I’m not a programmer, coder, and/or software engineer. Sometimes I wish I was - the magical things that can happen when one understands the inner workings of creating things that work for computers is something I’ve always been impressed by. However, since I’m not a computer marvel, I’ve found great educational support (and patience) from those people who I’ve had the good fortune to meet who are involved in the enterprise of technology. Over time, I’ve learned that I shouldn’t only approach them when something breaks, such as a system that supports the collection of assessment data. I’ve found, when we think together prior to implementation of a particular technology, we can do great things together, and save a great deal of time and money.
But technology folks have much more on their plate than working with assessment folks, just like we don’t only interact with technology. They have to keep the infrastructure of the university working, thinking about what to do when the internet stops working on campus (gasp!), and how to manage a great deal of users who are accessing and using data from a great deal of systems. They assist in the research operations of the university as well, maintaining systems that support inquiry based projects that sometimes have millions of dollars associated with their outcomes. Their jobs are high pressure, and when I’m working with our technology groups, I’m aware that they have many other things on their plate, too. I’m fortunate to have their attention when we are able to meet!
So with all the magic that can happen via computers, how can what do we as assessment people add to the mix and position ourselves as colleagues that add value? We understand the context of the data. We know why it is important to collect data in particular ways, what it will be used for (ideally), and what political and legal considerations are at play around the use of specific types of data. We know about other systems that people might use in which the new implementation might add to their plates. We’ve heard how people may have had difficulty accessing data and can bring those difficulties into a solution-oriented space. Above all that, though, we produce and use data daily, and our technology colleagues produce and use data daily, too. We’re kindred spirits, after all. We are driven by curiosity.
On that note, I recently attended the Higher Education Data Warehousing Forum Conference at Oregon State University. I was asked to speak about Student Affairs and data integration at the conference, and, surprise, I was not the only assessment person there from our field of Student Affairs! There were institutional researchers there, as well. This conference was a wonderful chance to connect with people who work very actively with educational technology on college campuses, and the speakers were talking about things I wish I knew about before I started on some major projects at my university. I learned a lot, and the atmosphere of the conference was very welcoming, even for someone like me who felt like I may have been in the wrong place given my limited technological “know how”. It was good to know that my perspective was just as valuable to the folks in technology as their perspective was to me. Also, membership in their organization is free, just like our organization here at SAAL!
Rather than drone on about my experiences at the conference and with my great technology team, I wanted to share three learning points I’ve applied to my practice:
Application 1: Work with your technology colleagues to come up with a list of questions that you can ask vendors:
I worked early on with my technology folks to develop questions to ask vendors when they approached us with possibilities for our campus. Questions such as: what access do we have to the data we are putting into your system to retrieve it? What are the security features of this software? Who is your biggest competitor, and what does your product do differently? What technical support do you provide, versus what we are expected to provide? I think I have a list of about fifteen questions now, all developed with various technology people on our campus. It helps keep our vendor meetings focused and on task.
Application 2: Develop trainings that can consider data systems working together, rather than separately:
I learned what the term “interoperability” meant when talking with the technology people on our campus. However, I also learned about it in a different way from our technology users - who we work with do get data for and conduct assessment with. Some people were having to go to multiple systems to gather data points for one purpose, and found this to be burdensome. I worked with my team to consider how we could build reports that integrate multiple datasets in such a way that these data would appear in one “place.” Within our Division of Student Affairs, this helps us provide the groundwork for a dashboard (of dashboards) related to our metrics for our work, for example.
Application 3: Look for an existing technology that may already perform the function you are trying to achieve:
Does this sound familiar? A lot of us learned to look for existing data before careening down a data rabbit hole, similarly, the pause before pursuing or purchasing a new software could help determine whether there’s something out there already that will perform a similar, or sometimes the same, function. For example, it’s worth pausing to consider if an enterprise, or campus-wide solution, might work your context. Especially if this enterprise solution has already been purchased. In one case, I learned that our software used for surveys was simply not being used to the full extent in which it could be used. Alternatively, sometimes a new product can replace aging products that might not be serving the university in the best way.
I’ve learned additional applications, too, but I’d like to hear from you - how has working with your technology people helped you in your work in student affairs assessment?
Daniel Newhart, Oregon State University