The big moment is here! After all your planning and hard work collecting data throughout the year, the results are in and ready to be analyzed. As you begin to dive into the results, your smile turns to a frown and you begin to feel disappointed.
These results are not what you expected or anticipated.
When experiencing an “Oh no” moment like this, what should you do?
What do I mean by unexpected results?
Unexpected results can be results less than what you anticipated, results that do not make sense, or results that do not match what you anticipated they should be based on targets or benchmarks. An important part of growing in assessment is truly reflecting on what can be improved and done better. Even when everything in the assessment process goes great, assessment professionals need to continually be thinking about what we can do to challenge our students and further engage their learning.
Think of this idea in relation to the saying, “Don’t cry over spilt milk.” It’s true, you shouldn’t cry over spilt milk… or coffee. Instead, you should think about what caused the milk to spill and make a plan to prevent a spill for next time or be more cautious moving forward. The same applies to assessment; don’t cry over your data. If there were things you could have done differently, plan to do them differently. If all was right in your process but the results weren’t what you expected, then there’s work to be done in order to improve student learning and experience.
Photo by Ohmky on Unsplash
Strategies for Handling Unexpected Results
What follows are some tips and considerations to prepare for and address unexpected results. In relation to planning, one should set benchmarks for your anticipated results. Data analysis yields results which, even if surprising aren’t “bad” unless we make them that way. Furthermore, there is a lot of consideration which goes into interpretation of the data and thinking through what might have caused the unexpected outcome(s). Consider the following strategies for handling unexpected assessment results in the future.
As assessment professionals, setting appropriate benchmarks or standards is important in the planning phase of assessment. This not only documents expectations for student learning, but can also help contextualize aspects of unexpected results. Whether the actual results are more or less than your standard, you should reflect both on how students are learning and how appropriate your expectations are for their learning. Linda Suskie shares in her blog, “one of the biggest barriers to understanding and using assessment results is figuring out whether or not the results are good enough by setting appropriate benchmarks or standards.”
It is Not All Bad
We also need to recognize unexpected results, while not ideal, are not a completely terrible thing. Unexpected results can assist us in order to improve our teaching, process, and assessment methods for the following year. Yes, this is hard, tedious work and takes time. However, making modifications and improvements can help yield improved student learning. As Green, Jones, and Aloi point out in their article, An Exploration of High-Quality Student Affairs Learning Outcomes Assessment Practices, “If the assessment process is only considered useful when the data contribute to immediate, observable actions for change, than the aims of the assessment process and the existing definition of use are too narrow.”3 As they point out, we often need to dig deeper.
Photo by Anaya Katlego on Unsplash
Our next step is to share the results with trusted individuals. It is important to create a safe space as you decide what you are going to do with the information. These trusted colleagues can help you assess, reflect and analyze what may have happened. These individuals can also provide additional and diverse perspectives, adding to overall insight to help you navigate your path forward.
In his recent commentary, “Assessment Is an Enormous Waste of Time”, Erik Gilbert quips from a recent study there was not a large significant difference between seniors and freshman on the Value rubrics. He says, “Maybe the rubrics are flawed, and they don’t actually measure student learning. That is a problem for assessment…. But an equally plausible explanation would be that students don’t learn much about critical thinking or written communication in college.” While there is more subtlety to the results and interpretation than what Gilbert provides, it is important to reflect on the results and think through aspects of the assessment process for potential changes.
From All Angles
As assessment professionals, it is important we consider on all of the possible factors that could have affected the results. Some possibilities for reflection include:
- Was the teaching detailed enough?
- Does the curriculum need to be more focused in a certain content area?
- Was there an issue with the validity or reliability of the measuring tool?
- Was it the methods or the way we taught the students?
In addition to practice and process pieces mentioned above, we also need to consider timing. For example, if the distribution method was a survey, when was it administered to the students? Thinking about other assessment efforts going on, not to mention institutional seasonality (e.g., course start/stops, midterms/finals, holidays) can be good to note as a possible explanation for unexpected results, as well as something to plan for in the future.
To Include or Not Include
We may choose to not include the results because they may not be useful or we may have other reporting priorities. As assessment professionals we gather a lot of data and some pieces of data are more useful to paint the picture than others. However, we need to consider the results may come out anyway. We also need to weigh how this will reflect on our department, our division, and at large the university. Completely ignoring the results may leave out a vital piece of the Student Affairs story.
That said, I would strongly encourage data be shared - whether the results were unexpected or not. Data on it own is not good or bad. Besides, results are prime for us to interpret and share, and we should do so in order to help garner support and collaboration for changes needed when there is room for improvement. Additionally, much can be gained by being transparent. Transparency can help guide us into action items. As we continually learn and learn from assessment, we can work to move towards sustainable improvements and programming.
Reflection to Action
After reflecting on the assessment results and all the factors that could have affected them, we have a call to action. The results should be used as a catalyst for change if necessary. The results help us collect more information over time. For example, upon reviewing the data, we may realize where there are gaps or holes of where we need to be more intentional in gathering data in the future. If the unexpected data is included in the results, it is important to explain it, work to improve it, and follow through on your action plan. As mentioned earlier, not yielding the expected results could help improve assessment measures for the next year. Unexpected assessment results can help improve future programming. Ultimately, the data should help lead to an evidence-based action plan.
Photo by Jack Hodges on Unsplash
In conclusion, next time you receive unexpected assessment results, remember what your benchmarks are, take a deep breath, share with trusted individuals for interpretation, and reflect on what can or should be improved for the future. Once you create an action plan, make sure to follow through. I believe part of being assessment leaders is not just the results that we receive but what we do with the results we receive. Unexpected assessment results can help produce growth and improvement for Student Affairs.
Have you experienced unexpected assessment results? If so, how did you handle the situation? What was the biggest lesson you learned?
- Suskie, L. (2015). Setting Meaningful Benchmarks or Standards. [Blog] Linda Suskie A Common Sense Approach to Assessment in Higher Education. Available at: https://www.lindasuskie.com/apps/blog/show/43191428-setting-meaningful-benchmarks-or-standards [Accessed 18 Apr. 2019].
- Green S, Jones E, Aloi S. An Exploration of High-quality Student Affairs Learning Outcomes Assessment Practices. NASPA Journal. 2008;45(1):133-157. doi:10.2202/1949-6605.1910.
- Gilbert, E. (2019, March 20). Assessment Is an Enormous Waste of Time. Retrieved from https://www.chronicle.com/article/Assessment-Is-an-Enormous/245937
Bethany Williams, Liberty University