“Begin with the end in mind” is a phrase all of us assessment professionals have heard often. However, this phrase carries a lot of truth as a necessary and critical aspect of assessment is evaluation. My background is in public health, and in the field of public health we utilize a specific framework for evaluation. The public health framework is parallel to the steps we use in evaluation in assessment. For example, within public health, the purposes of the framework are to:
- Summarize the essential elements of program evaluation
- Provide a framework for conducting effective program evaluations
- Clarify steps in program evaluation
- Review standards for effective program evaluation
- Address misconceptions regarding the purposes and methods of program evaluation1.
In order to measure effectiveness, maximize impact, and increase engagement in our communities or with students, we need to conduct effective evaluation. It is not only necessary, it has become an expectation within Student Affairs. While I may be treading into dangerous waters talking about evaluation given differing definitions and understandings of assessment, evaluation, and research, I’m not alone in making the assessment-evaluation connection. Becki Elkins points out the necessity of good evaluation in Student Affairs, “Expectations that student affairs units will engage in rigorous outcomes assessment, evaluation of program effectiveness, and data-drive decision making have increased across institutions of higher education as resources have become more scarce, calls for accountability more common, and research regarding effective student affairs practice more widely disseminated.” 2
So, if evaluation should be occurring, what does that look like?
The first step of evaluation is to engage stakeholders.3 For some of us, this may be our immediate supervisor, or our division, or it could be an entire organization. The stakeholders can be people involved in implementing the program, using the results of the evaluation, or funding the program.
As we begin with the end in mind, we need to consider what is most important to our stakeholders so that we can capture that information in our results. Don’t leave this to guesswork. Talk to people. Review their guiding documents (strategic plan, goals, past reports). Know where you came from to plan for where you need to go and what you need to do. Another beneficial byproduct here can be using familiar language and information specific to people’s areas, reinforcing this is not completely foreign to them. We need to be mindful of the mission of our department so that we can show how the program we are evaluating contributes to that mission. At this phase, it is also helpful to work backwards and start with the end goal and then break down the steps you need to complete in order to get there. In public health, we work backwards through utilizing logic models. This helps us engage others by showing that we have thought through the details of the necessary steps to evaluate our program(s).
The second and third steps in the process are to describe the program and focus on evaluation design. As we are developing assessments, we need to ensure we ask “What are we trying to measure?” It is vital to ensure that there is a clear vision and goal for the program, event or class.
For example, I could set a personal goal saying “I want to become a runner.” That’s a productive and positive goal given the many benefits to my physical, mental and emotional health through running. Is this an appropriate goal for evaluation? Nope!
This goal is not specific or measurable. What is the criteria for me to become a runner? How long do I need to run? How often? How hard? However, if I state, “I want to run a marathon by January 2020,” I’ve now made my goal not only specific and measurable, but time-bound. The detail matters for assessment and evaluation, where language holds power and relationship to measurement and data to come. We need to create SMART goals-goals that are specific, measurable, attainable, realistic and time-bound. We need to ensure we are using proper goal-setting practices as we carry over design and focus aspects into our assessment work.
To help guide our describing the program and the evaluation design, we can ask ourselves the following questions:
- How appropriate are our starting points (program goals/outcomes)?
- What is the best way to measure this information?
- Is the tool we are using reliable?
- Is this tool valid?
- Is it sustainable?
- Is it relevant?
Engaging in this kind of reflection is both useful to help orient or involve collaborators in your process, as well as making sure you are being intentional to set a foundation to execute a plan or approach to achieve desired results. While you cannot control what the data will be, you can do your part in constructing the evaluative design in preparation of your needs.
The fourth step is to gather credible evidence.
This is where we really gather and begin to synthesize the data to inform on our program. We need evidence to answer our questions or demonstrate achievement of our goals. Gathering credible evidence includes reviewing the quality of information obtained and taking practical steps to improve the quality in the future. We need to use the evidence to make evidence-based action items and plans to improve or to sustain the programs we have. Evaluating student learning is challenging. In our world of Student Affairs, we need to ensure we lay a solid groundwork that communicates clearly and concisely what we are measuring.
Report and Recommend
Once credible evidence is gathered, the next step is to justify conclusions. By this point, you have posed questions, made a plan, collected data, and likely now made assertions in relation to the results. It is important to defend your statements with some kind of “why”.
As we make proposals for programs to change or to remain the same, we need to be ready to answer (and anticipate) questions or concerns surrounding our suggestions. I have personally found I need to think about this question on three levels:
- First, to my immediate supervisor and what’s important for our department.
- Next, I need to be ready to explain why to the Senior Vice President of Student Affairs, who is looking at our division and effectiveness as a whole. My assessment report should provide the rationale that defends my “why.”
- Finally, the changes need to benefit students and tie into the mission and vision of our university.
When we justify our conclusions in this manner, we are forward-thinking. Additionally, we are looking at assessment from a birds-eye view which gives us a more complete picture. We need to make evidence-based decisions with our data. There are times where we may realize we need to gather more data, and we can tie what data we plan to gather into the action item of our report to make it more robust for the following academic year.
After you justify conclusions, do not stay quiet! Ensure the evaluation is used and share lessons learned. This is where quick visuals of charts and graphs are helpful. Also, share with more than just colleagues at your university! Consider sharing insights learned or successes by posting information on your institution’s website, sharing at conferences, or even through writing a guest blog here!
In conclusion, as we practice evaluation, we need to engage stakeholders, describe the program and focus on what we are trying to measure. Then, we need to use evidence to support our claims and report and make recommendations based on our findings. Share your findings with others so we can all learn from each other as we navigate assessment in Student Affairs.
How are you doing evaluation on your campus? What is the most challenging aspect of evaluation for you? How have you improved your evaluation process? We invite you to share your thoughts on the topic.
- Framework for Program Evaluation - CDC. Centers for Disease Control and Prevention. https://www.cdc.gov/eval/framework/index.htm. Accessed August 15, 2019.
- Elkins B. Looking Back and Ahead: What We Must Learn From 30 Years of Student Affairs Assessment. New Directions for Student Services. 2015;2015(151):39-48. doi:10.1002/ss.20136.
- CDC - Program Evaluation Steps. Centers for Disease Control and Prevention. https://www.cdc.gov/eval/steps/index.htm. Accessed August 15, 2019.
Bethany Williams, Liberty University