So far, it seems like ACPA was the last big higher education conference this year due to the coronavirus. As an attendee and presenter at my second ACPA, I added many new and exciting professionals to my network, as well as spread the word about assessment. With the strategic imperative for racial justice and decolonization taking the forefront of this year’s conference, there were many great sessions, posters, and discussions around how we can act in the best interest of our students. I learned a lot both directly from the presentations I saw and presented myself, as well as through participating in various conference activities with others. Here are 4 things I learned about assessment through attending the annual ACPA conference, along with my own thoughts about how to move forward.
Student Affairs practitioners assume that someone will do assessment, but it won’t be them. At the same time, assessment coordinators can only do so much
On the second day of the conference, I was catching up with a colleague from a previous institution. When I talked about my time in my academic program and how I fell into assessment, they started talking about how assessment is so complex for one person and that one person can’t do everything. I was overjoyed to hear this because it truly takes a village to implement sound assessment practices and create a culture of assessment and learning improvement. As the conversation went on and they asked my opinion on various projects their department was doing, after every opinion I shared, they said “Oh, maybe our assessment coordinator can do that.” I was a bit confused at this point; first I was told that assessment shouldn’t fall on one person, yet you were going to delegate all assessment projects to the coordinator, and not to anyone else.
Now, I realize that assessment coordinators exist for a reason: assessment coordinators have become increasingly common in recent years to help their departments and units with planning and implementing assessment, as well as data analysis and interpretation. To me, it seems like my colleague was expecting their assessment coordinator to design the program and outcomes, collect the data, analyze it, write the report, and recommend changes. The catch? This would be for every single program their department did. Instead of being a facilitator or guide, my colleague seemed to be envisioning the assessment coordinator as a programmer and a data analyst. Nowhere in the conversation did there seem to be a distribution of assessment efforts across the department. This can be troubling because any office in the division of student affairs has a LOT of programs, and if each of those programs has data associated with it, that’s a LOT of data to analyze.
In short- assessment coordinators can’t do it all. Let us assessment coordinators help coordinate assessment efforts in the department and then help where needed, not do everything assessment-related ourselves.
My question: How can we influence department, unit, division, and/or institution leadership that the responsibility of assessment falls on more than one person?
My answer: Reminding leadership that assessment is used to inform how we carry out programs and other interventions. As it’s (usually) not the job of an assessment coordinator to put on programs, leadership needs to be reminded that the ones who put on the programs need to be more involved in the assessment process. The ones putting on the programs have all the context and knowledge needed to make the changes, not the assessment coordinators.
Many student affairs professionals don’t use evidence-informed practice as the foundation of their work
In one of the presentations I went to involved small-group discussion on how we use assessment in our functional areas. As someone who studies assessment and works as an assessment consultant, I often tell people that my functional area is assessment. After I introduced myself as an assessment practitioner to my small group, someone gave me an odd look, which made me feel immediately uncomfortable. I ignored the look, as I’m used to it (liking assessment can get all sorts of reactions from people) and continued into the discussion. The discussion itself and the aftermath is what alarmed me the most.
Throughout my small group’s discussion, many people were using the word “assessment” as the process of data collection and writing a report. Although those are important parts of assessment, assessment isn’t just that- assessment is a cycle that allows one to create evidence-informed programs and use data to change and make improvements to those programs. As I mentioned this to those at my table, I was met with a mixture of reactions. Some told me that they had never heard of assessment framed as an ongoing process and wanted to learn more, and others did not want to entertain the idea I had proposed.
In particular, one person at my table contested my idea. After a healthy debate, I asked them how they used assessment or evidence-informed practice in their work. They responded in a way of which I had heard about from others, but I had never witnessed myself until that moment. Loudly, they told me “I can either do assessment and meaningless statistics, or I can do my work as a student affairs practitioner that actually matters. I work in student retention. We hire people like you to crunch numbers while we do the real work.” This made my heart hurt for two reasons: 1) This person, who is a well-known leader in student affairs, told me that assessment doesn’t matter for student success, and 2) It is clear that assessment and evidence-informed practice is not used in their work.
Using assessment and evidence-based practice isn’t just data collection. Together, assessment involves evidence-based practices, which helps us explain the why and how of our programming, from creating student learning outcomes and the program itself, and then using data to make changes. If we can’t reframe assessment and evidence-informed practice to be a foundation of our work, I fear that student affairs as a field will be unable to move forward in supporting students through the years.
My question: How can we as pro-assessment practitioners educate others on the importance of evidence-based practice?
My answer: We tend to not use evidence-informed practice in our work because we often become victims to our own perceptions about how students should be responding to a program, rather than how students are actually responding to a program. This point can usually be communicated well enough if we start asking people questions about how and why they expect students to reach outcomes with the current programming. This can transform into an opportunity to advocate for the use of empirical evidence and data in our practices, as they help us determine if a program helps students reach outcomes.
Culture change with assessment may take time, but we need more and louder advocates
There were times during ACPA that I felt alone, and no, I don’t mean when I was taking breaks and sipping some coffee by myself. In more intimate conversations, many people were fans of assessment- they wanted to learn more and do more with it in their work. When my thoughts about assessment were mentioned out loud, no one else spoke up with me. From these interactions, it seems like people want to incorporate assessment into their work, but the greater field of Student Affairs, or at least with a large portion of the field that I interacted with at ACPA, is silent about the importance of assessment in the work we do to support students.
I had a great conversation with a friend of mine at ACPA about her experience and upbringing in student affairs, which was a bit before me. She mentioned that she could sense a slow change in culture of assessment within the field, but it’s “not much of one.” She noted that in her graduate studies, there was no mention of assessment in the practice of helping students, and now in most programs there is either an assessment or research methods course.
I agree that this is progress- as assessment becomes a more pressing concern within the field of student affairs, it is great that student affairs graduate programs are incorporating assessment into the curriculum. There are many individuals who feel that assessment is important to what we do in student affairs. Unfortunately, it doesn’t seem like those individuals are willing to make a stand and loudly advocate for that change to occur faster within the field. I know these assessment advocates exist, but more of us need to have no fear in asking “how are we going to assess this?” in every meeting about programming, in any event you put on, and in any room that you are in. If you have that fear, know that you’re not alone, and that nothing got done by staying in your comfort zone.
In short, points 2 and 3 are pretty connected: We may not have many advocates for assessment, but that could be because we as a field don’t know or believe in the potential of evidence-informed practice. If we can get more people to see the benefits of evidence-informed practice, it’s more likely that more advocates will appear!
My question: How can we either speed up change or get more advocates?
My answer: I said it above: we need to educate and work with people who don’t understand or know about evidence-informed practice. Sometimes, this may feel like working intimately with others on projects and being with them every step of the way. As their confidence in evidence-based practice increases, it’ll become easier to step away and watch them be transformed by the power of theory-to-practice!
Images taken from Henning, G.W. & Philippon, R.G. (2019, December 11). Socially just assessment: Theory and examples of practice. Session presented at annual meeting of NECHE, Boston, MA.
Combining the strategic imperative and assessment is essential to the success of the field of student affairs.
With the release of the Framework for the Strategic Imperative for Racial Justice and Decolonization last year, the strategic imperative was the subject of many sessions I attended. There were three sessions (well, two, but I’m biased so I think my session was also good) that I felt embraced the connection between assessment and the strategic imperative’s call for us to work toward critical consciousness, radical democracy, and humanizing others. As I was deliberate about which sessions I chose to go to, I tried to find sessions that wanted to make the connection between these two large pillars of student affairs. After attending these sessions, it was clear that student affairs as a field embraced and has become more knowledgeable about the strategic imperative much faster than assessment. To challenge us all as student affairs practitioners, I need to ask this question: How will we know when we’ve reached the outcomes stated in the framework of critical consciousness, radical democracy, and humanization? How do we know that we are headed in the right direction and not harming ourselves or others? That’s what assessment is for. We need to be able to be comfortable with assessment and assessing ourselves before we can move forward with the strategic imperative. I’m hopeful that this will happen!
My question: How can we start putting these two things together?
My answer: Well, first we need to unlearn that diversity, equity, and inclusion (DEI) work is not separate from our assessment work. Everything that we do for students through the lens of DEI can be assessed, and assessment itself can be dissected for themes of DEI. As it has emerged in the past few years as an area of research, culturally inclusive and socially just assessment are ways we can look at assessment through the lens of DEI. At the same time, we also need to assess our own knowledge of DEI
In conclusion, ACPA 2020 was fantastic, filled with rich conversation about assessment and how we can keep implementing assessment in everything we do as student affairs practitioners. I love that the field is getting a grasp on assessment, yet we can do more. It’ll take all of us assessment advocates to truly make a change and help this field progress in terms of embracing the potential of assessment.
Chris Patterson, James Madison University