How I Disagreed My Way to a Better Understanding of AER

How I Disagreed My Way to a Better Understanding of AER


Assessment, evaluation, and research (AER).


Three separate concepts, right? Some people – like me – think so. I’m of the mind each of these are different concepts with specific purposes. There can certainly be instances where interests or aspects overlap, but this occurs as each executes its distinct and intended purpose.   

Image made by Joe Levy

There are also people who think the lines between these concepts are pretty blurry. The concepts overlap in form and function so much it would not be incorrect to use any of these terms interchangeably. Going further, some people view these concepts as subsets of one overarching practice (e.g., assessment and evaluation are forms of research or vice versa).

By now, you may be aligning yourself to one of the aforementioned groups of people, shaking your head how wrong some people can be, or you may be more confused than ever. Either way, it’s worth considering these concepts in relation to one another, as well as on their own merits to gain an enhanced perspective here. It was for me.


In the fall of 2017, I wrote a blog about assessment and research being different (and why that’s ok). This was in response to a sequence of publications examining assessment through a research lens (Gilbert, 2016; Sriram, 2017). At the time, I saw nuggets of truth in those publications, but I fundamentally disagreed with the comparisons since I viewed assessment and research as different things.

To be honest, I was surprised this conversation was even taking place.

Assessment, evaluation, and research have their own respective (and distinct) literature, practice, and professionals doing this work. Yet, assessment was being criticized by research standards (Gilbert, 2016) and assessment professionals were being called to adopt a research paradigm, as well refer to themselves as researchers (Sriram, 2017). These attempts to reframe assessment were particularly interesting to me given government and accreditation standards calling for more evidence of student learning assessment – not research – from institutions.

The topic proved to have more momentum than I anticipated, as colleagues were discussing differences and similarities in assessment, evaluation, and research practices at the IUPUI Assessment Institute in October 2017. Throughout informal conversations, I heard nuanced points made about assessment and research, which began to place the conversation more on a spectrum than a black or white scale as I had considered it. As conversation continued with more folks throughout the conference, I came to realize there were (and still are) multiple perspectives representing practice – even of those considering assessment to be research.

Jan Romero via Unsplash

Expanding the Conversation

The NASPA Assessment, Evaluation, and Research Knowledge Community had the good sense to expand this conversation to a larger audience. Dr. J. Patrick Biddix and Dr. Darby Roberts helped coordinate the details and frame what would be an on-demand presentation on the topic. Roberts would play facilitator of a panel of perspectives examining the differences between assessment and research, as well as whether those matter.

Biddix was part of the panel; a higher education faculty member and author of books on both research methods and assessment practice in student affairs. Dr. Rishi Sriram - the person whose work I had previously disagreed with - was also invited to be a panel member. Like Biddix, Sriram is a higher education faculty member, as well as an author of a book about quantitative research and statistics in student affairs. Not an established faculty member or book author, I was just lucky to be involved in the conversation.

The presentation was contextualized with the blurred lines between concepts, approaches, and methods for assessment and research in student affairs. The focus was to compare and contrast practices, establish and clarify language related to assessment and research, and discuss expectations and implications for practitioners. Despite little introduction or previous coordination, the conversation flowed naturally and proved very rich between us panel members.

Going into the conversation, I thought I was on one end of the spectrum with the mindset of “assessment and research are different”, while Sriram was on the opposite end of the spectrum and Biddix somewhere in between (admittedly closer to Sriram). However, once the conversation turned to practice, expectations, and implications, there were many points we all agreed upon. Moreover, we found ourselves supporting and expanding each other’s arguments or stances in relation to implications for practitioners.

C:\Users\jlevy2\Pictures\alegri - Sound-spectrum-4freephotos.jpg

@alegri via 4FreePhotos

Lessons Learned

Not wanting to spoil content from the session, I did walk away with several lessons learned (or reiterated).

1. Seek first to understand; remain open-minded.

First of all, I was reminded everything is not black and white. Just as you may have sources for your understanding and beliefs, assume someone else has theirs which can be just as (or perhaps even more) valid. Approach situations and topics with more of an open mind, recognizing there are multiple factors and perspectives to consider.


2. Utilize your personal and extended network for support.

As I encourage sourcing your understanding, it helps to be informed from multiple perspectives on a given topic. I had been informed and cited various sources in my previous writings, but certainly took away concepts from additional sources leading up to and after the panel conversation. Don’t be shy in utilizing your colleagues (internal and external) to get tips or recommendations to learn more about a given topic. While we all have our specific passion or functional area interests, we have also come across excellent writings on different topics worth sharing with others.


3. Create space and invite views different from your own.

I cannot overstate the value of conversation. Having read one another’s works, we each came to the conversation with basic knowledge (and assumptions) of each panelists’ stance on the topic. However, the more we talked and had the opportunity to share, collaborate, and collectively brainstorm to offer recommendations for others, the more we found similarities in perspective. Create the space for conversations on topics, allowing people to voice, share, and dynamically contribute to the narrative as opposed to making assumptions or drawing conclusions based on static information. This is a point I continue to struggle with amid meetings, projects, tasks: knowing I should take more time and put more effort into better understanding a situation without making assumptions, as well as take time to talk with the people involved.


In the end, I very much appreciated the opportunity to participate in the panel. It was beneficial to talk more about the topic, increasing my own understanding, while also giving context and understanding on the issue to the audience. What sticks with me are the underlying commonalities and values we possessed. Despite the diversity of our institutions and roles, it is right to expand and excel in evidence-based practice to support student learning. Whatever name or combination of practice utilized, understanding and facilitating student development and success is what matters. Competition and criticism aside, we must remember the whole field benefits from individuals and institutions doing better.

Because I don’t want this to be all about me, it’d be great to hear from you in response to this post. Where are you on the assessment-research spectrum? Have you had these kinds of conversations on your campus or beyond? Do any of my lessons learned resonate with you or align with goals you have for this coming year?

Kyle Glenn via Unsplash


Gilbert, E. (2016, November). Why assessment is a waste of time. Inside Higher Ed. Retrieved


Sriram, R. (2017). We need researchers…So let’s stop using the term assessment. About

Campus, 22(2), 28–31.



Joe Levy, Professional Development Committee Chair


Go Back


Blog Search