Menu

5 Takeaways from the 2019 Assessment Institute

5 Takeaways from the 2019 Assessment Institute

This year was the first time I’ve attended the IUPUI Assessment Institute in Indianapolis. While I’m not new to assessment, I’ve recently stepped into this assessment management role. So obviously, I was excited to attend this year’s conference to learn more about assessment through the lens of those who do assessment (student affairs staff, faculty, adjuncts, and other assessment specialists).

While there was a lot of great information shared across many sessions, I walked away from the conference with five things I wanted to do more or better. I imagine you’ll find these ideas worth considering in your assessment lives.

Survey Smarter


https://media.giphy.com/media/1iu8uG2cjYFZS6wTxv/giphy.gif

The buzz at every conference involving measurement and human beings always pokes fun at survey research:

Surveyed to death.
Bludgeoned by surveys.
Knocking down doors with surveys.

I’m here to tell you that anyone who hates surveys clearly doesn’t know how to craft a good survey.

Good surveys:

  • Have a clear purpose (“I was told to do a survey” isn’t a clear purpose FYI!)
  • Ask the right questions in the right way (the hardest part of any survey design process!)
  • Were piloted with the population they were intended for
  • Have a group of people eagerly waiting for the results to make a decision

Good survey development is difficult and takes time, so make sure to survey smarter! As a common tool for indirect outcomes assessment, survey design and delivery is critical to getting useful assessments results. Check out the survey development tips I link below!

Don’t put a cat in a box


https://media.giphy.com/media/zCXG6qPHRKrXa/giphy.gif

The best advice of the conference (for me at least) was by Dr. Andrew J. Young. While discussing collecting, analyzing, and reporting on data from LGBTQ+ communities, Andrew used the metaphor of putting cats in boxes for forcing LGBTQ+ students into narrowly-defined boxes on survey items.

As a self-proclaimed crazy cat lady, I totally got it. I’ve tried putting Gatsby, the Great (my cat) in a box. Doesn’t work (read: I got bit)! He has to choose the container he fits.

As a survey designer, I had many ah’has. Adding clarity on sensitive identity questions is important so students know why and how we plan to use it. Likewise, we should only ask what we can realistically use and report. 

Big takeaway – be inclusive in your survey response options when asking for sexual orientation, gender identity, and other demographic identities. Even if that means the options are a mile long, being inclusive in our options shows students we see them and care about them. They aren’t just an “other (please specify)”. 

Even if we end up grouping all these options together into a dichotomous variable (LGBTQ+ and Not), let’s stop forcing cats in boxes we think they belong in and let’s give them enough boxes they can comfortably put themselves into (or not! “prefer not to disclose” and “identity not listed” are still good options to include).

Look for Intersectional Experiences


https://media.giphy.com/media/IcUbDfcmMWvU4/giphy.gif

Speaking of student identities, the complexity of identity intersections can be dug into! Allison BrckaLorenz and the other great people from Indiana University Bloomington provided us with four tips to analyzing intersectionality in survey responses.

  1. Group options into a dichotomous variable (when appropriate) and make some crosstabs
  2. Look at one at a time by holding some identities constant (example: hold gender constant and look at results by race/ethnicity to see differences)
  3. Use a moderation analysis to see how different identities interact
  4. Create identity groups based off continuous variables (such as SES or loan debt) using averages, quartiles, and ranges.
  5. Secret 5th idea – create grouping variables based on experiences and survey responses, then dig into the identities most prevalent (example: grouping students who were satisfied with tutoring and not satisfied)

A couple of final notes and a gentle warning: when doing any analysis on your data, be purposeful and have transparent intent on why you are looking for differences and what you find. Going fishing for differences is bad practice, so have an analysis plan before starting your data dive. Finally, always tie your interpretation back to the context and look for simple explanations of the data. Sometimes our interpretations feel a little “oh, no duh” once we look what’s happening (or not happening) around us.

Look at more than just student learning


http://giphygifs.s3.amazonaws.com/media/109TWWxRddcIEg/giphy.gif

Include operational outcomes, program outcomes, and student learning outcomes in your assessment plans. Michele C. Soliz and Alana Malik from the University of Toledo described the three types of outcomes they use in residence life and student affairs and how to come up with outcomes beyond “butts in seats” and “heads in beds”.

  • Operational outcomes (sometimes called program objectives) answer how well something operated (example: wait time for appointments with the career center).
  • Program outcomes (sometimes called initiative outcomes) are the outcomes of an overall experience or goal of a program / initiative (example: a motivational interviewing initiative wants to decrease student binge drinking on campus).
  • Student learning outcomes (SLOs) are well…the learning outcomes of students! SLOs are perfect for student affairs programming that was developed for learning (not just an activity).

Sometimes our activities aren’t true learning experiences, but rather social ones (and shouldn’t be included in our assessment plan as learning outcomes). However, having multiple types of outcomes allows us to better align our activities or interventions to our goals and mission. By assessing more than student learning outcomes, we are able to see a complete view of a student’s experience with student affairs areas. When we only measure operational outcomes, or only measure learning outcomes, we fail to show the total impact of student affairs service to the student experience. Keep in mind - we don’t have to measure everything we do, but be purposeful of measuring outcomes in useful ways.

Share your story


https://media.giphy.com/media/YqdWzX5r6SYZW/giphy.gif

Data storytelling is a hot topic lately! Natasha Jankowski quoted George D. Kuh, who challenges us to educate others on 1) what your area does and 2) how well you do it. Using data visualization best practices, share your assessment results as evidence while telling a student-focused story.

Here are some of my favorite go-to tips for sharing results:

  • Know your audience and speak to them (all others are secondary so don’t try to speak to the world!). Speak to your audience’s interests in the way that makes sense to them (report, infographic, poster, presentation, dashboard, etc.).
  • Leave the jargon out but keep the high level impact. Ditch the p-values but keep the overall meaning of your findings, and use informative sidebars and icons to communicate information.
  • Break up heavy text with relevant pictures. Unsplash is my favorite free picture site, but be sure to check out your institution’s photography page (Flicker!).
  • Draw reader’s eyes to the important pieces using call-out boxes and colorful text. Colorful boxes will draw eyes to important pieces while light grey boxes are for the “nice to know” stuff.

Check out more data sharing tips below with the rad resources!


http://giphygifs.s3.amazonaws.com/media/kyeIEFsFQ75ew/giphy.gif

Overall, it was a good first trip to IUPUI’s Assessment Institute! I walked away with a lot to consider. I already have meetings set up to get the ball rolling on a lot of my ideas. I hope you found this content useful for your area.  If you were at the conference, do you have any takeaways to share? Comment below or share on the SAAL social media channels!

Rad Resources

Survey

Data Storytelling

IUPUI Session Shout Outs

  • Analyzing Data and Reporting on LGBTQ+ Communities: Strategies to Assist Data-Driven Decisions
    • Andrew J. Young, Robbie Janik, and Caleb Keith, IUPUI
  • Examining our Narratives through Evidence-Based Storytelling
    • Natasha Jankowski, NILOA
  • Getting Lost at the Crossing? Tips for Assessing Intersectional Experiences
    • Allison BrckaLorenz, Kyle Fassett, Tom Kirnbauer, and Sylvia Washington, Indiana University Bloomington
  • Beyond “Butts in Seats” and “Heads in Beds,” Redefining Assessment in Student Affairs
    • Michele C. Soliz and Alana Malik, The University of Toledo

Julie L. Lamping, National Louis University 

 

Go Back

Comment