As we continue to explore how we thread intentionality throughout the assessment cycle, we reflect on our selection and use of assessment measures, analyzing and sharing results, and utilizing results towards data-driven decision making and improvement. Within each stage of the later half of the assessment cycle, there is an intentional focus to employ assessment as an opportunity for further learning - whether that is the use of a formative assessment measure, learning about our own biases within our data analysis, or developing a holistic culture of learning through continuous embedded assessment practices.
Identify, Develop, & Administer Measure(s)
Each of the above are decisions that require intentionality towards effectively measuring the original learning outcomes. For example, when developing the assessment structure for the Rensselaer Union’s student employment program, a universal rubric was crafted to evaluate development over the interpersonal skills tied to the student employment learning outcomes. Evaluation rubrics and review meetings with student employees were completed at the end of their first semester of employment and upon their final semester or departure from the position, with the goal being established growth and development between the two review points. Here, a rubric was incredibly useful as to track development to specific performance and learning markers over time and provide feedback directly to students.
There is also intentionality in crafting assessment methods that can be used as learning tools. In review of the assessment methods used in a course taught by staff at Rensselaer Polytechnic Institute (RPI), an issue with refining our quiz questions keeps coming up in the instructor group. Instructors have been caught in a middle ground of whether we are using an appropriate metric to evaluate understanding and application of course content, or are we teaching to the test? Overwhelmingly, instructors would like to use assessment measures in the course as an opportunity for students to continue to learn, and therefore have been reviewing and adapting current testing strategies and the inclusion of new assessment methods - such as simulations and further personal reflections. In this way, we are moving to a formative assessment approach as a developmental tool for students and instructors engaged in the curriculum.
Analyzing Data & Sharing Results:
To harken back to previous SAAL Blog posts, we must be intentional in recognizing our inherent biases when we approach data analysis and draw conclusions, and work to include perspectives and lived experiences outside of our own. This includes thinking critically about the student populations we are reviewing and crafting our results towards equity and social justice. How can we slice and dice the data for deeper meaning? Taking time to disaggregate the data by subpopulation, characteristic, or experience can give greater voice and perspective to your data. This also involves intentional relationship building, and bringing in others who may have different social identities and/or perspectives that provide diverse voices in reviewing the data.
As we hear the voices within the data, its important to make sure that those voices know they have been heard. St. Mary’s University is a great example as they share their survey results regularly with students through their “Question of the Week” initiative. Earning an Honorable Mention for the 2015 NASPA Assessment, Evaluation, & Research Knowledge Community Innovation Award, the Question of the Week solicits engagement in weekly surveys, by fostering a culture of assessment where students know their voices are heard. Each week, students see large posters across campus displaying the results of the prior week’s survey question and context about how that data is influencing decisions on campus.
Use the Result for Improvement, Evidence, or Decisions
With the deliberate collection and analysis of data, must then come the thoughtful use of data. Drawing back to Alaska PEAK as an example, what began as an initiative to evaluate the student employment experience, has grown into a culture of learning through work and reflection within the on-campus employment experience. Analysis of student employment data has provided the framework and guide for supervisors to create meaningful and enriching experiences, turning student employees into student leaders.
The same can be said for the impactful use of intentionality in utilizing information towards further learning at RPI. In building course material for an upper level Professional Development class, the Archer Center for Student Leadership Development at RPI asked members of engineering corporate communities to identify necessary skills and qualities that are required to be successful in their workplace. This initial assessment both validated the continued need to provide curriculum dedicated to developing students’ interpersonal skillset as all of the skills were non-technical in nature, and was then crafted into a Leadership Competency matrix which continues to act as the foundational personal reflection assessment of the course. Students initially evaluate their strengths and weaknesses for each area within the matrix, and identify specific personal experiences that relate to each competency.
This constant attention to intentionality is what helps us advance the field of student affairs, strengthen our programs and services, and critically engage with the evaluation and assessment of our work. How do you maintain intentionality and leverage assessment and data for further learning?
Amy Corron, Rensselaer Polytechnic Institute
Whitney Brown, University of Alaska Anchorage