Menu

Collaborating on a Survey Redesign

Collaborating on a Survey Redesign

 

Image Credit: Andrew Neels from Pexel

 

So. Many. Surveys. Innovations in technology, the move to so much of our lives online, and the desire for information has led to the proliferation of surveys – not only in higher education, but in general. While I could criticize the increased dependence on surveys for information collection, I can also see this as a demonstration of our sincere interest in gathering the thoughts and perspectives of others. 

Surveys are used to assess program-specific as well as big-picture outcomes. Surveys can provide helpful information about a program, services, or event that a student interacted with on our campus. We learn whether students feel valued, safe, or respected by their peers and within the campus community. Students may find an opportunity to speak up about an issue or topic where they might otherwise have been unable to communicate with faculty and staff. These are all worthy purposes. 

The pandemic introduced new challenges to our lives, our work, and how we interacted with others. With the stay-at-home orders and social distancing over the past year, the change of environment from on-campus to online opened new opportunities to connect with others.

This spring, my colleagues and I collaborated on a survey redesign that fostered professional community-building and refined an existing instrument to serve multiple programs. As a bonus, collaborating on one instrument helped address student survey fatigue by reducing the number of surveys sent at the end of the year. 

Applying assessment and survey review best practices, the following steps helped my colleagues and I redesign a survey from a program-specific to a broader instrument that helped illuminate collective cross-program impact on the student experience:

 

1. Give the survey a clear purpose

Develop a clear purpose that addresses the survey focus, specific areas to measure, and what you hope to learn from participants.

 

2. Make connections across programs to guide the survey 

Engage stakeholders to identify areas of alignment across programs. Allow time to revisit discussions and explore ideas.

 

3. Review the literature and brainstorm updates to the instrument

Revisit the literature and collect examples of instruments and items to support the survey’s updated purpose. Engage with stakeholders to provide their perspectives and ideas.

 

4. Seek student input

Share the survey with students to gather their insight on how they understand the items, what might be missing, and what they might recommend. Identify terms used and phrasing of items that may mislead or be misinterpreted by participants.

 

5. Review results and share findings

Dedicate time to collectively review the results and share findings with stakeholders. Consider 

the format and delivery of findings to different stakeholder groups.

 

6. Incorporate findings into an action plan

Identify your next steps to apply what you learn from the findings. Make sure these plans are feasible and include strategies for monitoring activities over time.

 

These steps can be applied in an independent setting, so what are some of the benefits we experienced working together in a collaborative approach?
 

  • Forming a clear purpose that spans multiple programs created a community of practitioners working toward a common goal through their unique positions on campus.
  • Alignment and synchronization of survey activities helps reduce survey fatigue, redundancy of data collection activity, and information bloat.
  • Revisiting the literature for current examples helped us support changes to existing instruments. We had evidence from the field to inform our items and their phrasing.
  • Incorporating student input on the instrument was beneficial for checking our assumptions on the items. We learned how the survey might be interpreted and other topics students felt were important to include.
  • Sharing an instrument across programs helped focus resources. We reduced time and financial costs by intentionally dedicating time to co-working on the instrument, distributing tasks across multiple staff, and consolidating expenses such as participant incentives.
  • Collaboration was a clear benefit. Staff discussed how their programs’ missions and activities connected to the shared survey’s purpose. We also built in future plans to discuss results and application of the findings.
  • Stepping back to view the bigger picture allowed us to elevate the outcomes the survey assessed. Seeing the student experience and interaction with our programs as multifaceted, we could more clearly discuss the outcomes using a holistic perspective of the student experience. 
  • Reviewing results together is helping us explore how to integrate findings into practice. We are scheduling time to discuss what we are learning from students who participated in the survey, for sharing findings with our respective programs, and providing future updates on our action steps.

 

At the beginning of our process a few considerations were established to guide our approach toward working together. While keeping the common purpose in mind, it was important to acknowledge the unique aspects of each program and how these features shaped both the student experience and the program’s assessment needs. We used an iterative process for decision-making to allow space and time to process information and provide feedback. Discussion notes and decisions were documented and revisited in subsequent meetings to reconfirm agreement or make adjustments. Finally, we set a project timeline that included reflection and celebration of each milestone along the way. 

 

Not every survey can be a one-size-fits-all, however, identifying the areas for collaboration and shared purpose can help both your program feel more connected with other programs, as well as identify the cross-connections of a students’ experience that contribute to their total experience.

 

As a final thought, working collaboratively this year helped foster and maintain connections with colleagues that we might not have had otherwise. As we conclude this academic year and venture into the next, I am hopeful that the spirit of collaboration will continue to inform how we develop and implement surveys, and encourage us to keep building connections across our campus communities.


Jennifer Nailos, Ed.D., The University of Texas at Austin

 

Resources

Banta, T. W., & Palomba, C. A. (2015). Assessment essentials: Planning, implementing, and improving assessment in higher education. (2nd Ed.). Jossey-Bass.

 

Diamond, R. M.  (Ed.), Field guide to academic leadership: A publication of the National Academy for Academic Leadership. Jossey-Bass.

 

Nichols, K. W., & Nichols, J. O. (2020). The department head’s guide to assessment implementation in administrative and educational support units. Agathon Press.

 

Suskie, L. (2018). Assessing student learning: A common sense guide. (3rd Ed.) Jossey-Bass.

 

Go Back

Comment