Menu

Creating a Culture of Evidence with Paul Holliday-Millard

Creating a Culture of Evidence with Paul Holliday-Millard

Hello SAAL Blog readers! Here is the next installment of our conversation series getting to know the leaders that make up this wonderful group of Student Affairs Assessment Leaders and learning from their personal stories. I joined the SAAL blog team after starting a new director role and, being new, I reached out to others who have been doing this work for a while. Given SAALs mission and vision center, in part, on the creation of a thriving community, I thought I would share with you a bit of what I learn from these conversations as well as some information about the humans in these roles, the faces behind the listserv emails so to speak.

Each conversation has been an hour long, so I’m not going to share all of what we talked about. I’ll share a bit about who these individuals are, some of their thoughts and advice, and hopefully give you some new ideas to ponder in your own work, with my own take-aways following the recap of our conversation.

 

Paul Holliday-Millard, Ed.D. (he, him, his)

Senior Research Associate, Assessment & Qualitative Research

University of North Carolina - Charlotte

In current role: ~2 years

Also serves as: Chair-Elect for ACPA’s Commission on Assessment & Evaluation; Previously served as: Education Committee Co-Chair for ACPA’s Commission on Assessment & Evaluation, 2023 Co-Chair for ACPA’s Student Affairs Assessment Institute, Previous Blog Manager for Student Affairs Assessment Leaders Blog

 

Tell me about your journey into student affairs assessment. How did you get here?

I had a pretty traditional route into student affairs. I was a really involved undergrad. I was an RA and then I was a head RA. I worked in our multicultural office and then I was Vice President of I think our Pride organization was called Lambda Alliance. I was also on the Student Conduct Board. I did a lot. I also established safe zones at my undergraduate institution. Actually, I think the very first time I did assessment. I remember doing a mock training because it was all student-led. It was a small college. Everyone wears multiple hats. Students did the training. I administered that survey and got the results back. I think I remember looking them over with my mentor. I think that was probably the first time I ever did.    

I went on and got my master’s degree. My first job was the coordinator of transfer and special student population services. 

Quote - I feel like I'm still a student affairs practitioner at heart.

It was a new position. I had a boss, but the boss was kind of like - figure out what you need to do. So, I asked for some data. I also served veteran students but no one knew what veterans wanted or needed so I did a needs assessment. I reached out to some other institutions who were very kind to lend me their instruments. We just did descriptive stuff. There was a veteran services committee on campus that was Provost appointed. I pulled that data, and I created a bunch of little graphs. Here's where we need to focus. Fast forward through a few positions, I completed my doctorate. The pandemic was really hard. I realized that I'd like to do something different. The director of this office at the time posted a position for a research associate and I had reached out to have a conversation about what she was looking for. She debunked this concept I had that they were doing regressions all the time and said she really wanted someone who wanted to learn and could present well.

I don't think I really realized I would ever be doing this type of work if you want me to be honest. I probably was one of the people that was like I want to be a vice chancellor for student affairs. I don't want to do that work now. I think I was looking for something that could also have impact. In my doc program I was able to write with some faculty. That was something that I was excited about. You're having impact, but you're also doing fun work. I will be honest. I don't know if I view myself as an assessment professional. I feel like I'm still a student affairs practitioner at heart, which is why I think I really try to teach student affairs people about outcomes and how to involve students in assessment. I'm lucky that I have a job that I think fills that bucket.

 

What advice do you have for me as a new director or someone new to this field?

I think assessment is about people. I think finding the balance between technical skills and people is really important. I am really thankful that Ellissa [my Director] values that a lot. I think that's probably part of the success of our office is that she puts that front and center. My advice would be to as much as possible put [people] front and center. At the end of the day, assessment is about continuous improvement. If it's not people centered first, I think you're setting it up yourself for failure in the long run. I'm not a director, so I also don't know the pressures from the top. You probably have to win some and lose some at the same time.

What’s the biggest issue/change you are watching in the field right now?

Quote: My advice would be to put people front and center

Well, I think I’m watching a couple of things. I think I'm watching the whole equity piece. Equity minded assessment, equity centered assessment. It's in our division’s strategic plan to do that, to expand it with added assessment practices. I think the other thing that we're talking a lot about on our campus is telling our story and telling our story through data - it can feel like we're being asked to do all of these big data things, but, how do you find all that data? How do you protect that data? How do you ensure it's clean and good data? Getting buy-in to do that. We as an office have really good relationships with IR. I know at some campuses that's not always the case. We also have good relationships with our IT department. With all that though sometimes I am a little worried. I just think some things get lost - I'm a little worried that we're losing track of writing good outcomes and measuring student learning. Talking to students. That's my big fear. I’m lucky to work with a team that values it all. 

The discussion around “big” data sent us down another path, talking about different platforms we are using in our respective work and the challenges of using something for a data purpose that was built for a different reason. I so appreciate any conversation with someone else who is a fan of using one thing for another purpose – I really find it a key efficiency practice that I embrace.

What's the narrative behind the system? People might not be using it effectively and the narrative was this thing sucks. I think the blog post you wrote about Britt really highlighted - assessment is about controlling the narrative and it's about people. I think that we have spent a lot of time in that space. It sounds like you are as well. We need to say to people “This is actually a good instrument. We're just not using it correctly.” Then once you recognize that you're like, “Crap. Now we need to do the work to train people how to use it correctly.” Then what's the ROI on that? It's a vicious cycle.

This is so spot on! I’ve spent a lot of time thinking about ROI of different aspects of my role as I try to build out an office that had a vacancy for a few years so this was right in line with some of the challenges I’ve been facing and thinking about. As I think about data infrastructure, I’m also thinking about the other wins I can make for students that might be totally outside the realm of assessment. For me, if I can identify some pain points and spend a few minutes being intentional about how a platform or an instrument can be leveraged in a non-assessment way to improve the student experience – that’s still my job. I see it as part of recognizing student voice and closing the loop on student feedback. For example, how does using Engage to store attendance data change how the platform is structured to support student groups? If we do a survey on basic needs, how are we also using that instrument to educate students on existing supports?

Quote: I'm a little worried that we're losing track of talking to students.

In between these sections of our conversation, we connected on how we both developed and ran an equity-centered assessment professional development cohort over the past semester. Stay on the lookout for a second blog post just on that piece of the conversation! That conversation led us back to a more general assessment topic that always comes up – closing the loop and the influence of leadership.

How can you ensure you're closing the loop and communicating to students what you have found and what you are going to do about it? We had an honest conversation anchored in some case studies. If LGBTQ students weren't using the counseling center, what's going to be your communication plan? What's going to be the action steps you're going to take? What's going to be your follow through? Another case study was around international students having a high percentage of the sanctions. What does that mean and what are you going to do? How are you closing the loop on that? It's hard to get it done. I think it comes down to leadership at the end of the day. I think you have to have a will to get it done.

If your leadership really cares about it, you're going to really care about it. I think our team is also having some conversations related to this about the assessment plan. There are a lot of departments that do amazing assessment plans. There are some departments that they're good but the follow through is not there. How do we support them in getting it done while also empowering them to get it done themselves? They are going to leave eventually. I think what we realize is that we need to be more hands on. I've interviewed 117  students this semester through focus groups, and we've learned a lot. We've learned so many things that we had no idea that was going on within these subgroups. Our departments need to hear that, and they need to know that. You're right, I don't know what to do about it. It's the hardest thing is to figure out the right balance. I had even found that our “assessment champions” or our assessment council representatives -- they're tired too.

 

Something that I'm thinking about as you're talking - is the mission to use assessment to improve the student experience, or is the mission to develop the capacity of staff to collect and use data improve the student experience?

Quote: how do we support non-assessment staff in getting it done while also empowering them to get it done themselves?

The mission is to create a culture of evidence. What does that mean? I think that means many different things to many different types of assessment people. The landscape analysis came out. I didn't read it in full, but I know from just talking with Ellissa and the syllabi project that I've been involved with that a lot of assessment people are not coming from student affairs programs like you and I anymore. We were taught to assess student learning through outcomes. We map them back to the curricular approach, or to the learning outcomes of our division. I'm wondering if that's changing some.

On that note, what do you do in your spare time?

What I do in my personal time? Well. My husband and I are big TV and movie buffs. We are obsessed with the Traitors. If you need something trash to watch, watch that.

I am a gardener. My irises are blooming, which I'm a little nervous about because we had a frost advisory come in today. I enjoy that a lot. I do like sports. The lady Gamecocks are playing the final four this weekend. I think the world is rooting for Caitlin Clark, and I'm the only one rooting for Dawn Staley. Caitlin Clark is amazing. I love her. I do like to read. I have not read in a little while, but I have the Women from the author that wrote the Nightingale. It takes place during the Vietnam War. That's on my nightstand to read. Jordan and I, we do like to travel. We were supposed to be in Vegas last week or two weeks ago to see Adele. There's a lot of pop music in the Holiday Millard household. A lot of Dua Lipa as of late, Beyoncé, things like that.

 

 

 

Some of the points in this I have been thinking about since Paul and I chatted. It keeps coming up for me as someone who is sort of building out a new assessment office after the position was vacant for a few years. Where do I get joy in my job? What do other people expect me to do based on who or how they’ve interacted with assessment in the past? What approaches are going to actually move the needle as opposed to being busy work? How do we assess that assessment training is moving the needle on our practice? Sometimes it can feel like I’m training people to sail through their next interview as the field values practitioners who can talk fluently about assessment, but our campus, our students don’t seem to benefit from the training when failing to close the loop is such a pervasive issue in the field. 

 

Have answers to the questions above or want to chat about them over Zoom? Reach out to the SAAL Blog Team and we’d love to feature you!

 

Paul Holliday Millard Head shot

 

This series is meant to highlight and lift-up those who are working in assessment full-time on a campus with at least some of their time dedicated to student affairs or co-curricular assessment. Know someone you’d like to learn from featured in the series? Leave their name in a comment and I’ll do my best to connect with them!

 

 

 

The blog post was written by Sophie Tullier, blog team writer and Director of Assessment, Data Analytics, & Research at the University of Delaware.

 

Go Back

Comment

Blog Search