top of page

*Not* an exposé: Publishing a department-level climate survey as a tool for action

Screenshot of a figure depicting how faculty, graduate students, and staff felt about belonging, thriving, and growth (displayed as a range from disagree to strongly agree) and satisfaction with department climate (displayed on range from very unsatisfied to very satisfied). Majority of results were generally positive, though graduate students had overall more negative responses and staff had the most negative responses re sense of belonging, thriving, and growth. Follow links in the post for access to the PDF of the whole paper.
Figure 3 from our paper depicts how the three employee categories we surveyed felt about sense of belonging and satisfaction in the department

A couple of years ago, I led a team in our department to develop and conduct a climate survey. As we detail in a paper we published (recently-ish) about that effort, the idea of doing a survey at the department level emerged from a lot of discussion and group learning initiated in our department in 2020.


As we converged on the value of doing a department-level climate survey, we assumed there would be a lot of resources (especially considering all the campus climate surveys that are conducted). Instead, we were surprised to find there were zero published climate surveys that addressed department-level climate. I was especially surprised, because I consider a department to be the unit within a university where substantial change can happen most readily.


By that, I mean, the department is the unit of cohesion and direct affiliation -- it is what people "belong" to. That is, a department is where faculty are employed, teach and do research and service (or should!). It's where students study and get their degrees. (In our case, our staff are technically not under our purview; they are managed and hired at the college level. But, we still consider them part of our department, because they work with us to make everything we do in the department possible and successful.) Because the department is thus the unit to which people "belong," it is important to understand what that feels like for them.


And, as I noted above, departments are usually populated at a "goldilocks" scale: by enough people to have momentum and a wider range of ideas, but not so big that decision-making and collective action are intractable. (At least, this goldilocks scale is true for our department.) Given that, it was surprising that there were no published climate surveys for our scale of organization [1]. That led us to a parallel commitment: (1) do the department climate survey, and (2) publish it so that the next department that wants to do this work won't have to start from scratch.


Key takeaways (from the paper)

  • 82% of people we asked to take the survey (all current faculty, grad students, and staff at the time of the survey) actually did. Along with a balanced distribution of people responding from each employment category, that proved a reliable response rate for extrapolating to the department as a whole.

  • The survey was organized into five topical areas: (1) mental health and well-being; (2) importance of DEIJ; (3) sense of belonging; (4) safety; (5) work-life balance. We posed a core set of questions to everyone, and then a subset of questions calibrated to matters relevant to each, individual employee category.

  • 65% of respondents made at least one comment in response to open-ended prompts. This extensive engagement assured us that the responses weren't just from people who were extra-content or extra-upset. When we coded open-ended responses, three major themes emerged: equity, community, and accountability. Throughout the paper, we discuss our process and results according to those themes, with special attention to concrete actions that can be taken to improve departmental conditions in each of those thematic areas.

  • Most results for quantitative questions (agree/disagree, etc.) were neutral to positive, although we saw graduate students and/or staff being less positive than faculty in nearly every measure.


Screenshot of table 2 from the paper. Table is organized by four columns: survey section, equity, community, and accountability. The rows align with the five major sections of the survey: (1) mental health and well-being; (2) importance of DEIJ; (3) sense of belonging; (4) safety; (5) work-life balance. Examples of each intersection (section-theme) are provided in the table. In the mental-health row, the equity example is "lack of mental health care benefits for students and staff." The full table can be accessed by following links in the blog post to access a PDF of the paper.
Table 2 in our paper provides examples of the topics people mentioned in open-ended responses. These topics emerged as three themes: equity, community, and accountability.

  • However, when we coded the open-ended responses, we found a very different picture. As we wrote, "open-ended responses uncovered both positive and negative experiences that would have gone undetected using closed-response questions alone. Notably, open-ended responses identified concrete action items for improving departmental climate, some of which have been implemented already (e.g., cohort-building courses, stipend increases for graduate students) while others constitute future initiatives (e.g., department-level ombuds, mentoring workshops for faculty). Further, specific questions to faculty, graduate students, and staff facilitated the acquisition of nuanced experiences of climate that were unique to each employment category." In other words, our mixed-methods approach was vital for getting a robust picture of the climate in our department. Without the open-ended responses, we might have perceived everything to be just fine, when in fact, there are distinct issues that need to be addressed, and respondents have tangible ideas about how to do so. Likewise, we might have misunderstood what aspects of the department climate people most appreciate -- these are things we need to work into our strategic priorities, programming, and budgeting, to ensure we sustain them.

We're going to do this again (and again, every 3 years)


I'm thinking about this paper right now, because we are gearing up for a re-survey. (It's been 3 years since we did the first one, and we committed the department -- yes, volun-told it/them/us/ourselves -- to doing this every 3 years.) We gleaned some very important takeaways from the survey process, some of which we use as illustrations in the paper. Others were reserved for internal discussion, because the survey was never intended to be an exposé. It was, however, intended to be a genuine tool for our department and others:

  1. We envision the survey serving our department as a process and set of recommendations that we could use to enhance workplace experiences for everyone.

  2. For departments interested in doing this work themselves, we hope the paper will provide a solid starting point for others looking to understand what they can do to improve their department's workplace climate.


The great news is the survey has informed, been leverage for, and/or at least contributed to collective momentum around a number of positive efforts in our department. These range from a deep-dive into grad student stipends (well below poverty level for our area) and how to raise them to an independent but complementary effort to boost mental health and provide a food pantry in the department. We also learned specific things that are great in the department, so we can work to sustain those.


Takeaways (from behind the scenes)


As we prep to run the survey again this fall, I have a few process things front-of-mind.


  1. It was incredibly time-consuming to develop our survey and analysis/reporting from scratch. (As in, I recorded spending 186 hours from summer 2020-August 2023 on this project [2], including a reading group, meetings, development, distribution, coding, reporting, paper writing, managing it all, etc. That's nearly 5 months of 40-hour-week work, and I didn't do any of the analyses, and this project involved 7 people work intensively for multiple years and an 8th person involved with a few aspects. If we extrapolate at all, a conservative estimate of everyone working half that time we're over an entire semester's worth of weeks.) That is why we published our process: it is entirely possible that a lot of folks aren't digging into this kind of effort because it is simply not feasible.

  2. We don't have the capacity to do it at that level again this time around. Of course, we've already built out a lot of the process, so we can be more efficient. But also, half our team has moved on and is no longer available. And, the remaining team has shifted responsibilities and isn't really available to lead this into the future. We have to do this differently if we're going to keep doing it, and that includes both survey implementation/analysis/reporting and succession planning for who runs future re-survey efforts.

  3. We need resources beyond people's time. To embed this in the ongoing assessment and strategic priorities of our department, we need money. It's that simple. At minimum, we'll need to pay one of the team members who analyzed the first round of data to develop a GUI or user guide to the code they ran. It will have to be something that someone could virtually plug-and-chug, because we aren't actually a department that trains people on social science research methods, and survey data require very particular treatment to be statistically valid. We cannot sail off into the future assuming the re-survey teams of 2024 and beyond will posses that knowledge. Of course, there's never any money, anywhere, for anything. Certainly not for something we're framing as part of our academic assessment efforts. So, we're going to have to get really creative about how we fund this development phase.

  4. Making a difference is an incentive. So is publishing. Neither may be sufficient alone. Part of why we managed to see this project all the way through is that the team of grad students, postdocs, and early career academics we assembled were all genuinely motivated by (a) wanting to take action toward meaningful improvement in our department and (b) our collective commitment to publishing a tool that could be used and help our CVs. The very real truth of the matter is that without publishing the paper about this survey, all the work we did would just get dismissed as "service." That people use the academic value system to justify disregarding this kind of work unless we make it into a countable "bean" is reprehensible. It is also our current reality. If we are going to see other departments do this kind of work (or see our own department sustain this work), we have to find ways of doing it that simultaneously result in countable beans and ensure those beans are valued beyond the existence of a DOI. That is, the bean has to actually contribute to the intended change. Without this reciprocal relationship of impact and bean-making, we're not going to see this work happen.


The figure depicts four boxes, all connected by arrows indicating an iterative and connected cycle. The cycle starts with education and planning, continues to action and then analysis/interpretation/reflection, which continues to presenting findings and proposed actions. This last stage also connects back to both action and continued education and planning. The figure can be accessed in detail via the PDF versions of the paper linked to in this blog post.
Figure 6 of our paper provides a conceptual overview of our entire planning-implementation-analysis-reporting-action planning process.

Using our paper as a tool

If you're interested in conducting a department-level climate survey, we hope you'll find our paper both informative and actionable. In addition to the main paper, we also provided several supplements that should get you rolling. All are hosted on the journal's website as editable, docx files. These include:

  • Complete climate survey with all questions and formats of questions, including information on how we piped (routed respondents to relevant sections of the survey)

  • Details regarding the inception of our climate survey and related efforts

  • Template language that we used in the context, consent [3], definitions, and acronyms sections of our climate survey

  • Complete results from the exploratory factor analysis of our quantitative results

  • Complete results from the employee category-specific questions


 

Full citation: Barrile, G.M.*, R.F. Bernard, R.C. Wilcox*, J.A. Becker, M.E. Dillon, R.R. Thomas-Kuzilik*, S.P. Bombaci, and B.G. Merkle#. 2023. Equity, community, and accountability: leveraging a department-level climate survey as a tool for action. PLOS One. 18(8): e0290065. https://doi.org/10.1371/journal.pone.0290065  * = student, # = senior author


If you can't access the paper, let me know, and I'll send you a PDF!

 

NOTES

[1] There are a slim handful of unpublished reports from departments and/or degree programs. We referenced those when we could find them, both to help build out our survey itself and to inform our process of analyzing and actually using the results. At least one additional survey we read wasn't publicly available, but we were able to reference it because one co-author had been involved with that one. (Which is all to say, there may be more of this work happening under the radar -- likely for various, ahem!, reasons. If you're able to tap into that or make the decision to publish on such work, it could help make this process more feasible for others in the future. And, as we discuss in our paper, there many bipartisan reasons to work to understand workplace climate at a scale where we actually have a chance of improving that climate.)


[2] You've probably also noticed that it took us several years to do this. The work was feasible because we paced ourselves. But also, we would have happily moved the project forward more quickly had we had capacity to do so.


[3] We submitted our survey and protocols to the university review board on ethics for human-subject research. They deemed our process approved and not subject to IRB oversight, as it was an assessment endeavor. We indicated such in the consent materials and in the material we submitted for publication.

commnatural sciencecommunication research & practice Bethann Garramon Merkle

© 2025 by Bethann Garramon Merkle.

bottom of page