WHAM Study Frequently Asked Questions
Why did HMC do this study?
In response to the events that highlighted concerns of workload and balance at HMC in the spring of 2017, The Teaching and Learning Committee (TLC) in partnership with the Faculty Executive Committee (FEC) and in consultation with several student advocates, undertook a longitudinal study of workload during the fall 2017 semester. The goal was to gain a more accurate estimate of out-of-class workload, satisfaction and wellness for HMC students that the campus could use to inform policies and practices around student wellbeing.
How and when was the survey administered?
An email (PDF) was sent to all HMC students enrolled in fall semester 2017 inviting them to participate in a study of Workload and Health at Mudd (WHAM). Three hundred and five students signed up to participate. On Sept. 13, 2017, all 305 students who signed up were sent an email with a link to a short demographic questionnaire. Two hundred eighty-six students (94 percent) provided demographic information.
Starting on Sept. 23, 2017, participants received an email with a link to the weekly questionnaire (PDF) every Saturday at 10 a.m. Three reminders (Sunday at 10 a.m., 6 p.m. and 11:45 p.m.) were sent to non-respondents. The weekly questionnaire closed each Monday at 8 a.m. The weekly questionnaires ran between Sept. 13 and Dec. 9, 2017, including the weeks of October and Thanksgiving breaks.
On Dec. 11, 2017, a second demographics questionnaire was sent to capture changes in participant demographics (e.g., major selection) that may have occurred over the course of the semester. This was the final questionnaire sent to participants.
Detailed information on weekly response rates and the background characteristics of respondents can be found on the demographics page.
Why didn’t you run the survey from the start of the semester through finals?
The study started during the third week of classes in order to let the add/drop deadline pass. By the third week of classes, most participants had settled into their course schedules for the semester.
Regarding the end date, this project asked students how much time outside of class or lab time they spent on work for each class, so the project stopped asking how much time they spent outside of class or lab when classes ended. Taking into account finals week is something that future work may accommodate.
How did the WHAM study handle half-courses? Participants who dropped a course?
All of a participant’s courses were listed in each of their weekly questionnaires. The instructions asked them to select N/A for their half-courses during the weeks of the semester when they were not enrolled in the course. If they dropped a course, the instructions indicated they should select “W” for that course.
Does this data represent all of HMC?
No. Because participation was voluntary, and the topics of work and health have the potential to provoke strong opinions at HMC, it is impossible to determine the impact self-selection may have had on the results. Additionally, there is variation in responses across respondents and across weeks. Therefore, it is important to keep in mind that these results do not necessarily represent generalizations about the entire HMC community, or any one gender, race, class year or department in particular. We do know that more than one in three Mudders (36 percent) participated in the study and, of those who participated, the majority (52 percent) provided data for 12 out of 13 weeks. That increases our confidence that the findings do document the experiences of respondents.
Why are you sharing selected findings and not the full results?
This site is designed to share important findings that are relevant to all students, faculty and staff at HMC, to provide the ability to investigate the data in ways that allow the community to understand the findings and to use the data to contribute to the productive conversations on campus about workload, health and wellness. Additional findings are expected in the future and will be shared here.
What about the written comments? Why aren’t those included in the analysis?
To maintain the privacy of the respondents, comments from the weekly questionnaires will not be released. This is because many comments are individually identifiable, or at least could be if another variable (gender or course) was known. Again, as much as possible, information that would allow for the identification of individuals has been withheld in an effort to protect individual privacy, a key factor in ensuring participation in this project. A thematic review of these comments was completed by an outside reviewer and summarized.
Is it possible to provide results by course? For nonbinary respondents? For first-generation respondents?
Maintaining the privacy and confidentiality of respondents remains a key consideration in preparing the findings. Thus, for variables and demographics (e.g., non-binary, international, first generation) where the number of respondents is small (n < 14), information will not be made public. A further consideration in the design of our visualization tool was to help our community to focus on the big picture. Please note that while the visualization tool does not provide data for some specific categories, these respondents are included whenever “(All)” is selected.
I’m using the department filter to look at results by department, and I see a category called “INTERDISC.” What is that?
INTERDISC (interdisciplinary) courses are courses that do not belong to any single department. Specifically, they include CL 57, IE 179D, IE 197 and Writ1. Core courses are included in their department and can be seen by using the department and Core filters in combination.
Why doesn’t the percentage always add up to 100 percent?
This was a complicated decision for the researchers who analyzed the data but, in the end, there were two primary reasons. First, the missing/blank questions (i.e., questions to which respondents didn’t provide a response) may have meaning when discussing specific details about workload and mental health. Respondents may have left gender, class year or race/ethnicity questions blank because they felt they were maintaining their privacy and/or due to the difficult nature of the topic. Second, it is important to maintain consistency in reporting. While some of the questions about time use, resources and support or wellness may not have been sensitive or required confidentiality, the same methodology was used throughout the analysis.
So when I’m looking at the findings, what exactly is being shared?
The visualization tool is not plotting all responses, it is omitting respondents who left the item blank (or in the case of the wellness questions, those who said it was not applicable).
When can we expect to see the impact of these results on campus?
As promised, the TLC has shared this data with everyone in the HMC community at the same time. That means that students, faculty and staff are all trying to make sense of it together. In the future, we will decide collectively how to best move forward.
Inquiries can be directed to Chief Communications Officer Tim Hussey at email@example.com.