You enable real research projects. Once you fund a project, you'll get access to progress, data, and results straight from the team.
Each project is reviewed by our team to make sure that it meets our project criteria. Anyone can start experimenting.
Join an online community of 32,000 explorers of science. Read about our mission.
Understanding the mental health of students of color in the U.S. during heightened public xenophobia and bigotry Ceglarek, Peter, Adam Kern, Sarah Ketchen Lipson, and Daniel Eisenberg.. Healthy Minds Network, University of Michigan, 12 Dec 2016. Experiment
Each wave of our project will proceed in four steps: (1) survey design; (2) school recruitment; (3) participant recruitment; and (4) sharing survey results.
(1) Survey Design: In order to identify the most important issues for examination in our survey, we consulted with several other experts in the student mental health and substance abuse fields at University of Michigan and elsewhere and we reviewed the existing literature on student mental health. In developing our survey, we used previously validated and widely used instruments wherever possible. For example, we included depression and anxiety scales from the well-known Patient Health Questionnaire (PHQ) and Generalized Anxiety Disorder 7-item (GAD-7) scale. In other parts of the survey we modified previously used questions to the campus setting. We also created new questions in order to address issues that have not been addressed in this type of survey. The survey has taken 15-20 minutes on average for respondents to complete in previous iterations, and in revising the survey we have maintained a similar length.
The enhanced version of HMS is comprised of a series of modules. There are three standard modules that will be administered at all participating colleges and universities. Colleges and universities will also select from a menu of elective modules (see questionnaire uploaded in this application). Participating colleges and universities have the option to add up to 10 custom questions to the survey (which our research team may use for research purposes). For grid questions with multiple items, each item is counted towards the limit of 10, so as not to increase the length of the survey excessively. These questions are chosen by the participating campuses, and our study team does not impose any restrictions except that we look at the questions to confirm that they are not introducing any new or substantially augmented risks to the participants (e.g., questions that are substantially more sensitive or might cause substantially more discomfort than the core questionnaire). Among schools that have included these extra questions, they have typically chosen questions that are very similar to the general topics in the core questionnaire—a mix of non-sensitive (e.g., asking about perceptions of campus climate) and sensitive questions (e.g., asking about drinking behavior and mental health). When a college or university elects to add custom questions, we will upload the items as an amendment for IRB approval.
(2) School Recruitment: Though HMS is now quite well known throughout the campus mental health networks, our research team will continue to actively recruit colleges and universities to participate in the study. To find schools to recruit, the team will advertise HMS through several channels including but not limited to listservs, the Healthy Minds Network (healthymindsnetwork.org), and research briefs. If a school is interested in enrolling, they will contact the study team (primary email contact: firstname.lastname@example.org). Schools that elect to participate will sign a participation contract with University of Michigan, attain study approval or exemption from their local IRB or ethics board (or provide documentation that they are not engaged in the research) and will submit payment to the study team (collected through the School of Public Health Department of Health Management and Policy). Participating colleges and universities will also select which modules to administer on their campus. All schools willparticipate in the three standard modules. They will also select approximately 3 additional modules from a list of ‘elective modules’.
Participating schools will also provide customization specifications for local resources to include in the survey endings and recruitment documents, as well as customized questions if desired.
To finish the enrollment process, schools will provide our team with a sample file of students from their Registrar’s Office. We will need to collect the following variables from each school's Registrar’s office for recruitment, sampling, assessment of nonresponse bias, and analysis:
This data will be obtained before the recruitment of individual students. We are permitted to obtain the administrative data listed above for all students recruited for the study consistent with the guidelines of the Family Education Right and Privacy Act (FERPA). These guidelines allow schools to disclose those records, without consent, to the following parties or under conditions (34 CFR 99.31) including organizations conducting certain studies for or on behalf of the school. In particular, these studies must help develop, administer, or validate predictive tests, administer student aid programs, or improve instruction. Our study is consistent with these specifications in multiple ways. Most notably, our study is administering and validating mental health screening tests to see how well they predict academic outcomes. More generally, our study aims to understand more about how the learning environment (i.e. instruction) can be improved through addressing mental health.
The HMS team will receive the data file of students at each participating college or university. The file will be uploaded by the school contact to a designated school folder on the M+ Google Drive (consistent with the process that has been used for our similar studies, including previous iterations of HMS). At that time, only the school contact and study team will have permission to access that folder. After the file is uploaded, the study team will remove accessing privileges from the school contact. Typically, the data file will contain all possible students that are eligible to participate, unless the school’s local IRB/Registrar’s Office decides that they would rather randomly pre-select the HMS recruitment sample (this often happens at campuses that administer many surveys and thus have a system for ensuring that surveys are equally distributed across the study body). If the HMS team conducts the random sampling to create the recruitment sample, the remaining data on students that were not selected will be destroyed.
The HMS study team will assign a unique survey link generated through Qualtrics to each student in the randomly selected sample. In other words, if there are 4,000 students in the random sample from a participating institution, we will generate 4,000 unique survey links. This unique link will be piped into the student emails. It will also be added as a column in the institutional data file provided by the participating institution. The unique survey link has no significance outside of the study.
(3) Participant Recruitment: On campuses with >4,000 students, the HMS team will invite 4,000 students per school to complete the survey (or all students at schools with <4,000 students). All invited participants will be currently enrolled students who are at least 18 years old. In states where the age of majority is greater than 18, a waiver of parental consent will be applied for in this IRB application. The HMS team does not wish to exclude 18 year olds in these states as the study does not pose more than minimal risk to participants. We believe that including these 18 year olds in this study will enhance the value of the research to society and the relevance of findings for these institutions. If we are to produce results representative of the first year population, we will need to include 18 year olds in these states. The risks for 18 year olds residing in these select states should not be different than for other 18 year olds. They are at the same stage of education and undergoing the same developmental transitions.
Dissemination of the online survey to students will be conducted by email. The process will involve specific steps by participating institutions and the HMS team. To send out emails to students on each of the participating campuses, the HMS study team is working with Emma, a web-based marketing and communications company. This partnership enhances the legitimacy of the study for students (because emails come from official school accounts—e.g., healthyminds@[school].edu, as described below), is a sustainable approach to recruitment/communication as the study expands to more sites (asking each participating institution’s ITS department to send out the emails is an unrealistic approach for a national study of this nature), and upholds the privacy and confidentiality for human subjects.
The HMS team has a main account (called a master admin account) through Emma: email@example.com. After enrolling in the study, each local contact will create an institutional email address for HMS. The study team will request that the email address be healthyminds@[school].edu but at some schools, there may be restrictions on the form of email addresses able to be created. It will be perfectly acceptable for the email address to be hms@[school].edu or hms2015@[school].edu, etc. and this will in no way impact the protection of human subjects. In creating this email account, the local contact will set the forwarding address to firstname.lastname@example.org (the main account for HMS). The HMS study team has access to this main email account.
To articulate this system more clearly, we provide an example of the process and the role played by the HMS study team, a participating institution (e.g., University of Midwest), and Emma. The HMS study team has a main account in Emma with sub-accounts for each participating institution (i.e., if there are 10 schools participating in HMS in academic year 2015-2016, HMS will have 10 sub-accounts in Emma). The campus contact at University of Midwest will follow the instructions to create a school-specific HMS account. The campus contact will create an account (typically this will be called a group account, alias account, or forwarding account by the participating institution). At University of Michigan for example, this would involve create a group called Healthy Minds and an email associated with this group (at University of Michigan this is accomplished through MCommunity). This account will not have its own inbox, rather it will be, as stated above, a forwarding account. In creating this account, the local contact will be asked to provide a list of member email addresses. The local contact will list email@example.com as the only member. So at University of Midwest, the local contact will go through the steps to create firstname.lastname@example.org and would then inform the HMS study team that this email account had been created. email@example.com would then be added as the sub-account email address for University of Midwest in Emma. When the HMS study team sends out the first email to the randomly selected sample from University of Midwest, the email would come from firstname.lastname@example.org. This legitimizes the study for students (who see that the study is supported by their institution given the email address) and abides by anti-spam and privacy laws. Furthermore, the survey data remain completely separated from the identifiable information in this process. Using the email system with Qualtrics would not have been possible because the survey data and the email addresses/names of student would have been stored within the same system.
There are several other reasons that justify our decision to work with Emma and our confidence in the proposed system. HMS is a large-scale study. If we recruit 4,000 from a given school, that will mean sending out 4,000 emails in a single day. The University of Michigan Gmail system will not allow for more than 2,000 emails to be sent, making it impossible for us to disseminate our survey to students using the University of Michigan email system. All communication from students will come to email@example.com, including any bounced-back emails or out of office replies will come directly to firstname.lastname@example.org.
At schools that are unable to create an email address for HMS, we will be creating an account ending in @umich.edu (e.g., email@example.com or firstname.lastname@example.org) and will follow the exact same emailing procedures using these email addresses instead.
As mentioned, the HMS study team will recruit students to complete the survey via email. The HMS team will begin recruitment with a brief “pre-notification” email. Survey methodologists have concluded that this initial notification can boost participation rates. Two to three days later, the team will send the recruitment email with a link to the online survey (i.e., data collection will begin with the recruitment email), and the team will follow-up with reminder emails to non-responders (up to three reminder emails in total, separated by approximately 5-7 days each). In total, students may receive up to 5 emails about participating in HMS over the roughly 3 week data collection period (see Table 1). Students who complete the survey, or indicate they do not wish to participate (by emailing the research team to indicate this or not consenting on the consent page of the online survey), will not receive any further invitations. We anticipate a total survey response rate of approximately 30%.
Students will voluntarily access the HMS survey via the unique survey link listed in the recruitment and reminder emails. The link will take students to the online survey, where they will be presented with the HMS consent form. Students must give their consent in order to progress to the first question in the online survey. After the initial page (the consent form), the rest of the online survey will be the same for these students as for all other participants. If students are unable to take the whole survey at one time, they may return to their unique survey link at any point during the data collection period to continue where they left off. The final reminder email indicates the exact date of data collection completion.
Upon completion of the survey, participants will receive a list of helpful resources that are programmed into the last page of the survey. For students who do not consent, the survey ending still includes the full list of resources (with the exception of sexual assault, which is included based on student responses and thus unavailable for students who do not consent to participate). Students who do not complete the survey (i.e., begin but do not complete) see the main resources at the top of each page but because there is no survey ending (because they have not completed the survey), there is no other way to provide them with additional resources above and beyond the resources listed on every page. The list includes custom local resources that school contacts have provided for their students. We will provide these institution-specific resource lists as an amendment (or multiple amendments) to this application.
Survey endings are displayed within Qualtrics and are not sent via email (i.e., sensitive identifiable information is not being transmitted via email). There will be two types of survey endings: (1) for students who do not consent, and (2) for students who complete the survey (see document uploaded in this application for full survey ending text).
Note that the survey endings reflects an important enhancement to this version of HMS, relative to previous iterations. There is a growing trend in online health screens/interventions whereby respondents receive immediate feedback (this is increasingly the expectation). In response to this, the HMS survey endings (for respondents only) now provide some basic information with results from the validated mental health screens. Our hope is that this feedback will increase the relevance of listed mental health resources and thus increase the likelihood of help-seeking for students in need. Another important enhancement that is operationalized in the survey endings is that we are taking a more proactive approach to connect students with relevant resources based on their survey results. Over the years we have developed many partnerships with interventionists (e.g., the Healthy Body Image Program at Stanford, through which we ran a pilot program linking from the Healthy Bodies Study). Our hope is that we will be able to connect students in need with available online resources following completion of HMS.
For the pilot study of HMS at UM, we will be offering an online program called Refresh to students with sleep disturbances (as identified by the Insomnia Severity Index). Refresh is an online program for students with sleep disturbances. The program is disseminated via email. Over the course of eight weeks, students receive a PDF in an email (included in Section 44). Within Qualtrics, there is a standard (built-in) function that allows for automated notifications to be sent to the study team based on responses. This is called an email “trigger”. For the pilot study at UM, we will utilize this option to notify our team of students’ placement into Refresh. The trigger emails, which have been developed and tested already, contain the student’s unique Qualtrics link. We will then use this link to find the student’s identifiable information (first name and email address) and we will invite them into Refresh (they are informed in the end of survey messages in Qualtrics to expect a follow-up email). They may opt-out at any time. The Refresh resources will be delivered to eligible students by the HMS research team, using the emailing system Emma. These students will receive a weekly email with Refresh resources over the course of 8 weeks following completion of the survey.
(4) Sharing Survey Results: Following data analysis, we will prepare academic articles for publication. We will also prepare a summary report for each participating institution. This report will not contain any
identifiable information or provide detail that would allow an institution to come close to identifying a participant. The report will be univariate, a simple descriptive statistic (percentage or mean) of the overall sample for each survey item, with an indication of how this institution compares to other HMS participating institutions (e.g., significantly higher or lower than the other institutions). An example of this report is available here (note that this is the 2014 national HMS report and is not school-specific): http://healthybodiesstudy.org/wp-content/uploads/2014/07/HMS_national.pdf. We will provide this report to the participating school contact. It is expected that the school contact will share the report with school administrators as well as communicate the findings to the student community. Identifiable data will never be shared with the school contact or other institutions. The data which will be shared with school contact or other researchers will always be de-identified. Additionally, these individuals will never have access to identifiable data.
For the partnership with Refresh, we will not be sharing any identifiable data with those researchers and we will not be accessing any data from them. We will be sharing an aggregate, de-identified data set with the Refresh researchers and there will be a way (using responses to the sleep screen questions) to limit the sample to those with specific criteria (in this case, limit to those who meet criteria for Refresh based on sleep disturbances). This could be done in HMS data for depression (using the PHQ-9 scores), anxiety (using the GAD-7), etc. will share the HMS data set with the Refresh researchers, likely with a variable to flag students who met criteria for Refresh (all who qualified). In terms of our relationship with Refresh, there is no formal partnership at this stage. Informally, we’re interested in working together with them to figure out how to delivery sleep health interventions as add-on to our survey. This is part of our broader interest in offering something of practical value to the thousands of students who take our survey each year. So this next step is essentially a feasibility test, just to see how it works for us to offer the sleep program at the end of the survey. Depending on how that goes, we might consider a more formal and lasting partnership. Refresh is not part of the research (i.e., not part of our analysis) and there is no current plan to collect data on Refresh. The Refresh team will not interaction with the subjects. In other words, the Refresh team will not collect any data from/or interact with HMS subjects.
The survey will be run through Qualtrics, and will be entirely online. Our data analysis will consist of both univariate and multivariate analyses. Our univariate analyses will include describing the data and identifying correlations. We will employ more sophisticated methods such as regression to publish findings from in peer-reviewed journals. Multivariate analyses will include linear and nonlinear (e.g. logistic) regressions. For both univariate and multivariate analyses we will examine differences across sub-groups, such as graduates versus undergraduates and racial/ethnic groups. We will compare characteristics of non- respondents to those of respondents and assign statistical weights to respondents accordingly. All analyses will be performed using Stata 12.1.
Browse the protocols that are part of the experimental methods.