Contents
-
Acknowledgments
-
I. Overview
-
II. Survey Design & Pilot Test
II.A. Questionnaire Development
II.B Substantive Expansion
II.C Pilot Test Administration
-
III. Survey Administration
III.A Population File
III.B Response Reminders and Incentives
III.C Response Rate
III.D Data Considerations and Preparations
-
IV. Sampling Design & Weighting
IV.A CIP Codes
IV.B Recruitment of Institutional Participants
IV.C Weighting Procedures
-
V. Considerations in Context of Prior SNAAP Surveys
-
Appendices
Appendix A: Annotated 2022 SNAAP Questionnaire
Appendix B: Frequency Report of Unweighted 2022 SNAAP Data
Appendix C: Resources for Participating Institutions (Bulletin & FAQ)
Appendix D: SNAAP 2022 Sample Weighting Procedures
-
References
Acknowledgments
This report was supported in part by an award from the Research: Art Works program of theNational Endowment for the Arts: Grant 1891787 – 38-22 with matching funds provided by the University of Illinois College of Fine and Applied Arts, the University of Illinois Investment for Growth Fund, and by the Department of Arts Administration, Education, and Policy at The Ohio State University, in partnership with the Strategic National Arts Alumni Project (SNAAP).
We extend our gratitude to Indiana University’s Center for Postsecondary Research for their contributions to SNAAP’s surveying and research endeavors. We extend sincere thanks to Dr. Y. Michael Yang, Dr. Julia Batishev, and Karen Gregorian at the National Opinion Research Center (NORC) at the University of Chicago who worked with SNAAP to develop a national sampling frame and survey weights; their contributions are included within this report. Thank you also to Janet Husunukpe and Emma Walters, doctoral candidates at the University of Illinois Urbana-Champaign, who provided early research assistance.
Suggested Citation: Novak-Leonard, J. L., Ibrahim, D., Scotto Adams, L., Miller, A.L., Skaggs, R. and Bigelow, S. (2023). 2022 SNAAP Technical Report, Strategic National Arts Alumni Project Report. Austin, TX: Arts + Design Alumni Research, SNAAP.
SNAAP is supported by Arts + Design Alumni Research, the nonprofit that oversees the management of SNAAP, and its institutional partnerships with the University of Texas at Austin’s College of Fine Arts and the University of Illinois at Urbana-Champaign’s College of Fine and Applied Arts.
SNAAP’s 2022 survey administration was made possible by the dedicated efforts of the SNAAP staff — Lee Ann Scotto Adams, Executive Director; Deanna Ibrahim, Director of Research Services; and Angie L. Miller, Data Consultant — and through support from The Mellon Foundation, The Emily Hall Tremaine Foundation, and sponsorships supplied by:
- ArtCenter College of Design
- College of Art & Media at University of Colorado Denver
- College of Arts and Architecture at Penn State University
- College of Fine Arts at University of Nevada, Las Vegas
- Eastman School of Music at University of Rochester
- Emerson College
- Herb Alpert School of Music at University of California, Los Angeles
- Indiana University, Bloomington
- Institute of American Indian Arts
- Kathrine G. McGovern College of the Arts at the University of Houston
- Maryland Institute College of Art
- Meadows School of the Arts at Southern Methodist University
- Pratt Institute
- School of the Art Institute of Chicago
- Thomas S. Kenan Institute for the Arts at University of North Carolina School of the Arts
- Tyler School of Art & Architecture at Temple University
- University of Southern California
- Yale University
Questions? Email: info@snaaparts.org
I. Overview
Since 2008, the Strategic National Arts Alumni Project (SNAAP) has collected, examined, and publicly shared the most comprehensive and detailed data on the educational experiences, careers, and lives of individuals with arts, design, and related degrees. SNAAP serves the dual purpose of providing institutional-level data to colleges and universities to inform their own evidence-based improvement strategies and of providing national survey data to enable and foster research that provides field-wide, systemic insights to policymakers, administrators, scholars, and additional stakeholders. Beginning in 2010, SNAAP provided access to data from its earliest pilot survey administrations. The first full SNAAP survey administration launched in 2011 and was repeated in 2012 and 2013, with nearly 100,000 respondents. SNAAP’s second three-year survey administration cycle took place in 2015, 2016, and 2017, with nearly 82,000 respondents.1 The SNAAP survey was relaunched and administered in 2022.
For its 2022 survey administration, SNAAP implemented two important, overarching changes:
- Designed a probabilistic sampling frame and implemented an updated recruitment process focused on inclusivity of the range of postsecondary institutions offering programs and awarding degrees in arts, design, and related fields. These changes were undertaken in effort to bolster the representativeness of SNAAP’s national, aggregated data and the generalizability of findings stemming from these data.
- Refined and expanded the 2022 SNAAP questionnaire with broad stakeholder input. The updates made to the questionnaire include a streamlining of SNAAP’s existing question set and an integration of new measures addressing (i) the ways in which alumni experienced or did not experience an inclusive environment and sense of belonging during their studies and training; (ii) how the careers, lives, and needed skills and abilities of arts and design alumni have been impacted by the pandemic and the changing nature of creative and other work environments; and (iii) financing postsecondary education.
This Technical Report documents the 2022 SNAAP survey administration, providing detailed information on the 2022 SNAAP questionnaire design, administration processes, sampling, and weighting. More about SNAAP can be found at www.snaaparts.org
1. Data from prior SNAAP administrations are available to qualified researchers on approved request and based on a cost-recovery model. Email info@snaaparts.org for further information.
II. Survey Design & Pilot Test
II.A Questionnaire Development
-
II.A.i Stakeholder Input
In spring 2021, a review of past SNAAP survey administrations’ questionnaires and data quality was undertaken with attention paid to means for streamlining, reducing cognitive burden, reducing breakoffs, and updating the overall questionnaire. Key topical areas were identified for continuation and for expansion. Key areas for continuation included, but were not exclusive to, alumni work and employment, as well as satisfaction with postsecondary education institutions. Key areas for expansion included, but were not exclusive to, socio-demographic characteristics, sense of belonging, implications of the COVID-19 pandemic for training, employment, and ways of work, and paying for postsecondary arts, design, and related educations. Once an initial draft was developed, critical feedback was sought from an array of stakeholders. Additionally, a further revised draft questionnaire was made publicly available for input and comments on the SNAAP website in fall 2021 and spring 2022.
In fall 2021, four focus groups and three interviews were held with higher education administrators, faculty, and additional stakeholders to discuss information needs related to the lives and careers of arts, design, and related alumni under University of Illinois Urbana-Champaign IRB. The purpose of these conversations was twofold: (a) to garner insights on pressing information needs within postsecondary arts and design fields as they relate to understanding and improving the careers and lives of graduates of related degree programs and (b) to generate feedback on the draft 2022 SNAAP questionnaire. The results of the former are synthesized and shared in the report, Data, Pressing Needs, and Biggest Challenges: Insights from the Field (Novak-Leonard, Dempster, Scotto Adams, & Walters, 2022).
-
II.A.ii Cognitive Testing
A series of questions relevant to insights regarding alumni perspectives on their sense of belonging while at their institution were cognitively tested with 33 alumni of arts, design, and media production undergraduate degrees.
The main objective of the cognitive testing was to probe the comprehension and wording of the questions, whether the questions were clear, and the interviewees’ thoughts on them. However, participants were at liberty to answer the questions as they deemed fit.
The interviewees were recruited from amongst recent alumni, defined as those who completed their degrees within the prior five years of receiving the invitation to participate in the interview. Interviewees stemmed from a range of disciplinary degree foci; institutional types, inclusive of a R1 research university, a specialized arts and design school, and minority-serving postsecondary institutions. These same interviewees also answered questions about sense of belonging, which are part of a larger qualitative study being led by Dr. Jennifer Novak-Leonard. Interviewees were given a $40 Amazon e‑gift card for participating in the whole interview, paid for with funds supplied by the University of Illinois Urbana-Champaign.
The interviews were conducted via Zoom, and each was 30 – 75 minutes in length, with most interviews concluding within 60 minutes. During the interviews, the interviewer read out the questions to interviewees and placed them in the chat. Interviews were conducted by Dr. Jennifer Novak-Leonard, Emma Walters, and Dr. Shanita Bigelow, under University of Illinois Urbana-Champaign IRB.
II.B Substantive Expansions
-
II.B.i Socio-Demographic Measures
A specific aim of the 2022 SNAAP survey administration was to capture more nuanced data on how individual alumni identify themselves, and on their social identities and socioeconomic circumstances in effort to garner more granular insights on the varied experiences of alumni within their postsecondary studies and training and experiences since. Reviews of current, albeit evolving, best practices for inclusively collecting information on race and ethnicity, gender identity, sexual orientation 2, and personal and household income were conducted, and comparable measures adapted for use within the 2022 SNAAP questionnaire. These measures are included in the latter questionnaire section entitled “Identity & Socio-Demographic Questions.”
2. For example, see Morgan et al. (2020).
-
II.B.ii Sense of Belonging
A key aim of the 2022 SNAAP survey administration was to document arts and design alumni’s experiences with sense of belonging during their postsecondary arts and design education, as well as within their employment if they were in the workforce at the time of 2022 SNAAP survey administration; these measures are included in Section A and Section C, respectively, of the 2022 SNAAP questionnaire. Sense of belonging is recognized as a key indicator of student well-being within higher education and as a predictor of student outcomes (Bentrim & Henning, 2022).
-
II.B.iii COVID-19 Questions
An aim of the 2022 SNAAP survey was to document arts and design alumni’s experiences during and after the COVID-19 pandemic, beginning in March 2020. The items developed for Section D of the questionnaire address pandemic-related experiences with work, artistic and creative practice, professional networks, skill gaps, and new skill acquisition. Attention to these topics is essential for establishing how arts alumni fared during the pandemic, including potential challenges they faced and areas where they may have upskilled or shifted focus to new artistic or non-artistic skills to respond to the changing world of work.
To develop survey questions to meet these research aims, Dr. Rachel Skaggs and Dr. Elizabeth Cooksey at The Ohio State University, along with research assistants MollyJo Burke and Erin J. Hoppe, conducted 66 interviews with arts alumni whose artistic work, educational experiences, and personal demographics closely resemble past SNAAP survey samples (read more in Skaggs, 2023; Skaggs, Hoppe, & Burke, 2022). Interviews were conducted from November 2020 – March 2021 and focused primarily on the impact of the pandemic to that point in time on arts graduates’ careers, personal lives, and their future. This research team also conducted informational interviews with eight administrative leaders in higher education in the arts. These deans, provosts, and presidents of institutions identified pressing challenges and sites of resilience and innovation for their institutions, students, and alumni that were instructive in designing Section D of the 2022 SNAAP questionnaire.
Appendix A contains the annotated 2022 SNAAP questionnaire with reference to influences toward the development of these questions. Questions and items that are not explicitly annotated were developed based on findings from the previously described interviews and from considerations of how to amend existing SNAAP questions from prior survey administration to be relevant and responsive to the aims of the 2022 SNAAP survey and related research.
-
II.B.iv Education Financing
The 2022 SNAAP survey also aimed to capture more nuanced information about how alumni had financed their educations and to gauge related debt. Relatedly, the SNAAP survey sought to better understand career-related motivations for those who pursued graduate degrees in content of education financing. The SNAAP survey is administered to alumni of arts, design, and adjacent programs, and therefore the SNAAP dataset itself does not contain data on other degrees in other fields, which limits the types of comparative analyses possible between alumni of arts and design programs and other fields, such as science, engineering, etc. In effort to enable some comparative contextualization with alumni of other degree fields, the questions used in this section of the 2022 SNAAP questionnaire are adopted from the 2019 National Survey of College Graduates managed by the National Science Foundation, which largely focuses on alumni of science and engineering programs.3
3. For more information about the National Survey of College Graduates, see: https://www.nsf.gov/statistics/srvygrads-legacy/#
II.C Pilot Test Administration
-
II.C.i Sample Used
In the summer of 2022, a pilot test was conducted, with a sample of alumni from two institutions planning to participate in the fall 2022 administration. Each institution was asked to provide contact information for no more than 1,000 of their alumni. Although both participating institutions considered providing incentives to their alumni who completed the survey, neither institution ultimately decided to provide incentives for the pilot study. One institution requested to include a recent sample of alumni (2017 graduation year or sooner), and the other included a sample that was representative across graduation years. The contacts at these institutions used the population file instruction guide and worked with the Director of Research Services to generate the alumni population file. In each alumni file, schools were required to include necessary information, such as the name, email address(es), graduation year, area of study, department, college/school, and CIP code of arts and design alumni. Schools were also given the option to include the alum’s street address(es), GPA, gender, alumni ID, and any additional information they would like to be included in the final dataset that was not asked in the survey. Each school submitted a file for 1,000 alumni to SNAAP staff and the Indiana University Center for Survey Research (CSR; SNAAP’s survey programming and administration service partner) using a secure OneDrive folder, resulting in a total pilot pool of 2,000 alumni to recruit from.
-
II.C.ii Recruitment and Promotion
Once the files were reviewed by SNAAP and CSR, they were shared with AlumniSync, the alumni search firm, via a secure folder. AlumniSync was responsible for validating the email addresses on file and providing any updated email addresses if detected. Of the 2,000 alumni included in the file, there were 1,657 (83%) with at least a primary email address on file. Of those 1,657 addresses, AlumniSync was able to validate 1,520 (92%); inother words, 1,520 email addresses were valid and safe to contact — that is 76% of the total 2,000 contacts submitted. Conversely, there were 137 email addresses on file that came back as invalid or unsafe to contact. Of those 137, 35 had a secondary email address on file, and AlumniSync was able to find 69 email addresses based on alumni contact information. In total, the search resulted in 1,745 valid email addresses (83% of the total 2,000; 105% of the total 1,657 with an email address on file) for CSR to send recruitment messages to, resulting in a recruitment pool of 1,745 alumni.
SNAAP worked closely with CSR to develop a set of five recruitment messages (one initial invitation, and four reminders), to be sent out to the recruitment pool. Given the short timeline of pilot survey administration of just over two weeks, we decided to remove one of the reminder messages. Once the messages were developed, SNAAP coordinated with our institutional contacts to allow them to customize certain sections of the messages and add their school logos. The contacts were also sent a test version of the first recruitment message via email before survey administration began, so that they could get a sense of what alumni would receive and add any final requests.
The pilot survey launched on June 28th, 2022, and closed on July 15th, 2022. A total of four recruitment messages were sent to alumni:
- Initial Invitation: Tuesday, June 28, 2022
- Reminder 1 Message: Tuesday, July 5, 2022
- Reminder 2 message: Friday, July 8, 2022
- Reminder 3 message: Tuesday, July 12, 2022
In accordance with SNAAP’s IRB protocol, if there was a secondary email address on file, a maximum of two recruitment messages were sent to multiple emails. Additionally, one message was sent in HTML formatting to improve the diversity of text used.
In addition to recruitment strategies, SNAAP encouraged contacts at our participating institutions to promote the SNAAP survey to their alumni via email and/or social media accounts, both before and during pilot survey administration. Schools were encouraged to send out a “pre-survey announcement” before the survey launch, as well as promotional notices throughout administration, to raise the importance of survey participation, and to notify alumni that if they did not receive an invitation yet, they would have the chance to participate in the fall. The recruitment messages would be updated slightly for the fall survey launch to match the fall administration dates.
-
II.C.iii Pilot Survey Response Rates
In total, we received 68 complete responses and 26 partial or break-off responses, for a total of 94 responses. The number of valid contacts AlumniSync identified was 1,520. The adjusted response rate across institutions was 5.7% ((the number of completions + partial completions) / (the total number of alumni — (those with no invitation sent + those with an invitation returned undelivered)).
Considering lower-than-expected response rates, SNAAP has explored several potential contributing factors.
- Recent research suggests that survey response rates are on a decline and survey fatigue is on an incline, particularly since the COVID-19 pandemic (de Koning et al., 2021). Due to a recent increase in web-based surveys, increased spam messages, and other reasons, web-based surveys have been linked to lower response rates than other methods (Daikeler, Bošnjak, & Lozar Manfreda, 2019).
- The short, two-week window of the pilot survey administration provided less time for alumni to participate than the full survey does.
- Our alumni search firm this year was able to find a higher number of alumni emails than we normally have access to, making for a lower overall response rate.
Given these factors, SNAAP’s pilot response rates are aligned with other recent alumni surveys. For instance, an alumni survey conducted by Fordham University in 2019 yielded a 5% response rate, and one by Cornell University in 2017 yielded a 7% rate.
-
II.C.iv Questionnaire Refinements
Once the pilot data was returned, the data was cleaned and thoroughly checked. Alumni comments within the survey were reviewed to determine whether changes to the survey were warranted prior to the fall administration. Based on alumni responses and experiences with the survey, the research team decided on several minor changes to implement for the larger survey administration:
- We decided to include fine arts degrees (BFAs and MFAs) in response options that listed degrees. For instance, in the pilot survey, the response option at the bachelor’s degree level included BA, BS, BM, etc.; we added BFA as an example. We made this change because many participating alumni are reporting BFA and MFA degrees.
- We decided to edit the response option “j” to curjob_ series (items referring to non-arts jobs). Rather than being specific to social services, we broadened the category to “social and other services” to include food service and travel industry jobs. We made this change because several participants wrote in other types of service industry responses.
- We decided to paraphrase the prompt for the ‘wh_whynot_’ series (items assessing why a participant is not working for pay or profit) to improve readability. This involved a slight adjustment from, “What are your reasons for not currently working for pay or profit?” to, “Please indicate why you are not currently working for pay or profit.” We made this change because an alum indicated in the open-ended response that they did not understand the question.
- We updated the response options to the survey item assessing gender. We made this change because several participants used the ‘other’ text box to indicate that they were nonbinary, which was not an option originally included in the item. We changed the response options from ‘female’, ‘male’, and ‘a gender identity that is not listed (e.g., gender fluid, two- spirit, transgender male),’ to the following: “Woman,” “Man,” “Gender identity that is not listed (e.g., nonbinary, transgender, gender-fluid, two-spirit), please share: [TEXT BOX, 100 ch. limit],” “Prefer not to answer.”
All the above revisions were implemented to improve readability and inclusion in the 2022 survey.
III. Survey Administration
III.A Population File
-
III.A.i Data Collection and Cleaning
The process for collecting alumni data from institutions for the fall survey administration was like that of the pilot survey administration. Of the 120 participating institutions, several registered separately for smaller arts and design schools/colleges within the university, resulting in a total of 128 population files. SNAAP worked closely with the Center for Survey Research to assist schools in providing alumni data and to review population files closely for inconsistencies. CSR was able to program a set of initial checks within the upload feature on the SNAAP interface. This allowed the feature to catch large, systematic errors within the files before they were uploaded. One important part of this process was ensuring that the required fields in the population file were complete. Within the population file instructions, SNAAP included thresholds of necessary data for each category. For instance, SNAAP required that each school provide the first and last name of 100% of alumni, but the email address(es) for at least 25% of alumni — given the ability to identify new email addresses with our third-party search firm. The upload feature would not allow institutional contacts to upload the population file if less than 100% of alumni had a first and last name listed, or if less than 25% had an email address listed.
Due to the complex nature of the population file, many schools reached out with questions, which SNAAP and CSR documented. The largest challenges schools faced throughout the process were:
- Locating and providing accurate CIP (Classification of Instructional Programs) codes for alumni.
- Over half of the participating institutions originally provided at least one CIP code for alumni that was outside of the sampling frame. In most cases, the alum had an arts or design major on file, but the CIP code itself was not within the identified frame. Alumni who had an arts or design major on file were included. In a handful of cases, there were more systematic inconsistencies — such that the school provided many alumni with non-arts or design majors and non-arts or design CIPs. In those cases, we reached out to schools to confirm that these alumni were indeed not within the arts or design, and flagged ineligible alumni not to be recruited.
- Inconsistencies in alumni degree information.
- A portion of schools provided contrasting information when referring to alumni degrees. For instance, the reported degree level (e.g., a doctoral degree, a master’s degree, a bachelor’s degree) did not match the reported degree level detail (e.g., PhD, MFA, BA), resulting in an alum’s degree level of a master’s degree but degree level detail of a BA. Another common inconsistency occurred in reporting the highest arts degree. Schools were asked to report the highest arts degree under degree level and degree level detail, and were asked to report additional (lower) degrees under arts degree 2 and arts degree 3. However, at times schools reported higher arts degrees in arts degree 2 or arts degree 3 rather than in the degree level column. Each of these inconsistencies were corrected through direct communication with schools. SNAAP reached out to institutional contacts and clarified both the correct degree and the highest arts degree received.
- Including duplicate cases of alumni in one population file, or duplicate cases of alumni across multiple population files.
- A portion of schools included duplicate cases of alumni in their institution’s population file; within-file duplicate cases were indicated by multiple cases of the same name and email address or multiple cases of the same email address with different names. SNAAP communicated directly with schools to clarify when duplicate records were included. In most cases of the same email address across multiple names, multiple alumni (family members, spouses) were sharing an email address according to school records. Most cases of the same name and email address were typos within school records. Regarding alumni who were included across multiple population files, SNAAP and CSR developed guidelines for alumni recruitment. To avoid sending the same alumni several survey emails relevant to multiple schools, we decided to contact alumni through contact information provided by the institution at which they received their highest arts degree. If multiple schools reported arts degrees at the same degree level for the same alumni, we used their contact information from the school associated with the most recent graduation year.
Throughout the survey administration, SNAAP received requests from institutions to add alumni to the recruitment list who were not originally included in their population files. In many cases, these alumni saw promotional messages about the survey on social media or via email and requested to be added, or these alumni were cases that schools were otherwise unable to include in their original file. SNAAP and CSR established a system through which schools could upload a condensed version of the population file for only those cases they needed to add by a given cutoff date during the survey administration.
- Locating and providing accurate CIP (Classification of Instructional Programs) codes for alumni.
-
III.A.ii Updating Contact Information
Similar to the pilot administration, SNAAP engaged a third-party alumni contact search firm, AlumniSync, for the fall survey administration. Two institutions opted out of this process due to data sharing concerns. SNAAP shared a total of 1,044,084 alumni records with AlumniSync, but of those, 796,959 records had an email address on file. AlumniSync was able to verify 88% of those alumni’s email addresses as “safe” to send. In addition to verifying email addresses, AlumniSync found an additional 151,535 new email addresses for alumni who only had a ‘.edu’ address (which we know have low response rates from the pilot study), and for alumni with a missing or invalid email address. At the end of AlumniSync’s search, a total of 823,791 alumni were considered to have valid email addresses on file. However, it is important to note that a number of those addresses considered valid bounced back during the recruitment stage (see below); the response rate was adjusted accordingly. After the first survey invitations were emailed to all alumni with a valid email address on file, 42,884 email addresses initially bounced back.
AlumniSync ran an additional validation check and was able to replace 13,999 of those emails with new emails considered safe to send.
III.B Response Reminders and Incentives
-
III.B.i Response Reminders
Throughout the survey administration, alumni received a series of recruitment messages via email. These messages included the link to the survey. Below is the full schedule of messages for the standard administration:
- Invitation: October 11 — 14
- Reminder 1: October 18 — 21
- Reminder 2: October 25 — 28
- Reminder 3: November 1 — 4
- Reminder 4: November 8 — 11
- Additional/final Reminder: November 15 — 22
Both CSR and SNAAP closely monitored participant response rates and engagement throughout the administration. CSR programmed daily response rate updates that SNAAP could use to track participation. The response rates following the initial invitation email were not quite as high as SNAAP and CSR expected, and given the historical knowledge that the first invitations tend to bring the highest number of responses than any other message, the team immediately brainstormed and implemented the following strategies for boosting response rates for the remainder of the administration:
- SNAAP developed and sent out a bulletin to institutional contacts to encourage them to promote the survey to alumni. The bulletin went out in between reminders 1 and 2, in order to prevent survey messages from getting lost in alumni’s inbox. The bulletin included text that schools could copy and paste into their social media accounts.
- SNAAP established a system with CSR that allowed for schools to text the survey link to their alumni if they had the permission and interest to do so. CSR was able to send the school a list of alumni who have not yet taken the survey and schools could text alumni directly, with guidance from an FAQ SNAAP developed.
- To increase reach, CSR programmed reminder 3 emails going out to “.edu” addresses to come from the “.edu” SNAAP email (snaapsrv@indiana. edu), and the reminder 3 emails going out to “.org” and “.com” addresses to come from the “.org” SNAAP email.
- For reminder 3, CSR conducted split-testing to understand whether a change in subject line impacts response rates. The original subject line was: Don’t miss out! Take part in a national arts alumni survey – FULL INSTITUTION NAME wants to hear from you! The revised subject line was: Don’t miss out – SHORT INSTITUTION NAME wants to hear from you
- The first batch of reminder 3 messages were scheduled to go out using plain text, rather than HTML. However, CSR noticed a drop in responses, consistent with the pilot, when using plain text. Because our IRB status (exempt) allowed for changes to the recruitment strategy, SNAAP decided to send the rest of the reminder 3 messages out in HTML formatting.
- As noted above, SNAAP allowed institutions to upload addendums to their population files to include any alumni who were not originally listed in the recruitment list.
-
III.B.ii Incentives
Approximately 60 participating schools offered incentives to their alumni to take the 2022 SNAAP survey. This is a substantial increase from previous survey years, when approximately 10 schools would offer an incentive. Incentives this year ranged from institutional swag offerings to drawings for iPods, Amazon gift cards, and event tickets.
In addition to school incentive offerings, SNAAP offered direct survey incentives of $30 e‑gift cards to alumni respondents of institutions typically underrepresented in the SNAAP survey, including 1 tribal college, 3 HBCUs, and 7 community colleges.
III.C Response Rate
In total, SNAAP received over 61,000 alumni survey responses, representing an 11.22% average institutional survey response rate. To calculate response rates, SNAAP uses two processes:
- To calculate response rates for individual institutions, SNAAP uses the AAPOR RR6 method (The American Association for Public Opinion Research, 2016). The numerator includes all respondents (partials and completes). The denominator includes all of the population that was contacted, meaning that alumni who did not have an institution-provided email on record, had an email that bounced back, never responded to the email provided by our search vendor, or were already included in another institution’s sample, were removed from the population count (the denominator).
- To calculate response rates for the overall administration, SNAAP averages the response rates from each participating institution (calculated per the above description). The rationale for this approach is because there is a wide variety of response rates across institutions, with notable patterns related to institution size. Larger institutions generally have lower response rates, a finding which is consistent in higher education research using surveys across multiple institutions (for example, see the 2023 National Survey of Student Engagement (NSSE) Overview and NSSE 2023 U.S. Response Rates by Institutional Characteristics). Given this pattern, the use of an institutional average or median better reflects the bigger picture of the respondent behaviors. Applying the AAPOR RR6 method to the entire contacted population would bias the response rate in the direction of the larger institutions, as their greater share of non-respondents would take up a disproportionate amount of the denominator, as compared to smaller institutions.
In addition to the overall response rate, CSR and SNAAP tracked response rates in various ways to learn about the effectiveness of recruitment strategies. For instance, compared to the overall adjusted response rate, the response rate for the small group of alumni who were added via addendum population files during the survey administration was about 18%. Additionally, 87% of survey responses came from contacting the alum’s primary email address, 6% from their secondary email, and 6% from an additional email that AlumniSync identified during the validation check. 34% of responses came from the initial survey invitation email. As noted earlier, response rates by recruitment message typically trickle down, with a bump during the final reminder. We did see this trend; however, we also saw a substantial increase in responses from the reminder 2 message, as 23% of the total survey responses came from reminder 2. This increased rate could be attributed to the additional bulletin SNAAP sent out between reminders 1 and 2. Based on incentive type, we noticed a two-percentage point difference between participants who did not receive an incentive and those who did (i.e., slight increase for those who did receive an incentive).
III.D Data Considerations and Preparations
Upon the close of the standard survey on November 28th, 2022, CSR compiled survey responses and shared responses with SNAAP about one month later. After receiving the data, SNAAP implemented a strategic and intensive data cleaning period, guided by insights from previous administrations, considerations of the new questionnaire structure, and needs related to creation of Institutional Reports. Below, we note a few important considerations and preparations SNAAP implemented, as these are especially relevant to the creation of the national, aggregated dataset:
- Alumni respondents from the pilot survey were merged into the larger sample of alumni respondents from the standard survey administration. Given slight revisions made to phrasing of select survey items and response options in-between the pilot and standard administration, items were cleaned to match the phrasing of the standard administration. Pilot participants are flagged in the dataset using the variable, “AdminType” in which 0 represents pilot cases, and 1 represents standard administration cases.
- Given the complex nature of skip logic in the 2022 questionnaire, a great amount of time and attention was devoted to:
- Ensuring that the skip logic worked correctly in routing the correct subgroups to the correct questions
- Using value labels to indicate instances in which subgroups of alumni were filtered out of a question for any given reason. When only one layer of skip logic influenced a particular item, the use of one value label: ‘-1’ for alumni who did not receive the question, was included. When several layers of skip logic were in use, multiple negative values were used to distinguish different reasons why subgroups of alumni did not receive the given item. The survey questionnaire PDF includes the full set of reasons why alumni did not receive each question, and the data should be interpreted according to skip logic.
- Due to the large variation in CIP codes provided for participating alumni, SNAAP created a variable to categorize areas of study in a more condensed manner. Consistent with previous administrations, SNAAP created a variable, called ReportArtsMajor, which has only 25 categories. SNAAP used previous categorization lists to group CIP codes, based on their official program titles and descriptions, into ReportArtsMajor categories. A very small percentage of alumni (6%) were assigned a CIP code in the school-reported population files that was outside of SNAAP’s sampling frame for arts or design alumni. For the ReportArtsMajor categorization, if alumni had a non-arts or design CIP code on file, but had an arts or design major on file, the arts or design major was used to group alumni into a ReportArtsMajor category. Thus, only 2% of the sample were assigned a non-arts or design major for ReportArtsMajor. In most cases, these 2% of cases are not clearly labeled as arts or design, but likely graduated from a program that was adjacent to arts or design, included arts or design-related courses, or was otherwise considered to be within the arts or design according to their institution.
IV. Sampling Design and Weighting
IV.A CIP Codes
Through a collaborative process with postsecondary administrative leaders and NORC at the University of Chicago, SNAAP identified the following Classification of Instructional Programs (CIPs) from the U.S. Department of Education’s National Center for Educational Statistics’ Integrated Postsecondary Education Data System (IPEDS) to serve as the basis of degree program areas of focus to create weights for its aggregated national dataset:
IV.B Recruitment of Institutional Participants
SNAAP led an extensive recruitment process to secure the participation of institutions in the 2022 survey. In the year leading up to survey administration, SNAAP developed a comprehensive marketing and communications plan, launched a new public-facing brand and website, and published a SNAAP Casebook featuring testimonials from administrators, staff, and faculty from previously participating institutions, to bolster recruitment efforts and emphasize the survey’s value to constituents (Strategic National Arts Alumni Project, 2021). SNAAP focused recruitment efforts on institutions selected for the sample from the sampling frame (further details are in Appendix D, “Sample Weighting Procedures” of this report). Registration for survey participation occurred between September 2021 and early August 2022 and recruitment efforts included direct email outreach, phone calls, and follow-up emails from members of the SNAAP staff and board. Multiple email campaigns advertising SNAAP participation were sent to the SNAAP mailing list, and direct outreach was made to several professional associations to reach larger networks of prospective participants. Throughout the recruitment process, a detailed tracking process was implemented to keep track of the level of difficulty in recruiting each institution, and institutions were categorized as easy, mid-level, or difficult accordingly. Many institutions were granted free or reduced-cost participation to reduce barriers to participation, and discounts were also available at request to all institutions expressing financial hardship, regardless of sample frame selection.
V. Considerations in Context of Prior SNAAP Surveys
This section serves to highlight several matters relevant to the 2022 SNAAP survey administration, data, and analyses in relation to prior SNAAP survey administrations, data, and analyses. Given the notable changes made to the sampling design and updates made to the questionnaire, data from the 2022 survey administration are not directly comparable to data from prior SNAAP survey administrations. Key changes made include, but are not limited, to the following:
- SNAAP’s first build of a sampling strategy and survey weights to bolster generalizability of insights.
- Several subsets of questions on the 2022 SNAAP questionnaire are asked only of those who graduated within the last 25 years from the participating institution.
- Modifications made to the question wording and order used in prior SNAAP administrations.
- A specific change of note is that participating institutions had the option to have select questions reference either the institution, a specific college or school within the institution, or a specific department within the institution from which alumni participating in the survey had studied or trained (see references to “[INSTITUTION2]” within Appendix A), whereas references were only made to the institution in prior SNAAP administrations. Based on insights from prior administrations and engagements with institutions throughout the 2022 SNAAP recruitment process, it was apparent that alumni may hold different perceptions of their overall post-secondary institution than of the specific college, school, or department in which they studied or trained.
Additional reference documentation on prior SNAAP survey administrations is available at www.snaaparts.org
Appendices
References
Bentrim, E. M., & Henning, G. W. (Eds.). (2022). The Impact of a Sense of Belonging in College. Sterling, VA: Stylus Publishing.
Buchholz, L., Fine, G. A., & Wohl, H. (2020). Art markets in crisis: how personal bonds and marketsubcultures mediate the effects of COVID-19. Am J Cult Sociol, 8(3), 462 – 476. doi:10.1057/s41290-020 – 00119‑6
Daikeler, J., Bošnjak, M., & Lozar Manfreda, K. (2019). Web Versus Other Survey Modes: An Updated and Extended Meta-Analysis Comparing Response Rates. Journal of Survey Statistics and Methodology,8(3), 513 – 539. doi:10.1093/jssam/smz008
de Koning, R., Egiz, A., Kotecha, J., Ciuculete, A. C., Ooi, S. Z. Y., Bankole, N. D. A., … Kanmounye, U. S. (2021). Survey Fatigue During the COVID-19 Pandemic: An Analysis of Neurosurgery Survey Response Rates. Front Surg, 8, 690680. doi:10.3389/fsurg.2021.690680
Morgan, R. E., Dragon, C., Daus, G., Holzberg, J., Kaplan, R., Menne, H., … Spiegelman, M. (2020). Updates on Terminology of Sexual Orientation and Gender Identity Survey Measures. Retrieved from https://nces.ed.gov/fcsm/pdf/fcsm_sogi_terminology_fy20_report_final.pdf
Munnelly, K. (2022). Motivations and Intentionality in the Arts Portfolio Career : An Investigation into How Visual and Performing Artists Construct Portfolio Careers. Artivate: A Journal of Entrepreneurship in the Arts, 11(1). https://doi.org/10.34053/artivate.11.1.163
Novak-Leonard, J. L., Dempster, D., Scotto Adams, L. A., & Walters, E. (2022). Data, Pressing Needs, and Biggest Challenges: Insights from the Field. Retrieved from Austin, TX: https://snaaparts.org/uploads/downloads/2022-SNAAP-Focus-Group-Report-with-cover.pdf
Skaggs, R. (2023). Socially distanced artistic careers: Professional social interactions in early, established, and late career stages during COVID-19. Poetics, 101769. https://doi.org/10.1016/j.poetic.2023.101769
Skaggs, R., Hoppe, E. J., & Burke, M. J. (2022). Out of Office: The broader implications of changing spaces and places in arts-based work during COVID-19 pandemic. In I. Woodward, J. Haynes, P.Berkers, A. Dillane, & K. Golemo (Eds.), Remaking Culture and Music Spaces: Affects, Infrastructures, Futures (pp. 88 – 101). Oxfordshire, England: Routledge.
Strategic National Arts Alumni Project. (2021). Casebook: Interviews from the Field. Retrieved from Bloomington, IN: https://snaaparts.org/uploads/downloads/snaap-casebook-2022.pdf
The American Association for Public Opinion Research. (2016). Standard Definitions: Final Dispositions of Case Codes and Outcome Rates for Surveys. Retrieved from https://www-archive.aapor.org/AAPOR_Main/media/ publications/Standard-Definitions20169theditionfinal.pdf
U.S. Census Bureau. National Survey of College Graduates (NSCG). Retrieved from https://www.census.gov/programs-surveys/nscg.html
The Authors
Dr. Jennifer L. Novak-Leonard
Research Associate Professor, College of Fine and Applied Arts at the University of Illinois Urbana-Champaign
Dr. Deanna Ibrahim
Director of Research Services, SNAAP
Lee Ann Scotto Adams
Executive Director, SNAAP
Dr. Angie L. Miller
Associate Research Scientist, Indiana University Bloomington and SNAAP Senior Scholar
Dr. Rachel Skaggs
Lawrence and Isabel Barnett Assistant Professor of Arts Management at The Ohio State University
Dr. Shanita Bigelow
2022 – 2023 Postdoctoral Fellow, Arts Impact Initiative, University of Illinois Urbana-Champaign
01.12.26
Special Report 2026
Sense of Belonging Experiences of Arts and Design Alumni
04.09.24
Special Report 2024
Arts and Design Alumni Employment and Perspectives on Their Work and Careers
04.01.24
Special Report 2024
Reflections: Alumni Perspectives on their Postsecondary Experiences in Arts and Design
03.11.24
Special Report 2024
The Impact of the COVID-19 Pandemic on Arts and Design Alumni