A cottage industry has sprung up around the academic job search process, as evidenced by numerous consulting services and guidebooks, such as The Professor Is In (Kelsky, 2015), suggesting that some graduate students are finding insufficient support from their institutions and degree programs. Little is known about the scale and scope of academic career preparation and professional development programs offered by universities. Even less is known about the difference such programs have on student career readiness or which program elements have positive short- and long-term effects on their academic career path success (Diggs et al., 2017; Schram et al., 2017) because graduate student professional development program evaluation data are rarely shared beyond one’s institution.

Instead, data about such professional development offerings for graduate students, if even collected, are commonly reported internally as part of local institutional or departmental program evaluation activities (e.g., student exit surveys). This reduces the opportunity for new and experienced educational developers tasked with developing such programs to engage in cross-program and cross-institution comparison in order to identify high-impact practices and models best suited for their institutional needs. This study primarily seeks to work toward filling this gap and advance our understanding of the value of such programs by providing participant perception and employment data as well as program design data about a centrally offered academic careers preparation program at a research-intensive university. The program description and results provided from this longitudinal case study can be used to inform the evidence-based decision-making processes of educational developers and administrators from other institutions seeking to develop or improve their academic career preparation program offerings for graduate students.

Background

In the 1960s, researchers began to describe and name the concept of a “hidden curriculum” (behavioral expectations that are not related to intellectual development) to learning in social environments, such as K–12 schools (Jackson, 1968). In the 1980s and 1990s, sociologists began to re-examine the graduate degree experience through a socialization process lens. Specifically, efforts were made to parse out the hidden curriculum of graduate education in sociology (Margolis & Romero, 1998) and examine how the socialization process of graduate students into academic culture and academic career paths (Gaff & Lambert, 1996) created barriers, reducing the diversity of people and ideas included and innovation in the field of study. More recently, researchers such as Gardner (2008) and Bagaka et al. (2015) pointed to doctoral program attrition (and thus academic career employment) being related to the lack of socialization in a given doctoral disciplinary program or the disconnect between a student’s “fit” with the socialization pattern of a given discipline or department/institution. Gardner (2008) found evidence that this was particularly salient for “women, students of color, students with [children], part-time students, and older students.”

A large-scale effort to work toward the diversification of the academy and prepare future faculty to be successful academy members and instructors was the Preparing Future Faculty (PFF) initiative. PFF was led by the American Association of Colleges and Universities and Council of Graduate Schools and spanned from 1993 to 2012 (Applegate, 2002; Preparing Future Faculty, n.d.). The PFF initiative produced locally designed and implemented PFF programs at hundreds of institutions in the United States and involved hundreds more as partnering institutions through the mid-2010s. These programs aimed to socialize graduate students in the value of undergraduate education, the roles and responsibilities of being a faculty member, and, in particular, intentionally train future faculty participants by developing their pedagogical knowledge and skills (Preparing Future Faculty, n.d.).

While evaluation reports from individual participating PFF institutions reported positive outcomes (e.g., DeNeef, 2002) and guidelines for designing and assessing PFFs were created (Winter et al., 2018), generalizable outcomes from across participating PFFs remain less clear due to a lack of standardization for program content, format, and assessment (Diggs et al., 2017). As the PFF initiative expanded over two decades, official PFFs or unofficial spin-offs (e.g., Preparing for the Professoriate) were found to take many programmatic formats (e.g., a single event vs. an ongoing learning community) and involve different degrees of collaboration with partners from different types of institutions (e.g., participant teaching-related experiences at or speakers only from a research-intensive university vs. a community college, a small liberal arts college, etc.). Additionally, programs evolved from the original PFF model goals to institution- or department-specific goals, often focused on increasing faculty diversity and improving inclusion (Diggs et al., 2017).

Since the PFF movement started nearly three decades ago, the landscape of higher education has changed dramatically (Finkelstein et al., 2016), raising the question of how PFF-type programs should adapt to these changes in higher education and evolving student needs. Among the many changes in higher education, three have been identified by Rozaitis et al. (2020) as being particularly relevant to PFF-type program design and implementation. The first change is the expectation and integration of advanced technologies in teaching and learning. If a core learning outcome of PFF-type programs is to prepare graduate students to be successful instructors, then programs should include experiential learning and theory related to emerging technologies (e.g., learning analytics, active learning classrooms). The second change is the decline in traditional tenure-track positions due to expansion and reliance on a contingent workforce for core instructional operations. As competition for a limited number of tenure-track positions continues to rise, terminal degree seekers (e.g., doctoral students) across disciplines, who primarily are seeking tenure-track positions, are increasingly making plans for non-academic career paths with their doctoral degree credentials or seeking alternative career paths (alt-ac) within the academy. Third, there has been a shift in how institutions address issues of diversity, equity, and inclusion (DEI) in the professoriate, including calls to diversify the professoriate through structural and social changes (Martinez-Acosta & Favero, 2018). As institutions around the world increasingly prioritize DEI efforts, PFF-type programs need to not only incorporate research-based practices to recruit and support a more diverse set of participants but also incorporate content addressing these topics from a role and responsibilities of a faculty member perspective.

Purpose of the Study

This study seeks to advance our understanding of the value of professional development programs designed to prepare doctoral and MFA students for academic careers. In this article, we describe one institution’s multidisciplinary cohort-based year-long Preparing for Academic Careers program for graduate students near the end of their doctoral or MFA degree and present results from our decadal program evaluation (2011–2021). Program outcomes and lessons learned may inform others designing similar programs and suggest future directions for such programs. Guiding research questions as we evaluated our program included:

  1. Is the program successful as measured by employment of alumni in academic positions?

  2. What are the key program components (interventions) that are perceived by participants as being the most valuable and why?

  3. Is the institution-level multidisciplinary design of the cohort perceived by participants as valuable? (In other words, is a centralized, non-discipline-specific program useful or should we consider shifting to supporting a network of decentralized, discipline-specific programs?)

  4. What needs to change moving forward to continue meeting our student needs?

Program Design

Setting and Participants

The program was designed and implemented at a large research-intensive public university located in southeastern United States. The institution currently has approximately 39,000 students enrolled at all campuses at the beginning of Fall 2021, of which almost 11,000 are graduate students seeking degrees (29% of all enrolled students). According to the university’s Office of Institutional Effectiveness and Planning (2021) data, there were about 40% full-time graduate students and 60% who identified as female (response options only included “male” and “female”). The same graduate population was divided into a diverse representation of students, including 45% White American; 15% Non-Resident Alien; 11% Asian American; 11% African American; 9% Hispanic American; 7% whose race was not reported; 1% who identify with two or more ethnicities; and less than 1% Native American, Pacific Islander, and “foreigners abroad” combined.

The program itself included 154 participants in 10 cohorts between AY2011–2012 and AY2020–2021. However, participant demographic data could only be obtained from institutional data for 145 of the participants. While the program is open to students nearing completion of either a MFA or a doctoral degree, the overwhelming majority of participants (94%) were seeking a doctoral degree. The majority of participants (62%) self-identified through graduate admissions data as female (38% as male; it is unclear whether other options were provided for respondents at the time) and as White (58%, with 28% unknown, 6% Black, 3% Asian, 3% two or more ethnicities, 1% Hispanic, and 1% Pacific Islander). The age of participants while enrolled in the program varied, with the majority 25–40 years in age (37% aged 25–30, 36% aged 31–40, 18% aged 41–50, 9% aged 51–60, and 0% aged 60+). Participants came from across colleges and schools within the institution, with the majority seeking degrees in non-STEM disciplines (68% non-STEM, 32% STEM). At the time of this study (July 2021), the majority (68%) had been awarded degrees (91 PhD, six MFA, and one DMA).

Theoretical Framework

The program design is broadly rooted in social constructivist theory that explains learning processes as being a function of an interaction between the learner, others, and the environment (Bandura, 1986). As such, a cohort-based structure was selected because cohort-based curriculum models create an intentional community social structure that supports the development of participant feelings of connectedness and facilitates learning (McCarthy et al., 2005; Swayze & Jakeman, 2014). A multidisciplinary cohort model, one in which the cohort is composed of students from across colleges and schools within the institution, was selected because disciplinary diverse learning ecosystems have been shown to increase information exchange and decision-making (Gruenfeld et al., 1996) and prepare students to see the whole academy. The program design is also rooted broadly in theories of experiential learning in which students learn by engaging in practice (Dewey, 1938). Namely, program activities were designed to involve students working in real, applied, and personal scenarios in which they produced workplace-related materials through a reflective process (e.g., producing a job application portfolio for an actual position). Lastly, program activities also included aspects of reciprocal peer learning (Boud et al., 1999), in which students teach and learn from one another (e.g., pairs giving each other written or oral feedback on oral presentations and written products), also founded on socio-constructivist theory.

Program Description

As the program name suggests, primary program goals included increasing graduate student knowledge of the academy and academic career paths (e.g., types of academic institutions and their general organization, the variety of academic career paths, what life as a faculty member is like, and strategies for transitioning from a student to a faculty role); pedagogical knowledge and skills articulating teaching philosophy and pedagogical approaches; and knowledge of the general academic job application and selection process (and how to increase their communication skills and distinguish themselves in the academic job market). The primary output of the program was the production of an academic job application portfolio (cover letter, curriculum vitae [CV], teaching philosophy statement, research statement, sample course syllabi). From the start in 2011, the program was designed as a multidisciplinary cohort model in which students applied from across the institution’s colleges and schools, with limited spaces available for each academic unit to ensure a disciplinarily diverse cohort annually. Cohorts ranged in size from 9 to 19, with an average of 16. To qualify, students had to be in their last or second to last year of their doctoral or MFA degree (if considered the terminal degree for a given discipline) at the time of the program’s start in order to maximize alignment with job application timing. The application typically consisted of degree progress questions, a statement of interest, and an advisor letter of support. Applications were reviewed by a committee of faculty and administrators who prioritized disciplinary and demographic diversity of the cohort, yielding an average annual acceptance rate of 69% (53%–90% range). Students were not selected if they submitted an incomplete application, were too early in their program, or their discipline was overrepresented in the applicant pool. While the operationalization of the core program goals may have varied slightly over the 10 years reviewed in this study, the program followed a consistent year-long (fall and spring semesters) format, with the fall semester as a one-credit course that was paid for by the Office of the Provost and the spring semester being a required, but non-credit bearing part of the program. The program is described in more details in Table 1.

Table 1.

Preparing for Academic Careers Program Summary

Semester

Typical format

Key topics (fall semester or across semesters)

Fall

(1-credit course)

Regular meetings (3 hours; every 2–3 weeks)

Attend the institution’s pedagogy conference

Assignments:

academic job application portfolio (CV, teaching philosophy statement, research statement)

In-class activities: discussion; guest speakers; peer review (teaching philosophy, syllabus, etc.); sharing example job ads or interview questions; bring example of scholarship of teaching and learning in your discipline

Readings

Discussion board posts

Higher education in the 21st century

Job application process, interviews, and job talks

Preparing to teach, perform scholarship, and engage in service

Teaching philosophies, course planning, and classroom management

The scholarship of teaching and learning

Creating a competitive academic record

Faculty roles across institutions

Spring

(no course credit)

One-on-one consultations

Mock interviews or job talks

Meetings/panels/workshops on cohort-requested topics (e.g., grant writing, alumni panel)

Celebratory luncheon with provost or associate provost*

Well-being and the academic career

Professional branding and networking

Writing an effective CV

Writing effective cover letters and research statements

  • until COVID-19 disruption in 2020

Methods

A descriptive mixed methods case study approach was used, with the program as the unit of the case (Yin, 2014) to gain a more in-depth understanding of participant success and perceptions. Publicly available data about past participants’ employment and alumni survey data (Lukes, 2021) were the main sources of data analyzed, as interviewing each student was cost prohibitive. A convergent design was used for the mixed methods approach, in which qualitative and quantitative data were analyzed separately and then merged as a “new” data set that was further interrogated and interpreted using the research questions listed in the “Purpose of the Study” section. This method also strengthened the trustworthiness of interpretations through data triangulation.

Program Alumni Employment Data

Alumni employment was determined through systematic searches of two public data sets (LinkedIn and the internet). A search for each participant’s name on LinkedIn was conducted by the second author to determine their current employment position. If the alumni did not appear on a LinkedIn search, then a Google search was conducted to locate the alumni’s current employment position. A participant’s position was recorded as “not found” if it was not identified through either search. These data were then qualitatively coded by the second author using the following categories: academic (positions at postsecondary institutions, including assistant professor, associate professor, administrative faculty or staff [e.g., program coordinator], full-time instruction, full-time research, part-time instruction, part-time research, part-time administration); government (K–12 positions also included in this category); non-profit; industry; and not found. The data coded as “academic” were further coded into categories of “position role” deduced from recorded position titles. All classifications were reviewed by the first author, and any discrepancies between researchers were discussed until agreement was reached. Categories were totaled for frequency counts and population percentages (Table 2).

Table 2.

Employment Status of Program Alumni Awarded Degrees at Time of Study (n = 98)

Primary occupation

Number of alums (% of all alumni)

Academic role category

Number of alums (% of alumni with academic positions)

Academic role subtypes

Number of alums (% of alumni with academic role category)

Academic

52 (53%)

Assistant professor

16 (31%)

Associate professor

3 (6%)

Administrative faculty or staff

7 (13%)

Full-time instruction (e.g., term faculty)

4 (8%)

Full-time research

15 (29%)

Researcher/ associate

Postdoctoral researcher

8 (50%)

8 (50%)

Part-time instruction (e.g., adjunct)

7 (13%)

Part-time research

N/A

Part-time administrative

N/A

Industry

16 (16%)

Government

10 (10%)

Non-profit

6 (6%)

Not found

10 (10%)

Listed as graduate student role

4 (4%)

Program Alumni Survey Data

A 24-question survey (see Lukes, 2021) was created in Qualtrics that included both quantitative (e.g., Likert-like scales) and open-ended response questions and was sent to all alumni that had a working email (140, or 91% of all 154 alumni) in the summer of 2021. Broadly, the alumni survey focused on retrospective participant perceptions of program benefits and present-day reflections identifying any long-term benefits that occurred after program participation. The survey asked alumni to rate the overall helpfulness of the program in clarifying and reaching their professional goals and applying for academic jobs; how helpful individual program elements were in preparing them for academic positions; and how participation impacted their sense of belonging in the academy and their discipline, their confidence as an instructor and researcher, their ability and comfort communicating their work, and their interest in cross-disciplinary work. The survey also included questions exploring participant engagement with the program cohort community over time and optional demographic questions (gender, ethnicity, discipline, and cohort). Responses to quantitative questions were analyzed using basic descriptive statistics (frequency counts, percentage). Responses to open-ended questions were qualitatively analyzed using a multi-step coding process that consisted of initial concept codes that were code mapped into categories and then pattern coded, which evolved into metaphorical themes (Miles et al., 2014; Saldaña, 2016). These quantitative and qualitative data sets were then examined as a single data set multiple times, each time applying the lens of one of the research questions individually.

Results and Discussion

Current employment data were found through public data sets for 87% (n = 136) of the 154 program alumni. This data set was further reduced to examine only those who had been awarded degrees at the time of this study (n = 98; Table 2). A quarter of program alumni responded to the alumni survey, which is considered to be a high response rate for surveys. The survey respondent population (n = 39 or 25%) was closely representative of the total alumni population in terms of demographics (described previously). Out of all the respondents, 56% self-identified as White and 54% as women. Due to staff capacity limitations, survey data were not analyzed by gender or ethnicity. Each cohort was evenly represented in the respondent pool, and there was representation across the different colleges and schools, with the majority of respondents (59%) coming from non-STEM disciplines. The self-reported primary occupation of all survey respondents was similarly representative of the larger alumni population and included academic (54%), government (8%), non-profit (13%), industry (5%), and no response (13%). Participants also self-reported their current position as full-time tenure-track/tenured (15%), full-time non-tenure track (15%), postdoctoral (8%), other academic (10%), staff position at university or college (10%), part-time non-tenure (5%), non-academic (18%), and no response (18%). Broadly, the responses to quantitative questions on the alumni survey indicate that participants found the program beneficial retrospectively, with all reporting it was helpful to some degree for clarifying and reaching their professional goals, including 67% and 59% reporting it was “very helpful” for clarifying and reaching goals, respectively. Additionally, when asked if they would recommend the program to others, 85% of survey respondents answered “yes.” Additional survey results are discussed in this section and are reported in Figure 1 as well as in Tables 2, 3, and 4.

Figure 1.
Figure 1.
Figure 1. Participant Survey Responses About Program Impact

Is the Program Successful as Measured by Employment of Alumni in Academic Positions?

From the publicly available data sets, the majority of all program alumni (58%, n = 154) were found to be currently holding an academic position, 74% of which could be identified as full-time positions. When the data were reduced to only those alums who had also been awarded degrees at the time of this study (n = 98), similar results were found, with 53% currently holding academic positions (Table 2). However, only 20 of these (38% of alums with academic positions and awarded degrees; 20% of all the alums awarded degrees) were classified as holding postsecondary positions with the traditional role titles of assistant or associate professor.

A 2020 National Science Foundation (NSF) report documented that 40% of all doctorate recipients from U.S. institutions with definite employment commitments (excluding postdoctoral fellowships) reported that their job would be an academic one (National Center for Science and Engineering Statistics, 2021). Even with removing postdoctoral positions from the classification of “academic position” in this study’s data to align more closely with the NSF approach, 45% of participants who were awarded degrees are still classified as being in academic positions, suggesting that this program may offer some advantage when compared to national trends. However, it is difficult to compare with any certainty, as the national data do not distinguish between full-time and part-time positions or the nature of the academic positions (instructional, traditional tenure-track, staff, etc.).

It is also important to note that study data suggest the goals of some participants may have evolved away from seeking academic positions; for example, “I did not end up having a career in academia. However, this program helped me feel more confident in pursuing that path if I had chosen it.”

What Are the Key Program Components (Interventions) That Are Perceived by Participants as Being the Most Valuable and Why?

The qualitative analysis of the open-ended alumni survey questions yielded four major models of how alumni described the beneficial nature of the program: as an information source to develop self- efficacy; as a reflection tool to develop professional identity and goals; as a social network to provide feedback and emotional support; and as an experiential learning opportunity to practice and develop skills (see Table 3). These themes are further supported by the quantitative survey data, as discussed in the following subsections.

Table 3.

Participant Perceptions of the Program Benefits

Code

Description

Example

Number of coded responses

1. Program as an information source to develop self-efficacy

Participants report gains in confidence or comfort levels due to gaining knowledge about the job application process, specifics of job application materials, or through skill development.

“It gave me confidence during my job search by equipping me with the right knowledge and training to face the job market.”

16

2. Program as a reflection tool to develop professional identity and goals

Participants report acts of reflection or developing a better sense of their professional identity, aspirations, or perspectives on career paths in the academy.

“The program helped me expand my goals and visions of wanting to be in the academic field including other disciplines.”

13

3. Program as a social network to provide feedback and emotional support

Participants report feelings of support from peers or the instructor or describe activities that are supportive in nature (e.g., troubleshooting scenarios, people sharing personal journey stories).

“One of the greatest benefits of the program is the sense of community during arguably the most challenging phase of one’s PhD life: the stress of finishing up one’s dissertation, looking for a job, and sometimes also taking care of one’s family. Having the encouragement from one another was a tremendous help during this period.”

6

4. Program as experiential learning opportunity to practice and develop skills

Participants report developing a skill or quality of writing product (e.g., job application materials) as a result of participation.

“I took the lessons learned (and materials generated) from the mock application exercise and the interview conducted and applied it directly to my own job search.”

13

Program as an Information Source to Develop Self-Efficacy

Self-efficacy is rooted in social cognitive theory and is commonly thought of as a person’s belief or confidence in their ability to behave in ways to yield specific performance outcomes (Bandura, 1977, 1986, 1997). As a construct, self-efficacy is thought to vary according to contextual circumstances. In this context, we’re exploring self-efficacy in terms of participants’ belief in their abilities to be a qualified candidate for consideration for academic job positions and become a professional member of the academy.

Evidence of self-efficacy as expressed through comments about developing confidence (e.g., “gave me the confidence”) or feelings of increased preparation (e.g., “prepared me for”) were most common across open-ended question responses. These feelings of confidence and preparation were often explicitly linked to an increase in knowledge gained from the program’s activities. Several respondents also alluded to the program revealing and filling in gaps of the hidden curriculum of academic professionalization (Gaff & Lambert, 1996; Margolis & Romero, 1998) in comments such as “[the program] provides content not found anywhere else in a PhD curriculum.”

This notion of the program serving as an information source that in turn built participants’ sense of academic self-efficacy is further supported by the alumni survey data. Knowledge-building activities were identified by alumni survey respondents as among the most helpful program elements in preparing them for an academic position. Specifically, 74% identified cohort meetings and 69% identified reading materials as being a most helpful program element (Table 4). This was in response to a “check all that apply” question, so it’s difficult to interpret which activity was valued more or less by participants, but the high selection of each item suggests opportunities to gain knowledge were highly valued by participants. Lastly, the majority of alumni (72%) reported that they felt “completely” or “mostly” prepared to apply for academic jobs based on what they had learned from participating in the program.

Table 4.

Which of the Following Was/Were the Most Helpful for Preparing You for an Academic Position?

Program elements

Number of respondents who selected program element (multiple response question)

Percentage of total respondents (n = 39)

Creating an academic career portfolio: cover letter, CV, etc.

35

90

Cohort meetings

29

74

Course reading materials

27

69

Individual meetings w/ instructor

19

49

Peer review activities

16

41

Attending the Innovations in Teaching & Learning conferences

12

31

Conducting an aspirational interview/colleague conversation

10

26

Practice job talk

9

23

Mock job interviews

8

21

Buddy groups or informal conversations with classmates

5

13

Other

5

13

Discussion board activities

4

10

No response

0

0

Program as a Reflection Tool to Develop Professional Identity and Goals

Several respondents to the alumni survey described the program’s value in terms of the self-reflection evoked by program activities, as evidenced in statements such as “[the program] helped me clarify my goals” and “it really helps people conceptualize where they want to go after they finish their program.” There is also evidence that self-reflection also resulted in re-evaluation of professional identity, as illustrated by one participant’s comment: “the program caused me to feel less prepared, less capable, and less like I belonged in the academy.” Quantitative questions from the alumni survey provide additional evidence that participants developed their professional identity and goals. When alumni were asked how helpful the program was in helping them clarify their professional goals, all respondents reported it was helpful to some degree, with 67% reporting it was “very helpful.” When asked how participation in the program had impacted their sense of belonging in the academy, most of the respondents (69%) reported an increased sense of belonging (Figure 1).

Program as a Social Network to Provide Feedback and Emotional Support

Several of the respondents to the alumni survey described the program as valuable in terms of providing a supportive community environment, as illustrated in statements such as “the program was … surprisingly ‘comforting’ ” and “one of the greatest benefits of the program is the sense of community during arguably the most challenging phase of one’s PhD life.” Some responses were more subtle and alluded to feelings of trust in the shared environment, such as “[the program] helped clarify so many basic things that I was nervous about and often too shy to ask about.” Quantitative alumni survey questions appear to also support the idea of community building. Many of respondents (49%) reported that they have stayed in contact with at least one of their cohort members after the program, suggesting that the program activities facilitated the development of peer relationships and a community beyond the scope of the program itself. Additionally, “cohort meetings” were identified by the majority of alumni (74%) as being among the most helpful program activities for preparing them for an academic position (Table 4). Interestingly, “buddy groups or informal conversations with classmates” was not commonly reported by alumni as valuable for preparing them for an academic position (13%; Table 4). This may suggest that the “public” sharing that happened in the community conversations was an essential program design element that helped create and build relationships between participants. However, the question specifically asks participants to rate how helpful individual program activities were with regard to preparing them for an academic position. It is possible that buddy groups or informal conversations were helpful for emotional support and practice but not helpful for the actual duties of an academic position.

Programs as an Experiential Learning Opportunity to Practice and Develop Skills

Less emergent in the qualitative data than the other models, but still notably present, was that a few respondents to the alumni survey described the program as valuable in terms of practical experience—specifically mentioning the job application material assignments (CV, teaching philosophy statement, mock interview, etc.). For example, one respondent stated, “I took the lessons learned (and materials generated) from the mock application exercise and the interview conducted and applied it directly to my own job search.” The practice of preparing these professional documents was not just an exercise but resulted in usable documents within the academic community beyond the program/course. In the quantitative alumni survey questions, however, this value model is more widely supported as “creating an academic career portfolio (CV, etc.)” and was identified by almost all alumni (90%) as being among the most helpful program activities for preparing them for an academic position (Table 4).

Is the Institution-Level Multidisciplinary Design of the Cohort Perceived by Participants as Valuable?

The program was designed on the assumption that multidisciplinary environments foster innovation and would broaden participant understanding of their counterparts in other units, ideally fostering a future willingness (as faculty members) to seek out those in other disciplines for community and collaboration. When alumni were asked how the multidisciplinary nature of the cohort (graduate students across disciplines) was beneficial to them, if at all, responses ranged from comments around general interest (e.g., “it was interesting to see the variation”) to emphatic declarations of the value (e.g., “it was eye-opening”) to descriptions of clear outputs directly related to the program being multidisciplinary (e.g., “this program helped me collaborate and publish with individuals from different disciplines”). The dominant theme that emerged, however, was alumni reporting an expanded perspective directly attributable to the multidisciplinary nature of the cohort. Participants describe novel or expanded perspectives on how the academy is organized and works, the nature of one’s disciplinary work, connections of one’s discipline/work to other disciplines, and/or career paths. As one alum respondent put it, “it instilled in me the idea that the academy is more than the little silo I operate in.” The survey data also support this idea of an expanded perspective and appreciation for looking across disciplines. For example, when asked to look back on the program and how it impacted their interest in cross-disciplinary work, the majority of respondents (64%) reported increased interest in cross-disciplinary work. Others reported their interest stayed the same (33%), and only 3% reported it decreased their interest (Figure 1).

What’s less clear from the responses is how the multidisciplinary nature of the cohort manifested in the program and how other institutions could design programs intentionally to amplify the benefits of having a multidisciplinary cohort. One individual reported that having a multidisciplinary cohort helped them to “better communicate my field/discipline to an audience that was not familiar with my background,” suggesting that peer review activities are enhanced in a multidisciplinary cohort because the reader/listener is aligned with an authentic job application review process that involves experts from outside their subdiscipline.

What Needs to Change Moving Forward to Continue Meeting Our Student Needs?

When alumni were asked to make open-ended survey comments on the weaknesses of the program and what they wished the program addressed, responses concentrated on three major themes. First, not all participants were seeking or ending up in tenure-track positions, and the program would benefit from including “information on … positions outside of the tenure track” and “helping students learn how to pivot.” As one respondent noted more pointedly and several echoed, “with the decrease in available [tenure-track] jobs, it was a bit surprising that very little (if any) time in the program was devoted to full-time non-tenure track positions.”

Second, the timeline of the program didn’t necessarily coincide with the job application timeline of the individual participants. Responses describing such a misalignment were often paired with respondent suggestions to encourage participants to connect more with one another outside of and beyond the program, including across cohorts. As one respondent asked, “How do I reconnect and continue with [program] until I actually find the academic placement I so strongly seek?” Comments in this theme also alluded to a post-program feeling of unpreparedness, as illustrated by one participant’s response: “We were given all the tools to apply for academic jobs, but not the skills for dealing with the shrinking job market. It would be helpful to know what to do if you do not get a job the first year you are out on the market.”

Third, some participants wanted more discipline-specific content. As one individual lamented, “I wished the program would have separated scientific disciplines from non-scientific ones.” These types of responses illustrate the tension between participants seeking and valuing the affordances of a centrally offered (e.g., less power dynamics) and multidisciplinary (e.g., expanded perspective) program to prepare for academic careers but also seeking and valuing the affordances of what a discipline-specific program could offer. One individual directly describes this tension further as it also applies to the discipline of the instructor/facilitator: “Having my instructor … be in a similar but not directly related discipline was very important to expand my horizons, improve communication, and honestly help me feel more comfortable in the program. I would not have been able to grow as much if my instructor were from my program/discipline.” The program is designed to be complementary to the discipline-specific advice that participants should be receiving from their disciplinary advisor or department. It’s possible that responses that included requests for discipline-specific topics, guests, or examples are reflective of the gaps in their home departments’ or advisors’ support for doctoral students.

Discussion

Strengths, Limitations, and Future Studies

The key strengths of this study are the methodological approach, the response rate to the alumni survey, and the discovery rate for the employment status search in public records. The descriptive case study approach allowed for a more in-depth interpretation. The mixed methods approach combining multiple data sources collected using different methods also strengthened the trustworthiness of interpretations. Despite this richness of data, due to limits on staff capacity, we were not able to examine the data across demographic populations. As one of the goals was to diversify the professoriate, future studies could interrogate the data for differences in reported experiences across demographic populations and those that were in academic vs. non-academic positions after receiving their degrees.

The main limitation to this study is that there is no control group. Participating population data are not compared to the data of non-participating populations (e.g., those who applied and were not selected and the broader student population in the last year or two of their terminal degree program). Therefore, while we are able to report on the percentage of participants with academic employment, we are unable to conclude how the observed academic employment status levels of participants compares to those not enrolled in the program. Future studies could pull employment data from graduate exit surveys to compare academic initial employment success of program alumni and non-alumni for a given period of time.

Similarly, while we have some longitudinal impact insights from the alumni survey responses, we aren’t able to conclude that these are uniquely attributable to the participation experience due to the self-selecting nature of the program’s applicant pool. If the program was not available, would they have sought out comparable alternative support structures? Would they have been successful at finding them? One way to explore this in future studies would be to compare the employment of non-selected applicants to selected applicants to explore success differences. This was not possible in this study because the non-selected applicant pool across cohorts is very small, making it difficult to draw meaningful conclusions.

An important limitation is that only 64% to 69% of program alumni had been awarded their degrees at the time of this study (July 2021; uncertainty is due to degree award date data not being available for nine alumni). Those without degrees would not be eligible for many academic positions, so the rates of “success” as measured by academic employment may actually be higher than reported here. Future studies could isolate the employment data for those alums who have been awarded a degree to get better resolution on employment success.

Implications for Program Design

Key takeaways for others designing similar programs are to design a program that establishes and builds community, include diverse perspectives through a multidisciplinary cohort, and include experiential learning opportunities (e.g., job application portfolio, mock interviews) in addition to providing academic career pathway hidden curriculum content that increases students’ knowledge. We also recommend including some needs assessment questions in participant interest forms or program applications. Alumni survey data, as well as annual evaluation data, suggest that student needs have changed over time in response to a changing academic market. By including needs assessment questions up front, the program content can be adjusted for student needs, such as preparing for alternative or non-academic positions at the same time.

Strategic approaches to participant recruitment are also critical if a long-term program goal is to work toward diversifying the academy. For example, the applicant and participant pool was less diverse than the broader graduate student population. While efforts were made to maximize the diversity of the program cohorts from the pool of applicants each year, recruitment for the applicant pool itself relied on very limited and traditional systems of communication. The systems that were available included emails to deans and chairs to pass along to faculty, inclusion in opt-in newsletters (e.g., institution’s center for teaching and learning and graduate student organization), flyers in student union, and word of mouth. More recently, through the work of the institution’s Anti-Racist and Inclusive Excellence initiative, there is more information on who the graduate students are and more ways to reach them.

Implications for Program Assessment

While the annual evaluation data collected and analyzed for the program were helpful at evaluating short-term impact and informing program redesign for the following year, the iterative nature of our approach made these data of limited use for longitudinal assessment (thus not included in this study). We recommend others with aspirations to assess impacts of the program on participants over time to identify a core set of assessment questions that are consistent over time, paired with a smaller set of questions that are unique to the circumstances or curriculum innovations/interventions specific to that year. This approach will allow researchers to triangulate participant experience better and facilitate sharing information publicly across institutions. Relatedly, we recommend collecting non-university emails and employment status from consenting participants at the end of their program. This could increase the response rate for alumni surveys, as compared to using publicly available email addresses.

Future Directions for This Specific Program

The results of this decadal review are informing current decision-making conversations about the future of the program. Plans include expanding to develop a new complementary program, Preparing for Non-Academic Careers. The specifics are in the ideation stage.

Conclusion

Overall, the program has been a success, meeting its core objectives. A slightly higher percentage of participants are employed in academic positions than national trends. Participants report a range of perceived benefits, including an increased sense of belonging in the academy, comfort talking to others about their work, confidence as an instructor, and interest in cross-disciplinary work. Another contribution from this work is the production of a participant survey tool that could be used by others to compare data across institutions. Also, the metaphorical themes that emerged from the analysis of participant responses present a novel theoretical framework for educational developers to consider and apply in researching and evaluating future faculty programming and initiatives. New iterations of any future faculty program need to consider and respond to the changing needs of students and the changing landscape of the academy.

Acknowledgments

Thank you to Kim Eby (program founder) and Shelley Reid for their leadership of and contributions to the Preparing for Academic Careers program during its early years.

Biographies

Laura A. Lukes is a former Assistant Director of the Stearns Center for Teaching and Learning at George Mason University (2014–2021) and currently is an Assistant Professor at the University of British Columbia. Her work also appears in the Journal of Geoscience Education, Journal of College Science Teaching, and SPUR. She is an Albert Einstein Distinguished Educator Fellow (2010–2011) and Fellow of the Geological Society of America.

Lamis M. Ibrahim is a graduate student and a data analyst in the Office of Graduate Education at George Mason University.

Laurence Bray is the Associate Provost for Graduate Education at George Mason University, serving as Chair of the Graduate Council and providing leadership for the university’s portfolio of graduate activities. Prior to her role, she oversaw a wide range of educational and research initiatives as a faculty member and administrator in the Department of Bioengineering. She has received multiple awards for her dedication to student success and her work in innovation.

References

Applegate, J. (2002). Engaged graduate education: Seeing with new eyes (PFF Occasional Paper Series). Association of American Colleges and Universities, Council of Graduate Schools. https://www.academia.edu/68618661/Engaged_Graduate_Education_Seeing_with_New_Eyes_PFF_Occasional_Paper_Serieshttps://www.academia.edu/68618661/Engaged_Graduate_Education_Seeing_with_New_Eyes_PFF_Occasional_Paper_Series

Bagaka, J. G., Badillo, N., Bransteter, I., & Rispinto, S. (2015). Exploring student success in a doctoral program: The power of mentorship and research engagement. International Journal of Doctoral Studies, 10, 323–342. http://ijds.org/Volume10/IJDSv10p323-342Bagaka1713.pdfhttp://ijds.org/Volume10/IJDSv10p323-342Bagaka1713.pdf

Bandura, A. (1977). Self-efficacy: Toward a unifying theory of behavioral change. Psychological Review, 84(2), 191–215. https://doi.org/10.1037/0033-295X.84.2.191https://doi.org/10.1037/0033-295X.84.2.191

Bandura, A. (1986). Social foundations of thought and action: A social cognitive theory. Prentice Hall.

Bandura, A. (1997). Self-efficacy: The exercise of control. W. H. Freeman.

Boud, D., Cohen, R., & Sampson, J. (1999). Peer learning and assessment. Assessment & Evaluation in Higher Education, 24(4), 413–426. https://doi.org/10.1080/0260293990240405https://doi.org/10.1080/0260293990240405

DeNeef, A. L. (2002). The Preparing Future Faculty program: What difference does it make? (PFF Occasional Paper Series). Association of American Colleges and Universities, Council of Graduate Schools.

Dewey, J. (1938). Experience and education. Collier Books.

Diggs, A. B., Mondisa, J., & Scott, R. D. (2017, June 24–28). Toward a systematic review of the Preparing Future Faculty program initiatives [Paper presentation]. 2017 ASEE Annual Conference and Exposition, Columbus, OH, United States. https://doi.org/10.18260/1-2--29036https://doi.org/10.18260/1-2--29036

Finkelstein, M. J., Conley, V. M., & Schuster, J. H. (2016). The faculty factor: Reassessing the American academy in a turbulent era. Johns Hopkins University Press.

Gaff, J. G., & Lambert, L. M. (1996). Socializing future faculty to the values of undergraduate education. Change, 28(4), 38–45.

Gardner, S. K. (2008). Fitting the mold of graduate school: A qualitative study of socialization in doctoral education. Innovative Higher Education, 33(2), 125–138. https://doi.org/10.1007/s10755-008-9068-xhttps://doi.org/10.1007/s10755-008-9068-x

Gruenfeld, D. H., Mannix, E. A., Williams, K. Y., & Neale, M. A. (1996). Group composition and decision making: How member familiarity and information distribution affect process and performance. Organizational Behavior and Human Decision Processes, 67(1), 1–15. https://doi.org/10.1006/obhd.1996.0061https://doi.org/10.1006/obhd.1996.0061

Jackson, P. W. (1968). Life in Classrooms. Holt, Rinehart and Winston.

Kelsky, K. (2015). The professor is in: The essential guide to turning your Ph.D. into a job. Three Rivers Press.

Lukes, L. A. (2021). Preparing for Academic Careers alumni survey. George Mason University. https://mars.gmu.edu/handle/1920/12806https://mars.gmu.edu/handle/1920/12806

Margolis, E., & Romero, M. (1998). “The department is very male, very White, very old, and very conservative”: The functioning of the hidden curriculum in graduate sociology departments. Harvard Educational Review, 68(1), 1–33. https://doi.org/10.17763/haer.68.1.1q3828348783j851https://doi.org/10.17763/haer.68.1.1q3828348783j851

Martinez-Acosta, V. G., & Favero, C. B. (2018). A discussion of diversity and inclusivity at the institutional level: The need for a strategic plan. Journal of Undergraduate Neuroscience Education, 16(3), A252–A260. https://europepmc.org/article/med/30254540https://europepmc.org/article/med/30254540

McCarthy, J., Trenga, M. E., & Weiner, B. (2005). The cohort model with graduate student learners: Faculty-student perspectives. Adult Learning, 16(3–4), 22–25. https://doi.org/10.1177/104515950501600305https://doi.org/10.1177/104515950501600305

Miles, M. B., Huberman, A. M., & Saldaña, J. (2014). Qualitative data analysis: A methods sourcebook (3rd ed.). SAGE Publications.

National Center for Science and Engineering Statistics. (2021). Doctorate recipients from US universities: 2020 (NSF 22–300). National Science Foundation. https://ncses.nsf.gov/pubs/nsf22300/https://ncses.nsf.gov/pubs/nsf22300/

Office of Institutional Effectiveness and Planning. (2021). Enrollment by demographic [Dashboard]. https://oiep.gmu.edu/data-analytics-research/enrollment/enrollment-by-demographic/https://oiep.gmu.edu/data-analytics-research/enrollment/enrollment-by-demographic/

Preparing Future Faculty. (n.d.). The Preparing Future Faculty program. https://preparing-faculty.orghttps://preparing-faculty.org

Rozaitis, B., Baepler, P., Gonzalez, A., Ching, P., Wingert, D., & Alexander, I. D. (2020). Preparing Future Faculty: Pedagogical practice in graduate school. New Directions for Teaching & Learning, 2020(163), 35–43. https://doi.org/10.1002/tl.20408https://doi.org/10.1002/tl.20408

Saldaña, J. (2016). The coding manual for qualitative researchers (3rd ed.). SAGE Publications.

Schram, L. N., Pinder-Grover, T., & Turcic, S., II. (2017). Assessing the long-term impact of the Preparing Future Faculty seminar. To Improve the Academy: A Journal of Educational Development, 36(2), 101–116. https://doi.org/10.1002/tia2.20063https://doi.org/10.1002/tia2.20063

Swayze, S., & Jakeman, R. C. (2014). Students perceptions of communication, connectedness, and learning in a merged cohort course. The Journal of Continuing Higher Education, 62(2), 102–111. https://doi.org/10.1080/07377363.2014.915446https://doi.org/10.1080/07377363.2014.915446

Vygotsky, L. S. (1978). Mind in society: Development of higher psychological processes. Harvard University Press.

Winter, K., Kent, J., & Bradshaw, R. (2018). Preparing Future Faculty: A framework for design and evaluation at the university level. Council of Graduate Schools.

Yin, R. K. (2014). Case study research: Design and methods (5th ed.). SAGE Publications.