Introduction

Introductory Science, Technology, Engineering, and Mathematics (STEM) courses are well-known for consistent patterns of inequity (Matz et al., 2017; Eddy & Brownell, 2016). While many promising reform initiatives have worked to combat these widespread inequities in grade outcomes, retention rates, and graduation rates (see Laursen, 2019), STEM courses in higher education remain inequitable and exclusionary for historically marginalized student populations (Castle et al., 2024; Hatfield et al., 2022; Whitcomb et al., 2021; Riegle-Crumb et al., 2019). Many STEM faculty have been slow to incorporate evidence-based teaching practices (Smith et al., 2024; Stains et al., 2018) and departmental and disciplinary norms often constrain individual reform efforts (Henderson & Dancy, 2007; AAAS, 2011; Reinholz et al., 2017). In addition, faculty attitudes toward equity play an important role in determining their willingness to engage in structural reform of courses and curricula (Russo-Tait, 2023). For example, many stakeholders in STEM education view academic inequities as resulting from student deficits (Valencia, 1997, 2010; Canning et al., 2019; Castro, 2014). This deficit mindset poses a significant barrier for STEM education reform as attempts to address inequities often focus on “fixing” the students (e.g., growth mindset interventions), rather than on creating systemic change. University Centers for Teaching and Learning (CTLs) often play a key role in helping support faculty in driving classroom change (Carlisle & Weaver, 2018) by providing training in evidence-based teaching practices (Henderson et al., 2010, 2011). In this paper, we use the words “instructor” and “faculty” interchangeably to designate anyone taking primary responsibility for teaching a course. Further, the term “instructor” is inclusive of faculty, lecturers, and other instructors of record.

Literature Review

In order to address inequities in STEM education through systemic change, it is vital to create sustained interdepartmental teams. Teamwork to improve STEM education has taken many forms—learning communities, communities of practice (CoPs), communities of transformation, action teams, and more. Each of these approaches focuses on bringing together like-minded people who either share similar challenges or goals for change (Kezar & Gehrke, 2015; Reinholz et al., 2017). Learning communities and CoPs center individual learning and professional development, while communities of transformation and action teams are more oriented toward changing structures (Kezar & Gehrke, 2017; Reinholz et al., 2017).

While many of these teams are faculty-driven, there is a growing call for student partnerships (Quan et al., 2019; Ronayne Sohr et al., 2020; Bovill, 2017), as they provide key perspectives (Oh, 2014; Narayanan & Abbot, 2020; Matthews & Cook-Sather, 2021). The “students as partners” (SaP) framework encompasses activities where students work with faculty or staff as collaborators rather than only as consumers of higher education (Cook-Sather et al., 2018). These collaborations have benefits for both students and staff, including increased confidence, sense of belonging, and empathy (Matthews et al., 2019). However, challenges specific to these partnerships include managing power dynamics, especially for student-faculty relationships (Mercer-Mapstone & Abbot, 2020; Quan et al., 2021), though these can be addressed (e.g., Matthews et al., 2019; Ngai et al., 2020; Abdurrahman et al., 2022).

A focus on equity requires consistent access to institutional data that can be used for disaggregated analysis of grades, graduation rates, student demographic data, and enrollment. However, consistent reporting of equity data has yet to become a standard practice in higher education. Analysts must find useful and authentic methods to analyze the data (McNair et al., 2020; Bhatti, 2021; Almeida, 2022). Converting the findings to action raises further challenges including defining equity, selecting and implementing reforms, and incorporating intersectional approaches (Crenshaw, 2005; Ireland et al., 2018). Multidisciplinary and multi-institutional approaches to STEM reform efforts can bring consistency of methods and measures that makes results comparable. Additionally, they can promote unique community-related benefits (e.g., validation of contributions, care across the network, diversity of viewpoints and context) (Colclough et al., 2023).

Recent research has focused on classroom-level, department-level, and institution-level data-based investigations with an eye toward measuring inequities experienced by historically marginalized students (e.g., Castle et al., 2024; Fiorini et al., 2023; Mack et al., 2019, McNair et al., 2020). Despite multi-institutional initiatives to use data to drive institutional change (e.g., McNair et al., 2020; Michaels & Milner, 2021), efforts remain small in scale and isolated.

Our Project

Between 2023-2024, with support from the National Science Foundation (#2215398; #2215689), we ran SELCs in parallel across nine participating institutions. Each SELC—consisting of faculty members, undergraduate students, teaching center staff, and institutional data staff—had access to local equity data reports. They engaged in a year-long program to guide participants in developing equity-mindedness that included meetings of all SELCs. Equity-mindedness for this project was defined as “the perspective or mode of thinking exhibited by practitioners who call attention to patterns of inequity in student outcomes” (Center for Urban Education, n.d.; Bensimon, 2006).

This paper serves as an exploration of this sustained, multi-institutional learning community aimed at increasing equity-mindedness in STEM and motivating structural change. We investigate the value of this learning community for individual participants and institutions, and more specifically the impact of student voices on SELC outcomes. We highlight ways to translate our activities and learn from our approach in the implementation of future team-based change efforts regardless of institution size.

Our Approach

Our Context (SEISMIC)

This study was conducted by members of the Sloan Equity and Inclusion in STEM Introductory Courses (SEISMIC) collaboration, which centers a multidisciplinary, multi-institutional, and evidence-focused approach to STEM reform. SEISMIC brings together large, public, research-intensive institutions committed to promoting equity and inclusion in large-enrollment introductory STEM courses. Members include faculty, students, researchers, staff, and administrators. Affiliations include STEM, social science, and humanities disciplines, as well as educator development units. The collaboration engages via multi-institutional working groups, a SEISMIC-wide seminar series, and annual summer meetings (SEISMIC, n.d.).

We ran the SELC project within the infrastructure of the SEISMIC collaboration using SEISMIC’s communication channels and central organizing team to coordinate this multi-institutional project. We used lessons learned from SEISMIC to design the SELC project, including undergraduate student participation, co-developing project materials with participants, and facilitating multidisciplinary collaboration. SEISMIC has previously developed nuanced equity measures that integrate student data, classroom data, and contextual data about the systems surrounding the students, including course structures and institutional policies (Castle et al., 2024; Fischer et al., 2023; Fiorini et al., 2023). Several SEISMIC institutions have also developed equity reports and dashboards to display student enrollment and outcomes data, including through the Foundational Course Initiative at the University of Michigan (CRLT, n.d.) and the Know Your Students tool at the University of California, Davis (CEE, 2024). The SELC project built on SEISMIC’s experience.

The STEM Equity Learning Community (SELC) Experience

Nine large, public research institutions hosted SELCs in 2023-2024, including the University of Michigan, Michigan State University, Purdue University, Indiana University, the University of Pittsburgh, the University of Maryland, the University of California Davis, the University of California Irvine, and the University of California Santa Barbara. These institutions are public, classified as “R1” (or very high research activity) institutions, and feature both large undergraduate enrollments and large introductory STEM courses (100+ students). Three institutions are on the west coast, four in the Midwest, one in the Mid-Atlantic region, and one in the South Atlantic region. Several of these institutions are minority-serving, including Hispanic-serving and Asian American and Native American Pacific Islander-serving. Multiple are also land-grant institutions.

The central goal of the SELC project was to develop and empower campus community members to use equity-minded decision making to address challenges in their STEM courses through year-long learning communities. SELCs are characterized by critical examinations of course equity data, authentic partnerships between students and instructors, facilitation by experts in local Centers for Teaching and Learning (CTLs), and an emphasis on equity-minded reform. The design for the SELCs was guided by Matthews’ (2017) principles for SaP and Weatherton and Schussler’s (2021) adaptation of Lundy’s model (2007) for honoring student perspectives through four elements: space, voice, audience, and influence. The SELC project also embraced the Departmental Action Team model’s six core guiding principles (Ngai et al., 2020) and drew on strategies shared by Castro (2014) and Garcia and Guerra (2004) for identifying and addressing student-deficit thinking.

SELC Equity Reports

Central to the SELC project is examining student enrollment and performance data, disaggregated by different student demographics. However, many institutions did not start with this resource and many participants had not previously seen data presented in this way. We developed resources to simplify the work required for SELC institutions to examine this equity data and made the resources available via GitHub (Farrar et al., 2023). Institutions can follow our R code to develop course equity reports using their own data. These reports show aggregated student enrollment and outcomes in courses of interest, with breakdowns by (when available) race and ethnicity, sex, income, first-generation status, transfer status, college major, and other student characteristics (Figure 1). The PEER label used in Figure 1 stands for “persons historically excluded based on ethnicity or race” (Asai, 2020).

Figure 1. Example Figure from SELC Course Equity Reports (Farrar et al., 2023). The triangle data points indicate students hold the identity labeled on the Y-axis. For example, the bottommost triangle represents students who fall into the PEER category

Drawing on Castle et al.’s (2024) work in SEISMIC, our course equity reports also use the Systemic Advantage Index to help visualize the ways in which systemic advantages—such as higher income levels or having parents who went to college—correlate with student performance. In addition to figures, the SELC course equity reports also include contextual information about the data being shown, considerations to avoid student-deficit thinking, and guiding questions for individuals and teams to use. Several of the authors on this paper are preparing a manuscript for publication with a more detailed description of the course equity reports.

SELC Participants

In the composition of the SELCs, we included instructor and student voices, centered institutional data, and created space for difficult conversations about equity. SELC teams comprised four to six instructors, two undergraduate students, one institutional researcher, and one facilitator. Instructors had taught introductory STEM courses or had decision making power about curriculum. The instructors brought a deep understanding of the local challenges, policies, and practices, and knew which courses had large opportunity gaps. We recruited a variety of undergraduate students who represented diverse demographics and majors, so they could draw upon their own course experiences. Given the inherent power differential between instructors and students, we included two students in each SELC to create a source of peer support and encourage the contribution of their perspectives. Students were paid to participate. We provided training for all participants at an initial in-person launch event to support productive and inclusive teams.

Each SELC also had an institutional researcher who was responsible for generating the course equity reports (Farrar et al., 2023). Their intimate knowledge of institutional data allowed them to answer detailed questions that arose in the local SELCs. Further, being engaged in the equity-oriented conversations allowed them to better understand the needs of the full SELC and tailor the equity reports accordingly. Institutional researchers were compensated for their time creating the reports.

Lastly, our CTLs helped identify a facilitator or co-facilitators with experience in educational development, managing challenging conversations, complex team dynamics, and work around equity. Facilitators were critical in organizing and facilitating each of the SELC meetings and brought in resources to help participants develop knowledge and skills. They were the local SELC leaders and ensured project momentum. Facilitators received SELC-specific training at the beginning of the program, and maintained regular communication with each other throughout the project. Additionally, facilitators who helped create the SELC curriculum or led the Inter-SELC meetings were compensated for their efforts.

SELC Activities

The SELC experience consisted of multiple activities within and across SELCs (Table 1), which helped them to achieve their goals. Each SELC defined its own goals and scope, based in part on faculty participants’ academic units. Some had a broader, campus-wide focus (i.e., when their team composition reflected multiple colleges/departments) while others focused on department-level challenges (i.e., when all members were from a single department).

Table 1. SELC Activities

SELC Activity Description
Local Campus Activities
Local SELC Meetings Each SELC met monthly to learn equity-minded approaches to examining student data, review course equity data reports, identify equity issues in their courses, and explore structural levers for change.
Campus Presentations Near the end of the experience, each SELC delivered presentations to key leaders on their campus who could help implement change. The specific audience varied based on the goals of each SELC, but could include department chairs; offices of diversity, equity, and inclusion; deans and vice provosts for undergraduate education; and student groups.
Across Campus Activities
Inter-SELC Meetings Regular opportunities for SELCs to meet with other SELCs, where they shared progress, explored common challenges, and developed skills.
Launch Event (one of our Inter-SELC Meetings)
A special 2-day in-person inter-SELC event that aimed to align participants’ project expectations, introduce equity data, bring together multidisciplinary and multi-role teams, and establish initial team goals and work plans. SELC members connected with others in the same role at different campuses to build community and share concerns.

Local SELC meetings followed an equity-minded curriculum designed by SELC facilitators and other local experts (see SEISMIC, n.d.). Most meetings involved a topic and set of activities designed to help SELCs gain additional context and practice in examining equity data to improve student outcomes. For instance, one activity involved SELC participants brainstorming common responses from peers when they learn of inequities in student outcomes and then practicing responding using an asset-based framework. SELCs developed action plans during their monthly meetings. These plans included recommendations for change—whether it was the redesign of a barrier course to student success or the campus-wide adoption of a new course equity dashboard—which they called for during their Campus Presentations. These presentations provided a concrete goal and deadline to motivate engagement for the SELCs throughout the year. Additionally, SELCs attended five inter-SELC meetings over the course of the project. Discussion topics for these meetings included motivations for joining the project, anticipated challenges and opportunities for the work in local contexts, and connections between SELCs and local campus initiatives.

Research Questions

We developed four research questions to investigate the effectiveness of the SELC project as a sustained effort for increasing equity-mindedness in STEM and motivating structural change.

  1. To what extent did participation in a SELC increase participants’ access to equity data?

  2. To what extent was participating in a SELC valuable for participants?

  3. How did SELC participation impact the institutions?

  4. How did the presence of student voices impact participant’s personal SELC project expectations/outcomes?

Methods

Instruments, Procedures, and Sample

Two survey instruments (SEISMIC, n.d.) were used to collect data from SELC participants:

  1. A pre-project survey collected data about participants’ academic backgrounds, familiarity/experience with and attitudes towards equity-focused work in STEM (including data reports/dashboards), and pre-project expectations/concerns. This survey was sent to participants after they signed up for the project and prior to the local SELC meetings. For most teams, they completed this survey prior to our Launch Event.

  2. A post-project survey collected data about those same attitudes and the value of the project along multiple dimensions (e.g., overall value of the project, success of the project in relation to expectations, the value of student voices, and local impact of the project). This survey was sent to participants in spring 2024, around the time most SELCs were completing their campus presentations.

Both surveys consisted of a mixture of open-ended and closed-response items. The development of both surveys aligned with typical multi-tiered models of training program evaluation (Kirkpatrick & Kirkpatrick, 2006): level one assessed learner reaction/satisfaction; level two assessed achievement of project learning outcomes; and level three assessed local impact (typically in the workplace of trainees) after the training. Data were collected for assessment purposes and all participants had the opportunity to opt out of having their data included in this research. To protect participant confidentiality, data from both the pre- and post-surveys were collected anonymously. The study was determined exempt by the University of Michigan Institutional Review Board (HUM00217823).

Overall, 55 participants (68% response rate) completed the pre-survey (27 faculty, 13 students, 8 institutional researchers, and 7 facilitators), and 58 participants (72% response rate) completed the post-survey (29 faculty, 13 students, 8 institutional researchers, and 8 facilitators).

Prior to the SELC project, 30% of faculty participants indicated they had no experience working with students as partners, and 38% of student participants indicated they had no experience working with faculty. Additionally, 56% of the faculty indicated that they had no access/little to no familiarity with reports or dashboards at their institution providing information about student demographics and performance. Finally, 29% of faculty reported no prior experience working to explore or improve equity outcomes in STEM courses.

Data Analysis

We used a mixed-methods approach to analyze our pre- and post-survey data (Creswell & Poth, 2016). Quantitative analysis (chi-square tests of independence, Wilcoxon rank-sum tests) was employed on closed response survey items to measure overall proportions and significant shifts among participants’ experiences. Unless otherwise indicated, all Likert scale response options ranged from Strongly Agree (coded numerically as 1) to Strongly Disagree (coded numerically as 5).

To evaluate participants’ experiences of the SELC project qualitatively, open response items were analyzed using inductive thematic coding (Saldaña, 2015; see also Braun & Clarke, 2006). Initial latent codes were developed by systematically identifying salient features of all survey data before being collapsed into broad themes. Where relevant, existing literature on participant value/meaning-making in similar contexts guided interpretation of the latent codes (Braun & Clarke, 2006; Patton, 1990). We categorized these themes in relation to the research questions and then through the theoretical lens of Communities of Practice (CoPs) (Wenger, 1998; 2009; 2010). Consistency of latent codes and themes, and any intraobserver inconsistencies and interobserver differences, were discussed and resolved among a subset of the authorship team who worked on the analysis to ensure intercoder reliability (Krippendorff, 2018). Codes, along with descriptions and example quotes, are described in Table 2.

Table 2. Example Codes, Descriptions, and Quotes

Code Description Example Quotes
Community of Practice (CoP)-Based Codes
Domain Awareness of STEM equity in teaching and learning, such as through gaining/lacking knowledge/understanding, enhancing/reinforcing perspectives, providing insight. “Sometimes I didn’t feel knowledgeable enough to talk about, for example recommendations
“Understand the issues surrounding equity in introductory STEM classes
Practice CoP practices/products, such as specific SELC work processes, facilitation of work processes, information sharing. “When we had a clear goal and objective for our meeting
“Examine institutional data and be able to observe the statistics of specific STEM introductory courses
Community SELC community, such as professional and/or personal aspects of community, meaning derived from working with others. “Welcoming, supportive faculty who expressed genuine interest in what I say and validate
“Meet and develop relationships with STEM instructors
Codes For Participant Intent to Improve Equity Outcomes Following SELC
Equity decision making and/or knowledge sharing/development outside the classroom Participation in institutional programming initiatives beyond applying SELC findings to a course/series of courses. “Asking SELC instructors to develop a workshop about what they found and what interventions they have been working on in follow-up projects
Equity decision making and/or teaching inside the classroom Applying SELC findings to a course/series of courses. “Create more support structures in a course and be more conscious of student-student and student-instructor interactions
Increasing data access/usage Increasing access to/future use of SELC data/data reports. “Continue to work on getting institutional data more readily available for faculty and consult on how to make changes based on the data

Results

Our results are organized according to the Research Questions. First, we address the extent to which participation increased individual access to equity data (RQ1). Second, we give a general overview of the extent to which participating in the SELC project was valuable for participants (RQ2). Third, we offer an overview of how the SELC project impacted institutions (RQ3). Fourth, we address how the presence of student voices impacted participant’s personal SELC project expectations/outcomes (RQ4).

Research Question 1: To What Extent Did Participation in a SELC Increase Participants’ Access to Equity Data?

The SELC project was successful in increasing access to data about student identities and outcomes. Participants were asked both pre- (N = 55) and post-survey (N = 58) whether they could access course-specific data about student identities and outcomes (Figure 2). A chi-square test of independence showed that after the SELC project, participants were significantly more likely to have access to data about student identities compared to before the project χ2 (1, N = 113) = 9.6, p < .01.

Figure 2. Distribution of Pre- and Post-Survey Responses on Data Access

Note. Survey question: “Do you have access to data about student identities and outcomes that would allow you to learn about equity in a specific course?”

Research Question 2: To What Extent Was Participating in a SELC Valuable for Participants?

Overall, program benefits aligned with participants’ expectations and were perceived to be worth the invested time and effort. All facilitators indicated that the project was professionally valuable. More than half of participants (56%) indicated their SELC achieved (or will achieve) what they hoped it would achieve prior to the project. Most participants (80%) also indicated program value by describing future plans to improve equity outcomes in STEM courses.

Most participants (90%) indicated that the program was as or more beneficial than expected and was worth the time and effort investment. Additionally, 97% of respondents (N = 48) strongly agreed/agreed that their contributions were valued by members of their local SELC (mean agreement score = 1.34, SD = 0.55). All respondents strongly agreed/agreed that student contributions were valued by members of their local SELC (mean agreement score = 1.16, SD = 0.37). Most (87%) of respondents strongly agreed/agreed that the SELC experience will be valuable to their future decision making around equity (mean agreement score = 1.61, SD = 0.96).

All facilitators (N = 8) strongly agreed/agreed with the following statements: participating in the SELC project was a beneficial professional development experience; they enjoyed working with other facilitators and would like to do so in the future; and they would recommend this collaborative model for supporting other future parallel facilitation. Seven facilitators strongly agreed/agreed that working with other facilitators helped develop facilitation skills, while one remained neutral on this topic.

Undergraduate students (N = 11) were asked to describe what factors, if any, supported their participation in SELC discussions. Using the CoP codes described in Table 2, nine out of eleven students identified community-related factors which supported participation. These included: general group engagement, validation of contributions from other group members, and value/meaning in interpersonal group interactions (e.g., student-student, student-faculty, and student-facilitator). For example, one student said: “Everyone was very welcoming overall, and I could really tell that what I said was valued. All the members echoed others’ statements, which made it feel very welcoming.” Two students identified practice-related factors which supported participation, including prior work on equity-minded data and goal setting in local SELC meetings. Finally, one student identified domain-related factors, but indicated that these factors in fact inhibited their participation in SELC discussions: “I think sometimes I didn’t feel knowledgeable enough to talk about, for example recommendations especially because I did not have a good hold on what is going on already.”

Participants (N = 57) were asked whether their SELC achieved the primary thing they hoped it would achieve prior to the project. More than half (56%) of respondents indicated either “Yes” or “Not yet (but it will be!)”; 31% indicated either “Impossible to tell at this point” or “Not sure how I would measure this”; and 10% indicated “No.” We note that compared to these overall percentages, by role, slightly fewer faculty and students indicated “Yes”/”Not yet (but it will be!)” (52% and 53% respectively) compared with facilitators and institutional researchers (76% and 71% respectively). Furthermore, slightly more faculty and students indicated “Impossible to tell”/“Not sure how I’d measure it” (both 38%) compared with facilitators/institutional researchers (13% and 14% respectively).

Participants (N = 58) were also asked how, if at all, they planned to improve equity outcomes in STEM courses after the SELC experience. We binned 80% of responses (the remaining responses were left blank or indicated that they were not sure) using the codes in Table 2. 36% of responses focused on future changes inside the classroom, whereas 39% of responses focused on changes outside of the classroom, such as through institutional programming or departmental/institutional level interventions. Finally, 22% of responses focused on increasing data access/usage, e.g., making raw data more widely available or helping others interpret data. Overall, responses indicated an intent toward future equity-minded action, which we take as an additional data point indicating participants found value in the SELC experience.

Finally, while the overall goal of the SELC project was to increase equity-mindedness among participants and in their local contexts, we were limited in our capacity to measure the extent to which equity-mindedness itself increased. Equity-mindedness can be measured along multiple dimensions. Our available survey data focused only on participants’ student-deficit mindset. There were small signs that student deficit thinking may have decreased by the end of the SELC project (Figure 3). However, in the future, we would recommend employing interviews to explore the project mechanisms which might have facilitated changes in equity-mindedness.

Figure 3. Distribution of Pre- and Post-Survey Responses on Equity-Mindedness. Given the numeric calculation using the Likert scale, higher mean agreement score means participants disagreed more with the choice. For example, in the post-survey, more participants disagreed that the responsibility for different student outcomes lies with student effort

Note. Survey question: “To what extent do you agree with the following statements? The responsibility for different student outcomes among different groups of students lies with…”

Likert scale, 1 = Strongly Agree, 5 = Strongly Disagree

Research Question 3: How Did SELC Participation Impact the Institutions?

There are early signs the SELC experience will lead to longer term institutional change. To address the perceived feasibility and influence of SELC-related equity work in local contexts, participants were asked to rate perceived interest and receptiveness from SELC presentation audiences (N = 30), and perceived local influence of the SELC project (Figure 4). Due to the timing of the post-survey and SELC presentations, 30 participants rated perceived interest and receptiveness from SELC presentation audiences, and 48 participants rated perceived local influence of the SELC project.

Figure 4. Perceived Feasibility and Influence of SELC-Related Equity Work in Local Contexts

Note: Survey items:

A: The audience(s) for my local SELC presentation was/were receptive to conversations on equity in STEM education

B: The audience(s) for my local SELC presentation was/were interested in the equity issues my SELC presented to them

C: The audience(s) for my local SELC presentation was/were receptive to the actions my SELC recommended for equity in STEM education

D: The audience(s) for my local SELC presentation is/are in positions where they can implement the actions my SELC recommended for equity in STEM education

E: The SELC project influenced other equity efforts on my campus and/or beyond my campus

Likert scale, 1 = Strongly agree, 5 = Strongly Disagree

In addition, audience members (N = 25) for SELC presentations were surveyed (separately) about their motivation for making equity-minded change in local contexts, and how SELC equity data might contribute to motivation (Figure 5).

Figure 5. Audience Member Motivation for Making Equity-Minded Change

Note: Survey questions:

A: How likely are you to make equity-minded changes in your sphere of influence (e.g., course, department, college) related to this presentation? (Likert scale, 1 = Not likely, 5 = Extremely likely)

B: Did you find the equity data shared in the presentation useful for motivating you to make equity-minded change? (Likert scale, 1 = Not at all useful, 5 = Extremely useful)

Research Question 4: How Did the Presence of Student Voices Impact Participant’s Personal SELC Project Expectations/Outcomes?

The authentic inclusion of students in this project impacted achievement of personal project expectations/outcomes in substantially positive ways. A small number of respondents also described some limitations of student voices. First, we give an overview of personal project expectations/outcomes, which included practicing equity-minded strategies, building community, and learning about equity. Following that, we present participant feedback on how the inclusion of students impacted those practice-, domain-, and community-related project expectations/outcomes.

Participants (N = 57) were asked what, when they started the SELC experience, was the primary thing they hoped their SELC would achieve. Overall, 63% of respondents identified practice-related outcomes such as specific work products/work processes related to SELCs, including the development and/or implementation of specific products, and implementation of work/work processes in wider contexts. 25% of respondents identified domain-related outcomes, such as increasing understanding/awareness of equity-mindedness in STEM and increasing understanding of the importance/utility of data in addressing STEM equity issues. Finally, 19% of respondents identified community-related outcomes. These outcomes typically identified collaborative elements of SELCs, and derivative value/meaning, such as: collaboration/making connections; learning about equity in STEM from others; and authenticity through community. In the words of one respondent: “I hoped for our team to coalesce and develop a true collaboration.”

The authentic inclusion of students in this project significantly impacted achievement of personal project expectations/outcomes as they related to practice, domain, and community (Table 3). Faculty, facilitators, and institutional researchers (N = 43) were asked to describe in what ways the presence of student voices impacted their SELC. Responses overwhelmingly (88%) identified positive impacts of student voices in the SELC projects. Respondents described the student-faculty collaboration in the SELCs as a useful microcosm of the teaching-learning environment and noted that the students’ perspectives on equity issues gave vital depth and context to the data. In addition, one ubiquitous sentiment was that student voices uniquely impacted SELC projects in these various ways. In the words of one facilitator: “The student voice was what set this program apart from others and I think that is one of the strongest parts.”

Table 3. Thematic Distribution of Positive Impact of Student Voices on Participants’ Personal Project Expectations/Outcomes

Expectations/Outcomes Themes Impact Described on Outcome Representative Quotes
Community-Related (49% of responses) Value/meaning derived from student input in group/interpersonal interactions, including in the context of future equity work. “It’s important to think of the teaching and learning environment as a community that works together for a common goal, and I think it’s good to be able to directly interact with the students about their experiences.
Practice-Related (37% of responses) Specific contributions students made to work processes (e.g., discussions which led to actionable products) and organizational elements of the SELCs. “Students were by far the best prepared in our meetings… they infused all other team members with a sense of responsibility that… elevated the quality of our conversations.
Domain-Related (14% of responses) Increased awareness (e.g., through understanding, enhancing/reinforcing perspectives, providing insight) of STEM equity (including STEM equity literature) and STEM equity issues in teaching and learning. “[Students] reminded us that some of the recommendations in the literature do not help, and that there is more to an (in)equitable student experience than just grades.

Finally, a small number of participants (N = 3) highlighted practice-related drawbacks of student voices due to a range of disciplinary and institutional knowledge across students and faculty.

Discussion

Findings From the Survey Data

Our pre- and post-survey data allowed us to address Research Questions 1, 2, and 4 directly and to provide initial indicators on Research Question 3. We were successful in increasing individual access to equity data (RQ1). In part, this follows mechanically from making the course equity reports a key expectation of the project that each institution committed to in advance. However, any institutional researcher will tell you that providing useful data is not so simple. These reports were made possible by the provision of resources to support the institutional researchers in developing these reports, including definitions of key variables, clearly written analysis code (Farrar et al., 2023), a Slack channel for asking questions and learning from each other, and funding to make their time available for this project.

Participating in this SELC experience was valuable for participants from all three spheres (faculty, student, staff) (RQ2). Participants overwhelmingly indicated the experience was worth the time and effort invested and will inform their future decision-making around equity work. Facilitators in particular found participating in the SELC project to be a beneficial professional development experience and recommend this collaborative cross-institution model for future projects. When looking ahead to future work, participants intend to bring equity-minded activity into their classrooms and beyond, and they intend to increase equity data access for others.

The SELC project’s impact on institutions (RQ3) is a longer-term change effort that will need to be measured several times after the end of the project. Since campus presentations were the most immediate mechanism for this change, we surveyed participants about how audience members responded to their presentations. Participants believed in general their audience members were very receptive to conversations on equity in STEM education and were interested in the equity issues presented. Participants also believed their audience members were in positions where they could make the changes recommended by the SELCs. While participants believed their audience members were receptive to the recommended actions, this inclination to action was not perceived as strongly as the other points. Most participants believed the SELC project impacted other equity efforts on their campus and beyond, but we do not yet have the data to specify the shape or extent of this impact.

Finally, involvement of students was a vital feature of the program (RQ4). When we asked the instructors, facilitators, and institutional researcher participants about how student voices impacted their SELCs, we found the involvement of students overwhelmingly linked to successful completion of participants’ initial goals and outcomes. Student SELC members increased the sense of community in the SELCs; contributed to productive teamwork, which then set the SELCs up for developing products and expanding equity work; and provided essential context in discussions of equity issues. Even though we do not yet have long-term data to determine whether structural change resulted from this project, we are confident that having students and instructors working together on shared goals is key. The enthusiasm and motivation generated through multi-role collaboration is promising for the sustained work necessary for tackling systemic inequities.

Reflections on Running the SELC Program

Beyond the analysis of the participant response data, we share some key insights in the development and implementation of the SELC project across these institutions. Two hard-won insights may provide value to others who are seeking to make similar changes on campus:

  1. Institutional barriers can prevent access to “sensitive” data – including disaggregated demographic data of student outcomes.

  2. An in-person launch event prior to sustained local learning communities is valuable.

The SELC institutions all made a strong commitment to promote equity using data. Yet even within this context, the institutions varied considerably in how accessible the data were. Different institutions’ administrative structures would periodically block access to certain types of data or would restrict who could see equity reports and in what formats. For example, in one or two institutions SELC students received more limited access to the equity reports than the instructors, which made it more difficult for them to participate as full SELC members. Within the more restrictive institutions the concerns varied. Stated apprehensions arose from three areas: student privacy, FERPA restrictions, and potential reputational damage from leaked reports. However, the aggregated nature of the data meant that student privacy and FERPA concerns should be minimal and that institutional privacy was the core concern. Overall, it is difficult to address systemic inequities if there is no way to disaggregate and display inequitable outcomes, or if key parties (e.g., students) are not permitted access. However, institutions with greater data access can use their influence to persuade more restrictive institutions to release equivalent data (“institutional peer pressure”). This helps to promote a higher education landscape that is more receptive to talking about institutional data. We should note that institutional data always remained exclusively within the domain of the employees and students of a given institution; equity reports were not shared between institutions.

We briefly described our Launch Event in Table 1. When planning the project, we recognized the importance of gathering people together in a common space to build relationships and trust. The event was intended to bring together all the participants from across the different campuses to provide a common experience and be able to provide some dedicated time to mentally engage with the equity work. This included providing a foundational layer of knowledge so that all participants felt well-equipped to engage with challenging conversations, but it also was designed to help build personal connections. Gathering participants together by role helped them find others with similar perspectives and allowed the organizers to address common concerns about the project. This was perhaps most striking in the case of the students, who rapidly built connections and continued to maintain and use them throughout the year. Many facilitators also greatly benefited from connecting with each other during the launch event. These and other positive reflections were shared by participants in subsequent project meetings. Participants continued to refer back to positive experiences at the launch event throughout the project. They shared how the team norms they established for the SELCs during that event, continued to inform their SELC processes throughout the academic year. Additionally, they gained an understanding of the full scope of the project, allowing them to better understand that their SELC work was not being done in isolation, but part of a much larger effort. Witnessing and clearly belonging to this broader community of practice was an early, positive experience for participants and helped them maintain their commitment to the project.

Limitations

The field setting of this project is helpful in that it shows the real-world practicality of our approach, but it inevitably makes causal connections between the project structure and outcomes difficult to establish. Survey responses may have been affected by social desirability bias (Furnham, 1986) and/or selective response bias (Heckman, 1990), although the high response rates make the selective response bias a minor concern. Relatedly, it is reasonable to assume that those who participated in the SELC project were already inclined toward learning about and working toward equity in STEM; the program might not be effective for understanding a broader population of faculty, students, and staff. Furthermore, the long-term impacts—in particular on institution level policies—are as yet unknown. Future research could determine how learning community members integrate equity-informed teaching practices into their classrooms, impact departmental or curricular change, and sustain this type of collaborative work. For our student participants in particular, future research could determine whether they continued to advocate for more equitable STEM environments and whether they continued this advocacy work in graduate school or their career.

Conclusion

In summary, the SELC project was a multi-institutional collaboration that leveraged multidisciplinary and multi-role learning communities to motivate equity-minded change. Key to the project was including student participants in the SELCs, requiring that each SELC create course equity reports, involving trained facilitators in designing activities to guide SELCs in developing equity-minded action plans, and providing structured opportunities for collaboration across roles and institutions (SEISMIC, n.d.). Our SELC participants have many plans for future work to promote equitable outcomes for students in STEM—such as expanding access to equity reports and running next versions of the SELC project—and we expect our courses, departments, and institutions will be impacted for years to come by these motivated individuals.

For those interested in structuring similar professional development work, we make three recommendations. First, provide sustained opportunities for instructors and students to work together on shared goals. Go beyond 1-on-1 partnerships to team-based efforts that leverage student expertise, instructor engagement, and university resources. In the words of one of our SELC participants, “Our students brought eye-opening perspectives and wisdom… I will be asking to include student voices on any committee I serve on in the future that is related to course transformation or equity.” Second, make access to and use of equity data a central component of the project. Our publicly accessible R code is available as a starting point for developing course equity reports and increasing data access on campus (Farrar et al., 2023). We hope to publish more guidance for institutional researchers on implementing course equity reports on campuses new to accessing and sharing equity data, that takes into consideration institutional constraints. University administrators will also find this forthcoming publication a valuable resource as they advocate for equity reports on their campuses. Third, find ways to expand the community so that participants can connect with others in similar roles but different disciplinary or institutional contexts.

ORCiDs

Nita A Tarchinski: 0000-0002-5886-2303

Thomas Matthew Colclough: 0000-0002-9228-4404

Ashley Atkinson: 0009-0000-0507-0903

Emily Bonem: 0000-0002-1919-3143

Nathan Emery: 0000-0002-9766-8044

Victoria Farrar: 0000-0002-7892-1542

Madeleine Gonin: 0009-0006-1353-5671

Daniel Guberman: 0000-0002-2669-3197

Eleanor Louson: 0000-0002-1454-4688

Matthew Mahavongtrakul: 0000-0003-3048-3456

Timothy McKay: 0000-0001-9036-6150

Marco Molinaro: 0000-0001-8922-6978

Meryl Motika: 0000-0002-1071-9196

Lizette Alda Muñoz Rojas: 0000-0002-9937-6360

Hurshal Pol: 0009-0009-9298-4011

Kem Saichaie: 0000-0002-1658-7358

Kelsey Smart: 0009-0008-2839-7501

Megan Stowe: 0009-0009-6614-4959

Natasha T Turman: 0000-0001-6490-041X

Jenna Marie Thomas Vest: 0009-0007-7275-1080

Ryan Sweeder: 0000-0002-5488-4927

Biographies

Nita A. Tarchinski is a Project Manager of the national Sloan Equity and Inclusion in STEM Introductory Courses (SEISMIC) collaboration and a Grant-Writer for undergraduate education initiatives in the College of Literature, Science, and the Arts at the University of Michigan.

Thomas M. Colclough is a Postdoctoral Scholar in the Center for Knowledge, Technology, and Society at the University of California, Irvine.

Ashley E. Atkinson is the Promotion and Outreach Coordinator for C-SPIRIT, an National Science Foundation-funded global center focused on plant resilience, within the Plant Resilience Institute and Department of Biochemistry & Molecular Biology at Michigan State University.

Emily Bonem is the Assistant Director for Scholarship of Teaching and Learning at the Center for Instructional Excellence at Purdue University.

Nate Emery is the Associate Director of STEM Education at the Center for Innovative Teaching, Research, and Learning at the University of California Santa Barbara.

Victoria S. Farrar is a National Science Foundation Postdoctoral Fellow at the University of Illinois at Urbana-Champaign, and previously was a postdoctoral researcher in Neurobiology, Physiology and Behavior at the University of California Davis. Farrar completed the work for this project while at UC Davis.

Madeleine Gonin is the Assistant Director for the Center for Innovative Teaching & Learning (CITL) at Indiana University’s Bloomington campus.

Daniel Guberman is the Assistant Director for Inclusive Pedagogy in the Center for Instructional Excellence at Purdue University.

Eleanor Louson is an Educator Developer at Michigan State University’s Center for Teaching & Learning Innovation (CTLI) and is a member of the Science & Society teaching faculty in Lyman Briggs College, MSU’s undergraduate residential STEM college.

Matthew Mahavongtrakul is the Program Director of Educational Development at the University of California, Irvine’s Division of Teaching Excellence and Innovation (DTEI).

Timothy A. McKay is the Associate Dean for Undergraduate Education in the College of Literature, Science, and the Arts, and Arthur F. Thurnau Professor of Physics, Astronomy, and Education at the University of Michigan.

Marco Molinaro is the Executive Director for Educational Effectiveness and Analytics at the Teaching and Learning Transformation Center at the University of Maryland, College Park. He was formerly the Assistant Vice Provost and Founding Director for the Center for Educational Effectiveness (CEE) at the University of California, Davis. He completed work for this project at both locations.

Meryl Motika is the Divisional Assessment and Research Analyst for Student Affairs at the University of California, Berkeley. She previously served as Associate Director for Educational Analytics at UC Davis.

Lizette Muñoz Rojas is a Senior Teaching and Learning Consultant at the University of Pittsburgh’s University Center for Teaching and Learning and the Program Manager for the Graduate Student Teaching Initiative.

Hurshal Pol is a Biomedical Health Sciences student at Purdue University, with a minor in Human Rights Studies. She serves as a Student Pedagogy Advocate through the Center for Instructional Excellence, which led her to join the SEISMIC Collaboration and serve as a Student Consultant for the Purdue SELC.

Kem Saichaie is the inaugural Executive Director of the Teaching and Learning Center (TLC) at the University of California, Los Angeles, and was previously the Executive Director of the Center for Educational Effectiveness (CEE) at the University of California, Davis. Saichaie completed the work for this project while at UC Davis.

Kelsey Smart is a student at Purdue University majoring in Speech, Language, and Hearing Sciences, Sociology, and Linguistics. She works as a Student Pedagogy Advocate for Purdue’s teaching center, and this position introduced her to the SEISMIC Collaboration and led to her position as a Student Consultant for the Purdue SELC.

Megan Stowe is an assistant director in the Foundational Course Initiative at the University of Michigan’s Center for Research on Learning and Teaching, where she aids faculty in transforming large scale introductory courses into engaging, equitable, and inclusive learning experiences for all students.

Natasha T. Turman is the Director of the Women in Science and Engineering Residence Program, an academic living and learning community within the College of Literature, Science, and the Arts at the University of Michigan, Ann Arbor.

Jenna Vest is currently an Associate Research Scientist in the Assessment and Qualifications division at Pearson. She contributed to the SELC project while serving as a research analyst for the Center for Educational Effectiveness at University of California at Davis.

Ryan D. Sweeder is a Professor of Chemistry and Associate Dean for Research and Faculty Affairs at Lyman Briggs College, an undergraduate residential STEM college at Michigan State University.

Acknowledgments

We would like to thank our SELC participants for their enthusiastic engagement in this project. We would also like to acknowledge Nikeeetha Farfan D’Souza for supporting development of the SELC curriculum, Matthew Steinwachs for playing a central role in developing the SELC course equity reports, and Heather Rypkema for sharing her code for U-M equity reports. This was an important foundation for the development of our SELC course equity reports. This material is based upon work supported by the National Science Foundation under Grant Numbers 2215398 and 2215689.

Conflict of Interest Statement

The authors have no conflict of interest.

Data Availability

The data reported in this manuscript are publicly available at ICPSR: https://www.openicpsr.org/openicpsr/project/208258/version/V1/view.

References

AAAS. (2011). Vision and change in undergraduate biology education: A call to action. American Association for the Advancement of Science.

Abdurrahman, F. N., Turpen, C., & Sachmpazidi, D. (2022). A case study of cultural change: Learning to partner with students. Proceedings of the Physics Education Research Conference (PERC), 24–29.

Almeida, K. H. (2022). Disaggregated General Chemistry Grades Reveal Differential Success among BIPOC Students in Partial Flipped Team Learning Classrooms. Journal of Chemical Education, 99(1), 259–267.  http://doi.org/10.1021/acs.jchemed.1c00401

Asai, D. J. (2020). Race matters. Cell, 181(4), 754–757.  http://doi.org/10.1016/j.cell.2020.03.044

Bensimon, E. M. (2006). Learning equity-mindedness: Equality in educational outcomes. The Academic Workplace, 1(17), 2–21.

Bhatti, H. A. (2021). Toward “inclusifying” the underrepresented minority in STEM education research. Journal of Microbiology & Biology Education, 22(3), e00202-21.

Bovill, C. (2017). A Framework to Explore Roles Within Student-Staff Partnerships in Higher Education: Which Students Are Partners, When, and in What Ways? International Journal for Students as Partners, 1(1), Article 1.  http://doi.org/10.15173/ijsap.v1i1.3062

Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.  http://doi.org/10.1191/1478088706qp063oa

Canning, E. A., Muenks, K., Green, D. J., & Murphy, M. C. (2019). STEM faculty who believe ability is fixed have larger racial achievement gaps and inspire less student motivation in their classes. Science Advances, 5(2), eaau4734.

Carlisle, D. L., & Weaver, G. C. (2018). STEM education centers: Catalyzing the improvement of undergraduate STEM education. International Journal of STEM Education, 5(1), 47.  http://doi.org/10.1186/s40594-018-0143-2

Castle, S. D., Byrd, W. C., Koester, B. P., Pearson, M. I., Bonem, E., Caporale, N., Cwik, S., et al. (2024). Systemic advantage has a meaningful impact on student outcomes in introductory STEM courses at six research universities. International Journal of STEM Education, 11(14), 1–20.  http://doi.org/10.1186/s40594-024-00474-7

Castro, E. L. (2014). “Underprepared” and “at-risk”: Disrupting deficit discourses in undergraduate STEM recruitment and retention programming. Journal of Student Affairs Research and Practice, 51(4), 407–419.

CEE. (2024). Know your students. Center for Educational Effectiveness. https://cee.ucdavis.edu/know-your-students

Center for Urban Education. (n.d.). CUE’s racial equity tools. Center for Urban Education. Retrieved December 27, 2023, from https://www.cue-tools.usc.edu/

Colclough, T., Howitz, W. J., Mann, D., Kearns, K., & Hoffmann, D. S. (2023). Meanings of community: Educational developers experience care, satisfying contributions, and belonging in a collaboration across institutions. To Improve the Academy: A Journal of Educational Development, 42(2): 9.  http://doi.org/10.3998/tia.2637

Cook-Sather, A., Matthews, K. E., Ntem, A., & Leathwick, S. (2018). What we talk about when we talk about students as partners. International Journal for Students as Partners, 2(2), 1–9.  http://doi.org/10.15173/ijsap.v2i2.3790

Crenshaw, K. (2005). Mapping the margins: Intersectionality, identity politics, and violence against women of color (1994). In Violence against women: Classic papers (pp. 282–313). Pearson Education.

Creswell, J. W., & Poth, C. N. (2016). Qualitative inquiry and research design: Choosing among five approaches. Sage.

CRLT. (n.d.). Foundational Course Initiative. Center for Research on Learning & Teaching. Retrieved December 31, 2023, from https://crlt.umich.edu/fci

Eddy, S. L., & Brownell, S. E. (2016). Beneath the numbers: A review of gender disparities in undergraduate education across science, technology, engineering, and math disciplines. Physical Review Physics Education Research, 12(2), 020106.

Farrar, V., Steinwachs, M., Tarchinski, N. A., & Thomas, J. (2023). SEISMIC equity measures [GitHub]. SEISMIC. https://github.com/centerforbiophotonics/SEISMIC-equity-measures

Fiorini, S., Tarchinski, N., Pearson, M., Valdivia Medinaceli, M., Matz, R. L., Lucien, J., Lee, H. R., et al. (2023). Major curricula as structures for disciplinary acculturation that contribute to student minoritization. Front. Educ., 8, 1–16.  http://doi.org/10.3389/feduc.2023.1176876

Fischer, C., Witherspoon, E., Nguyen, H., Feng, Y., Fiorini, S., Vincent-Ruz, P., Mead, C., et al. (2023). Advanced placement course credit and undergraduate student success in gateway science courses. Journal of Research in Science Teaching, 60(2), 304–329.  http://doi.org/10.1002/tea.21799

Furnham, A. (1986). Response bias, social desirability and dissimulation. Personality and Individual Differences, 7(3), 385–400.  http://doi.org/10.1016/0191-8869(86)90014-0

Garcia, S. B., & Guerra, P. L. (2004). Deconstructing deficit thinking: Working with educators to create more equitable learning environments. Education and Urban Society, 36(2), 150–168.

Hatfield, N., Brown, N., & Topaz, C. M. (2022). Do introductory courses disproportionately drive minoritized students out of STEM pathways? PNAS Nexus, 1(4), 1–10.  http://doi.org/10.1093/pnasnexus/pgac167

Heckman, J. (1990). Varieties of Selection Bias. American Economic Review, 80(2), 313–318.

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984.  http://doi.org/10.1002/tea.20439

Henderson, C., & Dancy, M. H. (2007). Barriers to the use of research-based instructional strategies: The influence of both individual and situational characteristics. Physical Review Special Topics-Physics Education Research, 3(2), 1–14.

Henderson, C., Finkelstein, N., & Beach, A. (2010). Beyond Dissemination in College Science Teaching: An Introduction to Four Core Change Strategies. Journal of College Science Teaching, 39(5), 18–25.  http://doi.org/10.2505/3/jcst10_039_05

Ireland, D. T., Freeman, K. E., Winston-Proctor, C. E., DeLaine, K. D., McDonald Lowe, S., & Woodson, K. M. (2018). (Un) Hidden figures: A synthesis of research examining the intersectional experiences of black women and girls in STEM education. Review of Research in Education, 42(1), 226–254.

Kezar, A., & Gehrke, S. (2015). Communities of transformation and their work scaling STEM reform. Pullias Center for Higher Education.

Kezar, A., & Gehrke, S. (2017). Sustaining Communities of Practice Focused on STEM Reform. The Journal of Higher Education, 88(3): 323–349

Kirkpatrick, D., & Kirkpatrick, J. (2006). Evaluating training programs: The four levels. Berrett-Koehler Publishers.

Krippendorff, K. (2018). Content analysis: An introduction to its methodology. Sage.

Laursen, S. (2019). Levers for change: An assessment of progress on changing STEM instruction (p. 200). AAAS. https://aaas-iuse.org/resource/levers-for-change-an-assessment-of-progress-on-changing-stem-instruction/

Lundy, L. (2007). ‘Voice’ is not enough: Conceptualising Article 12 of the United Nations Convention on the Rights of the Child. British Educational Research Journal, 33(6), 927–942.

Mack, K. M., Winter, K., & Soto, M. (2019). Culturally responsive strategies for reforming STEM higher education: Turning the TIDES on inequity. Emerald Group Publishing.

Matthews, K. E. (2017). Five propositions for genuine students as partners practice. International Journal for Students as Partners, 1(2).

Matthews, K. E., & Cook-Sather, A. (2021). Engaging students as partners in assessment and enhancement processes. In M. Shah, J. T. E. Richardson, A. Pabel, & B. Oliver (Eds.), Assessing and enhancing student experiences in higher education. Palgrave Macmillan, Cham.  http://doi.org/10.1007/978-3-030-80889-1_5

Matthews, K. E., Mercer-Mapstone, L., Dvorakova, S. L., Acai, A., Cook-Sather, A., Felten, P., Healey, M., Healey, R. L., & Marquis, E. (2019). Enhancing outcomes and reducing inhibitors to the engagement of students and staff in learning and teaching partnerships: Implications for academic development. International Journal for Academic Development, 24(3), 246–259.  http://doi.org/10.1080/1360144X.2018.1545233

Matz, R. L., Koester, B. P., Fiorini, S., Grom, G., Shepard, L., Stangor, C. G., Weiner, B., et al. (2017). Patterns of gendered performance differences in large introductory courses at five research universities. AERA Open, 3(4), 1–12.  http://doi.org/10.1177/2332858417743754

McNair, T. B., Bensimon, E. M., & Malcom-Piqueux, L. (2020). From equity talk to equity walk: Expanding practitioner knowledge for racial justice in higher education. John Wiley & Sons.

Mercer-Mapstone, L., & Abbot, S. (2020). The Power of Partnership: Students, Staff, and Faculty Revolutionizing Higher Education. Elon University Center for Engaged Learning. https://www.centerforengagedlearning.org/books/power-of-partnership/

Michaels, K., & Milner, J. (2021). Powered by publics learning memo: The big ten academic alliance cluster exploring foundational course DFW rates, equity gaps, and progress to degree (APLU’s Powered by Publics, pp. 1–5). https://www.aplu.org/wp-content/uploads/powered-by-publics-learning-memo-the-big-ten-academic-alliance-cluster.pdf

Narayanan, D., & Abbot, S. (2020). Increasing the participation of underrepresented minorities in STEM classes through student-instructor partnerships. In L. Mercer-Mapstone & S. Abbot (Eds.), The power of partnership: Students, faculty, and staff revolutionizing higher education (pp. 181–195). Elon University Center for Engaged Learning.  http://doi.org/10.36284/celelon.oa2

Ngai, C., Corbo, J. C., Falkenberg, K. L., Geanious, C., Pawlak, A., Pilgrim, M. E., Quan, G. M., et al. (2020). Facilitating change in higher education: The departmental action team model (1st ed.). Glitter Cannon Press. www.dat-project.org

Oh, S.-Y. (2014). Learning to navigate quickly and successfully: The benefits of working with a student consultant. Teaching and Learning Together in Higher Education, 1(11), 5.

Patton, M. Q. (1990). Qualitative evaluation and research methods. Sage.

Quan, G. M., Corbo, J. C., Finkelstein, N. D., Pawlak, A., Falkenberg, K., Geanious, C., Ngai, C., et al. (2019). Designing for institutional transformation: Six principles for department-level interventions. Physical Review Physics Education Research, 15(1), 010141.

Quan, G. M., Corbo, J. C., Wise, S., & Ngai, C. (2021). Unpacking challenges in student-faculty partnerships on Departmental Action Teams. 353–358. https://www.per-central.org/items/detail.cfm?ID=15780

Reinholz, D. L., Corbo, J. C., Dancy, M., & Finkelstein, N. (2017). Departmental action teams: Supporting faculty learning through departmental change. Learning Communities Journal, 9, 5–32.

Riegle-Crumb, C., King, B., & Irizarry, Y. (2019). Does STEM stand out? Examining racial/ethnic gaps in persistence across postsecondary fields. Educational Researcher, 48(3), 133–144.  http://doi.org/10.3102/0013189X19831006

Ronayne Sohr, E., Gupta, A., Johnson, B. J., & Quan, G. M. (2020). Examining the dynamics of decision making when designing curriculum in partnership with students: How should we proceed? Physical Review Physics Education Research, 16(2), 020157.  http://doi.org/10.1103/PhysRevPhysEducRes.16.020157

Russo-Tait, T. (2023). Science faculty conceptions of equity and their association to teaching practices. Science Education, 107(2), 427–458.  http://doi.org/10.1002/sce.21781

Saldaña, J. (2015). The coding manual for qualitative researchers (2nd ed.). Sage.

SEISMIC. (n.d.). Home. SEISMIC Collaboration. Retrieved January 17, 2024, from https://www.seismicproject.org/

Smith, D., Finkelstein, N., Horii, C. V., Miller, E., & York, T. (2024). Levers for change: Enacting a national agenda for undergraduate STEM education. AAAS.

Stains, M., Harshman, J., Barker, M. K., Chasteen, S. V., Cole, R., DeChenne-Peters, S. E., Eagan, M., et al. (2018). Anatomy of STEM teaching in North American universities. Science, 359(6383), 1468–1470.  http://doi.org/10.1126/science.aap8892

Valencia, R. R. (1997). Conceptualizing the notion of deficit thinking. In The evolution of deficit thinking: Educational thought and practice (pp. 1–12). RoutledgeFalmer.

Valencia, R. R. (2010). Dismantling contemporary deficit thinking: Educational thought and practice. Routledge.

Weatherton, M., & Schussler, E. E. (2021). Success for All? A Call to Re-examine How Student Success Is Defined in Higher Education. CBE—Life Sciences Education, 20(1), es3.  http://doi.org/10.1187/cbe.20-09-0223

Wenger, E. (1998). Communities of Practice: Learning, Meaning, and Identity. Cambridge University Press.

Wenger, E. (2009). A Social Theory of Learning. In Contemporary Theories of Learning: Learning Theorists … In Their Own Words (pp. 209–218). Routledge.

Wenger, E. (2010). Communities of Practice and Social Learning Systems: The Career of a Concept. In Social Learning Systems and Communities of Practice (pp. 179–198). Springer.

Whitcomb, K. M., Cwik, S., & Singh, C. (2021). Not all disadvantages are equal: Racial/ethnic minority students have largest disadvantage among demographic groups in both STEM and non-STEM GPA. AERA Open, 7.  http://doi.org/10.1177/23328584211059823