Introduction
Much recent research in higher education has supported the view that adopting instructional practices, like active learning and inclusive teaching, in lieu of more traditional lecture-based models benefits all students (e.g., Freeman et al., 2014; Theobald et al., 2020). However, despite this research, the lecture model remains the dominant teaching strategy. Although many instructors may continue to use traditional lecturing because of barriers to faculty pedagogical change (e.g., lack of training, time, and incentives) (Brownell & Tanner, 2012), many other instructors think they are incorporating these techniques more than they are (e.g., Sheridan & Smith, 2020). These findings suggest that many instructors are unable to accurately reflect on their teaching techniques on their own. However, reflective teaching ability is a precursor to instructors being able to implement meaningful teaching change (Henderson et al., 2011; McPartlan et al., 2022). Thus, it is important to identify strategies that promote higher education instructors’ reflective thinking abilities.
Centers for Teaching and Learning (CTLs) offer many types of professional development (PD) for instructors that may affect reflective thinking. For example, individual teaching consultations, communities of practice (CoP) (Adams et al., 2023), and unstructured classroom observations are often used, possibly because “they can be so flexibly employed to … confront an instructor about discrepancies between goals and practice” (Wright, 2023, p. 107). Classroom observations, which invite an educational developer or trained faculty to observe classroom dynamics, may prove particularly effective at promoting reflection on teaching when coupled with individual or group meetings to review the findings. Indeed, it has been argued that the addition of observation data into a consultation carries more benefits than a consultation alone because it offers a “better picture of teaching behaviors to be discussed in the consultation session” (Penny & Coe, 2004, p. 237). Some researchers even argue that to have the greatest impact on teaching, it is important to pair the classroom observation with another approach, such as reflection or collaboration, because “how people think about and use teaching-related data is a complex and idiosyncratic process” (Wright, 2023, p. 78).
Although CTLs typically provide unstructured classroom observations along with follow-up consultation meetings, our personal experiences and observations indicate that very few have documented the use of structured observation protocols as part of their repertoire. Structured protocols differ from unstructured protocols in that they have been specifically validated in their ability to quantify certain instructional practices in the classroom. For example, the Classroom Observational Protocol for Undergraduate STEM (COPUS) (Smith et al., 2013) and Decibel Analysis for Research in Teaching (DART) (Owens et al., 2017) quantify the amount of time that active and non-active learning strategies are being used during instruction (e.g., lecturing vs. group work), while the Protocol for Advancing Inclusive Teaching Efforts (PAITE) (Addy et al., 2022) has observers identify the use of specific inclusive teaching practices (e.g., providing diverse examples). Structured observation protocols have the advantage over unstructured protocols in that they are standardized, validated by research, and low cost to implement; however, they were primarily developed to be used as measurements of teacher change in research rather than as components of a PD opportunity led by CTLs.
How Have Structured Classroom Observation Protocols Been Used as PD?
Despite the possibility of combining structured classroom observation results with meetings with CTL staff, few have done so in higher education settings. It is more common for structured classroom observation protocols to be used as research tools to evaluate the effectiveness of PD, rather than as a tool to drive reflection on teaching and, ultimately, pedagogical change (e.g., Ebert-May et al., 2015; Tomkin et al., 2019; Wheeler & Bach, 2021).
For example, Ebert-May et al. (2015) used the Reformed Teaching Observation Protocol (RTOP) to examine the extent to which postdoctoral fellows implemented evidence-based pedagogies after completion of a two-year PD program. They found that the instructors demonstrated increased student engagement and reformed teaching per the RTOP protocol; however, these classroom observation results were used as an outcome assessment tool rather than a method for prompting reflection on teaching. Other researchers have similarly used structured observation data to assess teaching change. Tomkin et al. (2019) examined the effectiveness of CoPs on STEM instructors’ use of active learning in their large-enrollment courses. They used COPUS to quantify the amount of active learning across instructors who did versus did not participate in CoPs. They found that those in the CoPs were more likely to employ student-centric practices (e.g., asking questions) than those who did not. And finally, Wheeler and Bach (2021) explored the impacts of three educational development programs—a weeklong course design institute, a new-faculty learning community, and a STEM learning community—on instructional practices (as measured by COPUS and coding course syllabi) and student performance outcomes (as measured by student grades). They found that courses taught by new-faculty learning community participants had significantly more active learning and more learning-focused syllabi compared to courses taught by new faculty who had not engaged in any interventions. While these results present promising methods for promoting teacher change, the classroom observation data were once again used to assess outcomes, rather than as a means to promote reflection on teaching practices that could inspire such change.
While most studies on teacher improvement in higher education have not leveraged structured observations to support reflection on instruction, there are some notable examples. Jackson et al. (2022) used the Practical Observation Rubric to Assess Active Learning (PORTAAL) (Eddy et al., 2015) to explore the impact of a two-year STEM CoP on instructor use of evidence-based teaching practices (EBTPs). PORTAAL was used as both an assessment and a PD tool, as pre-post PORTAAL scores were compared to examine the effectiveness of the CoP program, and PORTAAL results were distributed to CoP members at regular intervals to promote discussion about use of the EBTPs. Other studies have used COPUS; Johnson et al. (2024) recently reported on their successful use of COPUS to prompt higher education mathematics instructors to incorporate more effective teaching practices into their courses. Reisner et al. (2020) emphasized the importance of reviewing classroom data for promoting teaching change by developing a guide for how faculty should review and interpret their COPUS results after being observed.
The Present Study: Combining Structured Observations with Meetings with CTL Staff
Though there is a growing amount of research demonstrating the benefits of structured observations alone or accompanying additional PD opportunities (e.g., meetings with CTL staff) to measure teaching change, these studies have not emphasized how an instructor can use the data to reflect upon their teaching practices in a meeting with CTL staff. Thus, our study sought to develop a PD opportunity that combined individual or group meetings with structured observation data, a program we call “data-informed professional development” (DIPD). The goal of the DIPD was to support instructors in developing their reflective teaching skills, which has been shown to create instructional change in undergraduate STEM courses (Henderson et al., 2011).
To develop the DIPD, we kept in mind “key features of [PD] programs that promote faculty to incorporate more EBTPs: long-term support of faculty lasting at least a semester … and efforts focused on encouraging faculty to be reflective about their teaching” (Jackson et al., 2022, p. 2). We asked for faculty to be involved in at least one term (quarter or semester, depending on the institution), as this time frame would be long enough to observe reflection on teaching (measuring actual changes in teaching would take much longer). Their involvement included completing a teaching reflection, having two class sessions observed with COPUS (as it is a commonly used structured observational protocol for undergraduate STEM courses), participation in at least one individual or group meeting facilitated by CTL staff, an exit interview (our main source of qualitative data), and a final opportunity to update their original teaching reflection. To encourage reflection on teaching practices, we structured the CTL-facilitated meetings such that the facilitators asked instructors questions about their structured observation data. These meetings occurred either as a one-time, individual consultation that is typical of CTL meetings or as a series of two to three small group meetings with the CTL staff and one to three other instructors participating in the program. As there are pros and cons to each format (i.e., individual meetings require fewer resources and are more time efficient, but multiple group meetings provide opportunities for instructors to compare their experiences and build relationships with each other over time), we sought to compare the reflections from individual and group meetings. Given that our group sessions were smaller and less intensive than traditional CoPs, we have opted to refer to these as “small group meetings,” rather than CoPs, though they were inspired by the CoP work discussed above.
Our DIPD was implemented at three different research-intensive universities in the Western United States. We asked the following exploratory research questions to obtain a sense of how well the program worked to prompt instructor reflection on their teaching practices (i.e., to demonstrate proof-of-concept):
How did participation in the DIPD program prompt instructors to reflect on their current teaching practices?
Which factors of the DIPD program contributed most to these reflections?
Theoretical Framework
The theoretical framework guiding our research was Teacher-Centered Systemic Reform (TCSR) (Woodbury & Gess-Newsome, 2002; Gess-Newsome et al., 2003). TCSR recognizes that multiple factors are influential in impacting such teaching work: personal factors (e.g., years of teaching experience), teachers’ thinking (e.g., knowledge of teaching practices; dissatisfaction with current teaching practices), and contextual factors (classroom and school contexts) (Woodbury & Gess-Newsome, 2002). We chose this framework because of its focus on teachers’ thinking, which is the primary variable of interest we sought to affect via participation in the DIPD. Many have argued that what teachers do is influenced by what they think, so teacher change efforts should be geared toward changing their thinking about their teaching (e.g., Clark & Peterson, 1986; Cooney & Shealy, 1997). Critically, TCSR argues that teachers’ thinking directly influences their practice, so identifying changes in teachers’ thinking can eventually lead to changes in teaching practices.
Several studies have used TCSR to create and evaluate the effectiveness of PD opportunities. For example, Stains et al. (2015) examined the impacts of teacher attitudes, beliefs, and practices (i.e., teachers’ thinking) taught in a PD program on new chemistry professors’ likelihood of adopting and sustaining use of EBTPs in their courses. They found that the PD opportunity was successful in affecting these outcomes in the short term, but such implementation was not always sustained. Similarly, Idsardi et al. (2023) used the TCSR model to study the relationships between STEM faculty members’ conceptions of how students learn and the instructional practices they used in their classrooms. They concluded that the most effective instructional change strategies would not only address teaching practices but also instructor personal characteristics and contextual factors within which instructors are embedded.
Contrary to these prior studies, we did not use TCSR in the development of the DIPD program but rather to make sense of the qualitative results from instructors’ post-DIPD exit interviews. Specifically, we paid attention to instances of instructors mentioning dissatisfaction with their current teaching practices as identified in the COPUS data. Additionally, we looked for instructor mentions of personal and contextual factors that would shape their response to the DIPD, as these cannot be separated from the DIPD experience.
Method
This study was approved by the Institutional Review Board at each of the three campuses (University 1: #1219, University 2: #2020-3, University 3: #HS-21-132). Given the different institutional contexts (e.g., CTLs having/not having prior experience with using COPUS data to facilitate meetings), there were minor differences in how the DIPD was carried out at each campus. Those differences are described in detail in Appendix A; however, below we describe the commonalities shared across the three institutions.
Institutions and Participants
Instructors from three public, research-intensive, large-enrollment, Minority-Serving Institutions with dedicated CTLs were recruited to participate for a single academic term. When selecting participants, we sought broad representation of instructors across disciplines, appointment types, and years of teaching experience. Most instructors were teaching large-enrollment introductory courses. We recruited a total of 18 instructors; however, two withdrew before the interview stage, resulting in a final sample of 16. Instructors included non-tenure-track lecturers and tenure-track professors of teaching that taught mostly life and physical sciences, had 1–27 years of teaching experience, and most had previously worked with their campus’s CTL prior to participation in the study (Table 1). Given the substantial time commitment required, instructors received as compensation an Amazon gift card for participating in one individual meeting ($300) or the multiple small group meetings ($500).
Table 1. Instructor and Course Demographics
Campus | Name (Pseudonym) | Title/Rank | Discipline | Years of teaching experience | Prior experience with COPUS | CTL-facilitated meeting structure |
Uni 1 | Sierra | Associate Professor of Teaching | Computer Science | 16 | Yes | Individual meeting |
Uni 1 | Eileen | Assistant Professor of Teaching | Biomedical Engineering | 4 | Yes | Individual meeting |
Uni 1 | Morgan | Associate Professor of Teaching | Psychological Science | 14 | No | Individual meeting |
Uni 1 | Kaitlyn | Lecturer | Economics | 12 | Yes | Individual meeting |
Uni 1 | Melissa | Lecturer | Cognitive Science | 25 | Yes | Small group meetings |
Uni 1 | Tiffany | Professor of Teaching | Chemistry | 27 | Yes | Small group meetings |
Uni 1 | Robert | Assistant Professor of Teaching | Computer Science | 9 | Yes | Small group meetings |
Uni 2 | Daphne | Continuing Lecturer | Chemistry and Biochemistry | 16 | Yes | Individual meeting |
Uni 2 | Kailash | Continuing Lecturer | Molecular and Cell Biology | 14 | Yes | Individual meeting |
Uni 2 | Monica | Assistant Professor of Teaching | Life and Environmental Sciences | 10 | Yes | Small group meetings |
Uni 2 | Jarnila | Lecturer | Molecular and Cell Biology | 1 | Yes | Small group meetings |
Uni 2 | Lindsay | Lecturer | Molecular and Cell Biology | 1 | Yes | Small group meetings |
Uni 3 | Belinda | Lecturer | Computer Science & Engineering | 8 | No | Individual meeting |
Uni 3 | Emelie | Associate Professor of Teaching | Education | 15 | No | Small group meetings |
Uni 3 | Serafina | Assistant Professor of Teaching | Hispanic Studies | 2 | No | Individual meeting |
Uni 3 | Joseph | Associate Professor of Teaching | Chemistry | 6 | No | Small group meetings |
Note. Uni stands for university. Lecturer indicates non-tenure track Lecturer.
Data-Informed Professional Development (DIPD)
To prepare for the CTL meeting(s), participants 1) completed a brief narrative teaching reflection about their teaching practices and 2) had their large lecture courses observed twice by trained observers from the CTLs using the COPUS protocol as described in Smith et al. (2013). They then participated in a meeting facilitated by their respective campus’s CTL staff, which took one of two forms. Half of the 16 total participants received an individual meeting, and half of the participants participated in a series of two small group meetings with one to three other instructors, facilitated by CTL staff. All meetings were centered around the instructors’ COPUS data from their courses. Then, each instructor participated in a semi-structured exit interview. Finally, they were given the option to revise their pre-study narrative teaching reflection (i.e., post-meeting teaching reflection).
Part 1: Pre-Meeting Teaching Reflection
Before the meetings with CTL staff, participants completed a two-question teaching reflection where they were asked to describe 1) their approach to teaching the observed course in person and why they chose to teach the observed course this way (500 words), and, given that this study took place in the transition back from online instruction, 2) any changes to their teaching that occurred due to the COVID-19 pandemic (200–300 words). Participants completed this teaching reflection before their first classroom observation.
Part 2: Classroom Observations
Before the meetings with CTL staff, each participant had their class observed in person twice for the entire duration of each class period, following the COPUS protocol (Smith et al., 2013). COPUS categorizes classroom activities over two-minute time intervals by “what the instructor is doing” (e.g., instructors lecturing, answering student questions, etc.) and “what the students are doing” (students quietly listening, completing worksheets, etc.). Observers from all three campuses were trained to conduct the COPUS observations in three hours by one of the authors or CTL staff at that campus, according to the training outlined in Smith et al. (2013), which included reviewing the same videos for reliability. Participating instructors were allowed to choose their observation days to avoid having exams or student presentations (with limited instructor activity) observed. Most observations were conducted within two to three weeks of each other, but they all were within a single academic term.
Part 3: Individual Meeting versus Small Group Meetings
We randomly assigned instructors into either the individual meeting (akin to a typical CTL interaction) or a series of two small group meetings with one to three other faculty.
Individual Meeting. These were 50–60-minute consultations with one instructor, led by members of the CTL staff. Instructors were emailed their COPUS data in advance and were encouraged to review their data ahead of the meeting but were not given specific instructions about how to do so, as review of the data would occur during the meeting. Providing the data in this way mimicked the way in which the CTLs involved in this project typically shared information from COPUS observations outside of this study. In this way, we were able to ask our instructor participants questions about how well they were able to interpret and reflect on their data both with and without CTL staff support. During the sessions, the facilitators provided an overview of the COPUS data, and a framework for discussion was guided by four questions developed by the CTL at University 2:
Based on these data, what do you think you are doing that is working well to help students learn best?
Did anything surprise you about these data?
What questions do you have after seeing these data?
What is something you would like to explore further?
Instructors were also encouraged to discuss their teaching practices identified in the COPUS results, as well as anything else they wanted to mention related to their teaching.
Small Group Meetings. CTL staff facilitated a series of two 60–90-minute meetings where groups of two to three instructors were given their COPUS results and asked to review these data before the first session. The first session proceeded in much the same way as the individual meetings outlined above, though the additional faculty participants in the group contributed to the conversation. During the second meeting, participants revisited ideas from the first session and answered follow-up questions that differed, depending on the nature of the first group session (e.g., University 2: “How will you share your evaluation of this data with your students in a way they will understand?”; University 3: “Was there anything about the data you wanted to revisit?”).
Part 4: Semi-Structured Exit Interview
Instructors then participated in a one-hour, 11-question semi-structured exit interview. The questions were designed to gain an understanding of each instructor’s general propensity to consider making changes to their teaching outside of the study, their specific response to the COPUS results from their classroom, their feelings regarding the meeting component of the study, and whether they were considering making changes to their teaching as a result of participating in this study (i.e., their reflection on their data from the structured observation protocol). They were also asked for suggestions on improving the DIPD design, such as if they imagined their CTL scaling up the opportunity to offer it to more instructors (see Appendix B for a summary of these suggestions). See Appendix C for the full set of interview questions. These interviews were recorded and transcribed for data analysis.
Part 5: Post-Meeting Teaching Reflection
Instructors were given the opportunity to revise their original teaching reflections upon completion of the study; however, the rate of revision (University 1 = 3/7; University 2 = 5/5; University 3 = 1/4) and extent of revision varied across campuses.
Data Analysis
We primarily analyzed the content of the semi-structured interviews, though this was supplemented with a general review of the narrative teaching reflections. We utilized inductive coding (Fereday & Muir-Cochrane, 2006; Miles et al., 2020) to analyze the interviews for all faculty participants. Initially, we reviewed the same set of transcripts that included participants randomly selected from each campus. On this first pass, we identified common ideas in the transcripts, then met to discuss the identified ideas and started to build the codebook. This process was repeated iteratively to arrive at the final set of overarching categories and component codes that were aligned with the original set of two research questions. Once the final set of categories and codes had been decided on, one author from each of the campuses (Authors 1, 3, and 9) reviewed and coded all transcripts from their home campus separately. In a final series of meetings, we met to discuss any ambiguous data and come to consensus and to make final revisions to the codebook as appropriate.
Reliability and Validity
Given the scale of this study (multiple campuses and cross-discipline), it was critical that we ensured that analysis of our qualitative data was reliable and produced valid results (Creswell & Miller, 2000). We took several measures across the entirety of the study to achieve this, taking the perspective that establishing reliability is “a process, not a product” (Syed & Nelson, 2015, p. 384).
For reliability, we checked the transcripts to ensure that they did not contain obvious mistakes during transcription. We communicated between coders to compare results derived independently by each institution with meetings to come to consensus. For validity (i.e., building trustworthiness in the data), we used multiple strategies during data collection and analysis. First, we clarified the experiences and biases the researchers brought to the study by writing and examining our positionality statement (see Appendix D). Second, we ensured that all observers using the COPUS protocol were sufficiently and similarly trained via cross-campus collaboration. Third, we ensured that the meetings were implemented by CTL staff members who were knowledgeable in the use of COPUS data to facilitate teaching discussions. The only exception was at University 3, where this had not been done before; however, the team at University 3 consulted with the University 1 and 2 teams (including using their meeting materials) to ensure that they were implementing the meetings similarly. Fourth, when coding the qualitative data, we first reviewed interviews from all three campuses as a team to identify similar themes and codes; this ensured similarity in themes and codes identified across the campuses. As we proceeded through the iterative process of developing the codebook together, we ensured that a member of each campus was the coder for that campus. This was important because it meant that the data were ultimately coded by a member of the research team that had the most contextual knowledge about the campus, the CTL, and the participants.
Results
We first discuss the extent to which participating in the DIPD impacted instructors’ reflections on their teaching practices (RQ1) and then what factors contributed the most to these reflections (RQ2). It is important to note that our results are written to specifically address each research question. Thus, some codes from our codebook are not discussed below (see Appendix E for the full codebook). Codes discussed in-text are included in parentheses and italicized.
Research Question 1: How did participation in the DIPD prompt instructors to reflect on their current teaching practices?
Most Instructors Reflected on Methods to Boost Student Engagement
Promisingly, instructors across all three universities identified that participation in the DIPD made them reflect on ways to change their teaching. Specifically, they noted lower levels of student engagement associated with their current teaching practices and expressed a desire to make changes to address this problem (prompted new ideas and reflection, improve student engagement). For example, Joseph from University 3 stated, “I think the COPUS data … supported the need for additional … interaction with my students in lecture,” highlighting how the observation data reflected the lack of student engagement instructors had noticed during their classes (the observation data aligned with expectations). Additionally, the DIPD helped instructors identify what changes to make to address the engagement issues highlighted in the observation data (felt supported): “[Redacted] gave me some great recommendations of what things I can do in the class to kind of bring up the engagement and have the … discussion increase some” (Belinda, University 3).
Though these instructors had noticed lower levels of student engagement in their courses prior to the study, seeing their COPUS data and participating in the DIPD reinforced their observations and provided a platform for them to discuss ways to move forward with addressing student engagement challenges.
Some instructors in the study specifically identified decreased levels of student engagement as an ongoing challenge stemming from the disruption of the COVID-19 pandemic. For example, Morgan from University 1 noted that (improve student engagement):
Yeah. I think some changes I would make … related to this program … [are] more post pandemic related changes. But I think there probably are more opportunities for students to talk to one another, I think. That’s especially important now that students haven’t been in person … there’s this, I think real sense of anonymity from being online for so long. And I don’t think that was there before. So I wasn’t as worried I think about students interacting with each other. But now I think it’s valuable. The question is what activities can I have them do that are useful and not just an excuse for them to talk to each other and me to buy time?
Similarly, Lindsay from University 2, an instructor relatively new to COPUS and working with her CTL, highlighted her interest in connecting with her students and creating more opportunities for them to engage in group activities in the wake of the COVID-19 pandemic (improve student engagement):
I don’t know if it’s just … [that] we’ve been in that virtual environment and there’s this space between the [instructor] and the students, but this semester seems to be harder to connect with the students. And while the summer is going to be online again, I still want to find ways that I can make that connection … to … have them engage in … this group activity, even if we’re in different cities.
For these instructors, discussion of the observation data during the meeting offered an opportunity to surface and discuss these salient, often pandemic-related concerns. Instructors also showed readiness to experiment with their teaching practices and enact new ones to better engage their students moving forward. We interpret this desire to increase student engagement as a means to improve student learning, though it is possible they wanted to do so for other reasons (e.g., support student community building).
Small Tweaks, Rather Than Large-Scale Changes
While participation in the DIPD helped instructors reflect on making changes to their courses, the changes they considered were small tweaks rather than large-scale changes (prompted new ideas and reflection). For example, Belinda from University 3 said, “Smaller, smaller lecture bits. So like five-minute lectures and then moving on … Even if I just do breaks like that in-between bigger lectures, that’s going to be, that’s going to help increase those engagement levels.”
Belinda noted that instead of overhauling the entire lecture course, she could instead break up the longer lecture into smaller chunks to achieve her goal of increasing student engagement (improve student engagement).
Similarly, Daphne from University 2 also expressed a desire to make small changes to her teaching to improve student engagement by slightly adjusting how she implemented her in-class worksheet activity. She said:
A lot of times it’s a worksheet and that works very well, but maybe there could be some sort of variety in it so that it’s done in some other way … It might still end up being kind of a worksheet, but done in a different context because the way I’m doing it right now, most people are doing it individually. Some people group up and maybe I want to say one day per chapter, it’s definitely group. Something like that.
In this quote, Daphne described how she might ask the students to work in small groups versus individually on her worksheet activities throughout the term. Her participation in the DIPD program prompted new ideas and reflection for how she might work with her students in the classroom.
These desires for small changes were likely influenced by constraints around implementing teaching change. Lindsay from University 2 described some of the challenges to making large-scale changes to her teaching (class constraints):
The biggest challenge I see is large classrooms. … I’ll be teaching in the fall; we’ll have 200 plus students in it. And I still want to implement those things, but managing 200 students, being able to separate them into groups, interact with all the groups, make sure that you’re not just picking on the few in the front, I know that’s going to be a challenge …
Similarly, Joseph from University 3 explained that making large-scale changes takes time (timing challenges):
… [it] made me remember that if I want to make significant changes, it’s something that I can’t do it the last minute. I can’t be like, Oh, I have to teach in an hour. Let me think about how I’m going to switch this up. It does require more planning ahead of time.
Together, these instructors highlighted their desire to make small changes to their teaching to increase student engagement based on their participation in the DIPD, demonstrating that the program was effective for prompting reflection on teaching and sparking new ideas. Although the DIPD may not have led directly to immediate change in teaching (as we did not measure this as part of the study), it provided an important platform to reflect on teaching practices—a necessary first step to implementing that change.
Research Question 2: Which factors of the DIPD program contributed most to these reflections?
Data-Driven Observations Alone Are Not Sufficient for Prompting Change
While the DIPD was effective for promoting reflection on teaching practices, instructors described the meetings facilitated by the CTL staff as the most critical component of the program. Providing instructors with the data from the classroom observations alone (via email prior to the meeting[s]) was not sufficient to prompt reflection on teaching for a variety of reasons (e.g., challenges with interpretation). For example, Serafina from University 3, who has a background in instructional design and evidence-based pedagogy, described how it was difficult to make sense of what the data reflected about her class without such guidance from CTL staff. She said, “… it was nice to see the COPUS thing, but it was also overwhelming to see because there were so many codes, so much stuff that it was difficult to really see the big picture sometimes.”
In addition to helping instructors work through challenges with interpreting their data, the meetings allowed for instructors to get different perspectives on their teaching practices that they would not have gotten on their own (felt supported). At University 2, undergraduate interns working for the CTL collected the COPUS data, and then instructors met with CTL staff and the interns to reflect on the assessment results, affirm practices that are working, and discuss potential changes. While the interns provided the student perspective on the classroom observation data, CTL staff offered EBTPs to respond to the findings. Jarnila at University 2 explained:
[Author 6] is very, very knowledgeable, and [Author 2] is very, very knowledgeable. And so getting to hear their perspectives is enlightening and really, really helpful. Aside from that, having the undergrad perspective was wonderful. And I feel like having more conversations with undergrads in these types of settings is enlightening. Because even though I was an undergrad student not too long ago, and I know how the students feel, I’m now on the other side.
Importantly, instructors described the meetings as being helpful to them, regardless of the format in which they were administered. Whether it was a one-time, individual meeting or a series of small group meetings, they prompted reflection on the observation data while providing a space for clarification, feedback, and co-exploration of the results. Kaitlyn at University 1, who participated in an individual consultation, described her thoughts during the meeting as she reflected on her observation data under guidance of the CTL facilitator:
So as we talked, I was thinking, okay, how can I use this information? … I was surprised that I spent … a lot of time explaining the iClick [sic] answers. And I didn’t feel like I was doing it that much. So that was kind of a surprise.
Her experience reveals the process of evaluation and meaning-making that can be aided by discussion with an experienced facilitator, especially in the face of surprising results (surprised & agree and helped clarify COPUS).
Additional Benefits of Small Group Meetings
Though both meeting formats prompted instructors to consider changes to their teaching, our participants reported that the multiple small group meetings (e.g., with other faculty, undergraduates, and CTL staff) were particularly impactful. One instructor appreciated how the small group meetings served as a platform to connect them with peers to discuss teaching practices. For example, Lindsay from University 2 explained how much she loved the group meetings because she typically did not have much interaction with other instructors to think and talk about teaching and learning (felt supported and feeling of equity & inclusion for instructors):
I loved the group activity. That was incredibly helpful because my first semester I didn’t really interact with any other faculty. … And so I didn’t get a lot of feedback on how do I engage with the students? So having that professional development felt amazing of getting to interact with the two others I was with … and getting their feedback on how their first year’s been going.
Lindsay self-identified as faculty in her first year of teaching at the university and described how much she appreciated receiving feedback from other first-year colleagues. However, of note is how the informal co-created context within the meeting community enabled her agency to make changes and reflect on her own teaching practices.
Even experienced instructors identified the benefit of the small group format. Joseph from University 3, with six years of teaching experience, said:
[it] was … a benefit to have … a fellow instructor there who’s, you know, who’s doing the day-to-day teaching. Having them there to sort of discuss the data with is very nice because we can compare and contrast our own sessions and see how, how our COPUS data was different … But I think it’s really nice to have the other … instructors in the room. Because instructors bring a different perspective than an instructional designer would.
Melissa from University 1, who has over 25 years of teaching experience, also shared this perspective, stating that, “I’m always going to choose the learning communities. I’m always going to learn more from people who are currently doing this in the classroom with different subjects. You just get a lot of really good ideas.”
Overall, while participants described the individual meetings as helpful, we found that participants described the added benefits of the small group meetings when prompted to think about an individual meeting compared to group meetings, even if they did not participate in the multiple meetings themselves. Specifically, participants described how the small group meetings allowed instructors a platform through which they could gain different perspectives on various teaching challenges.
Benefits Beyond Reflecting on Teaching Change: Community Building
Beyond reflecting on ways to make changes to their courses, participants felt that there was an important element of community-building that occurred through the meetings (feeling of equity & inclusion for instructors), and that this community could support them as they considered making changes to their teaching. For instance, Serafina from University 3 said, “I don’t have a lot of interaction with people because I don’t have a community like most people that came before. I’m constantly reaching out to these opportunities because I’m like, I need to meet people.”
To have such formalized opportunities to meet like-minded colleagues and discuss teaching was very important for this former-instructional-designer-turned-faculty.
The role of the CTLs in promoting community emerged in multiple places during the study. This is reflected in the following response from Sierra at University 1, who participated in the individual meetings:
Yeah, so I think for me, the meeting with [Redacted], the questions about the COPUS were not a major component, actually … what was more valuable to me was being able to share some of my experiences with her, in terms of teaching, and hearing from her, “What you’re telling me is very common, I hear that a lot from a lot of faculty across the disciplines.” There is a lot of value in sort of having that ground truth for teachers, in terms of struggle points.
This instructor’s experience demonstrated the value of the meetings for facilitating information sharing about teaching strategies, which is often not shared naturally outside of CTL contexts (felt supported). Though the meeting was in the individual format for this instructor, they still were able to feel a sense of community from hearing that those other instructors on campus shared the same challenges.
As a new faculty member, Jarnila from University 2 mentioned the small group meetings as playing a substantial role in supporting the planned changes to her teaching practices by connecting her with a community of experienced colleagues. She mentioned how this community created a unique space to discuss the nuances of teaching large enrollment classes (feeling of equity & inclusion for instructors):
… I think what stood out the most is that there are people that care and there are conversations about professional development. So, whether whatever it is we talked about was great and helpful, it contributed to my professional development, and I will use it and implement it, including, the COPUS feedback that I got and everything else. But really, what really, I felt was very impactful is that this program is there, and there are people other than me, [Author 9], [M], and the few others that are there, care about it, and they’re doing something. And I feel like I’m very enthusiastic about it because of that aspect…
Jarnila highlighted the importance of the facilitators and participants as critical players in her teaching professional development journey.
Though CTL-led PD opportunities are typically focused on promoting measurable outcomes in the classroom in terms of student learning and engagement, it is equally important to consider the impact of such PD on instructors’ reflective processes. Our DIPD provided a formalized space for like-minded instructors to come together and discuss common teaching challenges; importantly, the DIPD was implemented at research-intensive universities, where such conversations and communities are often rare. This data-supported community can be the support instructors need to engage in the challenging work of improving their instruction and learning outcomes for their students.
Discussion
In this exploratory study, instructor interview data provided insight into how a DIPD opportunity could prompt even experienced instructors to reflect on their teaching practices. Though there have been many studies that have used structured observation protocols, like COPUS, to measure teaching change (e.g., Tomkin et al., 2019; Wheeler & Bach, 2021), there have been relatively few studies examining the potential for these observation data to prompt reflection on teaching, an important precursor for teaching change (for exceptions, see Johnson et al., 2024; Wood et al., 2024).
We found that providing the data from the structured observation protocol helped some instructors come to important insights about their teaching (e.g., when they were surprised by the data but ultimately agreed with it). More importantly, our results found that the meetings where they were provided with the chance to review and discuss their classroom observation data with trusted and knowledgeable colleagues and CTL staff were the most beneficial component of the experience. It is important to note that the meetings were not delivered identically at all three universities in the study, given pre-existing differences in teaching support on those campuses (e.g., familiarity with/usage of structured classroom observations, existing teaching PD programs). However, despite these differences, the themes that emerged from our data were consistently identified at all three campuses, meaning that the DIPD was similarly impactful, regardless of the specific execution of the program. Thus, our multi-institutional study provides an example of how a PD program can begin with core ideas (i.e., use structured protocols to observe classrooms and discuss the observation data afterward) but be tweaked to fit the current institutional context and still have instructors emerge with similar experiences and desires to change their teaching practices.
Regarding Research Question 1—how the DIPD program prompted instructors to reflect on their current teaching practices—our results suggested the DIPD program did prompt such reflection, particularly with regard to boosting student engagement. This desire to change could be a result of their dissatisfaction with how things were going in their classrooms that was highlighted when they reviewed their own COPUS data and noticed areas for improvement (e.g., too much lecturing). This is an example of teachers’ thinking that TCSR argues directly influences reform in teaching practices (Gess-Newsome et al., 2003; Woodbury & Gess-Newsome, 2002). The focus on student engagement makes sense in the context of using COPUS, as it was designed to help instructors notice when they are (or are not) using interactive strategies (Smith et al., 2013).
Additionally, we found that our instructors described wanting to make small tweaks, not large-scale changes. Using the elements of the TCSR framework, we interpret these comments as likely being due to personal factors (e.g., how long they’ve been teaching the course, how “solid” they feel the course is in its current state, evaluation of practicality) and contextual factors (e.g., classroom contexts like course size and student preparedness) (Woodbury & Gess-Newsome, 2002). For example, our participants were generally more experienced instructors (personal factors) who described the types of changes that were feasible to make in their own teaching context. We found that having experienced instructors participate in our DIPD program helped affirm their expectations about their teaching practices but also helped them raise important questions about ways that they might improve or change their classroom practices. Though the changes our instructors considered were small and incremental rather than dramatic restructuring, researchers promoting the implementation of EBTPs encourage the adoption of small tweaks that can have large impacts on teaching and learning (e.g., Darby & Lang, 2019; He, 2021; Lang, 2021). Our findings are also in line with a recent systematic review of innovation in teacher education research, which categorized innovations in the field and found that most innovations were classified as “incrementation,” or pushing the field in the direction it was already heading (as contrasted with large-scale teaching innovations like redefinition and redirection) (Ellis et al., 2023).
Our participants described how class size and timing challenges impacted their desire to make slow and gradual changes to their teaching practices. This is echoed by previous quantitative studies that have found that contextual factors, including class size, impacted the likelihood of an instructor implementing active learning strategies (e.g., Denaro et al., 2022; Gess-Newsome et al., 2003). Also, these instructors had just gone through major course redesigns because of the COVID-19 pandemic and were likely recovering from the challenges associated with major instructional shifts (an example of a contextual factor that influences teacher change) (Woodbury & Gess-Newsome, 2002). Though we deliberately avoided the inclusion of quotes related to changes already made to their courses because of the pandemic (vs. ideas for changes because of the DIPD), most of our participants did mention the pandemic, indicating that it had a large impact on the ways they were thinking about teaching their courses. This is unsurprising, as the pandemic dramatically changed the nature of teaching (Brunetto et al., 2022) and influenced instructor attitudes about making changes in teaching (e.g., Lee et al., 2022).
Regarding Research Question 2—which factors contributed the most to instructors’ reflections on their teaching—our results suggested that the meeting component of the DIPD experience was necessary to prompt reflection on teaching practices. We found that even experienced instructors in our sample struggled to make sense of their observation data ahead of the CTL-facilitated meetings to guide them; thus, the data alone were not sufficient for prompting reflection on teaching. By working with CTL staff, instructors afforded themselves an opportunity to engage in intentional dialogue with experts in the educational development field and/or with undergraduates who bring essential insights from their student role. The meetings thus offered instructors many important opportunities to understand their class dynamics, diagnose issues through guided re-examination of the observation data, and gain other perspectives on their teaching behaviors. Such findings align with previous work suggesting that teachers’ experiences with PD are an influential factor in changing teacher thinking (Woodbury, 2000). Interactions with CTL staff encourage deeper reflection on teaching goals and methods and allow them to offer feedback based on their classroom observations (Knapper & Piccinin, 1999).
Instructors in both the individual and small group meetings (when prompted to comment on whether an individual or a group format might have worked better) highlighted the additional benefits that come from working with small groups of instructors in addition to CTL staff. Though the small group meetings were more time-intensive than a single individual consultation with a CTL staff member, they had more value to our participants because they provided space for crosstalk between instructors to share alternative teaching practices and provided additional time to process the findings and reflect on teaching. This finding makes sense in the context of prior research on CoPs, from which our small group meetings took inspiration (e.g., Tomkin et al., 2019).
The impact of the meeting component of the DIPD is notable given that teaching is often conceptualized and practiced as a private, individual activity, undertaken in what Shulman (1993) calls “pedagogical solitude” (p. 6). It is unusual for most instructors to invite outsiders—colleagues, students, or others—to enter their classrooms as observers and to discuss what happens there. The meetings in our DIPD offered a particularly exciting way to step out of the pedagogical solitude to build community, reflection, and exploration. Although unexpected at the beginning of the study, our instructors identified an important theme of community building as a particularly impactful component of the DIPD. Even though participating instructors had different data, teaching backgrounds, and institutional contexts to respond to, they found a community of other instructors invested in improving their teaching through the DIPD, which was a powerful motivator for promoting reflection on teaching.
Limitations and Future Directions
Though our study revealed important themes related to how DIPD can prompt reflection on teaching, it is not without its limitations. First, most of our instructors had previous PD interactions with their campus’s CTL. Thus, they were likely already reflective about their teaching practices prior to their involvement in the study. However, the fact that they were still able to identify areas for improvement in their teaching is promising. Perhaps the new format of the DIPD, which relies on standardized data gathered from structured and validated observational protocols, may draw in instructors who typically do not engage with CTLs but who may be interested in reviewing such data for their own courses. The opportunity to work with like-minded colleagues and hear student perspectives about teaching in the small group format may also entice instructors to participate.
Additionally, because this study only lasted for a single term, it was not possible to measure whether the DIPD had an impact on instructional practice in the instructors’ courses following their participation in the study. However, because measuring teacher change was beyond the scope of the current project, we encourage future work in this area, as it will be a critical next step for understanding the impact of structured observational protocols as a PD tool to prompt instructor change.
Conclusion
We have presented evidence that combining a structured classroom observation protocol with individual or small group meetings can positively impact teachers’ reflection on their teaching practices, which is a critical first step in promoting pedagogical change. We found that providing instructors with data from the observation protocols alone, without support from the follow-up meeting, was not sufficient for driving reflection in teaching. The meeting was critical for helping instructors review their data and identify actionable points of change. Importantly, while the DIPD was grounded in similar evidence-based teaching practices at each campus, it took different forms that fit each of the three study campuses. Yet, the outcomes were largely the same—instructors wanted to make small changes to their teaching methods to increase student engagement. They additionally expressed that several small group meetings composed of CTL staff and other faculty were particularly beneficial compared to an individual meeting. Beyond simply being able to generate more ideas and hear multiple perspectives in such small groups, instructors felt that there was an important element of community building that emerged from being able to participate in the DIPD because they were able to connect with other like-minded, teaching-focused individuals. Such findings are important because group meetings require fewer resources to implement compared to typical individual consultations. Overall, our DIPD demonstrated the power of providing instructors not only with standardized data about their teaching but also opportunities to discuss those data, particularly with other instructors. Our work supports the claim of Schulman (1993), who called for “open[ing] classroom doors to supportive communities of conversation and evaluation” and “treat[ing] teaching as community property” (p. 6), rather than a solitary practice.
ORCIDs
Annie S. Ditta: 0000-0002-7126-034X
Adriana Signorini: 0000-0003-4817-6737
Mathew Williams: 0000-0001-9164-2487
Eric Johns: 0009-0003-0659-783X
Andrea Aebersold: 0009-0001-7370-0767
James Zimmerman: 0009-0002-9537-4330
Samantha Eastman: 0000-0003-4353-6692
Brian K. Sato: 0000-0003-1489-0705
Petra Kranzfelder: 0000-0003-4146-7929
Biographies
Annie S. Ditta is an Associate Professor of Teaching in Psychology at the University of California, Riverside. Dr. Ditta’s research investigates the cognitive underpinnings of student learning. The ultimate goals of her research are threefold: 1) to help students develop their critical and creative thinking skills, 2) to increase motivation to learn, and 3) to design better methods of instruction for large lecture courses at the university level.
Adriana Signorini is the Educational Assessment Coordinator at the Center for Engaged Teaching and Learning at the University of California, Merced. Dr Signorini provides pedagogical support through the Student Assessing Teaching and Learning (SATAL) program. SATAL is a student-faculty pedagogical partnership program that supports faculty with formative assessment activities, including collecting, analyzing, and integrating student feedback into the course development.
Mathew Williams is the Director of Instructional Technology at California State University, Northridge, where he oversees the University’s digital learning environments, professional development programs, and faculty support services. Dr. Williams’ research examines how individual and institutional factors interact to shape instructional decisions, student engagement, and learning outcomes.
Eric Johns is a member of the Academic Studio Faculty at the New Orleans Center for the Creative Arts (NOCCA), where he develops and delivers creative, project-based learning world history curriculum to students from across the arts. He was previously a postdoctoral researcher with the XCITE Center for Teaching and Learning at UC Riverside.
Andrea Aebersold is the Executive Director of the Center for Teaching and Learning at Montana State University Billings. Dr. Aebersold oversees professional and instructional development for faculty and researches equity in STEM for faculty and students, active learning pedagogy, and pedagogical wellness.
James Zimmerman is a Professor of Teaching in Physics and the Director of the Center for Engaged Teaching and Learning. In his wideranging academic career as both faculty member and administrator, Dr. Zimmerman has been recognized as a leader who can build consensus, motivate, and operationalize campus-wide plans. He enjoys the challenge of aligning constituents toward the essential institutional goals of developing, managing, and resourcing exceptional and innovative undergraduate learning environments that benefit diverse student populations.
Samantha Eastman is a Senior Learning Experience Designer with the XCITE Center for Teaching and Learning at UC Riverside, where she assists faculty to enhance their courses through digital pedagogies and active learning strategies. She is a mobile-learning researcher with interest in SOTL action research as praxis. Her interests also include enacting pedagogies of care to create more equitable and inclusive learning environments.
Brian K. Sato is a Professor of Teaching in Molecular Biology & Biochemistry and Associate Dean of the Division of Teaching Excellence and Innovation at the University of California Irvine. Dr. Sato’s research focuses on creating more equitable and inclusive STEM higher education environments, including identifying impacts of teaching-focused faculty, as well as evaluating the effectiveness of programs and policies at an institutional level. Dr. Sato also oversees instructional professional development activities for faculty and future faculty.
Petra Kranzfelder is an Assistant Teaching Professor in Molecular, Cellular, and Developmental Biology at the University of California Santa Barbara. Dr. Kranzfelder’s research centers on the implementation of evidence-based teaching practices to engage all students, particularly those from marginalized communities. She uses findings from this reach to improve how she teaches undergraduate biology and graduate teaching assistant training courses.
Acknowledgments
We would like to acknowledge the contributions of the CTL staff and the instructor participants at each campus, without whom this work would not have been possible. Funding support came from the National Science Foundation (NSF DUE 1821724) and Howard Hughes Medical Institute Inclusive Excellence (HHMI IE # GT 11066).
Conflict of Interest Statement
The authors have no conflicts of interest.
Data Availability
The data reported in this manuscript are available on request by contacting the corresponding authors.
References
Adams, S., Tesene, M., Gay, K., Brokos, M., McGuire, A., Rettler-Pagel, T., & Swindell, A. (2023). Communities of practice in higher education: A playbook for centering equity, digital learning, and continuous improvement. Every Learner Everywhere. https://www.everylearnereverywhere.org/resources/communities-of-practice-in-higher-education/
Addy, T. M., Younas, H., Cetin, P., Cham, F., Rizk, M., Nwankpa, C., & Borzone, M. (2022). The development of the protocol for advancing inclusive teaching efforts (PAITE). Journal of Educational Research and Practice, 12(0), 65–93.
Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: Lack of training, time, incentives, and… tensions with professional identity? CBE—Life Sciences Education, 11(4), 339–346.
Brunetto, D., Bernardi, G., & Andrà, C. (2022). Teaching as a system: COVID-19 as a lens into teacher change. Educational Studies in Mathematics, 110, 65–81. http://doi.org/10.1007/s10649-021-10107-3
Clark, C. M., & Peterson, P. L. (1986). Teachers’ thought processes. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 255–296). Macmillan.
Cooney, T. J., & Shealy, B. E. (1997). On understanding the structure of teachers’ beliefs andtheir relationship to change. In E. Fennema & B. S. Nelson (Eds.), Mathematics teachers intransition (pp. 87–109). Lawrence Erlbaum.
Creswell, J. W., & Miller, D. L. (2000). Determining validity in qualitative inquiry. Theory Into Practice, 39(3), 124–130.
Darby, F., & Lang, J. M. (2019). Small teaching online: Applying learning science in online classes. John Wiley & Sons.
Denaro, K., Kranzfelder, P., Owens, M.T., Sato, B., Zuckerman, A. L., Hardesty, R. A., Signorini, A., Aebersold, A., Verma, M., & Lo, S. M. (2022). Predicting implementation of active learning by tenure-track teaching faculty using robust cluster analysis. International Journal of STEM Education, 9, 49. http://doi.org/10.1186/s40594-022-00365-9
Ebert-May, D., Derting, T. L., Henkel, T. P., Middlemis Maher, J., Momsen, J. L., Arnold, B., & Passmore, H. A. (2015). Breaking the cycle: Future faculty begin teaching with learner-centered strategies after professional development. CBE—Life Sciences Education, 14(2), ar22.
Eddy, S. L., Converse, M., & Wenderoth, M. P. (2015). PORTAAL: A classroom observation tool assessing evidence-based teaching practices for active learning in large science, technology, engineering, and mathematics classes. CBE—Life Sciences Education, 14(2), ar23.
Ellis, V., Correia, C., Turvey, K., Childs, A., Andon, N., Harrison, C., … & Hayati, N. (2023). Redefinition/redirection and incremental change: A systematic review of innovation in teacher education research. Teaching and Teacher Education, 121, Article 103918.
Fereday, J., & Muir-Cochrane, E. (2006). Demonstrating Rigor Using Thematic Analysis: A Hybrid Approach of Inductive and Deductive Coding and Theme Development. International Journal of Qualitative Methods, 5(1), 80–92. http://doi.org/10.1177/160940690600500107
Freeman, S., Eddy, S. L., McDonough, M., Smith, M. K., Okoroafor, N., Jordt, H., & Wenderoth, M. P. (2014). Active learning increases student performance in science, engineering, and mathematics. Proceedings of the National Academy of Sciences, 111(23), 8410–8415. http://doi.org/10.1073/pnas.1319030111
Gess-Newsome, J., Southerland, S. A., Johnston, A., & Woodbury, S. (2003). Educational reform, personal practical theories, and dissatisfaction: The anatomy of change in college science teaching. American Educational Research Journal, 40(3), 731–767.
He, Y. (2021). Point of view: STEM teaching reform: Incremental pathways. Journal of College Science Teaching, 51(1), 3–4.
Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984.
Idsardi, R. C., Luft, J. A., Wingfield, J. L., Whitt, B., Barriga, P. A., & Lang, J. D. (2023). Relationships between undergraduate instructors’ conceptions of how students learn and their instructional practices. Journal of Research in Science Teaching, 60(9), 2076–2110.
Jackson, M. A., Moon, S., Doherty, J. H., & Wenderoth, M. P. (2022). Which evidence-based teaching practices change over time? Results from a university-wide STEM faculty development program. International Journal of STEM Education, 9(1), 22.
Johnson, P. B., Holtzman, N., & Fernandez, E. (2024). Classroom observational data: A professional development tool for introductory college mathematics instruction. International Journal of Mathematical Education in Science and Technology, 1–13.
Knapper, C., & Piccinin, S. (1999). Consulting about teaching: An overview. New Directions for Teaching and Learning, 79, 3–7.
Lang, J. M. (2021). Small teaching: Everyday lessons from the science of learning. John Wiley & Sons.
Lee, K., Fanguy, M., Bligh, B., & Lu, X. S. (2022). Adoption of online teaching during the COVID-19 Pandemic: A systematic analysis of changes in university teaching activity. Educational Review, 74(3), 460–483 http://doi.org/10.1080/00131911.2021.1978401
McPartlan, P., Thoman, D. B., Poe, J., Herrera, F. A., & Smith, J. L. (2022). Appealing to faculty gatekeepers: Motivational processes for intentions to adopt an evidence-based intervention. BioScience, 72(7), 664–672. http://doi.org/10.1093/biosci/biac029
Miles, M. B., Huberman, M. A., & Saldana, J. (2020). Qualitative Data Analysis (4th ed.). SAGE Publications.
Owens, M. T., Seidel, S. B., Wong, M., Bejines, T. E., Lietz, S., Perez, J. R., … & Tanner, K. D. (2017). Classroom sound can be used to classify teaching practices in college science courses. Proceedings of the National Academy of Sciences, 114(12), 3085–3090.
Penny, A. R., & Coe, R. (2004). Effectiveness of consultation on student ratings feedback: A meta-analysis. Review of Educational Research, 74(2), 215–253.
Reisner, B. A., Pate, C. L., Kinkaid, M. M., Paunovic, D. M., Pratt, J. M., Stewart, J. L., … & Smith, S. R. (2020). I’ve been given COPUS (Classroom Observation Protocol for Undergraduate STEM) data on my chemistry class… now what? Journal of Chemical Education, 97(4), 1181–1189.
Sheridan, B. J., & Smith, B. (2020). How often does active learning actually occur? Perception versus reality. AEA Papers and Proceedings, 110, 304–308. http://doi.org/10.1257/pandp.20201053
Shulman, L. S. (1993). Teaching as community property: Putting an end to pedagogical solitude. Change, 25(6), 6–7.
Smith, M. K., Jones, F. H., Gilbert, S. L., & Wieman, C. E. (2013). The Classroom Observation Protocol for Undergraduate STEM (COPUS): A new instrument to characterize university STEM classroom practices. CBE—Life Sciences Education, 12(4), 618–627.
Stains, M., Pilarz, M., & Chakraverty, D. (2015). Short and long-term impacts of the Cottrell scholars collaborative new faculty workshop. Journal of Chemical Education, 92(9), 1466–1476.
Syed, M., & Nelson, S. C. (2015). Guidelines for establishing reliability when coding narrative data. Emerging Adulthood, 3(6), 375–387. http://doi.org/10.1177/2167696815587648
Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Nicole Arroyo, E., Behling, S., Chambwe, N., Cintrón, D. L., Cooper, J. D., Dunster, G., Grummer, J. A., Hennessey, K., Hsiao, J., Iranon, N., Jones, L., Jordt, H., Keller, M., Lacey, M. E., Littlefield, C. E., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences of the United States of America, 117(12), 6476–6483. http://doi.org/10.1073/pnas.1916903117
Tomkin, J. H., Beilstein, S. O., Morphew, J. W., & Herman, G. L. (2019). Evidence that communities of practice are associated with active learning in large STEM lectures. International Journal of STEM Education, 6(1), 1–15.
Wheeler, L. B., & Bach, D. (2021). Understanding the impact of educational development interventions on classroom instruction and student success. International Journal for Academic Development, 26(1), 24–40.
Wood, A. K., Christie, H., MacKay, J. R., & Kinnear, G. (2024). Using data about classroom practices to stimulate significant conversations and aid reflection. International Journal for Academic Development, 29(1), 114–127.
Woodbury, S. (2000). A model of the influence of teacher thinking and contexts on teacher change as conceptual change in mathematics education reform. In The annual meeting of the American Educational Research Association (3–59). ERIC Clearinghouse on Assessment and Evaluation.
Woodbury, S., & Gess-Newsome, J. (2002). Overcoming the paradox of change without difference: A model of change in the arena of fundamental school reform. Educational Policy, 16(5), 763–782. http://doi.org/10.1177/089590402237312
Wright, M. (2023). Centers for teaching and learning: The new landscape in higher education. Johns Hopkins University Press.
Appendix A-E
All Appendices can be found at the following link: https://drive.google.com/drive/folders/1-kJ_IbFLzw4Nr3tGIdremNnM7lg2s44_?usp=sharing