During Spring 2020, instructors needed immediate, scaled support for translating their face-to-face courses into a remote format. In response, like many universities, we met this need by developing and facilitating a remotely conducted course design institute (Debelius et al., 2021; Kaldor et al., 2021). This shift in modality—from our traditional, face-to-face institute to a remotely held one—resulted in the possibility of collecting a different kind of data, which afforded a new way to measure impact. For example, participants shared their thinking during asynchronous discussion forums and activities, in the chat during synchronous Zoom sessions, and in live-edited collaborative documents. These data preserved the immediacy, candor, and emotional response that occurred during participant interactions, adding a new layer of richness. This allowed for an inductive, qualitative analysis approach that focused on participants’ words: how they described their ideas, beliefs, and intentions toward teaching their courses. To complement this, we gathered Likert survey data at the start and end of the institute, providing quantitative data on participants’ changes in confidence. Leveraging these data in this study, we ask: What were the experiences of faculty who participated in the institute that relate to changes in their attitudes, perceptions, or pedagogic approaches?

In taking this mixed methods approach, we hoped to gain insights into shifts in participants’ thinking. Because beliefs about teaching drive behaviors (Bauch, 1984; Marra, 2005; Muijs & Reynolds, 2002), changes reflected in participant thinking could indicate potential shifts in their teaching practices. More tangibly, we can use this analysis to identify implications for the design of future educational development programming. Given that intensive, extended faculty development programs are more impactful than single, brief sessions (Henderson et al., 2011; Prebble et al., 2004), our center regularly offers multi-day institutes, semester-long fellows programs, and multi-part workshop series. Because these programs require significant time and resources, optimizing these efforts for maximum effectiveness is essential. An analysis like the one performed in this study can provide themes that reflect what did and did not work well in the institute, which can then guide changes in future program design.

These data were collected at a unique moment, as instructors were forced to shift their teaching to a remote format and our participants were navigating the difficulties of balancing work and personal lives, health scares, and uncertainty in ways they had never done before. As a result, some of the themes identified in this work may reflect amplified emotions tied to this experience and the impact of collective isolation and extreme social distancing. As we publish these results several years later, the idea of a return to a pre-pandemic “normal” feels even more unlikely as many of our ideas about teaching and learning and work and life have been deeply disrupted. Long-held assumptions about what a classroom looks like, how learning happens, and what is important or necessary are being challenged by students, faculty, and society at large. We also recognize that external stressors are always present. Certain stressors may not be as visible because their burden is concentrated within a subset of the faculty, and others may bring impacts over time felt by all (e.g., racism, systemic traumas, changing demands for higher education degrees, evolving demographics and background of the student body, increased use of artificial intelligence and other software in teaching, and even climate change). However, our research question, exploring participant experiences that potentially lead to changes, spans beyond the pandemic context of 2020. Therefore, as we developed our implications from themes arising in the data, we deliberately sought comparisons with existing studies of similar course design institutes, of learning theory, and of learning sciences.

Below, we start with a brief review of studies that measure the effectiveness or impact of faculty development programs, particularly course design institutes. Following this review, we describe the course design institute: its conception, design, implementation, and participants. We then describe the data sources and types, as well as our approach to qualitative and quantitative analyses. Finally, we explain our findings with particular focus on thematic categories and offer implications for future programming.


Studies of faculty development suggest that intensive and extended interventions, rather than shorter interventions such as one-off workshops, are needed to positively impact instructors’ conceptions about the nature of teaching and learning, the most important indicator of actual change in course design (Henderson et al., 2011; Prebble et al., 2004). However, most existing research into effective features of educational development programs provide weak evidence in identifying what works, what does not, and why (Amundsen & Wilson, 2012; Henderson et al., 2011; Levinson-Rose & Menges, 1981; Prebble et al., 2004; Stes et al., 2010; Weimer & Lenze, 1991; Wheeler, 2021). Consequently, there are repeated calls for “more rigorous research designs but also more qualitative research, a better theoretical and conceptual grounding of educational development practice, and a more detailed description of practice, so that each new study can build more explicitly on previous ones” (Amundsen & Wilson, 2012).

Course design institutes are standard offerings of centers for teaching and learning (Lee, 2010). These relatively intensive programs typically span multiple days or weeks, providing both time and space for reflection in a community of learners. Existing studies of these programs have used a variety of methodologies including quantitative analysis of survey data and thematic analysis of open response questions, interviews, focus groups, and materials such as syllabi, assignments, and student course evaluations (Favre et al., 2021; Johnson et al., 2017; Palmer et al., 2016; Wheeler, 2021). These studies have found a positive impact on participants’ metacognition, self-efficacy, and learner-centered beliefs and ultimately varied but increased implementation of student-centered instructional practices. Moreover, these examples demonstrate that a rigorous collection of varied data sources enables researchers to holistically assess the impact of an individual program.

A rich set of data sources also provides an opportunity for researchers to use qualitative analysis to elevate participants’ thoughts, feelings, and voices. Grounded theory methods remain faithful as possible to the raw data, using systematic constant comparisons to inductively discover interpretations (Glaser & Strauss, 1967) and asking the researcher to become aware of their own “preconceptions, values, and beliefs” in order to “search out and understand the world of others” (Hutchinson, 1986). It is useful for uncovering both the experiences of participants in an activity and the structures that shape those experiences (Conrad, 1982; Hutchinson, 1986; Marshall & Rossman, 1995). In educational research, it has become more common to draw on the methods of open coding and constant comparison from grounded theory, as it has been shown to be a valuable technique to understand the impact of pedagogical experiences on participants (e.g., Brantmeier et al., 2017; Buckley, 2019; Fetherston & Kelly, 2007; Kennedy & Lingard, 2006).

In this study, we aim to add to the existing body of studies on course design institutes by adopting a rigorous grounded theory analysis approach. We iteratively examine a rich set of artifacts of participants’ experiences in a course design institute to better understand changes in their attitudes, perceptions, or pedagogic approaches.

Description of Our Course Design Institute

Two campus units, the Center for the Enhancement of Learning and Teaching and Educational Technology Services, collaboratively developed an online institute on remote teaching in May 2020 with the goal of helping faculty translate their existing face-to-face courses into inclusive, student-centered online courses. The institute format, which occurred remotely, included asynchronous content, regular synchronous sessions, peer feedback components, and facilitated support. The institute was offered as a five-week program twice in the summer of 2020 and as an intensive, five-day program once in January 2021.

In designing the institute, we drew on motivation and learning theories in an extended framework that included opportunities for participants to experience pedagogical principles, apply them to their own teaching, reflect on the outcomes, and receive feedback (Borda et al., 2020; Ebert-May et al., 2011; Henderson et al., 2011; Sunal et al., 2001; Wlodkowski & Ginsberg, 2017). We also drew on research demonstrating that optimal learning environments tend to the social and emotional experiences of the learner by affirming their cultural beliefs and values and by recognizing their unique work and perspectives as important, relevant, and valuable. These environments make space for the learners to bring their “whole-selves,” including their broader roles and multiple identities (Cavanagh, 2016; Immordino-Yang, 2015; Immordino-Yang et al., 2018). Based on this literature, the principles of learner-centeredness, inclusive teaching practices, and modeling an authentic course experience grounded and guided the institute’s design.

The institute was structured into a linear set of modules (Table 1). Each included an introductory synchronous meeting, asynchronous activities and materials, and a final synchronous meeting for peer feedback. Synchronous Zoom meetings included real-time, interactive elements in which participants submitted responses verbally and in writing (e.g., virtual chat prompts, whiteboard activities, collaborative Google Docs). In between meetings, participants engaged with asynchronous content in Canvas in the form of videos, readings, and activities. Each module culminated in assignments prompting participants to draft their own course components and to implement those components into their own course Canvas site.

Table 1.

The Sequence of Modules in the Course Design Institute

Module number





Introduction to the course

Pre-institute survey; upload copy of course syllabus


Creating Inclusive and Student-Centered Courses

Discussion board and pre-course survey

Draft a welcome video and pre-course survey for their own course; create pre-course survey using survey software


Backward Design and Alignment

Start building learning objectives

Draft goals and learning objectives and revise them using a checklist; record the welcome video


Inclusive Assessments, Assignments, and Activities

Reading quiz on inclusive and sustainable assessment practices

Identify and design an assessment; create the assignment using the Canvas Assignments tool


Day-to-Day Instruction

Questionnaire on course alignment

Draft a course module


Course Production and Overview of Canvas Tools

Use Canvas to build the learning module drafted in Module 4

Post-institute survey



All faculty instructors at our institution were invited to participate. The institute enrolled three cohorts of participants: 108 in June, 97 in July, and 12 in January for a total of 200 unique individuals (some participants enrolled in the course multiple times). Roughly 50% of the participants came from disciplines across the arts and sciences, 10% from engineering, and 5% from fine arts. A further 20% of participants were composed of professional faculty teaching in the health sciences (medical, dental, veterinary, nutrition), with the remaining participants representing a wide array of professional programs in civics, law, business, and international relations, as well as several faculty with administrative roles.

Our research was conducted under the University SBER IRB, and our protocols numbered STUDY00001228 and STUDY00001229 were approved under Expedited Review.

Data Sources

Survey Data

Participants were asked to complete a pre-institute survey when they first registered for the institute and a similar survey upon completion of the final module: 181 unique individuals responded to the pre-institute survey (a completion rate of 91%), and 75 respondents responded to the post-institute survey (a completion rate of 38%). This resulted in 72 matched responses to pre-/post-institute surveys for analysis, representing approximately one-third of participants.

In the survey, faculty rated on a scale of 1–5 their confidence in various course design and remote teaching skills (see Table 2 for a list of questions). The pre-institute survey included an open-ended question asking participants to describe what they “hoped to gain” through their participation. The post-institute survey asked participants to describe what they “took away” from the institute and, if they were not able to complete the entire institute, to describe any barriers to completion.

Table 2.

Average Likert Response Values for All Respondents at the Beginning (Pre) and End (Post) of the Institute

Please rate, on a scale of 1–5 (1: Not at All Confident, 2: Slightly Confident, 3: Somewhat Confident, 4: Fairly Confident, 5: Completely Confident), your confidence in your current ability to:



1. Describe the necessity of using a student-centered perspective in developing online courses.



2. Identify some specific ways to establish an online learning community through instructor presence, engagement, and communication.



3. Describe how to approach online course design with equity and inclusive lenses.



4. Apply concepts from Universal Design for Learning, including using multiple approaches and technologies to meet students’ diverse learning needs.



5. Prioritize knowledge and competencies that are most important for students to retain long term and adapt face-to-face goals and objectives accordingly for the online learning platform.



6. Define alignment and why it is essential to a student-centered online experience.



7. Describe the importance of assessments, assignments, and activities in relation to students’ achievement of the course goals and objectives.



8. Incorporate low-stakes and inclusive assessments that can guide student learning online.



9. Develop assessments that can help students develop self-monitoring skills.



10. Identify creative final assessments that allow students to demonstrate their achievement of the course’s most important learning outcomes.



11. Develop a modular structure of your online course, with a clear pathway for students to follow as they experience the course.



12. Make intentional decisions about when to use synchronous vs. asynchronous learning.



13. Translate face-to-face learning activities online by considering intended goals and adapting from a menu of pedagogical and technological alternatives.



    Note. These values are not significantly different from those of the 72 participants with matched pre- and post-institute survey responses.

Additionally, 59 responses were collected during an anonymous feedback survey at the institute’s midpoint.

Participant Asynchronous Activity in Canvas

Participants submitted work in the form of asynchronous activities integrated into the modules, including:

  • Responses to a model pre-course survey

  • Responses to a reading quiz

  • Responses to a series of questions about their course alignment

Participant Responses From Synchronous Activities

Participants shared their ideas in a variety of ways during synchronous activities, including:

  • Responses to questions in the chat or in a virtual whiteboard

  • Collaborative anonymous responses on shared Google documents

Data Anonymization

Data collected during the mid-institute feedback survey and collaborative Google documents were anonymous. For the pre- and post-institute surveys, which were administered using the software Qualtrics, we downloaded responses, removed identifying information (such as computer IP addresses), and replaced each participant’s name with a unique random identifying number. We used these identifiers to pair pre- and post-institute survey responses for each participant so that we could measure any change in their responses. For Zoom chat transcripts and Canvas-based surveys, quizzes, and assignments, we downloaded data and removed participant names prior to analysis.


We compared the responses to the 13 Likert scale questions that appeared identically in the pre- and post-institute surveys. While paired t tests are commonly used to assess the significance of paired responses, we note this test statistic assumes a normal distribution of a continuous variable. Given that Likert responses are discrete, ordinal, and non-interval—for example, the difference between 2 (slightly confident) and 3 (somewhat confident) might not equal the difference between 3 (somewhat confident) and 4 (fairly confident)—we recognize that a non-parametric test is a more mathematically valid and reliable measure (Roberson et al., 1995). Therefore, we ran the comparison between participant’s pre- and post-institute survey responses using a paired t test and a non-parametric sign test. Any significant measures of change in confidence related to the key outcomes of the institute were then considered during the final step of our coding process in Stage 3 as we developed each of our themes and implications.

For the qualitative data, we engaged in a three-stage coding process to inductively seek out what the data reveal about participants’ experiences of the institute. In Stage 1, we engaged in thematic content analysis, isolating each data source as granularly as possible (i.e., a single cohort’s responses to a single question within a survey or activity) and coding it according to any emergent themes that we noticed. This resulted in a set of themes for each data source, with some themes varying across cohorts. In Stage 2 we engaged in constant comparison analysis in which we looked within and across data sources to either find new emerging themes or confirm those identified in Stage 1. This resulted in five major themes, which were defined in single words or short phrases. Finally, in Stage 3 we explored connections between the themes described in Stage 2, the results of our quantitative analysis, and our original research question. This resulted in deeper descriptions of each theme, how each theme related to participants’ experiences of the institute, and how the themes holistically might impact future educational development programming and areas for future research.

We took several measures to ensure our analysis was as valid as possible and to reduce researcher bias. As we engaged with these three stages, we aimed to focus on and draw from participants’ verbatim responses to the prompts. During all analyses, we wrote memos to document observations, aberrations, and general thought processes. The memos were used during triangulation, which occurred after each stage. In addition, a colleague with expertise in qualitative analysis engaged in the initial process of triangulation and again in discussions of final emergent themes. The authors engaged independently in each analysis stage and each triangulation conversation. Triangulation conversations involved aggregating and comparing our discovered themes; checking any potential biases, assumptions, or over-interpretations we brought to the analysis; resolving disagreements; and establishing agreement before embarking on the next phase. These conversations also involved discussion of individual stories as well as overarching stories reflected in the participants’ responses.


Changes in Confidence as Revealed by the Pre- and Post-Institute Survey Responses

We analyzed the responses to the pre- and post-institute survey Likert questions to measure changes in faculty confidence. For each Likert question, we first measured the average pre- and post-institute response values (with a sample size of 181 in the pre-institute survey and 75 in the post-institute survey). Average pre- and post-institute values for all respondents are shown in Table 2. The pre-institute survey responses did not significantly differ between the June (93 participants) and July (75 participants) cohorts, but the January cohort (13 participants) had on average slightly higher confidence upon entering the institute. The post-institute responses did not significantly differ between cohorts. The average confidence for the matched sample of 72 participants increased for all 13 items. Using a paired t test and a sign test, we confirmed that the confidence gains were significant (alpha < = 0.01) for all items. We note that the size of the gain in confidence is inversely correlated with the pre-institute survey response values (R2 = 0.8), meaning that the highest gains were seen in those values starting lowest. Because the survey items were aligned with the institute objectives, these confidence gains provide evidence that the institute objectives were achieved. We consider these data in our grounded theory analyses.

Thematic Analysis

Using our methodology of constant comparison and triangulation, the data ultimately revealed five themes (see Table 3 for a summary). These themes arose from the participants’ words and were supported by the qualitative and quantitative data collected throughout the institute. Some themes showed overlap (e.g., though we identified “Emotions” as its own theme, emotions also arose in the theme of “Technology”). We also note that the survey confidence gains and takeaways articulated by participants in the post-institute survey aligned with this set of themes. Below, we describe each theme in more detail, using direct quotations as support.

Table 3.

Summary of Themes

Pedagogical Knowledge

Participants’ words reflected awareness of and interest in the pedagogical practices introduced in the institute. They usually spoke in broad theoretical terms about their intent to incorporate pedagogical concepts but at times referred to specific student-centered strategies.

Student Perspective

Participants showed awareness of their role as students in the course. This helped participants gain valuable insights into how student-centered course design positively impacts learning and motivated them to include specific student-centered practices in their intended course redesign. It also led to their expressing empathy for students.

Community and Connection

Through their experience as learners in the institute, participants discovered firsthand the importance of connection and community in a remotely taught course, which further solidified their commitment to fostering these elements in their own courses.


Both essential and time-demanding, technology created frustrating barriers. It also enabled participants to experience excitement and connections within the institute and to plan for their own courses.


Though emotions were mixed throughout the institute and occurred at micro, meso, and macro levels, there was a general pattern as the program progressed: overall, participants went from feeling overwhelmed and frustrated to grateful, reassured, and even optimistic.

1. Pedagogical Knowledge

“[Students] will respond to each other’s writing, provide peer review, reflect on the opinions and reviews of their peers, and work collaboratively on a group project.”

Gaining pedagogical knowledge was a key priority of faculty as they began the institute. In the pre-institute survey, faculty indicated their hope that their participation would lead them to gain understanding and skills around online course design (e.g., how to design effective online assessments), course climate/environment (e.g., how to maintain instructor presence), and technical proficiency (e.g., how to build and organize a course in Canvas). Soon after the institute commenced, maintaining an interactive, engaging, and community-like learning environment emerged as a commonly stated pedagogical priority.

Although participants regularly acknowledged the importance of various pedagogical principles such as alignment among course components, inclusive assessment, and course organization, they less frequently explicitly identified how they might apply these experiences to their own teaching. Sometimes participants expressed liking or being interested in a concept without clear intent to use it; for example, “I, too, will think intentionally about sustainability and inclusivity [in assessment design].” Moreover, when participants were prompted to create specific plans for their course, their ideas reflected fairly conventional strategies (e.g., instructional videos, reading assignments, breakout rooms, discussion boards, group projects) rather than particularly innovative or creative ones. To some extent, this pattern was not surprising: applying new and creative pedagogical techniques can take a good deal of processing time, which was not provided in the institute. Participants had not designed their entire course by the end of the institute, nor was that our expectation.

In the instances when participants did articulate their intent to use a teaching strategy, many were focused on student-centeredness. For example, one aimed to alleviate isolation among students by “pairing students up in groups of two or three for accountability and socializing.” Participants considered ways to ensure their course felt relevant to students; one described thinking that “one thing that will keep them engaged is to emphasize how art making directly relates to the personal experience—what is happening now, Covid, BLM, elections, staying home, isolation.” Participants also described their intent to incorporate some specific pedagogical strategies such as adding scaffolding to assessments via peer review, collaborative projects, milestones, and checkpoints. The post-institute survey showed gains in participant confidence related to pedagogical knowledge: confidence in identifying specific ways to establish community, applying Universal Design for Learning, defining course alignment, incorporating low-stakes and inclusive assessments, and developing assessments that help students develop self-monitoring skills all increased significantly over the course of the institute.

2. Student Perspective

“Experiencing the course as a ‘student’ has been extremely enlightening.”

As soon as participants began to engage with the modules and attend synchronous sessions, their perspective shifted to include that of being a student in the “course.” We intentionally guided participants into this role by asking each to reflect on their experience as a student and its impact on their own course design intentions. In these reflections, participants connected specific interactions and activities to their learning; for example, “This reading … helped me reframe how I think about … ” They noticed when the course facilitators explained the rationale behind certain institute activities. One participant said, “I appreciate the intentionality of telling us what you’re doing and showing the steps you’re taking (mechanically) to do the logistical things.”

Participants appreciated the student-centered focus of the institute and its design. Moreover, they connected their experiences as students to changes in their thinking about students and their approach to teaching. For example, one participant recognized that their own experience as a student in the institute led to them “feeling a lot of empathy for students who are overwhelmed and over-extended.” Participants also expressed awareness of the impact a particular activity had on them as a student, stating how they would adjust their own approach to teaching because of their experience. These connections were not always made explicitly; frequently participants acknowledged their experiences as students separately from identifying changes they would make in their teaching. Despite this, and in combination with the survey analysis showing a significant increase in participants’ confidence in their ability to “describe the necessity of using a student-centered perspective in developing online courses,” the data overall show that taking on the student perspective was an important component of the participants’ experience in the institute.

3. Community and Connection

“Being with a cohort of other educators”

Most participants entered the course design institute looking to create the same kind of community and connection for students in remote courses that they had been able to achieve in person. They also came in mourning a loss of connection, saying they “miss[ed] the interaction with real humans” in their teaching and more generally. Notably, they did not identify finding connection among colleagues as a priority for what they were hoping to gain from the institute. As the institute progressed, they started to acknowledge the possibilities that online learning affords around promoting connection and community—for their future students but also for themselves as institute participants. One participant reported “feeling less isolated” and realizing that “we are all dealing with very similar issues; wanting the best learning experience for our students and trying to adapt to the new reality as best we can.” Participants particularly appreciated time in breakout rooms to “connect with other educators to share concerns, teaching strategies, opinions, exchange ideas, resources.”

Once we facilitators modeled techniques for building community and connection, participants saw how they could use similar methods in their own courses. The post-institute survey indicated a significant increase in participants’ confidence in their ability to identify specific ways to establish community. Participants also identified ways to break from traditional assessments to make them more collaborative and ways to foster interactions in the virtual classroom. By the end of the institute, they identified one of their biggest takeaways as the helpfulness of “being with a cohort of other educators.”

4. Technology

[I] continue to hate the word module.”

Because the institute was held remotely, technology underpinned the experiences of the participants. At the beginning, participants indicated their motivation to increase their technological skills, awareness, and comfort level; they also expressed concern that technology would “impede” their ability to teach. By the end, faculty were explicit in their intent to use many of the technological tools they saw modeled during the institute. Many also mentioned increased confidence, skills, and ideas for technologies they wanted to use in their courses. Responses to the post-institute survey showed a significant increase in confidence in their ability to “translate face-to-face learning activities online by considering intended goals and adapting from a menu of pedagogical and technological alternatives.”

While there were many positive outcomes for participants as they gained proficiency with technology, during the institute they had varied experiences. Though most indicated at the beginning that they felt they were doing “pretty well” with online learning, as the institute commenced, they described technology as a consistent “hindrance” to their participation, voicing concerns that their “technology skills may prove insufficient for the task[s]” they were being asked to accomplish. Technology was identified as a frequent barrier to completing institute assignments; at the same time, it was seen as the source of meaningful experiences of connection and excitement around their own course planning.

We also noticed that technology was emotionally activating for participants and that these emotions fluctuated depending on their sense of stress or pressure. For example, sometimes participants expressed joy as they discovered new affordances of various technologies; one participant observed, “Through Zoom, I actually CAN feel connected to other people in the course.” However, they also frequently expressed negatively emotionally charged perceptions of technology. As the institute progressed, participants expressed overwhelm or frustration with the steep learning curve and amount of time required to gain proficiency with technological tools. For example, one participant explained, “The video assignment was too much for me to put together without giving up the entire weekend, and I couldn’t do that right now and maintain sanity.” Some participants felt too much of the institute was dedicated to technological aspects and others too little. Given the immense task of translating courses into a remote format in a short period of time alongside various other stressful contextual factors, participants’ interaction with technology was overall an emotional experience.

5. Emotions

“Daunted,” “Relieved,” “Thankful”

Learning is not just cognitive; it is social and emotional. Participants expressed many emotions over the course of the institute, sometimes prompted and sometimes unprompted. The strong expression and presence of emotion in the institute was likely impacted by the external context for their participation, namely, the beginning of the pandemic and all the loss that accompanied it. Emotions were expressed at micro, meso, and macro levels. At a micro level, emotions related directly to the experience of participating in the institute. At a meso level, emotions related to the prospect of teaching remotely in the fall. At a macro level, emotions related to the greater context of life during the pandemic.

Generally, the summer cohorts expressed stronger, more distinctive negative emotions than the January cohort. They also showed a clearer progression toward more positive emotions over the course of the institute. At the beginning of the institute, overwhelm was a common theme, as voiced by the participant who felt “daunted,” as well as stress, mournfulness, frustration, and feeling distracted. Many of these negative emotions were mentioned in relation to the intersection of personal and professional responsibilities; as one participant noted, “It’s been a difficult week with family issues, so I haven’t had as much time and attention for the work as I’d have liked.” By the midpoint of the program, positive emotions such as excitement and enthusiasm became somewhat more prevalent, sometimes indicated through emphasis such as exclamation points or using all caps. By the end of the program, positive emotions were most strongly expressed. Many participants communicated feelings of gratitude, as shown by the participant who said, “I cannot thank you enough and hope you can build in islands of peace and rest for yourselves.” They also identified feelings of reassurance, relief, optimism, and feeling supported. Even given this general progression over the course of the institute, positive and negative emotions were expressed at all stages. At the end, when reflecting on barriers that prevented them from completing aspects of the program, participants described lingering negative emotions such as fear, stress, overwhelm, and burnout.


The major themes drawn from the experiences of the participants can help inform the design and execution of future educational development programming, particularly in the realm of course design support for faculty. Based on the themes, we offer four suggestions for future educational development programs. Not all are mutually exclusive—the evidence and examples associated with them sometimes span across suggestions—but we hope they offer some concrete ideas for change.

  1. Educational development programs should model the student experience. Programs should integrate opportunities for participants to act as students, to experience curricular features and teaching methods, and to engage in reflection on their learning. In designing the institute as a course—complete with a syllabus and assignments—with participants acting as students, faculty gained a direct lens into the experiences their own students might have in their own courses. For example, participants appreciated the sense of connection and community they gained as students and expressed a desire to cultivate that in their own courses. Similarly, participants recognized that any confusions or frustrations they experienced due to organization flaws or technology failures had the potential to arise in their own courses, thus prompting empathy for students and commitment to proactively address these issues. This result is consistent with the developmental nature of transformational learning, in that participants experienced being students through the lens of their “values, beliefs, and assumptions” that impacted their meaning making (Merriam, 2004). Moreover, the institute offered an experiential learning opportunity in which participants learned by doing (Kolb, 1984): they learned what it was like to be a student in a remote course by being a student in a remote course, and they were asked to create authentic materials for their own courses. The reflective aspect of this process was an important step in which participants identified ways they planned to apply what they learned. Indeed, participants were likely to state their intention to use a specific activity or method they themselves had experienced as students in the institute.

  2. Educational development programs should address community and connection. Programs should explicitly support instructors in fostering community and connection among students. When surveyed at the beginning, participants said that maintaining connection and community among students was a priority in the shift to remote instruction. Certainly, the importance of social connection in the learning process cannot be overstated: learning takes place in specific social contexts, learners learn by observing other learners’ behaviors, and one’s sense of connectedness and belonging in a classroom will influence their motivation and learning behaviors (Bandura & Walters, 1977; Walton et al., 2012). This is not limited to the student context; it has been found that faculty development is an important opportunity for community building among instructors (Eib & Miller, 2006) and that course design programs in particular can increase instructors’ feelings of connection to the institutional community (Favre et al., 2021). We suggest programs address how instructors can build a true sense of connection among students while intentionally designing opportunities for connection among institute participants.

  3. Educational development programs should incorporate time for synthesis. Even after five weeks, most participants did not articulate many commitments or clear intentions to implement specific strategies related to major pedagogical concepts and themes (e.g., inclusive assessment, student-centered teaching). Rather, they stated their appreciation for the importance of those concepts or offered more abstract intentions. We believe this was because we did not provide explicit time for the participants to process, synthesize, and apply the concepts and strategies discussed in the institute. Processing time is necessary for learners to situate new knowledge into their existing frameworks; moreover, continued practice recalling and applying new knowledge over time is critical for deep learning (Ambrose et al., 2010; Cepeda et al., 2006; Erickson & Kruschke, 1998). Providing dedicated time to fostering participants’ metacognition skills, including practice, planning, monitoring, and self-assessing their teaching, is one important way course design institutes can help faculty become more expert teachers (Johnson et al., 2017). To ensure participants have identified clear, concrete intentions around next steps, time must be set aside for them to think about it.

  4. Educational development programs should periodically prompt faculty to identify and reflect on their emotions. Research has shown that a learner’s emotion impacts their learning experience in any context, as the two are inextricably linked (Cavanagh, 2016; Immordino-Yang, 2015). During our institute, emotions ran high as both participants and facilitators struggled to navigate the uncertainties, losses, and grief caused by the pandemic. Though the emotions expressed were often difficult ones, explicitly identifying those emotions helped build a sense of connection among the participants and facilitators. Upon discovering these emotions, facilitators responded quickly by being transparent and making adjustments when possible. Given the important role of student emotion, we modeled techniques for getting in touch with emotions in the (virtual) classroom so that faculty could do the same in their own teaching. This implication aligns with previous findings about affective skills, including “compassion, empathy, and listening” as being “crucial components” of educational development during the pandemic (McGowan & Bessette, 2020). Moreover, though not addressed in the institute, instructors should be aware of trauma-informed teaching practices so they may be prepared to respond with sensitivity and authenticity depending on the context and severity of student emotions.

Future Studies

This study explored the experiences of participants within an institute, but more research is needed to understand the impact of these experiences on faculty practices. In the years following these institutes, participants have referenced, in informal interactions with us, ways their experiences changed their practice. However, a systematic study would be useful to characterize specific actions that result from faculty participation in institutes. A study could first ask participants to submit teaching artifacts (e.g., syllabi, assignments, exams, slides, lecture outlines) and then follow up during subsequent semesters to analyze the ways that these artifacts evolved. Past participants could be interviewed to explore motivations and perspectives behind teaching changes. As actions taken are identified, we could then investigate their alignment with the themes arising from analysis of the faculty’s experience within that institute.

In future studies, methods for data collection and analysis will depend on program modality.

We were able to easily capture and document participants’ thinking and contributions in this study because of the tools we used to enable a fully remote modality. This means that we were able to take a unique approach in our study of this educational development program: We could perform a deep analysis of participants’ experiences based on the text-rich data they submitted. Moving forward, we are uncertain about how frequently similar analysis approaches would apply, as educational developers are still grappling with choosing modalities over two years after the onset of the pandemic. We are experimenting with hybrid (combination face-to-face and remote) programming, which seems to be more accessible to participants while maintaining some of the impromptu connections and other benefits of fully face-to-face programming. Regardless of modality, one can continue to use shared document tools (e.g., Google Docs, virtual whiteboards) such as those we leveraged during the remote institute, as they can offer flexibility and more equitable participation. Therefore, it may be possible to capture some similar text-rich data in future programming across modalities.


This study explored faculty’s experiences in a course design institute by attending to their words and responses as captured through surveys, activities, chat logs, and shared documents. The data revealed five themes (pedagogical knowledge, student perspective, community and connection, technology, and emotions) that relate to changes in faculty attitudes, perceptions, and pedagogical approaches. Though some of these may have been more visible because data were captured in the first year of the COVID-19 pandemic, these themes and their implications span beyond the context of a crisis.

Because this study focused on the faculty’s words to understand their experiences, our analysis revealed themes that were not explicitly factored into the initial design of the institute. Ultimately, we discovered the usefulness of this technique for analyzing the impact of professional development programming. Additionally, this technique elucidated specific recommendations for the design of future programs: modeling the student experience, addressing community and connection, building in time for synthesis and commitment, and prompting participants to identify and reflect on their emotions.


We want to give a heartfelt thank you to Dana Grossman Leeman for her guidance throughout the development of this article, continued conversations about qualitative methods, and independent triangulation of our themes and analysis. We also wish to thank our colleagues Ryan Rideau, Annie Soisson, and Dana Grossman Leeman and staff at Tufts Educational Technology Services who collaborated on the development of the Institute.


Carie Cardamone, Associate Director, Tufts University Center for the Enhancement of Learning and Teaching, brings her passion to make science education both inclusive and exciting to her work supporting faculty in STEM disciplines and in the professional schools. She builds on her experience as an astronomer and education researcher and is interested in the ways that assessments can be used to support student learning and advance equity in the curriculum.

Heather Dwyer is currently Associate Director for Teaching, Learning & Inclusion at Tufts University’s Center for the Enhancement of Learning and Teaching. Dr. Dwyer’s professional interests include evidence-based teaching, inclusive teaching, and measuring the impact of CTL efforts. Her article “A Mandatory Diversity Workshop for Faculty: Does It Work?” was published in 2020. She earned her doctorate in Ecology at the University of California, Davis.


Ambrose, S. A., Bridges, M. W., DiPietro, M., Lovett, M. C., & Norman, M. K. (2010). How learning works: Seven research-based principles for smart teaching. John Wiley & Sons.

Amundsen, C., & Wilson, M. (2012). Are we asking the right questions? A conceptual review of the educational development literature in higher education. Review of Educational Research, 82(1), 90–126. https://doi.org/10.3102/0034654312438409https://doi.org/10.3102/0034654312438409

Bandura, A., & Walters, R. H. (1977). Social learning theory (Vol. 1). Prentice-Hall.

Bauch, P. A. (1984, April). The impact of teachers’ instructional beliefs on their teaching: Implications for research and practice [Paper presentation]. Annual meeting of the American Educational Research Association, New Orleans, United States. https://eric.ed.gov/?id=ED252954https://eric.ed.gov/?id=ED252954

Borda, E., Schumacher, E., Hanley, D., Geary, E., Warren, S., Ipsen, C., & Stredicke, L. (2020). Initial implementation of active learning strategies in large, lecture STEM courses: Lessons learned from a multi-institutional, interdisciplinary STEM faculty development program. International Journal of STEM Education, 7(1), 4. https://doi.org/10.1186/s40594-020-0203-2https://doi.org/10.1186/s40594-020-0203-2

Brantmeier, E. M., Molloy, C., & Byrne, J. (2017). Writing renewal retreats: The scholarly writer, contemplative practice, and scholarly productivity. To Improve the Academy, 36(2). https://doi.org/10.3998/tia.17063888.0036.205https://doi.org/10.3998/tia.17063888.0036.205

Buckley, J. B. (2019). A grounded theory of education for sustainability in the postsecondary classroom. The Review of Higher Education, 42(3), 965–989. https://doi.org/10.1353/rhe.2019.0026https://doi.org/10.1353/rhe.2019.0026

Cavanagh, S. R. (2016). The spark of learning: Energizing the college classroom with the science of emotion. West Virginia University Press.

Cepeda, N. J., Pashler, H., Vul, E., Wixted, J. T., & Rohrer, D. (2006). Distributed practice in verbal recall tasks: A review and quantitative synthesis. Psychological Bulletin, 132(3), 354–380. https://doi.org/10.1037/0033-2909.132.3.354https://doi.org/10.1037/0033-2909.132.3.354

Conrad, C. F. (1982). Grounded theory: An alternative approach to research in higher education. The Review of Higher Education, 5(4), 239–249. https://doi.org/10.1353/rhe.1982.0010https://doi.org/10.1353/rhe.1982.0010

Debelius, M., McGowan, S., Maciel, A., Reid, C., & Eason, A. (2021). “Things are different now”: A student, staff, and faculty course design institute collaboration. In T. N. Thurston, K. Lundstrom, & C. González (Eds.), Resilient pedagogy: Practical teaching strategies to overcome distance, disruption, and distraction (pp. 272–288). Utah State University. https://doi.org/10.26079/a516-fb24https://doi.org/10.26079/a516-fb24

Ebert-May, D., Derting, T. L., Hodder, J., Momsen, J. L., Long, T. M., & Jardeleza, S. E. (2011). What we say is not what we do: Effective evaluation of faculty professional development programs. BioScience, 61(7), 550–558. https://doi.org/10.1525/bio.2011.61.7.9https://doi.org/10.1525/bio.2011.61.7.9

Eib, B. J., & Miller, P. (2006). Faculty development as community building—An approach to professional development that supports Communities of Practice for Online Teaching. The International Review of Research in Open and Distributed Learning, 7(2). https://doi.org/10.19173/irrodl.v7i2.299https://doi.org/10.19173/irrodl.v7i2.299

Erickson, M. A., & Kruschke, J. K. (1998). Rules and exemplars in category learning. Journal of Experimental Psychology: General, 127(2), 107–140. https://10.1037//0096-3445.127.2.107https://10.1037//0096-3445.127.2.107

Favre, D. E., Bach, D., & Wheeler, L. B. (2021). Measuring institutional transformation: A multifaceted assessment of a new faculty development program. Journal of Research in Innovative Teaching & Learning, 14(3), 378–398.

Fetherston, B., & Kelly, R. (2007). Conflict resolution and transformative pedagogy: A grounded theory research project on learning in higher education. Journal of Transformative Education, 5(3), 262–285. https://doi.org/10.1177/1541344607308899https://doi.org/10.1177/1541344607308899

Glaser, B. G., & Strauss, A. L. (1967). The discovery of grounded theory: Strategies for qualitative research. Aldine Transaction.

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analytic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984. https://doi.org/10.1002/tea.20439https://doi.org/10.1002/tea.20439

Hutchinson, S. A. (1986). Education and grounded theory. Journal of Thought, 21(3), 50–68.

Immordino-Yang, M. H. (2015). Emotions, learning, and the brain: Exploring the educational implications of affective neuroscience. W. W. Norton & Company.

Immordino-Yang, M. H., Darling-Hammond, L., & Krone, C. (2018). The brain basis for integrated social, emotional, and academic development [Research brief]. The Aspen Institute National Commission on Social, Emotional, and Academic Development. https://www.aspeninstitute.org/publications/the-brain-basis-for-integrated-social-emotional-and-academic-development/https://www.aspeninstitute.org/publications/the-brain-basis-for-integrated-social-emotional-and-academic-development/

Johnson, T. A., Holt, S. A., Sanders, M., Bernhagen, L., Plank, K., Rohdieck, S. V., & Kalish, A. (2017). Metacognition by design: How a course design experience can increase metacognition in faculty. To Improve the Academy, 36(2), 117–127. https://doi.org/10.1002/tia2.20057https://doi.org/10.1002/tia2.20057

Kaldor, E., Streifer, A., Fournier, K., Vecchione, M., Kane, M., & Moore, C. S. (2021). Researching the resilient CDI: Pandemic lessons and opportunities [Conference presentation]. #POD21: Evolving Beyond Crisis—Connecting to the Future, 46th Annnual POD Network Conference. https://podconference2021.sched.com/event/p8mj/researching-the-resilient-cdi-pandemic-lessons-and-opportunitieshttps://podconference2021.sched.com/event/p8mj/researching-the-resilient-cdi-pandemic-lessons-and-opportunities

Kennedy, T. J. T., & Lingard, L. A. (2006). Making sense of grounded theory in medical education. Medical Education, 40(2), 101–108. https://doi.org/10.1111/j.1365-2929.2005.02378.xhttps://doi.org/10.1111/j.1365-2929.2005.02378.x

Kolb, D. A. (1984). Experiential learning: Experience as the source of learning and development. Prentice-Hall.

Lee, V. S. (2010). Program types and prototypes. In K. J. Gillespie, D. L. Robertson, & Associates (Eds.), A guide to faculty development (2nd ed., pp. 21–34). Jossey-Bass.

Levinson-Rose, J., & Menges, R. J. (1981). Improving college teaching: A critical review of research. Review of Educational Research, 51(3), 403–434. https://doi.org/10.3102/00346543051003403https://doi.org/10.3102/00346543051003403

Marra, R. (2005). Teacher beliefs: The impact of the design of constructivist learning environments on instructor epistemologies. Learning Environments Research, 8(2), 135–155. https://doi.org/10.1007/s10984-005-7249-4https://doi.org/10.1007/s10984-005-7249-4

Marshall, C., & Rossman, G. B. (1995). Designing qualitative research. Sage Publications.

McGowan, S., & Bessette, L. S. (2020). Affective labor and faculty development: COVID-19 and dealing with the emotional fallout. Journal on Centers for Teaching and Learning, 12. https://hcommons.org/deposits/item/hc:34215/https://hcommons.org/deposits/item/hc:34215/

Merriam, S. B. (2004). The role of cognitive development in Mezirow’s transformational learning theory. Adult Education Quarterly, 55(1), 60–68. https://doi.org/10.1177/0741713604268891https://doi.org/10.1177/0741713604268891

Muijs, D., & Reynolds, D. (2002). Teachers’ beliefs and behaviors: What really matters? Journal of Classroom Interaction, 37(2), 3–15.

Palmer, M. S., Streifer, A. C., & Williams-Duncan, S. (2016). Systematic assessment of a high-impact course design institute. To Improve the Academy, 35(2), 339–361. https://doi.org/10.1002/tia2.20041https://doi.org/10.1002/tia2.20041

Prebble, T., Hargraves, H., Leach, L., Naidoo, K., Suddaby, G., & Zepke, N. (2004). Impact of student support services and academic development programmes on student outcomes in undergraduate tertiary study: A synthesis of the research [Research report]. Ministry of Education, New Zealand.

Roberson, P. K., Shema, S. J., Mundfrom, D. J., & Holmes, T. M. (1995). Analysis of paired Likert data: How to evaluate change and preference questions. Family Medicine, 27(10), 671–675.

Stes, A., Min-Leliveld, M., Gijbels, D., & Van Petegem, P. (2010). The impact of instructional development in higher education: The state-of-the-art of the research. Educational Research Review, 5(1), 25–49. https://doi.org/10.1016/j.edurev.2009.07.001https://doi.org/10.1016/j.edurev.2009.07.001

Sunal, D. W., Hodges, J., Sunal, C. S., Whitaker, K. W., Freeman, L. M., Edwards, L., Johnston, R. A., & Odell, M. (2001). Teaching science in higher education: Faculty professional development and barriers to change. School Science and Mathematics, 101(5), 246–257. https://doi.org/10.1111/j.1949-8594.2001.tb18027.xhttps://doi.org/10.1111/j.1949-8594.2001.tb18027.x

Walton, G. M., Cohen, G. L., Cwir, D., & Spencer, S. J. (2012). Mere belonging: The power of social connections. Journal of Personality and Social Psychology, 102(3), 513–532. https://doi.org/10.1037/a0025731https://doi.org/10.1037/a0025731

Weimer, M., & Lenze, L. F. (1991). Instructional interventions: A review of the literature on efforts to improve instruction. In J. C. Smart (Ed.), Higher education: Handbook of theory and research (Vol. 7, pp. 294–333). Agathon Press.

Wheeler, L. B. (2021). Supporting STEM faculty of large enrollment undergraduate courses: A mixed methods study of impact. International Journal for the Scholarship of Teaching and Learning, 15(1). https://doi.org/10.20429/ijsotl.2021.150107https://doi.org/10.20429/ijsotl.2021.150107

Wlodkowski, R. J., & Ginsberg, M. B. (2017). Enhancing adult motivation to learn: A comprehensive guide for teaching all adults. John Wiley & Sons.