Despite robust scholarly evidence that inclusive teaching practices produce more equitable learning outcomes for students (e.g., Dewsbury et al., 2022; Theobald et al., 2020), widespread uptake of these practices by instructors remains low. Systemic and individual barriers prevent instructors from changing their pedagogical practice, including lack of time, training, and incentives as well as fear of negative student evaluations if they try new teaching methods (Addy et al., 2021b; Henderson et al., 2011). Instructors may also grapple with tensions between teaching and their research productivity and professional identity (Brownell & Tanner, 2012). Departmental cultures may act as barriers to implementing inclusive practices if instructors perceive that time spent on teaching is not valued or that lecture is the normative pedagogical approach (Shadle et al., 2017; Sturtevant & Wheeler, 2019). Many centers for teaching and learning (CTLs) aim to address these challenges by developing long-term programming to train instructors (Palmer et al., 2016; Addy et al., 2023); however, the effort that goes into reflecting and iterating to improve inclusive teaching practices may not be sustainable if no departmental or institutional structures acknowledge or incentivize teaching development efforts.

To address the barriers to advancing pedagogical change, we collected data in partnership with academic departments—from multiple sources at different levels (students, courses, department)—that describe current teaching practices and learning experiences as well as contextual factors that influence the implementation of inclusive and equitable teaching practices. We developed tools to identify and describe inclusive and equitable teaching occurring in the department, including a rubric to analyze course syllabi. The use of course syllabi as a primary data source allowed us to leverage a universal teaching artifact that provided abundant information about course-level teaching and course design practices without asking for much time or effort from faculty members. Data from course syllabi allowed us to recognize inclusive teaching practices that were already being implemented in the department to highlight exemplars and adopt an adaptive, contextual approach to advance inclusive and equitable teaching. To that end, we leveraged these data to guide collaborative discussions with department partners to contextualize findings, interpret results, unpack teaching implications, and inform changes in departmental policies and instructor pedagogy. The methodology of the project as a whole is described in more detail in Soicher et al. (2024).

In this paper, we describe the development and implementation a syllabus analysis tool: a rubric to identify and describe inclusive and equitable pedagogical practices observed in course syllabi. We designed the rubric to qualitatively assess course-level teaching and course design practices in focal courses within each department. This approach yielded insights into departmental teaching practices within an individual course and patterns across courses within a department, allowing us to tailor our approach to the distinct disciplinary norms, resources, and needs in academic departments. Shifting teaching practices across the department may help individual instructors sustain and continue to develop inclusive teaching practices (Ngai et al., 2020; Reinholz & Apkarian, 2018).

Course Syllabi as Indicators of Inclusive and Equitable Teaching

The syllabus is an essential teaching artifact of academic courses in higher education. Typically, instructors develop the syllabus as a roadmap for the course, indicating what students will know and be able to do at the end of the class. It also describes the learning activities, teaching methods, and assessments students will encounter along the way. Writing the syllabus prompts the instructor to make decisions about their priorities for student learning, the sequencing of topics and activities, the weighting of different assessments, course policies, and options for students if unforeseen circumstances come up during the course. For students, the syllabus shapes students’ perceptions of the instructor (Harnish & Bridges, 2011; Wheeler et al., 2019) as well as the expectations of their engagement and success in the course (Richmond et al., 2016). Accordingly, educational developers often design programming to support instructors’ course design decisions and their revision of syllabi to center student learning during course design institutes (Palmer et al., 2014). Additionally, syllabi are frequently used as evidence of teaching quality in faculty job applications and tenure and promotion materials as well as course and learning outcomes quality in accreditation efforts.

Given the prevalence and ease of access to course syllabi, we decided to analyze syllabi as a primary data source in our project to access rich information about inclusive and equitable teaching that can be readily compared across disciplines and teaching contexts. As noted by Bers et al. (2000), “course syllabi are unobtrusive but powerful indicators of what takes place in classrooms,” (p. 7). Educational developers have used syllabi to guide priorities in CTL work and to evaluate the impacts of CTL programming and support. CTL staff have reviewed syllabi to inform programming and support for faculty to advance learner-centered instruction (Cullen & Harris, 2009). Others have consulted syllabi to identify faculty and departments with shared interests (e.g., use of high impact practices) and to initiate collaborations to support the development of those interests (Stanny et al., 2015). CTL staff have also reviewed syllabi to evaluate the outcomes of their programming, including Course Design Institutes (Palmer et al., 2016) and consultations (Hershock et al., 2022). These formal efforts to qualitatively analyze course syllabi have focused on identifying evidence of learner-focused practices emphasizing what and how students will learn in the course, in comparison to content- or instructor-centered approaches emphasizing delivery of content. The syllabus rubric we developed builds on these previous tools used to analyze course syllabi (e.g., Palmer et al., 2014), with the added goal to capture evidence of inclusive and equitable course design and teaching practices.

In contrast to inclusive and equitable teaching initiatives that focus on content and curricula (i.e., what instructors teach), we aimed to capture a set of equity-minded strategies and practices (i.e., how instructors teach) that can be observed in course syllabi. We define inclusive and equitable teaching as strategies and course design practices designed to cultivate a learning environment in which students have equal access to learning and feel valued and supported in their learning. In more concrete terms, these practices involve communicating with transparency and purpose, designing active and structured learning experiences, engaging with diversity, and providing opportunities to build relationships with instructors and other students (Artze-Vega et al., 2024).

In this paper, we will describe our process for developing and testing the rubric, and how we applied it to analyze course syllabi. Comprising 22 criteria, we designed the rubric to measure indicators of inclusive and equitable teaching within the context of a single course. We will share outcomes of the process, such as the impact of collaboration on this project, as well as the findings, including how we facilitated data-informed discussions within departments regarding the strengths, challenges, and observed inequities in student experiences. We also highlight potential applications of the syllabus rubric as a useful tool: for instructors to reflect and iterate on the elements of inclusive and equitable teaching in their courses; for department administrators to identify patterns of inclusive and equitable teaching across the curriculum; and for educational developers to engage in data-informed consultations or tailored programming with instructors and departments. We offer this syllabus rubric and analytic process to educational developers and department leaders who may wish to engage with instructors in a formative assessment of course design and inclusive and equitable teaching practices.

Process

Institutional Context

Massachusetts Institute of Technology (MIT) is a private, STEM-focused research university with more than 2000 faculty and teaching staff, 1500 postdocs, 7300 graduate students, and 4500 undergraduate students. In 2024–2025,1 women accounted for 48% of the undergraduate and 42% of the graduate student population. Fifty-seven percent of undergraduates and 21% of graduate students self-identified as members of one or more US minority groups. International students accounted for 12% of the undergraduate and 40% of the graduate student population. In the undergraduate class of 2028, 24% of students were Pell Grant eligible and 20% were the first generation in their family to attend college.

Across MIT, there are 30 academic departments across five schools and one college. MIT is a decentralized institution, with much of the decision-making authority occurring at the departmental level, as illustrated in recent university-wide strategic action plans that emphasize local plans at the department level. Our center for teaching and learning (the Teaching + Learning Lab) is a centralized resource for teaching support at the institution with staff who are experts in teaching, learning, research, and evaluation.

At MIT, instructors have significant flexibility on the format and content of their syllabi, except for certain policies pertaining to exam timing, supplemental class scheduling, assignment and test scheduling in relation to the last week of classes, and grades. There is variability in who writes the syllabus across the university, ranging from individual instructors for smaller courses to course leads for large or multi-section courses with teaching teams. It is noteworthy that in some cases the instructor who teaches a section of the course and interacts with students may not be the author of the syllabus.

Developing and Testing the Syllabus Rubric

Two members of the research team (RCT and LBG) developed the rubric by generating a list of potential dimensions based on scholarly literature, drafting descriptions of scale levels for each criterion, and then testing the syllabus rubric on a set of course syllabi from different disciplines.

In the first step of developing the syllabus rubric, we used a deductive approach (Galman, 2013) to compile a list of inclusive and equitable course design and teaching practices based on scholarly literature2 that would be broadly applicable across different disciplines and teaching contexts, prioritizing the teaching strategies and practices over the content being taught. We consulted overviews of inclusive teaching practices (Addy et al., 2021a; Center for Research on Learning and Teaching at U-M, 2021; Dewsbury & Brame, 2019; Salehi et al., 2021) as well as scholarly literature on:

We then refined this list to select course design and teaching practices relevant to course syllabi, developing a long list of potential criteria for the syllabus rubric.

While we developed most potential criteria based on the scholarly literature, we recognized that some qualities of inclusive and equitable teaching could be more abstract or not found in existing rubrics. This applied to the principle of critical engagement with difference, which involves acknowledging students’ different identities and experiences. To address this limitation, we also took an inductive approach (Galman, 2013) to generate criteria by reviewing three learner-centered test syllabi from different institutions and disciplines (natural science, engineering, social science/humanities). Through this process, we developed four criteria that aligned with this principle: affirming student diversity, diversity among ideas, variety in assessments, and opportunities for autonomy and choice in assessments.

To draft descriptions of all potential criteria, we evaluated existing syllabus rubrics on learner-centered syllabi. Our goal was to identify criteria that illustrate inclusive and equitable teaching practices, such as: alignment between intended learning outcomes and assessments; qualities of formative and summative assessments; course policies and their rationale (Hershock et al., 2022; Palmer et al., 2014); emphasis on classroom community; opportunities for peer collaboration and instructor support; and communication of relevance of coursework (Cullen & Harris, 2009). At this stage, we drafted descriptions for each criterion at each level of the scale, ranging from 0 (not present/assessable) to 3 (present and clear). We then discussed and iterated on the observable characteristics to clearly differentiate between the levels of the scale.

Once we had a working draft of the syllabus rubric, we (RCT and LBG) separately tested it to evaluate three syllabi from different disciplines.3 These test syllabi were drawn from different institutions and disciplines (natural science, engineering, social science/humanities) and used by our CTL in our programming on syllabus construction. We noted any criteria that were difficult to evaluate or that overlapped significantly with other criteria. Following our independent assessments, we compared ratings and discussed each criterion with the aim to adapt the rubric to hone in on concrete, observable characteristics for each level of the scale and reach consensus on criteria we rated differently. At this stage, we continued to refine the working draft of the syllabus rubric. In some cases, we merged multiple criteria to simplify coding while retaining important distinctions between principles of inclusive and equitable teaching. For example, we combined a criterion that focused on belonging (e.g., encouraging and inviting students to engage with instructors) with one that focused on structure (e.g., specific methods to engage with instructors beyond office hours) because both addressed student-instructor engagement. Our goal in this process was to prioritize and maintain measurable characteristics of inclusive and equitable teaching principles that could provide actionable feedback on how and where to revise syllabi.

The Inclusive and Equitable Syllabus Rubric

The syllabus rubric consists of 22 criteria, categorized into four principles based on the organizing framework from the Center for Research on Learning and Teaching at University of Michigan (2021), which also aligns with the way we present and discuss inclusive and equitable teaching in our CTL programming:

  • Transparency involves clearly communicating why students are learning course concepts and skills, how learning activities and assessments help them learn, and what students need to do to succeed. It also involves explaining the reasoning underlying course policies. [10 criteria]

  • Academic Belonging refers to students’ sense of being accepted and valued in academic classrooms by building relationships with peers and with the instructor(s) and teaching staff and by recognizing the relevance and value of their coursework. [3 criteria]

  • Structure (sometimes referred to as Structured Interactions) describes a systematic approach to designing course elements and activities to help students learn and apply concepts and skills as well as protocols or processes that support equitable opportunities for students to participate and interact in the classroom. [5 criteria]

  • Critical Engagement of Difference acknowledges and responds to students’ different identities, experiences, perspectives, strengths, and needs by affirming the value of diversity, recognizing different perspectives and ways to solve problems, and allowing for different pathways for successful learning in the course. [4 criteria]

Each criterion was rated along four levels: not present (0), present but unclear (1), present and somewhat clear (2), and present and clear (3). Each criterion was accompanied by a detailed, qualitative description of observable characteristics at each level (see Table 1 for selected examples of a subset of criteria and their descriptive levels). See Appendix for a list and brief description of all criteria.

Table 1. Representative Examples of Criteria and Descriptive Levels from the Syllabus Rubric

Criterion (Principle(s)) Level 3. Present and clear Level 2. Present and somewhat clear Level 1. Present but implied or unclear
Learning tips and resources (Transparency) Clearly describes actionable tips and concrete strategies on how to succeed in the course, which may also include external tools and campus resources to support students’ development of the skills and knowledge in the course Describes general tips, strategies, or resources to support success in the course, but they are not linked to the development of the skills and knowledge in the course (e.g., learning outcomes, assessments) Lists resources but does not describe the relevance of resources or additional tips or strategies for success
Grading criteria (Transparency) Clearly articulates how assessments will be graded and makes available detailed criteria, rubrics, or annotated examples Articulates how assessments will be graded with brief criteria or rubrics Articulates how assessments will be graded without criteria or rubrics. May mention generic criteria outlined in the course catalogue
Peer support and collaboration (Belonging) Includes learning activities that provide opportunities for students to interact as peers, identify shared interests in course content, and collaborate; acknowledges the value of students learning from one another and/or provides opportunities for students to reflect on their collaboration Includes some learning activities that provide opportunities for students to interact as peers and few options to collaborate and some reflections of the value of collaboration Includes some learning activities that provide opportunities for students to interact as peers but no options to collaborate
Embedded opportunities for practice (Structure) Describes frequent and consistent class activities and formative assessments that allow students to practice with key concepts and skills that are clearly aligned with learning outcomes and/or the summative assessments Describes some class activities and formative assessments that allow students to practice with concepts and skills; alignment with learning outcomes and/or summative assessments is unclear Describes few class activities and formative assessments; it is unclear whether or how students will gain practice with concepts and skills
Autonomy in assessments (Engagement with Difference) Builds in multiple opportunities for student choice in assessments (e.g., multiple options for topics or modalities for assignments, optional opportunities for instructor or peer feedback on drafts) Includes and describes opportunities for student choice that are either restricted, such as choosing a topic from a list, or isolated to a single major assessment Implies opportunities for student choice on a single assessment but does not fully explain
  • Note: Each criterion also included a level 0, which indicated that the criterion was not present or assessable. For example, for the criterion Learning Resources and Tips, the description of level 0 was “does not mention additional supporting resources.” The full syllabus rubric is available at https://tll.mit.edu/inclusive-equitable-syllabus-rubric/.

There is conceptual overlap for some criteria across the four principles. For example, the criterion “opportunities for student engagement with instructor” is categorized under the principle of belonging based on the literature indicating the importance of students feeling connected to instructors to support their sense of belonging (Polmear et al., 2024; Rainey et al., 2018). However, the qualitative description of the criterion also reflects concrete, defined opportunities for students to interact with instructors (e.g., required meetings with instructors during class time or using piazza or slack to ask questions), aligning with the principle of structure.

We also decided to integrate criteria that align with the U-M principle of flexibility into those of structure and critical engagement with difference. We based this decision on conceptual overlap of these criteria with other principles and a desire for parsimony. For example, building in opportunities for student choice in assessments provides students with flexibility in how they demonstrate their learning while also showing responsiveness to students’ different strengths and needs, aligning with the principle of critical engagement with difference. Conceptual overlap and criteria that align with flexibility are noted in the Appendix and in the full syllabus posted on the website. This decision was also based, in part, on institutional and disciplinary considerations. As a STEM-focused institution, we considered that some STEM disciplines emphasize technical accuracy, precision, and minimal tolerance for numerical error (Villanueva et al., 2018). We anticipated that some faculty within STEM disciplines may question the applicability of criteria associated with the label of flexibility. Based on preliminary feedback from faculty colleagues in STEM disciplines, we found that integrating flexible teaching practices into other principles allowed us to more effectively promote inclusive practices that involve adapting to student needs. We note that this decision fits with our institutional context. In other institutions, disciplines and/or educational contexts the framework may be adapted to include flexibility as a stand-alone principle.

Syllabus Analysis

In the first two years of the project, we applied this process to analyze 17 syllabi from focal courses across three undergraduate departments and 12 syllabi from core courses in one graduate degree program. The four departments we partnered with represent different academic disciplines (one science, two engineering, and one social science).

Collecting Syllabi. To identify focal courses in each department or program, we guided a discussion with departmental leadership about the curriculum. We defined focal courses as those that are routinely offered by the department, and may include (1) large enrollment courses and/or introductory courses that drive student interest in the major, (2) core classes that most or all students majoring within the department are required to take, and (3) courses that they feel might be critical to the culture of inclusion and equity in the department (e.g., courses that draw students with highly varied levels of preparation, prior knowledge, backgrounds, or interests). We pulled a list of potential courses from the department website and course schedule and asked for their feedback and additions.

To collect syllabi for coding, a member of the research team contacted departmental administrators, course leads, and individual instructors to request syllabi from focal courses, with permission from instructors. Next, a research team member not involved in future rounds of coding collected class-level attributes from the syllabi. This included department, student level (e.g., lower/upper division), type of class (e.g., lecture, lab), class size and enrollment, credit hours, etc. This team member also deidentified the syllabi by redacting any text identifying instructors and/or teaching assistants and staff to reduce the likelihood of bias affecting our ratings. Once syllabi were deidentified, we used Dedoose software (2024) to conduct two additional rounds of coding: descriptive and evaluative.

Descriptive Coding. The goal of descriptive coding was to capture relevant syllabus excerpts and surrounding context for later evaluation. Two coders were assigned to each criterion. The primary coder took an initial pass through all syllabi, focusing on identifying evidence for one criterion at a time, and flagging excerpts of the syllabi that were relevant to that criterion. Subsequently, the secondary coder went through the syllabi and flagged (1) excerpts that they believed the primary coder had missed and (2) excerpts flagged by the primary coder that they believed were irrelevant to the criterion. The primary coder then reviewed any flagged excerpts to accept suggestions by the second coder or note discrepancies that should be discussed. In general, we encouraged coders to err towards flagging more excerpts to be considered in the evaluative coding stage; accordingly, most suggestions by the second coder were accepted, with discussions only occurring when the two coders appeared to have different interpretations of the rubric that indicated a need to clarify language in criterion descriptions.

Evaluative Coding. The goal of evaluative coding was to review the excerpts identified during descriptive coding and determine an overall score for each syllabus for each of the 24 criteria. Two coders were assigned to each criterion and worked simultaneously but independently, to assign ratings for one criterion at a time across all the syllabi. Each coder reviewed all instances of evidence flagged within one syllabus for a given criterion and assigned a rating of 0, 1, 2, or 3 based on the cumulative evidence:

  • A score of 0 indicated that there was no evidence present in the syllabus for a given criterion.

  • A score of 1 indicated that evidence of the criterion was present but unclear.

  • A score of 2 indicated that evidence of the criterion was present and somewhat clear.

  • A score of 3 indicated that evidence of the criterion was present and clear. Overall, scores of 3 were reserved for exemplary examples of each criterion.

In the first year of the project, the inter-rater reliability for evaluative coding prior to resolving discrepancies was high (Cohen’s kappa between 0.80 and 0.89; McHugh, 2012). In the second year of the project, we trained two new coders to assist with coding. The inter-rater reliability was more variable but still in the acceptable range (0.59 to 0.90).

After all syllabi were evaluated by both coders independently for a given criterion, the coders met to compare and discuss their ratings and resolve any discrepancies. If they could not reach an agreement, a third coder made the final decision. Coders noted any clarifications, guidelines, or rules they applied during discussions in the syllabus rubric. The coders also recorded any criteria that were difficult to evaluate or cases where there was minimal variability in the ratings. This process was repeated until all syllabi had scores for all criteria.

Editing of Syllabus Rubric during Coding. During the descriptive and evaluative coding phases, we adjusted the criteria descriptions to improve clarity and, by extension, the consistency of rating. We also refined differences between the four levels of rating for each criterion based on varied qualitative and quantitative differences in observable characteristics (see Table 1), such as:

  • The clarity and level of detail in the descriptions in the syllabus,

  • The frequency and consistency of the criterion, and/or

  • The presence of an additional quality in higher levels.

In one case, we developed a new criterion to measure an element of structure that appeared in syllabi but was not captured in the existing criteria (at the time, 21) of the rubric. This addition occurred during the project’s second year4 when we observed that many syllabi described class activities and formative assessments that allowed students to practice with concepts and skills. In response to our observation, we created the criterion “embedded opportunities for practice.” If we modified the rubric during or after coding syllabi, we repeated the descriptive and evaluative coding phases to apply the modified criteria.

Discussion and Recommendations

We developed and applied a syllabus rubric to advance inclusive and equitable teaching and used the findings to inform both departmental change efforts and centralized CTL programming. After synthesizing findings from our syllabus analysis with other data sources (student survey, instructor interviews), we engaged in dialogue with department leaders and instructors to contextualize findings, interpret results, and unpack teaching implications. In this section, we discuss the outcomes from the syllabus analysis, including how departmental discussions helped guide changes in departmental policies and instructor pedagogy, as well as the value of the process for graduate student training. Finally, we share potential applications and recommendations for educational developers and department leaders looking to conduct formative assessments of inclusive and equitable teaching and course design practices.

Data-sharing Discussions with Departments

A core element of our project was to engage department partners in collaborative meaning-making by holding data-driven discussions about how to advance inclusive and equitable teaching in their departments and teaching contexts. We tailored each departmental discussion by considering both the themes that emerged from the data and the potential for actionable policy and/or pedagogical changes. This adaptive, contextual approach was done after consulting with departmental leadership to understand disciplinary norms, priorities, and the departmental and teaching context. For example, prior to the start of a new semester, we worked with a curriculum specialist in one department to present findings in their faculty meeting and to identify recommendations for syllabus revisions based on our findings. The curriculum specialist then shared exemplars from departmental faculty and templates for syllabus sections that aligned with our recommendations, such as clear policies and pathways for late work submissions, other learner-centered policies (academic integrity, mental health, inclusion), and assessment descriptions that aligned with intended learning outcomes. This approach allowed faculty to see, copy, and adapt exemplars and templates from their colleagues as they revised their syllabi for the upcoming semester.

In another department, we presented major findings and facilitated a discussion with faculty as they constructed and revised shared departmental policies for the next academic year. We drew on data from our syllabus analysis to summarize how class participation was framed in formal communication channels. We also presented data we collected on student perceptions of class participation from a survey distributed to student majors (one of the other data sources collected in the project as a whole). Then, we unpacked data-driven implications for both syllabus revision and teaching practices. We posed questions to foster discussion with faculty about the relevance of participation as a disciplinary skill relevant to students’ future careers. In the context of the discussion, we highlighted the importance of increased transparency with students about the importance of participation, along with structured support to provide multiple ways for students to participate and to help them develop the skills to share their ideas aloud.

Together, these examples demonstrate how we leveraged department-specific data and collaborative discussions with department partners to adapt to distinct departmental contexts and tailor our support to each department, ultimately advancing inclusive and equitable teaching.

Collaboration with Graduate Students

Our collaborative efforts to develop the syllabus rubric and to qualitatively analyze course syllabi provided a fruitful opportunity to train graduate students in educational research methods and build their expertise in the scholarship on inclusive and equitable pedagogy while positioning them as equal partners in educational development. Through a structured fellowship program that involved a commitment of ten hours a week for an academic year, trained graduate fellows played a central role in shaping the rubric and conducting the analysis of syllabi. Inspired by programs that position students as pedagogical partners (Cook-Sather et al., 2021), we worked closely with graduate student fellows during each of the two academic years of the project. One graduate fellow collaborated on the development of the syllabus rubric and both fellows analyzed syllabi and reported findings from the syllabus analysis. The process of developing the rubric and analyzing syllabus data required us to translate principles and concepts of inclusive and equitable pedagogy from scholarly research into specific, measurable characteristics observed in course syllabi. Together, we reckoned with nuances in how we defined, interpreted, and measured each criterion in the rubric as we discussed discrepancies during the coding and analysis of syllabi. These experiences reinforced graduate students’ conceptual understanding of inclusive and equitable teaching principles and practices as well as their skills in synthesizing academic research for application in educational settings.

The collaborative process between members of the CTL and graduate students reinforced the value of integrating and prioritizing student perspectives in advancing inclusive pedagogical practices. Collaborative discussions during the syllabus analysis allowed us to consider perspectives rooted in both the student and instructor experiences. Staff members of the CTL involved in the syllabus analysis were likely to view the syllabi from an instructor’s perspective, based on their extensive experience teaching in higher education and working with teaching faculty. Moreover, as we evaluated syllabi from outside our disciplines, the collaborative coding process with colleagues with different disciplinary training helped us recognize when our own disciplinary contexts and biases informed our perceptions of the syllabi. Collectively, the varied perspectives we brought to the syllabus analysis allowed us to recognize and question when we used our role in the classroom or disciplinary perspectives to inform what we prioritized in our discussions and reports for departmental partners. By collaborating with graduate students and with colleagues from different disciplines, we were able to consider and integrate multiple perspectives during the coding and the reporting of strengths and recommendations on the syllabus analysis.

Applications and Recommendations

Course syllabi are an easily accessed source of rich data on course design and the instructor’s intent to implement inclusive and equitable teaching practices. Given the utility and accessibility of course syllabi, the rubric we developed can be applied at various levels to identify individual, departmental, and school- or university-wide patterns in course design and inclusive pedagogy. In our project, we developed the syllabus rubric as a tool to identify indicators of inclusive and equitable teaching across focal courses in an academic department. We synthesized data from the syllabus analysis with results from a student survey and instructor interviews. These data from multiple sources informed our discussions with departments about how to advance inclusive and equitable teaching across the curriculum and teaching contexts.

In our application of the syllabus rubric, we gathered and synthesized data at the department level to focus on a “key unit of change” (Reinholz & Apkarian, 2018, p. 1) at our decentralized institution. Scholarly literature on change theories point to academic departments based on the relative coherence of departmental cultures (relative emphasis on teaching vs. research, value of teaching innovation) and structures (e.g., teaching loads, process for making teaching assignments, promotion and tenure processes; Reinholz et al., 2019). However, the scope of the analysis and the level of data reporting may vary across institutional contexts and departmental structures. At more centralized institutions, aggregating data across schools or even campuses may be more relevant and compelling levels of analysis. To evaluate the appropriate scope for data collection and analysis, researchers and educational developers may consider the coherence of teaching-related structures, educational mission and values, and teaching cultures within and across departments and other academic units.

Going beyond the applications in the current project, we envision the inclusive and equitable syllabus rubric as a useful tool for:

  • instructors to reflect and iterate on the elements of inclusive and equitable teaching in their courses,5

  • educational developers to engage in data-informed consultations,

  • department leadership to identify patterns of inclusive and equitable teaching across the curriculum,

  • disciplinary organizations to highlight patterns and trends in course design and inclusive pedagogy, and

  • CTLs to identify needs for programming and support with instructors and departments.

We encourage educational developers to borrow or adapt the syllabus rubric to their institutional contexts. Importantly, the syllabus rubric was designed as a tool for formative feedback to inform next steps for teaching development and for change efforts at the departmental level, in contrast to a tool that provides a summative evaluation of the quality of teaching and/or course design. Accordingly, the applications we recommend emphasize the syllabus rubric as a source of ongoing formative feedback to guide teaching development.

Limitations of the Syllabus Analysis

There are two primary limitations associated with using the rubric to analyze syllabi. The first has to do with the inherent limitations of the syllabus itself: it is an indirect communication from the instructor (or author of the syllabus) rather than a direct reflection of the actual teaching happening in the course. For example, descriptions of class sessions may reflect an instructor’s intent to implement active learning activities, while the actual class sessions may shift to focus on content coverage through lecture. Moreover, there may be variation in who writes the syllabus and how much autonomy they have over the learning outcomes, assessments, and policies. Some institutions may require instructors to include standardized policies (at the university or department level). For example, an instructor may include a strict department-wide policy on attendance on the syllabus but choose not to enforce that policy in practice. In such cases, these elements of the syllabus may not reflect the instructor’s intent and/or how they carry out policies or inclusive and equitable teaching practices in the classroom. Given these limitations, data gathered from the syllabus can prompt further questions and discussion to understand the context and unpack implications that advance the use of inclusive and equitable teaching practices.

Additionally, a syllabus is not always the primary method of communication of course-related information. This information may be presented verbally in class or stored on the course learning management system (LMS). These limitations are part of why the project integrates data from both students and instructors prior to drawing strong conclusions. During discussions with departments, we acknowledged the variety of communication channels beyond the syllabus and used it as an opportunity to be curious rather than make assumptions when criteria were absent or lacking detail in the syllabi.

A second limitation is that the syllabus rubric itself may not capture all indicators of inclusive and equitable teaching, especially considering the multifaceted definitions of inclusive and equitable teaching in the scholarly literature. We expect the syllabus rubric will continue to develop as its use expands to different types of courses and programs and the changing needs in various institutional contexts. We experienced this evolution of the rubric during both the initial development and the second year of using the rubric in a new academic program. Specifically, we noted many structured opportunities for practice in the syllabi that were not being captured by the rubric but that were important, evidence-based components of inclusive and equitable teaching. After discussion within the team, we decided that a new criterion needed to be added to the rubric: embedded opportunities for practice. Drawing on Addy et al.’s (2021a) description of the syllabus as a “living constitution,” this syllabus rubric “invites participation, allows for evolution (or amendment), and accommodates a community’s changing needs,” (p. 50). We encourage adopters of our syllabus analysis tool to adapt the rubric to work in their context and contact us with ideas and modifications to support ongoing iteration and refinement of the rubric.

One potential area for expansion of the syllabus rubric is to engage more explicitly with practices based in critical pedagogy (Freire, 2000; hooks, 1994) and other pedagogical approaches with an anti-oppressive lens.6 Consistent with these educational philosophies, the existing rubric includes criteria that emphasize the participatory (e.g., embedded practice opportunities, autonomy in assessments) and relational elements (e.g., student-instructor engagement, peer support and collaboration, participation structure and variety) in the syllabus. However, additional criteria could be added to emphasize opportunities for students to provide input or make choices about learning activities, assessments, and/or course materials (i.e., co-creation in teaching and learning; Bovill, 2020). Moreover, future iterations of the rubric could incorporate undergraduate student perspectives to inform revisions to existing criteria, create additional criteria, or provide instructors and departments with targeted feedback on syllabi. The current rubric incorporated input from a graduate student as an equal partner in its development; however, feedback from undergraduate student focus groups or pedagogical partners (Cook-Sather et al., 2021) would be a valuable next step in the iteration and refinement of the syllabus analysis tool.

Additional Considerations

One essential consideration in the analysis and reporting of data from course syllabi is the variability of contexts across individual instructors, departments, disciplines, and institutions. Accordingly, our team prioritized conversations with members of the department to understand those contexts and use them to inform our approach to analysis and reporting. For example, we grappled with how best to share data with departments as we felt a tension between more comprehensive reports versus more accessible, brief summaries of the data we collected. As researchers, we wanted to be thorough and detailed in how we reported data. As educational developers, we wanted to ease reading and processing of the reported data to help faculty with limited time recognize the main points and implications of the data for advancing inclusive and equitable pedagogy.

In the pilot year of the project, we provided comprehensive written reports to departments that included scores on each syllabus criterion as well as strengths and recommendations based on patterns observed across departmental syllabi. This approach was time consuming and resulted in reports more than ten pages in length for just the syllabus analysis. In the next round of the project, we consulted with department leadership about the appropriate audiences, level of detail, and purpose of the reports. For example, in one department we led leadership through a comprehensive presentation of the major findings and later presented to faculty about specific teaching recommendations informed by the data. A primary strength of the project was its tailored approach to departmental norms, which also informed how we presented and discussed the findings from the syllabus analysis.

Syllabi are a primary data source in our comprehensive project aimed at facilitating departmental change efforts, enabling access to rich information about inclusive and equitable teaching that can be readily compared across disciplines and teaching contexts. We hope that other educational developers, researchers, and readers across roles will adapt the syllabus rubric as a tool for formative assessment to advance inclusive and equitable teaching in their contexts.

Notes

  1. These data were drawn from key facts about the university community and admissions data in June 2025.
  2. During the process of drafting descriptions, we decided to exclude two criteria that align with inclusive and equitable teaching but were difficult or time-consuming to assess. We excluded a criterion focused on the tone of the syllabus due to challenges in drafting additional observable characteristics that captured the complexities of tone in written text for each of the three levels of the scale. Given our emphasis on teaching strategies and practices rather than content, we also deemed a criterion aimed to capture the diversity of content and scholarly contributors to the discipline (e.g., diversity of scholars in assigned readings, highlighted research, guest speakers, etc.) untenable given the time and disciplinary expertise necessary for a comprehensive analysis.
  3. The same three test syllabi were used to generate criteria for the principle of critical engagement with difference using an inductive approach and test a working draft of the rubric.
  4. We were evaluating syllabi from a different academic program than those evaluated in the first year of the project.
  5. We adapted the rubric to create a checklist for instructors (with examples from MIT syllabi illustrating checklist criteria) to examine and revise their syllabi to include inclusive and equitable teaching and course design practices. This checklist is available at https://tll.mit.edu/teaching-resources/course-design/syllabus-checklist-landing/.
  6. We thank reviewers of an earlier version of the manuscript for suggesting possible extensions of the rubric to address co-creation and practices based in critical pedagogy.

ORCiDs

Ruthann C. Thomas: 0009-0002-0170-3616

Raechel N. Soicher: 0000-0002-2142-625X

Luna BuGhanem: 0009-0008-7218-9064

Amanda R. Baker: 0000-0001-9388-5730

Biographies

Ruthann C. Thomas is an Associate Director of Teaching and Learning in the Teaching + Learning Lab at Massachusetts Institute of Technology. Ruthann designs high-impact initiatives and provides programming and support to advance inclusive, evidence-based teaching practices for MIT faculty, instructors, postdocs, and graduate students. Her work is anchored in deep commitments to active and inclusive pedagogy and learning sciences as well as a love of learning and a belief in the transformative power of teaching.

Raechel N. Soicher is an Associate Director of Research and Evaluation in the Teaching + Learning Lab at Massachusetts Institute of Technology. Raechel partners with faculty, staff, and students to conduct research and evaluation projects at MIT. Her primary research interests leverage her applied psychology background to examine the complex contexts in which teaching and learning practices are successfully implemented.

Luna BuGhanem is an architect-researcher and former Graduate Research Fellow at the Teaching + Learning Lab at the Massachusetts Institute of Technology. Based between New York City and Beirut, she explores the relationship between the built environment, migration, and material culture across scales. Her practice is grounded in critical spatial research and a teaching approach committed to engaging students through interdisciplinary, place-based inquiry.

Amanda R. Baker is an Associate Director of Research and Evaluation in the Teaching + Learning Lab at the Massachusetts Institute of Technology. She works with members of the MIT community to improve the student experience through rigorous educational research and evaluation. As a learning scientist, she is particularly interested in using research to understand and promote students’ deep engagement in high-impact learning environments.

Acknowledgments

We thank Janet Rankin for striking an ideal balance between autonomy and support of our team as we developed and implemented this complex project and for her feedback on drafts of this manuscript. Hannah LeBlanc and Nathalie Vladis helped with the syllabus analysis in the second year and provided thoughtful feedback to help us iterate on the criteria and descriptions. To RCT’s writing group: Beth Lisi, Cindy Blackwell, and Lindsay Doukopoulos, thank you for the community, accountability, feedback, and support as I drafted the manuscript. We appreciate our department and program partners for opening themselves up to the project and for their commitment to advancing inclusive and equitable teaching.

Conflict of Interest Statement

The authors have no conflict of interest.

References

Addy, T. M., Dube, D., Mitchell, K. A., & SoRelle, M. (2021a). How do they design an inclusive course? In What inclusive instructors do: Principles and practices for excellence in college teaching (pp. 47–72). Taylor & Francis.  http://doi.org/10.4324/9781003448655-5

Addy, T. M., Leary, A., Rudenga, K., Sandoval, C. L., & Dewsbury, B. (2023, November 16–19). Intensive programs for inclusive teaching: Three models [Conference presentation]. 48th Annual POD Network Conference, Pittsburgh, PA, United States. https://podnetwork.org/48th-annual-conference/pittsburgh-program-details/

Addy, T. M., Reeves, P. M., Dube, D., & Mitchell, K. A. (2021b). What really matters for instructors implementing inclusive teaching approaches. To Improve the Academy, 40(1), 1–48.  http://doi.org/10.3998/tia.182

Artze-Vega, I., Darby, F., Dewsbury, B., & Imad, M. (2023). The Norton guide to equity-minded teaching. Norton & Company.

Ballen, C. J., Wieman, C., Salehi, S., Searle, J. B., & Zamudio, K. R. (2017). Enhancing diversity in undergraduate science: Self-efficacy drives performance gains with active learning. CBE-Life Sciences Education, 16(4), 56.  http://doi.org/10.1187/cbe.16-12-0344

Bers, T. H., Davis, B. D., & Taylor, B. (2000). The use of syllabi in assessments: Unobtrusive indicators and tools for faculty development. Assessment Update, 12(3), 4.

Bovill, C. (2020). Co-creation in learning and teaching: The case for a whole-class approach in higher education. Higher Education, 79, 1023–1037.  http://doi.org/10.1007/s10734-019-00453-w

Brownell, S. E., & Tanner, K. D. (2012). Barriers to faculty pedagogical change: Lack of training, time, incentives, and…tensions with professional identity? CBE-Life Sciences Education, 11(4), 339–346.  http://doi.org/10.1187/cbe.12-09-0163

Canning, E. A., Harackiewicz, J. M., Priniski, S. J., Hecht, C. A., Tibbetts, Y., & Hyde, J. S. (2018). Improving performance and retention in introductory biology with a utility-value intervention. Journal of Educational Psychology, 110(6), 834–849.  http://doi.org/10.1037/edu0000244

CAST (2024). Universal Design for Learning Guidelines version 3.0. Retrieved from https://udlguidelines.cast.org. Accessed 17 December 2024.

Center for Research on Learning & Teaching. (2021). The research basis for equity-focused teaching. University of Michigan CRLT. Retrieved from https://crlt.umich.edu/equity-focused-teaching/research-basis. Accessed 17 December 2024.

Cook-Sather, A., Addy, T. M., DeVault, A., & Litvitskiy, N. (2021). Where are the students in efforts for inclusive excellence? Two approaches to positioning students as critical partners for inclusive pedagogical practices. To Improve the Academy: A Journal of Educational Development, 40(1).  http://doi.org/10.3998/tia.961

Cullen, R., & Harris, M. (2009). Assessing learner-centeredness through course syllabi. Assessment & Evaluation in Higher Education, 34(1), 115–125.  http://doi.org/10.1080/02602930801956018

Dedoose. (2024). Cloud application for managing, analyzing, and presenting qualitative and mixed method research data (9.2.005) [Computer software]. SocioCultural Research Consultants, LLC.

Dewsbury, B., & Brame, C. J. (2019). Inclusive teaching. CBE-Life Sciences Education, 18(2), fe2.  http://doi.org/10.1187/cbe.19-01-0021

Dewsbury, B., Swanson, H. J., Moseman-Valtierra, S., & Caulkins, J. (2022). Inclusive and active pedagogies reduce academic outcome gaps and improve long-term performance. PLoS ONE, 17(6), e0268620.  http://doi.org/10.1371/journal.pone.0268620

Eccles, J. S., & Wigfield, A. (2002). Motivational beliefs, values, and goals. Annual Review of Psychology, 53(1), 109–132.  http://doi.org/10.1146/annurev.psych.53.100901.135153

Eddy, S. L., & Hogan, K. A. (2014). Getting under the hood: How and for whom does increasing course structure work? CBE-Life Sciences Education, 13(3), 453–468.  http://doi.org/10.1187/cbe.14-03-0050

Fink, L. D. (2013). Creating significant learning experiences. John Wiley & Sons.

Freire, P. (2000). Pedagogy of the Oppressed. Translated by Myra Bergman Ramos. Rev. Ed. 1973. New York: Continuum

Galman, S. C. (2013). The good, the bad, and the data: Shane the lone ethnographer’s basic guide to qualitative data analysis. Routledge.

Harackiewicz, J. M., Canning, E. A., Tibbetts, Y., Priniski, S. J., & Hyde, J. S. (2016). Closing achievement gaps with a utility-value intervention: Disentangling race and social class. Journal of Personality and Social Psychology, 111(5), 745.  http://doi.org/10.1037/pspp0000075

Harnish, R. J., & Bridges, K. R. (2011). Effect of syllabus tone: Students’ perceptions of instructor and course. Social Psychology of Education: An International Journal, 14(3), 319–330.  http://doi.org/10.1007/s11218-011-9152-4

Henderson, C., Beach, A., & Finkelstein, N. (2011). Facilitating change in undergraduate STEM instructional practices: An analystic review of the literature. Journal of Research in Science Teaching, 48(8), 952–984.  http://doi.org/10.1002/tea.20439

Hershock, C., Harrell, J., Le Blanc, S., Rodriguez, M., Stimson, J., Walsh, K. P., & Weiss, E. D. (2022). Data-driven iterative refinements to educational development services: Directly measuring the impacts of consultations on course and syllabus design. To Improve the Academy: A Journal of Educational Development, 41(2).  http://doi.org/10.3998/tia.926

hooks, b. (1994). Teaching to transgress. New York: Routledge.

McHugh, M. L. (2012). Interrater reliability: The kappa statistic. Biochemia Medica, 22(3), 276–82.

Ngai, C., Corbo, J. C., Falkenberg, K., Geanious, C., Pawlak, A., Pilgrim, M. E., Quan, G. M., Reinholz, D. L., Smith, C., & Wise, S. B. (2020). Facilitating change in higher education: The Departmental Action Team Model. Glitter Cannon Press. https://www.dat-project.org/resources/facilitating-change/

Palmer, M. S., Bach, D. J., & Streifer, A. (2014). Measuring the promise: A learning-focused syllabus rubric. To Improve the Academy, 33(1).  http://doi.org/10.3998/tia.17063888.0033.103

Palmer, M.S., Streifer, A.C., & Williams-Duncan, S. (2016). Systematic assessment of a high-impact course design institute. To Improve the Academy, 35, 339–361.  http://doi.org/10.1002/tia2.20041

Polmear, M., Hunsu, N. J., Simmons, D. R., Olaogun, O. P., & Lu, L. (2024). Belonging in engineering: Exploring the predictive relevance of social interaction and individual factors on undergraduate students’ belonging in engineering. Journal of Engineering Education, 113(3), 555–575.  http://doi.org/10.1002/jee.20599

Rainey, K., Dancy, M., Mickelson, R., Stearns, E., & Moller, S. (2018). Race and gender differences in how sense of belonging influences decisions to major in STEM. International Journal of STEM Education, 5(1), 1–10.  http://doi.org/10.1186/s40594-018-0115-6

Reinholz, D. L., & Apkarian, N. (2018). Four frames for systemic change in STEM departments. International Journal of STEM Education, 5(1), 3.  http://doi.org/10.1186/s40594-018-0103-x

Reinholz, D. L., Ngai, C., Quan, G., Pilgrim, M. E., Corbo, J. C., & Finkelstein, N. (2019). Fostering sustainable improvements in science education: An analysis through four frames. Science Education.  http://doi.org/10.1002/sce.21526.

Richmond, A. S., Slattery, J. M., Mitchell, N., Morgan, R. K., & Becknell, J. (2016). Can a learner-centered syllabus change students’ perceptions of student–professor rapport and master teacher behaviors? Scholarship of Teaching and Learning in Psychology, 2(3), 159–168.  http://doi.org/10.1037/stl0000066

Salehi, S., Ballen, C. J., Trujillo, G., & Wieman, C. (2021). Inclusive instructional practices: Course design, implementation, and discourse. Frontiers in Education, 6.  http://doi.org/10.3389/feduc.2021.602639

Shadle, S. E., Marker, A., & Earl, B. (2017). Faculty drivers and barriers: Laying the groundwork for undergraduate STEM education reform in academic departments. International Journal of STEM Education, 4(1), 8.  http://doi.org/10.1186/s40594-017-0062-7

Soicher, R.N., Baker, A.R. & Thomas, R.C. (2024). A mixed-methods research design to advance inclusive and equitable teaching. Innovative Higher Education, 49, 1105–1125.  http://doi.org/10.1007/s10755-024-09741-5

Stanny, C., Gonzalez, M., & McGowan, B. (2015). Assessing the culture of teaching and learning through a syllabus review. Assessment & Evaluation in Higher Education, 40(7), 898–913.  http://doi.org/10.1080/02602938.2014.956684

Sturtevant, H., & Wheeler, L. (2019). The STEM Faculty Instructional Barriers and Identity Survey (FIBIS): Development and exploratory results. International Journal of STEM Education, 6(1), 35.  http://doi.org/10.1186/s40594-019-0185-0

Tanner, K. D. (2013). Structure matters: Twenty-one teaching strategies to promote student engagement and cultivate classroom equity. CBE—Life Sciences Education, 12, 322–331.

Theobald, E. J., Hill, M. J., Tran, E., Agrawal, S., Arroyo, E. N., Behling, S., Chambwe, N., Cintrón, D. L., Cooper, J. D., Dunster, G., Grummer, J. A., Hennessey, K., Hsiao, J., Iranon, N., Jones, L., Jordt, H., Keller, M., Lacey, M. E., Littlefield, C. E., … Freeman, S. (2020). Active learning narrows achievement gaps for underrepresented students in undergraduate science, technology, engineering, and math. Proceedings of the National Academy of Sciences, 117(12), 6476–6483.  http://doi.org/10.1073/pnas.1916903117

Transparency in Learning & Teaching Project (2024). Transparent Methods. https://www.tilthighered.com/transparent-methods. Accessed 17 December 2024.

Walton, G. M., & Cohen, G. L. (2011). A brief social-belonging intervention improves academic and health outcomes of minority students. Science, 331(6023), 1447–1451.  http://doi.org/10.1126/science.1198364

Wheeler, L. B., Palmer, M., & Aneece, I. (2019). Students’ perceptions of course syllabi: The role of syllabi in motivating students. International Journal for the Scholarship of Teaching and Learning, 13(3).  http://doi.org/10.20429/ijsotl.2019.130307

Wiggins, G. & McTighe, J. (2005). Understanding by design (expanded 2nd edition). Association for Supervision and Curriculum Development.

Winkelmes, M., Bernacki, M., Butler, J., Zochowski, M., Golanics, J., & Weavil, K. H. (2016). A teaching intervention that increases underserved college students’ success. Peer Review, 18(1), 31–36. Retrieved from https://www.proquest.com/scholarly-journals/teaching-intervention-that-increases-underserved/docview/1805184428/se-2

Appendix

Syllabus Rubric Criteria, Organized by Principle of Inclusive Teaching

Principle and Criterion General Description of Criterion
Transparency
Intended learning outcomes Clearly states specific, measurable, and realistic course-level learning goals. Goals reflect levels of learning that are appropriate to the subject and the expected student level
Description of learning activities Describes in detail what students will do to prepare for and engage in a class session, making connections to the typical instructional methods used in class (e.g., lecture, in-class polling, problem solving, discussion, etc.)
Description of assessment tasks Clearly describes what students will do to complete major assessments in detail
Purpose of learning Clearly describes the purpose of learning activities and major assessments with explicit reference to gained skills and/or connections within the course
Learning tips and resources Clearly describes actionable tips and concrete strategies on how to succeed in the course, which may include external tools and campus resources to support students’ development of skills and knowledge in the course
Grading criteria Articulates how assessments will be graded and provides criteria, rubrics, or annotated examples
How to participate Clearly and concretely explains how students will participate in class
Purpose of participation Clearly explains why student participation is important, linking participation with successful attainment of learning outcomes and/or its relevance in the discipline, career, or students’ lives
Instructor help + contact Explains how and when instructor(s)/TA(s) are available to students, how to reach out to them, and the purpose of office hours
Rationale for course policies Clearly explains rationale for all course expectations, logistics (subject requirements, prerequisities, deadlines), and policies in constructive terms, highlighting the reasons for policies rather than the consequences
Belonging
Relevance of coursework Emphasizes or provides opportunities to reflect on relevance of coursework to career and life-oriented contexts, explaining how concepts and skills gained in the course connect in these contexts.
Peer support and collaborationa Includes learning activities that provide opportunities for students to interact as peers, identify shared interests in subject content, and collaborate. Describes the value of students learning from one another and/or provides opportunities for students to reflect on their collaboration
Student-instructor engagementa Clearly describes frequent opportunities, specific, and easily accessible structures of engagement with instructor(s) (beyond office hours) and encourages and invites students to engage with instructor(s)
Structure
Alignment: ILOs x assessments Explicitly links the student knowledge and skills measured in assessments to learning objectives
Embedded practice opportunities Describes class activities and formative assessments that allow students to practice with concepts and skills that align with learning outcomes and/or summative assessments
Feedback + revisiona Employs low-stakes, formative assessments that provide students with feedback and opportunities to revise and/or reflect on learning
Participation structure + varietyb Presents structures to support equitable participation, describing multiple, specific ways to participate
Policies’ structure and supportb Provides clear policies, information on accommodations, and pathways to secure them, if students need to be absent, turn in work late, leave class early, etc. Explains how policies are designed to support student learning when unforeseen circumstances arise
Critical engagement with difference
Affirming student diversity Explicitly acknowledges and affirms students’ different identities, experiences, strengths, and needs and describes diversity as an asset in the classroom; Articulates that they will be responsive to diversity by seeking feedback and responding to students’ needs.
Affirming diverse perspectives Explicitly acknowledges and affirms the value of considering and/or sharing different viewpoints; provides opportunities for students to build skills to critically engage with varied viewpoints
Variety of assessmentsb Employs a variety of assessments for students that draw on different skills to showcase their learning.
Autonomy in assessmentsb Builds in multiple opportunities for student choice in assessments (e.g., different options for topics or modalities for assignments, optional opportunities for instructor or peer feedback on drafts)
  • Note: Each of the 22 criteria in rubric included four levels used to assign scores: not present (0), present but unclear (1), present and somewhat clear (2), and present and clear (3), which each level including a detailed, qualitative description of observable characteristics. Table 1 includes selected examples of criteria from the syllabus rubric with qualitative descriptions. The full syllabus rubric, including these qualitative descriptions is available at our CTL website (link will be added to accepted manuscript).

    a These criteria have conceptual overlap with the principle of structure as they represent either concrete and structured pathways for students to interact with either peers or instructors that are built into the design of the course and/or class sessions.

    b These criteria could also be categorized as flexibility to show how an instructor can respond and adapt to students’ varied needs, strengths, and circumstances.