Promoting discipline-based education research (DBER) has become a concerted goal among the National Academies (National Academy of Engineering [NAE], 2018; National Research Council [NRC], 2012) and the National Science Foundation (NSF). In 2018, NAE lauded DBER, writing, “DBER has arrived at insights about how students learn science and engineering and how to design instructional strategies that build on these insights to improve students’ conceptual knowledge and attitudes about learning” (p. xii). Thus, by developing an awareness of DBER, faculty members become better equipped to apply evidence-based teaching strategies in alignment with, and to ultimately help students attain, course learning objectives. Given the critical role of DBER for promoting student learning, it is essential for science, technology, math, and engineering (STEM) faculty members to develop DBER literacy such that they are better prepared to apply contemporary and emergent educational research to inform their teaching praxis.
DBER involves “basic and applied research” (NRC, 2012, p. 2). DBER literacy may be developed via collaborations between disciplinary experts and educational experts or disciplinary experts who have DBER literacy (Wieman, 2017). Goals of DBER vary, which can make developing DBER literacy challenging. For example, DBER goals may include seeking to “understand how people learn the concepts, practices, and ways of thinking of science and engineering”; “understand the nature and development of expertise in a discipline”; or “identify approaches to make science and engineering education broad and inclusive” (NRC, 2012). In short, DBER constitutes bringing theoretical knowledge from educational research to advance learning while accounting for the contextual features and nuances that are important to a discipline. DBER literacy involves a level of adeptness at engaging with and applying DBER in one’s teaching.
In this project, we explore the impact of an internal grant program on faculty members’ development of DBER literacy. We place specific focus on DBER rather than scholarship of teaching and learning (SoTL). While DBER and SoTL share an ultimate goal of improving student learning, there are a few ways in which they differ. SoTL aims to enhance teaching quality using findings from research on learning and thus purposefully uses extant research designs or theory (Shulman, 2000). However, the goal of SoTL is generally not advancing theory. In contrast, DBER explores research questions, hypotheses, and ways of thinking relevant to teaching and learning in a particular discipline and implements those findings beyond single classrooms and programs. This results in original, generalizable, and mechanistic insights or theories into educational processes and their effects (Center for Science, Mathematics and Computer Education, n.d.; Dolan et al., 2018).
For this study, we developed, implemented, and evaluated a STEM Education Innovation & Research Institute (SEIRI) Seed Grant (SSG) program on STEM faculty members’ development of DBER literacy at Indiana University–Purdue University Indianapolis (IUPUI). In this manuscript, we summarize goals and outcomes of multiple years of the SSG program. We address three research questions via the theoretical lenses of social network and situated learning theory:
-
RQ1:
What factors facilitated DBER literacy among STEM faculty who participated in the SSG program?
-
RQ2:
In what ways did the SSG program contribute to faculty members’ understanding of DBER?
-
RQ3:
What other outcomes resulted from faculty members’ engagement in the SSG program?
Background & Motivation
Benefits of Small Grant Awards on STEM Faculty Development
Small grants are beneficial to STEM faculty as they provide an incentive to ask research questions in the context of their pedagogical interventions. A competitive submission process can promote high-quality pedagogy and projects that include evidence-based methods to assess success of existing or novel interventions (Carlisle & Weaver, 2018).
The 2019 American Association for the Advancement of Science (AAAS) report Levers for Change (Laursen, 2019) discusses how institutional structures and cultures can work as levers driving changes in STEM undergraduate teaching and learning. The report identifies shortfalls in strategies for adoption of research-based instructional strategies and recommends that a positive change will require faculty development activities, institutional support, departmental changes, and rewards. The SSG initiative directly or indirectly helps address these suggested efforts through activities such as regular team member meetings, assessment-based consultations and guidance, and funding support. The initiative fosters collaborations within and across STEM departments, disciplines, and career stages. Thus, faculty at all levels have the potential to benefit from this program.
Most STEM faculty, though involved with teaching, have limited familiarity with reading or implementing DBER-based research practices owing to the lack of a community or incentives aligned with evidence-based pedagogical reform (Cox, 2001; Mulnix, 2016). The SSG program facilitates faculty engagement in DBER and requires faculty members to design and evaluate their evidence-based teaching strategies in innovative ways, thereby enabling them to simultaneously improve their courses in alignment with extant learning theories or recent evidence. SSG awardees become familiar with DBER by proposing and engaging in projects they created. These projects involve changes in their own courses or a course series in their departments. Most awards foster collaborations with a team of colleagues in interested areas, creating a cohort-like mentality. Thus, we posit that the SSG program promotes deep learning among participants, but here the learning outcome is DBER literacy (Guskey & Yoon, 2009).
Existing Models of Small Grants Across Academic Institutions
STEM education reform in academic departments is often incentivized with small grants. These change efforts can either be at the individual, faculty, or departmental level. These efforts, as described below, motivated our design of the SSG program.
The Peer-Led Team Learning (PLTL) project received two national dissemination grants (NSF Award Nos. 0231349 and 0941978). A key activity of awards was a small grants program (i.e., Workshop Project Associate or WPA) that provided funds, technical expertise, and compensation to potential PLTL adopters. The adopters were encouraged to adapt the PLTL model to meet the needs of their student populations (Gafney & Varma-Nelson, 2008). Similarly, the Science Education Initiative (SEI) model implemented a model of change wherein departments were funded to hire DBER expert faculty partners, who supported their evidence-based teaching efforts. This effort eventually resulted in positive shifts toward faculty attitudes and norms across departments (Wieman, 2017).
Another example of small grants fostering STEM pedagogy are the Curriculum Enhancement Grants (CEG; https://ctl.iupui.edu/Programs2/CEG) designed and implemented by Varma-Nelson in 2009. The grants provide technical and instructional support to implement projects focused on improving student teaching and learning. The CEGs aim to increase awareness about SoTL among faculty. Types of projects supported include improving individual undergraduate or graduate courses or transformation of courses from face-to-face to online or hybrid formats. This program continues to be popular with faculty, as evidenced by several submissions each year to the program.1
One SSG program goal was to help faculty members develop competitive proposals for external funding. We modeled the SSG program after NSF’s Improving Undergraduate STEM Education (IUSE) program, particularly the IUSE “Request for Proposals” guidelines, “Reviewer Template” and the annual report form (see Appendices A–B, D.1–D.2). Faculty members’ track records of external grant funding were a distal outcome we provide as an additional measure of faculty development and the success of SSGs.
Motivation for the SSG Program
Theory of Change
We hypothesized the SSG program would promote individual growth based on the theories of social network and Communities of Practice (CoPs).
According to what we have learned from the analysis of social networks, building informal relationships across units such as departments or schools within a university enables peers to influence each other’s attitudes or choices in positive ways (Daly, 2010; Kezar, 2001). Social networks thus facilitate change via a range of mechanisms, such as the diffusion of innovations model (Macdonald et al., 2019; Rogers, 2003) wherein “early adopters” prove the merit of an initiative (such as engaging in DBER), which encourages others to eventually come on board. Such diffusion, however, is contingent upon myriad factors, such as establishing communication systems, facilitating knowledge transfer, promoting adaptations to alternating environments, shaping attitudes, increasing support for solving problems, and creating systems of accountability (Kezar, 2001; Larson & Dearing, 2008; Wenger, 1998a, 1998b).
CoPs comprised of faculty members in STEM are one way to realize institutional reform (Gehrke & Kezar, 2017). CoPs theorize that learning is a “joint enterprise [emphasis added] as understood and continually renegotiated by its members” (Wenger, 1998a, p. 2), and the ultimate objective of a CoP is for its “members to engage in a collective process of learning” (p. 4). Two primary goals of CoPs include (1) “transferring/exchanging knowledge” and (2) “networking” (Gehrke & Kezar, 2017, p. 808). Success factors for promoting learning in CoPs include (1) effective communicative and structural organization, (2) ample and “optimized” networking and interaction opportunities, (3) development of supportive “infrastructure,” (4) developing and sharing of tools to support growth, (5) establishing “specific objectives and strategies,” and (6) providing “organizational support.” CoPs have been utilized in STEM education and have been shown to be effective as participants gain knowledge of a new realm (such as DBER) by immersing themselves in knowledge exchange or networking discussions with other faculty (Pelletreau et al., 2018).
Taken together, analyses of social networks and CoPs informed the design of the SSG program. Faculty members progressed together like a cohort of new learners to strengthen their DBER literacy. The SSG CoP provided a social network to support learning about the theories and practices of DBER. Faculty participants (1) find meaning by implementing pedagogical innovations they designed to achieve a particular objective; (2) practice engaging in DBER by evaluating project findings, discussing emergent trends, and with support from the community; and (3) start identifying themselves as scholars of teaching and learning. The CoP (Wenger, 1998a, 1998b) promotes the sharing and celebration of accomplishments from DBER-based curricular refinements and course practices and encourages extending lessons learned into novel opportunities, such as new dissemination opportunities and external grant submissions.
In the following sections, we provide additional details of the SSG program features, including how they align with our theoretical framework.
Overview of the Program
Like existing models of small grants, SSGs are awarded by a university STEM research institute at a large Midwestern university. The hosting institute aspires to promote research and implementation of potentially transformative projects by STEM faculty within the graduate and undergraduate curriculum. SSG is a key initiative of the hosting institute and seeks these ultimate outcomes through faculty members’ development of DBER literacy. The SSG program offers a low-stakes opportunity for STEM faculty to engage in DBER, often for the first time, with scaffolded support. Like the NSF IUSE program, the SSG program requires faculty members to provide evidence for the effectiveness of their innovation using appropriate assessment and evaluation methods. Within IUPUI, the SSGs are unique in scope in comparison to other awards, as they are aimed at collecting preliminary data in preparation to eventually apply for external funding.
SSG Program Features That Promote Change
As detailed next, all the activities faculty members participated in were designed in alignment with our theoretical framework and provide potential evidence for the development of DBER literacy. They are designed to scaffold faculty development to gain skill and confidence in engaging in DBER.
Information sessions
STEM faculty learn about SSGs via information sessions that cover topics such as proposal format, examples of innovative STEM pedagogy-based projects, budget templates, and potential sources of external funding for future application. They were required to incorporate each of these components and design a sustainable project plan as they frame their proposals (see Appendices A and B). We encourage faculty to use evidence-based pedagogical practices as they draft their research plan. To elaborate, we require that each proposal situate the pedagogy in their study within previous research and literature. Applicants are required to include information about specific assessment tools aligned with their project goals. This lends insight into various measures of STEM-based competencies and allows for broadened testing of the assessments in new contexts.
SSG participation related outcomes
We received 12 applications in 2017 and eight applications each in 2018 and 2019. Following a rigorous review process using a reviewer template (Appendix B), a reviewer panel comprising of past SSG awardees shortlisted successful applications. We funded seven in the first year and five each in the following years, or 17 awards total. Table 1 provides an overview of the range of proposal topics, schools, and departments represented by the SSG 2018 recipients. Similarly, SSG 2017 and 2019 received and funded proposals across each of these STEM schools and departments. See the SEIRI website (https://seiri.indianapolis.iu.edu) for more details.
SSG 2018 # |
School |
Department |
Title |
Faculty reviewer department |
---|---|---|---|---|
SSG201806 |
Science |
Psychology |
Peer Assistant Role Models in a Graduate Computer Science Course |
Chemistry, Physics, Engineering, Technology |
SSG201807 |
Science |
Chemistry & Chemical Biology |
Indianapolis Metropolitan Area Chemist Community Outreach Program (“I M A” Chemist Program) |
|
SSG201802 |
Engineering & Technology |
Computer and Information Technology |
Integrating Disciplinary International Collaborative Experiences (DICE) Into the Undergraduate STEM Curriculum |
|
SSG201804 |
Engineering & Technology |
Mechanical and Energy Engineering |
Extracurricular Projects to Enhance the Current Engineering Educational Paradigm |
|
SSG201808 |
Engineering & Technology |
Engineering Technology |
Writing Support in STEM Education |
Methods
Data Collection
We collected and report on three types of data for this study: (1) small group interviews, (2) observations of CoP meetings, and (3) annual project reports.2
Small group interviews
Two 90-minute-long faculty small group interview sessions (see Appendix E.1, interview questionnaire) with 12 participants (SSG 2017 awardee cohort) each were conducted by the first author. The data analysis section below describes our interview analysis process.
Observations of Community of Practice meetings
The SSG faculty members met twice each semester at a time of mutual convenience to allow maximum representation of the SSG awardee cohort. Project personnel were required to attend these meetings so they could share the lessons learned with other members of the cohort. The requirement was for each project to be represented by at least one team member. We covered a range of topics during the meetings, such as critical components (Appendix C) of a pedagogy. We facilitated informal conversations on topics such as “critical components when designing an assessment” for DBER projects and “basis of designing a good logic model for DBER projects” with exercises for faculty to construct their own project logic models. Other topics included discussion of seminal research articles about assessment design as well as talks by past SSG recipients who demonstrated good project implementation strategies.
Annual project reports
At the end of the first year, SSG teams submit an annual report based on a provided annual report template. The template captures attainments of project goals as per proposed timeline, overall project impact, instructional strategies implemented in the first year, and plans for the following year. Information regarding assessment, dissemination, and plans for external funding are solicited in the final report (Appendices D.1 and D.2).
Data Analysis
We used thematic analysis (Braun & Clarke, 2006) to code faculty group interviews. Our approach included both deductive (i.e., theory-driven) and inductive (i.e., exploratory) elements. From a deductive standpoint, our coding was informed by the deliverables considered as part of the SSGs (such as faculty professional advancement, increased DBER research familiarity). Two coders (Author 1 and a graduate student) independently coded the interview transcripts, and then the coders came together to discuss codes to resolve any discrepancies. After two rounds of modifications to the coding scheme, intercoder reliability of 90% was attained using Percent Agreement by dividing number of agreements by number of codes. This analysis yielded nine themes, or three themes each, offering insight into our three research questions (see Appendix E.2 for interview codes). Author 1 shared these themes with Authors 2 and 3 for review and critique. We triangulated each theme based on our observations of the faculty meetings and aspects, outcomes, or experiences reported in the annual reports.
Results
We addressed three research questions (RQs) in this study. With respect to each research question, the themes highlight (1) factors that bolstered STEM faculty members’ engagement in DBER and development of DBER literacy, (2) ways the SSG program promoted faculty members’ DBER literacy, and (3) auxiliary outcomes of the SSG program. We generated nine themes total, or three themes for each research question. We share results of each RQ in turn.
RQ1: What factors facilitated DBER literacy among STEM faculty who participated in the SSG program?
RQ1 explores the factors that facilitated the development of DBER literacy among STEM faculty members who participated in the SSG program.
Theme |
Description |
---|---|
1.1. Intrinsic motivation |
Individual’s intrinsic motivations to engage in DBER or realize curricular change provided a foundation for success. |
1.2. Peer dialogue and engagement |
Conversing and engaging with peers (i.e., other SSG recipients) via a CoP facilitated DBER literacy gains. |
1.3. Institutional DBER support |
DBER experts available through the hosting institute and the CoP provided meaningful support to facilitate DBER growth. |
Theme 1.1: Intrinsic motivation
There were variable motivations that brought participants to the SSG program. Participants learned of the program via information sessions broadcasted by multiple communication channels. Most participants suggested they were motivated to pursue SSG funding from various motivational sources. Via our data collection, we identified many intrinsic motivations coupled with extrinsic motivation in the form of project funds.
Intrinsically motivated participants aspired to learn more about a specific pedagogy, although participants were at variable levels of experience applying select pedagogies. For example, the mechanical engineering department was interested in testing PLTL and viewed the SSG program as an opportunity to test the model on their own student population while more purposefully exploring student impact. The School of Engineering and Technology has now introduced the PLTL program in several courses (https://et.iupui.edu/students/pltl/). Other participants were interested in trying an innovation they designed based on their local teaching experiences. These participants were intrinsically motivated, with a goal of alleviating difficulties students experienced in learning certain concepts. While these participants shared intrinsic motivations, the support afforded by the SSGs facilitated translation of their motivation to practice. As one participant (Assistant Professor, Department of Computer Information Technology) stated:
Similarly, other participants saw the SSG program as an opportunity to improve teaching and learning within their program. These participants felt the SSG program afforded them the opportunity to implement and test a curricular innovation that could realize sustainable change at the departmental level. For example, one participant (Assistant Professor, Health Informatics) discussed how the SSG could collectively involve faculty and students in conversations about how best to change curriculum:I had a long-term goal for almost a decade of introducing [a] greater computational component in every single course that we teach. So, the goal was to have every course have maybe close to 25% component that was computational. The only way to make this successful was to have every single person, every single faculty member in the department be committed and involved with this so that one could transform the entire curriculum. And that requires some financial support, which is where SEIRI and the SSG program came in.
One of the main issues with the [departmental] program was that [it is] a feeder program to do a graduate program and the students do not have enough required technical skills … so I wanted to design an experiment that could allow us to measure how students did in an online classroom setting with use of technology.
Theme 1.2: Peer dialogue and engagement
Faculty members found engaging with peers via structured interactions was useful in promoting their DBER literacy. As one participant stated, “Every time I come here [faculty meeting], I learn a different [pedagogical] concept from [a] different [breakout] team.” Some participants felt hearing how peers communicated DBER findings enhanced their own understanding of DBER. As one participant stated, “It’s helpful seeing how people [SSG peers] communicated more pedagogically linked types of work.” Many participants suggested the conversations they had within the CoP fueled future conversations. As one participant (Senior Lecturer, Mechanical Engineering) stated:
This same participant compared the SSG conversations with their departmental seminars.I did learn something from the informal interaction that happened during, before, and after the breakout groups. Or just well, or simply seeing them and having and hearing something and then begin to have a conversation. I learned something about what he [a separate SSG awardee] was doing. And so that promoted a conversation at a later date or before the next meeting or something like that.
I went to all of my department seminars for the last year, and I compared what I got from those to what I got out of the SEIRI discussions that we had. Alright, I, in that case, I might have gotten a lot more out of the SEIRI discussions.
Theme 1.3: Institutional DBER support
While participants were motivated to engage in the SSG program and benefited from peer conversations, most participants felt engaging in the research portion of the SSGs was a challenge. These challenges were primarily grounded in participants’ lack of familiarity with DBER. Thus, participants suggested the support from STEM education researcher via SEIRI was critical to their learning and success. As one participant (Associate Professor, Department of Chemistry & Chemical Biology) stated:
This participant suggested the support offered by SEIRI helped them overcome their lack of confidence, a barrier that otherwise may have inhibited their pursuit of any DBER-related work. Other participants indicated similar gratefulness of SSG support from SEIRI, suggesting that it provided an overall positive experience. As one participant stated, “I just want to say that you were fantastic. All the support you gave us. Thank you. That was really excellent.” Another interviewee (Professor, Department of Physics) in the same session echoed this sentiment, stating:I’ve continued to feel that I know absolutely nothing about the science of teaching, whereas I would feel like at least hopefully in my new area of research, I might know something. The hurdle that I’ve identified is my own confidence that I know what I’m doing. And sort of my desire to be partnered with someone who does know what they’re doing because of what I guess my feeling for doing more work in the educational field… . I certainly still feel very much like an outsider … who maybe, you know, understands a few phrases, has a little bit of language right on the edges with, let you go to a seminar and maybe appreciate the education-based seminar. But you don’t necessarily feel like you can stand alone and do anything.
I will just double that and say you guys are just amazing. I mean my experiment that I am running would have been impossible without your support, and it is not often that you come across that in one’s life, you know. So, kudos to you guys for putting it together and helping the way you did.
RQ2: In what ways did the SSG program contribute to faculty members’ understanding of DBER?
While results of RQ1 suggest factors that helped ensure SSG participants developed DBER literacy, RQ2 focuses on the ways in which faculty came to understand DBER by participating in the SSGs.
Theme |
Description |
---|---|
2.1 DBER vs. STEM research |
Contrasting STEM research and DBER provides a familiar modality for understanding DBER. |
2.2. DBER and teaching |
Engaging in DBER led to improved teaching, which translated to student learning benefits. |
2.3 DBER and collaboration |
Engaging in the SSG program improved abilities to engage peers within and beyond the department in curricular change and innovation. |
Theme 2.1: DBER vs. STEM research
Faculty members contrasted traditional STEM research with their evolving knowledge of DBER. After participating in SSGs, they felt the two research modalities were similar in approach, but the context was distinct. Specifically, faculty research is embedded in a certain STEM area and typically does not connect to teaching and learning. DBER findings, however, carry implications for course teaching. Despite these contextual differences, DBER and STEM research both have a research question and methodology, as indicated by this participant (Assistant Professor, Department of Chemistry & Chemical Biology):
With regard to methodological distinctions, faculty members suggested STEM and STEM education differ in their approach to assessment. For example, one faculty member (Associate Professor, Department of Chemistry & Chemical Biology) stated:The activities the students are doing are just like research that would be done in my lab or [by] a colleague in a related aspect of chemistry. But the goals of the project were not really the goals of doing chemistry; they were learning how to have students do chemistry. And all of the thought considerations, analysis, trying to think about what data we get and how to analyze it, what it means, and that’s all completely different from anything else I do [referring to own research].
Faculty members discussed validity and generalizability considerations of DBER vis-à-vis STEM research. Faculty members indicated that, unlike STEM research, DBER findings are more socially constrained. More specifically, faculty members felt that DBER findings may not be as broadly generalizable as their STEM research findings:We all now maybe understand more about the importance of assessment. We may understand, some of you may understand very broadly with some of the kinds of assessment … but it’s still so different from disciplinary research. I think, I may recognize that there is a project that I might want to do in bench chemical research, where I can do everything and I need collaborators, but I feel like I understand what might need to be done and know what I need to communicate to my collaborators and know whether they’re doing what they need to do. They may have to provide all of the fine tuning. Because they’re the experts there. But I still feel like in educational research, maybe I have an idea, but I have no clue of what [the] appropriate analysis evaluation of it is. And [I] don’t really feel like I can drive things very much because I have a piece of an idea, but I know there’s this big bad part, and I’m not sure how much it would take to change that dynamic.
A challenge with contrasting DBER and STEM research that faculty discussed was that the import of the DBER work on promotion and tenure varies by position. Participants suggested DBER work would fall under the traditional promotion and tenure categories of “teaching” or “service” rather than research. As one participant stated, “[SSG] probably goes under the bucket of either teaching or service, but not research.” Moreover, some faculty expressed promotion and tenure benefits resulting from DBER engagement despite its lack of institutional recognition relative to STEM research. As one assistant professor from the Department of Computer Information Technology stated,It is similar on a sort of 30,000 feet superficial level. There are similarities like coming up with the hypothesis and the device or instrument to collect the data. You have to analyze the data. However, in physics we believe that if I come up with some general observations and phenomena, that that applies no matter what, you know, that electron is on the moon, its properties are the same. But for SSGs, anything that I discovered here, is that true for students at IUPUI versus students at say Harvard versus students at a community college? The answer is probably no. (Professor, Department of Physics)
While some faculty perceived the SSG to be beneficial for promotion and tenure, others suggested it was both a blessing and a curse. As one participant stated, “SSGs do not really count as research or teaching advancement. But it could still go under the bucket of teaching and/or service. However, ironically, not continuing the new pedagogy might affect faculty teaching evaluations.” In short, this faculty member expressed that there was a departmental shift in student expectations for pedagogy resultant from the SSG, thus raising the bar for what students expect of their learning environment.My tenure and promotion is heavily based on research, but also I need the teaching part to be satisfactory. And [with the SSG funding] I got the chance to look into new pedagogy and the methodology, and we want to publish the results to … FIE [Frontiers in Education].
Theme 2.2: DBER and teaching
Participants indicated that by engaging in DBER and reflecting on their own teaching, they helped promote student learning gains. In many instances, faculty discussed how engaging in the SSGs helped them hone their specific teaching skills and better empathize with students.
One faculty member suggested by engaging in the SSG program as a student themselves, they could better empathize with students learning incidental or unintentional aspects of a course. As one faculty member said of their SSG progress, “Sometimes you expect this should work, but [it] doesn’t work as I expected.” This faculty member struggled to realize their initial SSG student learning outcomes, but they gained new insights. In parallel, they compared their SSG assessment to course assessment data, stating, “Something surprisingly comes out based on student feedback. They learn more things in that aspect, instead of what we expect them [the original outcomes] to be.”
Given the challenges of engaging in DBER, faculty found themselves learning from their challenges. Another faculty member (Associate Professor, Department of Chemistry & Chemical Biology) stated:
Other faculty members discussed unexpected successes. One participant (Clinical Assistant Professor of Law) discussed that many engineering students were pursuing careers in a domain related to their SSG program:The experiment we decided to do in [the] beginning was a … complete failure, and we learned that during the first time we ran the lab. One realization that is somewhat of a surprise is that we proposed to do something that would be hard to do with a lot of lab sections and we recognized that up front.
I knew that the career path of a patent examiner is a very good career path. It is one of the, one of the best things our students can do. And I was surprised to see how many students took that on and were very successful at becoming patent examiners at the patent office. And from what I can tell, they’re very happy at their positions at the patent office. So that was a surprise. I just didn’t know how that was going to shape out.
Theme 2.3: DBER and collaboration
Engaging in DBER via the SSG program provided faculty with the ability and encouragement to engage peers, including within and beyond their department, in realizing curricular change. Thus, the DBER engagement that the SSG program fostered led to novel collaborations. These collaborations were especially salient within departments wherein SSG recipients leveraged their SSG scope of work to drive conversations and enhance the departmental social network. As a participant (Assistant Professor, Department of Computer Information Technology) stated:
Many participants felt their departmental peers would be resistant to widespread curricular change. Yet faculty members found themselves engaging with more departmental peers and were surprised when their departmental peers were significantly cooperative. As a SSG participant (Professor, Department of Physics) stated:The idea was to have a computational component and at a minimum 25% of every course… . That was happening in other courses, but not all faculty members were equally comfortable with this in terms of implementation. They were on board with the idea but did not know how [to] do it. How do you create and assess assignments that are computational in nature and incorporate it into your classroom, lectures, and recitations and so on? So, the three of us would help them get up to speed and create, help create assignments.
Other participants discussed how the SSG program initiated conversations beyond the institution. For example, one participant (Professor, Department of Biology) discussed the role of applying learnings to inform how STEM laboratories are designed broadly across the institution:Since ours was a department-wide effort, it required every single faculty member to be 100% on board. The fact that they were … surprised me! I think there was absolutely no resistance anywhere. But everybody was on board. It was really, frankly, shocking. You know, as a past department chairperson, [I] never got this level of cooperation for anything. So that was a surprise.
I think maybe one thing that [SSGs are] going to stimulate … is that our lab courses don’t get revised very often in this department, generally. It really opens the question, why do we teach lab courses? What do we want students to get out of the lab course? Are they there to learn a technique or are they there to see [a] demonstration of a particular phenomenon/reaction that they’ve learned about? … [T]he project that SSG funded really kind of changes what the student gets out of the lab… . So, what does the lab do? Is the lab there to let them see what researchers like to make them enthusiastic about it? Or is it there to make them technically adept at doing something in particular? And that’s a big question because labs do need a facelift.
RQ3: What other outcomes resulted from faculty members’ engagement in the SSG program?
To understand the benefit SSGs carried beyond gain in DBER literacy, we explored other outcomes resultant from faculty engagement in the SSGs. These findings revealed ways in which faculty members benefited professionally.
Theme |
Description |
---|---|
3.1 Fueling motivation |
The SSG program enhanced and maintained faculty motivation to engage in future curricular change. |
3.2. Professional development |
Engaging in the SSGs led to other aspects of individual faculty member’s professional development. |
3.3 Departmental change |
Engaging in the SSG program led to conversations with peers and chairs in the department, which supported curricular change. |
Theme 3.1: Fueling motivation
While many faculty members participated in the SSG due to their intrinsic motivations, engaging in the SSG program fueled future motivations (and, ostensibly, confidence) to engage in DBER and curricular change. Faculty members expressed motivations to develop and implement innovative instructional activities in the future. One participant (Lecturer, Mechanical Engineering) suggested they would develop a new leadership program in alignment with the PLTL format they tested in their curricula. Another participant from a chemistry department intended to redesign their labs, stating, “[The SSG] didn’t cause me to teach something that I had never taught before. But clearly, I didn’t teach something differently that I have taught before.” The mechanical engineering lecturer participant also expressed interest in transferring learning from one subdiscipline to another:
This participant expressed the desire to scale up their pedagogical innovation by bringing it to new disciplines:This methodology is now standard in three courses, in the ME [mechanical engineering] department, one course in the BME [biomedical engineering] department. We’ve done it a couple of times in electrical engineering, and I think that’s continuing. So, you know, part of the goal here was to expand the reach of peer-led team learning. I think that’s occurred.
I think what I found interesting there was the fact that people implement PLTL for the students in the classroom. But you get this benefit of the leaders and their improvement… . For these small number of leaders and mentors, the improvement is sometimes quite deep. And it changes their whole trajectory of their education. And that would not have occurred without SSG.
Theme 3.2: Professional development
The participation in the SSGs positively influenced faculty’s professional development, particularly within their department. For example, one lecturer suggested that by improving their teaching, the SSG program enhanced their promotability. As they stated, “The dean and the department look at my focus on teaching. I think that this grant contributed greatly to their understanding of what I’m trying to do and was extremely positive.” Most participants agreed that the SSG program led to recognition at the department faculty meetings and department talk invitees. Yet the benefits of this recognition on tenure varied by participant positions. For example, while many participants were able to pursue manuscripts and external funding, they felt that these credits were “hard to measure.” More specifically, a participant (Associate Professor, Chemistry and Chemical Biology) stated:
Other participants discussed how the SSG program enhanced their visibility among external networks, such as disciplinary professionals. As one participant (Professor, Department of Physics) stated:If that [SSG project] gets [externally] funded, we get credit for that. But how that credit works is very hard to measure, because to make a very direct and blunt point, I know for me to get promoted, I need another federal grant. If this gets federally funded, I won’t get promoted because it won’t get counted.
[This project helps in getting] visibility, in the sense that, after we did this work, we became part of a bigger network of physicists who are doing similar things, which then led to giving invited talks at conferences… . We had an invitation this week to speak at the American Physical Society annual meeting, which is our biggest professional society.
Theme 3.3: Departmental change
Participants indicated that engaging in the SSGs led to additional discussion within the department. Some participants suggested these conversations led to department chairs reconsidering what constitutes good criteria for promotion and tenure. Specifically, discussions in some STEM departments purposefully considered how DBER might be better prioritized across the department and institution. As a mechanical engineering lecturer participant stated:
Irrespective of tenure, faculty members felt the SSG project directed their department chair’s attention to aspects of the curriculum that needed to be modified. As a participant (Assistant Professor, Department of Computer Information Technology) stated:[T]he place of educational research within the promotion and tenure track has been discussed. I don’t know that I can make a definitive statement after that about the outcome, but I would say that it’s been discussed as a something that needs to be considered.
At the beginning, [the department chair] was encouraging us to do this renovation to our curriculum because it’s a starting point. No one explored this kind of collaboration [data science and informatics course-based collaboration] before. And obviously, we did two rounds of experiment, plus this year we did a third round again. Every round we’ll modify the process to try to make it better.
Discussion
This study suggests that a 2-year experience conducting STEM education research can help faculty members become more knowledgeable and confident in conducting DBER and more aware of potential improvements to their own and other department courses. Our findings suggest faculty members enjoy networking with department colleagues and institutional administrators in improvement of their curriculum. In this discussion, we map themes and strategies for motivating DBER and provide other suggestions for promoting DBER literacy.
Mapping Themes
This work was exploratory, but in Figure 1 we postulate how the themes interrelate. We triangulated findings to provide a model of key learning components that promote DBER literacy and, in turn, how these learnings contributed to other auxiliary outcomes. We postulate that the three learning components contributed to DBER-related outcomes that, in turn, facilitated auxiliary outcomes. Within each category, we do not make claims regarding relationships between themes, but we posit that the themes are mutually reinforcing. For example, by considering the key learning components, we postulate that institutions that value and support DBER will likely empower their faculty members to act upon these values by fostering networks between such faculty members and fostering such peer engagement. Should one or multiple of these learning components be missing, the realization of the subsequent DBER literacy outcomes may not come to fruition.
While our analysis was exploratory and largely inductive, the findings align with extant research on CoPs in STEM education. For example, Gehrke and Kezar (2017) identified key strategies for realizing STEM reform via CoPs. Their suggestion for providing adequate support aligns with the institutional DBER support learning component, their suggestion for involving multiple individuals aligns with the peer dialogue and engagement learning component, and their suggestion of engaging members to promote “mastery” of a content domain aligns with the intrinsic motivation learning component. Gehrke and Kezar also discuss the import of identifying “key leaders,” which aligns with the professional development auxiliary outcome, particularly as some SSG awardees develop DBER expertise and become fellows at a local STEM center (p. 825).
Motivation
Intrinsic motivation, peer dialogue and engagement, and institutional DBER support provided a critical foundation for the SSG program. These learning components fueled DBER literacy and, in turn, auxiliary outcomes, such as faculty members’ motivation to continue SSG activities beyond the grant funding period. Thus, while faculty members brought intrinsic motivations, the overall program seemed to fuel motivation for future DBER-related work. We draw attention to the importance of building on faculty members’ extant motivations in institutional institutions by developing communities to discuss, share, and potentially generate new aspirations. For instance, by promoting peer dialogue and engagement and supporting this with institutional resources, faculty members discovered aspects of interventions that were effective and expressed plans to continue those elements in their courses. As one faculty member (Clinical Assistant Professor of Law) stated:
Intrinsic motivation and recognition from others are key aspects of promoting science identity (Carlone & Johnson, 2007). To continue fueling motivations, institutions ought to focus on recognition such as highlighting awardees through annual symposiums, inviting awardees to serve on reviewer panels for subsequent programs, and inviting them as “showcase speakers” at informational or CoP sessions.We have surveys that we have put together. And it … helps me as an instructor to find out how students have improved their knowledge of intellectual property and other aspects of what I teach. When I first started this project, I didn’t know how to really to navigate it. And now I have a roadmap to do so.
Promoting DBER Literacy
The goal of SSGs was not to convert STEM faculty into DBER experts, but we hoped faculty would develop DBER literacy, including knowledge of discipline-based education research methods such as instrument design, ability to connect methods and instrumentation to theoretical frameworks, ability to attain external funding in STEM education, ability to prepare and submit peer-reviewed papers, and motivation to stay updated on current discipline-based education literature. We found advances in each of these areas. Faculty believed they improved their knowledge of pedagogical methods tied to appropriate assessment strategies. At the time of this writing, four out of 12 SSGs awarded in 2017 and 2018 respectively received external funding from NSF and the Indiana state department.
However, despite increased DBER literacy, faculty continue to struggle with engaging in DBER. Most notably, faculty continued to find it challenging to design and articulate theoretical frameworks. This is further substantiated in interviews, when faculty exposed their fears owing to the differences in research practices of their own STEM discipline versus STEM education research in their discipline. Thus, despite becoming familiar with DBER, faculty members expressed a continued need for support from DBER experts to engage in future DBER work.
Building off the prior findings, we found the development of a social network provided a critical scaffold to support growth. Faculty gained skills and confidence in engaging others in DBER-related conversations, thus creating a larger network of scholars interested in the SSG program. While faculty members were not situating themselves as DBER experts, they expressed greater humility for the trials and tribulations experienced by many DBER researchers and expressed a desire to continue engaging with others, thus fostering a culture welcoming of DBER at IUPUI.
Transferability of SSG to Other Institutions
Our SSG-related findings might be useful to others interested in building similar programs. SEIRI had six staff with expertise in curriculum development, evaluation, and research utilizing qualitative, quantitative, and mixed methods. While the program was offered to STEM faculty at all levels (tenure track, lecturers, and clinical ranks), over 70% of engagement in the program stemmed from mid-career to senior faculty (associate professors, professors, and senior lecturers). This was particularly rewarding given the challenge to engage mid-career faculty in traditional faculty development activities such as workshops and lectures addressing evidence-based teaching and learning (Mathews, 2014).
As outlined on our website (https://seiri.indianapolis.iu.edu/fundingopportunities/ssg/), the SSG program provides seed funding to faculty members to develop, implement, and evaluate the impact of pedagogical innovations across multiple STEM courses. The projects are eligible for an award of up to $15,000 for individual or faculty team projects and up to $30,000 for department-wide projects. Without any requirement to conduct DBER research, faculty gather experience developing evidence-based classroom practices and secure external funding to improve their own and departmental course offerings. To secure external funding they might need to work with an evaluator and collaborate with scholars with DBER expertise.
The following factors were beneficial for SSG’s successful implementation and offer specific guidance for others to follow:
Support from administrators: At IUPUI, it was the Vice Chancellor of Research and the Executive Vice Chancellor and CEO of Academic Affairs that provided funding, space, and encouragement to form SEIRI and fund the SSG program and other programs offered by the institute. Deans of STEM schools also contributed some funding, encouraging the chairs to support the institute and providing the atmosphere for SSGs to thrive. This was done by inviting the institute director to share the SSG program at the annual faculty convocations, encouraging chairs to incentivize faculty to apply for SSG grants and, upon successful completion of their SSG projects, provide support and encouragement to apply for external (typically NSF) funding.
Dedicated personnel: Institute personnel coordinated the SSG program. They collaborated with SSG recipients to discuss their successes and challenges each month, shared the hurdles and how they overcame them, and hosted a symposium each year to celebrate faculty successes. This symposium showcased the projects that secured external funding wherein faculty shared their journey toward development of their projects. During the early years of the program, the first author was responsible for overseeing and directing institute personnel involved in the SSG program.
Access to DBER experts: An essential feature of the SSG program, we posit, was the access to a variety of professionals who can do qualitative, quantitative, and mixed methods research via SEIRI staff. SSG participants could engage with dedicated personnel as needed and with peer faculty during the CoP meetings.
Requiring CoP participation: The SSG solicitation distinctly identified faculty awardee responsibilities, including attendance at all programming offered by the Summer Institute. Other resources within the community itself included access to dedicated consultants (research associates and postdoctoral fellows and faculty associated with SEIRI), IRB submission guidance from peers and stuff, and miscellaneous professional development meetings such as developing of proposals to an external funding agency of choice.
Future Directions
Our research findings integrate faculty feedback from the survey and interviews and the accomplishments of SSG recipients (e.g., external funding). Taken together, the data present strong evidence our SSG program effectively provides a firsthand experience about discipline-based education research to STEM faculty. After 3 years of successful implementation, we now introduce suggested modifications to our proposal solicitation. As per the new guidelines, we now accept proposals of three types: (1) individual, (2) faculty teams, and (3) department wide.
The addition of a department-level type proposal has been implemented after two successful SSGs had included investigators across all levels within a department, including the department head. That department-wide proposals are effective is corroborated by past research as well. Corbo et al. (2016) elaborated that STEM grants given at the department level are often successful because faculty members working together can bring sustained change in alignment with departmental vision. Projects carried out at the department level require systems thinking, as teaching, research, and service are perceived as integrated and not separate (Corbo et al., 2016). Corbo et al. further stated,
At SEIRI, the aim of department-wide proposals is to provide experience to faculty in response to external calls for department-level impact, such as the NSF program solicitation Revolutionizing Engineering Department (RED) (National Science Foundation, 2024). We recognize that there is not a similar program for other STEM departments, which might be useful to engineering and other STEM faculty at IUPUI to frame department-wide level proposals for SSG funding and can serve as a potential external funding target.The department develops the capacities of individual members through training and team learning and aligns rewards and incentives with desired outcomes (including learning outcomes). Department members reflect on their actions, are willing to revise their assumptions, and are open to attending to events in new ways. These practices lead to continued learning, and as a whole, the department becomes better at learning how to learn.
Biographies
Annwesa Dasgupta is Project Manager of Digital Learning & Innovation at Boston University.
Justin L. Hess is Assistant Professor in the School of Engineering Education at Purdue University.
Pratibha Varma-Nelson is Professor of Chemistry and Founding Executive Director in the STEM Education Innovation & Research Institute (SEIRI) at Indiana University Indianapolis*.
*Please note that the Indiana University–Purdue University of Indianapolis (IUPUI) has changed its name to Indiana University (IU) Indianapolis. This research was carried out under the university name IUPUI.
Acknowledgments
We would like to acknowledge the support of all participating IUPUI faculty focus group members. We would further like to acknowledge all supporters of the SEIRI Seed Grant program.
Conflict of Interest Statement
The authors have no conflict of interest.
Data Availability
The data reported in this manuscript are available in the article’s Supplemental Materials. Note: Data included in the manuscript are subject to the journal’s word count limit.
Notes
- See Curriculum Enhancement Grants at https://ctl.iupui.edu/programs2/CEG/Past-Awardees. ⮭
- All research procedures were reviewed and approved by the Institutional Review Board (IRB protocol #1909964846). ⮭
References
Braun, V., & Clarke, V. (2006). Using thematic analysis in psychology. Qualitative Research in Psychology, 3(2), 77–101.
Carlisle, D. L., & Weaver, G. C. (2018). STEM education centers: Catalyzing the improvement of undergraduate STEM education. International Journal of STEM Education, 5, 1–21. https://doi.org/10.1186/s40594-018-0143-2https://doi.org/10.1186/s40594-018-0143-2
Carlone, H. B., & Johnson, A. (2007). Understanding the science experiences of successful women of color: Science identity as an analytic lens. Journal of Research in Science Teaching, 44(8), 1187–1218. https://doi.org/10.1002/tea.20237https://doi.org/10.1002/tea.20237
Center for Science, Mathematics, and Computer Education. (n.d.). Discipline-Based Education Research. University of Nebraska–Lincoln. https://scimath.unl.edu/discipline-based-education-research/https://scimath.unl.edu/discipline-based-education-research/
Corbo, J. C., Reinholz, D. L., Dancy, M. H., Deetz, S., & Finkelstein, N. (2016). Framework for transforming departmental culture to support educational innovation. Physical Review Physics Education Research, 12(1), 010113. https://doi.org/10.1103/PhysRevPhysEducRes.12.010113https://doi.org/10.1103/PhysRevPhysEducRes.12.010113
Cox, M. D. (2001). Faculty learning communities: Change agents for transforming institutions into learning organizations. To Improve the Academy, 19(1), 69–93. https://doi.org/10.1002/j.2334-4822.2001.tb00525.xhttps://doi.org/10.1002/j.2334-4822.2001.tb00525.x
Daly, A. J. (Ed.). (2010). Social network theory and educational change. Harvard Education Press.
Dolan, E. L., Elliott, S. L., Henderson, C., Curran-Everett, D., St. John, K., & Ortiz, P. A. (2018). Evaluating discipline-based education research for promotion and tenure. Innovative Higher Education, 43, 31–39. https://doi.org/10.1007/s10755-017-9406-yhttps://doi.org/10.1007/s10755-017-9406-y
Gafney, L., & Varma-Nelson, P. (2008). Peer-led team learning: Evaluation, dissemination, and institutionalization of a college level initiative. Springer Science & Business Media.
Gehrke, S., & Kezar, A. (2017). The roles of STEM faculty communities of practice in institutional and departmental reform in higher education. American Educational Research Journal, 54(5), 803–833. https://doi.org/10.3102/0002831217706736https://doi.org/10.3102/0002831217706736
Guskey, T. R., & Yoon, K. S. (2009). What works in professional development? Phi delta kappan, 90(7), 495–500. https://doi.org/10.1177/003172170909000709https://doi.org/10.1177/003172170909000709
Kezar, A. J. (2001). Understanding and facilitating organizational change in the 21st century: Recent research and conceptualizations. ASHE-ERIC Higher Education Reports.
Larson, R. S., & Dearing, J. W. (2008). Design research and the diffusion of innovations. In A. E. Kelly, R. A. Lesh, & J. Y. Baek (Eds.), Handbook of design research methods in education (pp. 511–534). Routledge.
Laursen, S. (2019). Levers for change: An assessment of progress on changing STEM instruction. American Association for the Advancement of Science.
Macdonald, R. H., Beane, R. J., Baer, E. M. D., Eddy, P. L., Emerson, N. R., Hodder, J., Iverson, E. R., McDaris, J. R., O’Connell, K., & Ormand, C. J. (2019). Accelerating change: The power of faculty change agents to promote diversity and inclusive teaching practices. Journal of Geosciences Education, 67(4), 330–339. https://doi.org/10.1080/10899995.2019.1624679https://doi.org/10.1080/10899995.2019.1624679
Mathews, K. (2014). Perspectives on midcareer faculty and advice for supporting them. The Collaborative on Academic Careers in Higher Education.
Mulnix, A. B. (2016). STEM faculty as learners in pedagogical reform and the role of research articles as professional development opportunities. CBE—Life Sciences Education, 15(4), es8. https://doi.org/10.1187/cbe.15-12-0251https://doi.org/10.1187/cbe.15-12-0251
National Academy of Engineering. (2018). National Academy of Engineering annual report. https://www.nae.edu/215419/NAE-Annual-Report-2018https://www.nae.edu/215419/NAE-Annual-Report-2018
National Research Council. (2012). Discipline-based education research: Understanding and improving learning in undergraduate science and engineering. National Academies Press.
National Science Foundation. (2024). IUSE/Professional Formation of Engineers: Revolutionizing Engineering Departments (IUSE/PFE: RED). https://new.nsf.gov/funding/opportunities/iuseprofessional-formation-engineers/nsf24-564/solicitationhttps://new.nsf.gov/funding/opportunities/iuseprofessional-formation-engineers/nsf24-564/solicitation
Pelletreau, K. N., Knight, J. K., Lemons, P. P., McCourt, J. S., Merrill, J. E., Nehm, R. H., Prevost, L. B., Urban-Lurain, M., & Smith, M. K. (2018). A faculty professional development model that improves student learning, encourages active-learning instructional practices, and works for faculty at multiple institutions. CBE—Life Sciences Education, 17(2), es5. https://doi.org/10.1187/cbe.17-12-0260https://doi.org/10.1187/cbe.17-12-0260
Rogers, E. M. (2003). Diffusion of innovations (5th ed.). Simon and Schuster.
Shulman, L. S. (2000). From Minsk to Pinsk: Why a scholarship of teaching and learning? Journal of the Scholarship of Teaching and Learning, 1(1), 48–53.
Wenger, E. (1998a). Communities of practice: Learning as a social system. Systems Thinker, 9(5), 2–3.
Wenger, E. (1998b). Communities of practice: Learning, meaning, and identity. Cambridge University Press.
Wieman, C. (2017). Improving how universities teach science: Lessons from the science education initiative. Harvard University Press.
Appendix A: 2017 STEM Education Innovation & Research Institute Seed Grants Request for Proposals
Submission Deadline: May 15, 2017
I. Purpose
The STEM Education Innovation and Research Institute (SEIRI) at IUPUI is pleased to announce the 2017 SEIRI Seed Grant (SSG). The goal of this competition is to facilitate and support STEM education innovation and research by growing the body of Discipline-Based Education researchers at IUPUI. Specifically, this opportunity provides faculty within science, technology, engineering, and mathematics (STEM) departments with funding to develop, implement, and evaluate the impact of pedagogical innovations across multiple IUPUI STEM courses. As a long-term goal, this grant is intended to enable faculty competitiveness for external funding with agencies such as the National Science Foundation (NSF), Spencer Foundation, and the National Institutes of Health (NIH), or other internal funding such as the IUCRG. As such, we strongly encourage that interested STEM faculty partner with an educational research or design expert within fields related to the learning sciences, such as (but not limited to) IUPUI’s Department of Psychology or School of Education.
II. Scope
The SSG supports a wide range of curricular innovations with a targeted focus on potentially transformative projects that will be implemented in the undergraduate curriculum at IUPUI and that provide opportunities for novel Discipline-Based Educational Research. Proposals should include an innovative aspect as well as a plan to evaluate the impact of this curricular innovation. Possible projects include, but are not limited to, the following:
Developing and exploring any novel and potentially transformative approach to teaching science, technology, engineering, mathematics, or informatics to undergraduate STEM students at IUPUI
Situating a rigorously tested pedagogical approach that has been proven to work in one context within another
Utilizing technology within or beyond the STEM classroom in novel ways
Fostering interdisciplinary collaborations among student teams from within and outside of STEM or across STEM disciplines
III. Eligibility
The Principal Investigator (PI) must be an IUPUI full-time faculty within the School of Science, the School of Engineering and Technology, or the School of Informatics and Computing (tenured, tenure track, and non-tenure track).
Anyone fitting the above definition may serve as Co-PI.
Co-PI(s) may also include faculty members from the School of Education or any other school, as long as this individual can be justified as an educational researcher or learning sciences expert.
Other part-time or adjunct faculty within the School of Science or the School of Engineering and Technology may be included on proposals but may not serve as a PI.
IV. Funding Levels
We will fund up to $150,000 for 18 to 24 months.
Teams can apply for up to $30,000.
V. Submission Deadline
Submit all application materials by 11:59 PM EST on May 15, 2017.
Late submissions will not be considered.
VI. Application Process
Direct your SSG-related questions to seiri@iupui.edu, 317–278–0168, or by visiting SEIRI at Room 1123 in the University Library.
SEIRI will hold an information session prior to the submission deadline. To find dates and in order to register, check the SEIRI web page (https://seiri.iupui.edu/). This session will provide information about the SSG, including eligibility, guidelines, proposal writing expectations, and post-award expectations.
-
Submit the proposal and a letter of support from your department or program chair at
[Qualtrics survey link] by the deadline.
VII. Support for Awardees
Programmatic consultations, including framing programmatic objectives, developing an assessment or evaluation plan, developing a plan to disseminate findings, and preparing your IRB materials.
We strongly encourage you to include a learning sciences or other related educational researcher to help with the assessment and evaluation component.
VIII. Awardees’ Obligations
Complete and submit annual project reports by 11:59 PM EST on July 31.
Agree to work with a facilitator from SEIRI who will serve as a consultant and will monitor the progress of the project so it is completed within the timeline proposed.
Participate in the programming offered by SEIRI related to the SSG, including:
Attendance at the SSG information session for all awardees, August 2017
A presentation in the SEIRI Speaker Series, Spring 2018
The research team must have or receive IRB approval for their project.
Awardees must include explicit plans for securing external funding, and they are required to submit a proposal extending on the SSG-funded study within 24 months of receiving SSG funds.
Awardees must acknowledge receipt of SEIRI support in any presentation or publication of work presented or published resulting from this support.
IX. Review Criteria
Funding decisions will result from peer review of proposals in response to this call. Reviews for submitted proposals will be based on:
The potential student impact of the project
The sustainability and long-term impact of the project
The potential for the project to advance knowledge
The appropriateness of the budget
The qualifications of the team to conduct the work
The probability of the project leading to external funding
X. Review Process
A panel consisting of STEM faculty and SEIRI personnel will review proposals.
During the review or award process, questions may arise regarding budget or other aspects of the proposal. SEIRI reserves the right to negotiate changes in budget requests or other project features.
Applicants will be notified of award decisions no later than July 14, 2017.
XI. Proposal Features
The proposal must include all sections listed below. Use 10- to 12-point font with 1ʺ margins, single-spaced. By the submission deadline, upload your proposal as a single document to:
[Qualtrics survey link]
Section 1. Project Summary (1-page maximum)
Project Title
List all investigators, including full name, department, rank/title, and e-mail address.
Optional: Other personnel (collaborators, technicians, graduate students, postdocs)
List potential sources of future funding (NSF, NIH, etc.).
Abstract: Describe the project in lay terms, articulate the project objective, specify what makes this project innovative, and describe your assessment or evaluation plan to ascertain student impact (this will be listed on SEIRI’s website if the project is funded).
Project Scope
Framing: Specify your overarching objectives; identify and describe sub-goals or specific aims and how these align with the overarching objectives; identify how the proposed innovation will meet those goals.
Participants: Identify your target participants; approximate how many participants will be impacted during the grant period, beyond the SSG duration; identify if and how this innovation will continue to benefit later student cohorts.
Recruitment: Specify how participants will be identified and contacted.
Rationale and Literature Review
Describe if and how this project aligns with IUPUI’s strategic initiatives.
Describe if and how this project aligns with national initiatives.
Identify educational research that has been conducted in this space.
Identify what prior work your team has done in this space.
Novelty
Describe what aspects of this project are novel.
Articulate what makes this project distinct from or build upon existing interventions.
Assessment and Evaluation Plan
Address how the sub-goals or specific aims will be measured.
Indicate how you will monitor the effectiveness of the project as it evolves.
Specify the assessment and evaluation methods you will use.
Identify what curricular changes you envision your project leading to at IUPUI.
Specify how you will know the extent to which those changes are realized.
Role of Key Personnel
Specify the expectations and obligations of all project personnel, including qualifications of the investigators with respect to their specified roles.
Dissemination Plan
Describe how the findings and products will be disseminated within IUPUI.
If appropriate, indicate how your investigation will inform the scholarship of STEM education throughout the larger academic community.
Broader Impacts
Frame the broader impacts of the innovation with respect to NSF’s language.
References (these will not count against the 8-page maximum)
Use a timeline to depict the schedule for your project. The timeline should include start and finish dates for your project as well as the dates or timeframe during which various project tasks will occur.
The 2017 SSG project period is August 2017 to July 2019, so the project timeline should be within that timeframe.
Complete the budget using a template offered by the Office of Research Administration: https://research.iu.edu/training/ora-training-videos/budget-templates.html
Specify how each budgeted item will contribute to the research plan. For instance, if faculty salary is a line item, please indicate who will be paid at what level of effort or time and what salary is consistent with HR rules. If “Software” is a line item, please indicate categories of supplies and cost (e.g., “quantitative analysis software”) and their project use. If travel is on the grant, specify the specific time and location of travel.
List professional preparation, current and prior academic appointments, relevant products, such as publications and/or presentations.
A letter of support from each investigator’s department chair must be uploaded with your application. Your chair should indicate you have the time to perform the project and that this proposal represents original work that is not funded by another agency.
XII. Submission Deadline
Submit all application materials by 11:59 PM EST on May 15, 2017.
Late submissions will not be considered.
XIII. Application Process
Direct your SSG-related questions to seiri@iupui.edu, 317–278–0168, or by visiting SEIRI at Room 1123 in the University Library.
SEIRI will hold an information session prior to the submission deadline. To find dates and to register, check the SEIRI web page http://www.seiri.iupui.edu. This session will provide information about the SSG, including eligibility, guidelines, proposal writing expectations, and post-award expectations.
-
Submit the proposal and a letter of support from your department or program chair at
[Qualtrics survey link] by the deadline.
XIV. Support for Awardees
Programmatic consultations, including framing programmatic objectives, developing an assessment or evaluation plan, developing a plan to disseminate findings, and preparing your IRB materials.
We strongly encourage you to include a learning sciences or other related educational researcher to help with the assessment and evaluation component.
XV. Awardees’ Obligations
Complete and submit annual project reports by 11:59 PM EST on July 31.
Agree to work with a facilitator from SEIRI who will serve as a consultant and will monitor the progress of the project, so it is completed within the timeline proposed.
Participate in the programming offered by SEIRI related to the SSG, including:
Attendance at the SSG information session for all awardees, August 2017
A presentation in the SEIRI Speaker Series, Spring 2018
The research team must have or receive IRB approval for their project.
Awardees must include explicit plans for securing external funding, and they are required to submit a proposal extending on the SSG-funded study within 24 months of receiving SSG funds.
Awardees must acknowledge receipt of SEIRI support in any presentation or publication of work presented or published resulting from this support.
XVI. Review Criteria
Funding decisions will result from peer review of proposals in response to this call. Reviews for submitted proposals will be based on:
The potential student impact of the project
The sustainability and long-term impact of the project
The potential for the project to advance knowledge
The appropriateness of the budget
The qualifications of the team to conduct the work
The probability of the project leading to external funding
XVII. Review Process
A panel consisting of STEM faculty and SEIRI personnel will review proposals.
During the review or award process, questions may arise regarding budget or other aspects of the proposal. SEIRI reserves the right to negotiate changes in budget requests or other project features.
Applicants will be notified of award decisions no later than July 14, 2017.
XVIII. Proposal Features
The proposal must include all sections listed below. Use 10- to 12-point font with 1ʺ margins, single-spaced. By the submission deadline, upload your proposal as a single document to:
[Qualtrics survey link]
Section 1. Project Summary (1-page maximum)
Project Title
List all investigators, including full name, department, rank/title, and e-mail address.
Optional: Other personnel (collaborators, technicians, graduate students, postdocs)
List potential sources of future funding (NSF, NIH, etc.)
Abstract: Describe the project in lay terms, articulate the project objective, specify what makes this project innovative, describe your assessment or evaluation plan to ascertain student impact (this will be listed on SEIRI’s website if the project is funded).
Project Scope
Framing: Specify your overarching objectives; identify, and describe sub-goals or specific aims and how these align with the overarching objectives; identify how the proposed innovation will meet those goals.
Participants: Identify your target participants; approximate how many participants will be impacted during the grant period, beyond the SSG duration; identify if and how this innovation will continue to benefit later student cohorts.
Recruitment: Specify how participants will be identified and contacted.
Rationale and Literature Review
Describe if and how this project aligns with IUPUI’s strategic initiatives.
Describe if and how this project aligns with national initiatives.
Identify educational research that has been conducted in this space.
Identify what prior work your team has done in this space.
Novelty
Describe what aspects of this project are novel.
Articulate what makes this project distinct from or build upon existing interventions.
Assessment and Evaluation Plan
Address how the sub-goals or specific aims will be measured.
Indicate how you will monitor the effectiveness of the project as it evolves.
Specify the assessment and evaluation methods you will use.
Identify what curricular changes you envision your project leading to at IUPUI.
Specify how you will know the extent to which those changes are realized.
Role of Key Personnel
Specify the expectations and obligations of all project personnel, including qualifications of the investigators with respect to their specified roles.
Dissemination Plan
Describe how the findings and products will be disseminated within IUPUI.
If appropriate, indicate how your investigation will inform the scholarship of STEM education throughout the larger academic community.
Broader Impacts
Frame the broader impacts of the innovation with respect to NSF’s language.
References (these will not count against the 8-page maximum)
Use a timeline to depict the schedule for your project. The timeline should include start and finish dates for your project as well as the dates or timeframe during which various project tasks will occur.
The 2017 SSG project period is August 2017 to July 2019, so the project timeline should be within that timeframe.
Complete the budget using a template offered by the Office of Research Administration: https://research.iu.edu/training/ora-training-videos/budget-templates.html
Specify how each budgeted item will contribute to the research plan. For instance, if faculty salary is a line item, please indicate who will be paid at what level of effort or time and what salary is consistent with HR rules. If “Software” is a line item, please indicate categories of supplies and cost (e.g., “quantitative analysis software”) and their project use. If travel is on the grant, specify the specific time and location of travel.
List professional preparation, current and prior academic appointments, relevant products, such as publications and/or presentations.
A letter of support from each investigator’s department chair must be uploaded with your application. Your chair should indicate you have the time to perform the project and that this proposal represents original work that is not funded by another agency.
Appendix B: Review process of the SSG proposals
The SSG proposals were subjected to a rigorous review process based on the reviewer template as provided below, modeled after the NSF review process. The review panels were formed from SEIRI staff and previous SSG recipients. The advantage is that the PIs of the proposals receive informed feedback from others who have been through the process themselves and the reviewers are able to apply what they have learned from participating in the process and further strengthen their own DBER literacy.
Faculty awardees from the first year (SSG 2017) reviewed SSG 2018 (Year 2) proposal submissions. The review process was facilitated by a SEIRI member (Author 1). Faculty reviewers were allotted to two panels—namely, Science and Engineering & Technology (E&T). The science panel reviewed submissions from various departments in the School of Science: Biology, Psychology, Library & Information Science, and Chemical & Chemical Biology. Similarly, the E&T panel reviewed proposals from various engineering departments: Biomedical Engineering, Mechanical Engineering, Computer and Information Technology, and Engineering Technology.
Each proposal was reviewed by three or four faculty. An effort was made so that no faculty reviewed a proposal from their home department to avoid a conflict of interest. A review template with guiding questions was provided (see below), and faculty were given a period of three weeks to submit their reviews to the SEIRI facilitator (Author 1) electronically.
Each reviewer panel participated in a face-to-face meeting along with two SEIRI members, who conducted the review discussions, and another scribe who was responsible for note-taking on a shared screen. The panel meetings lasted 2.5 hours with 30 minutes assigned for each project review.
After each panel session, each project was classified into one of three categories—Highly Competitive (HC), Moderately Competitive (MC), and Non-Competitive (NC)—with some projects placed in between two categories. Three projects categorized as “Non-Competitive” were declined for funding, while three projects categorized between MC-NC or HC-MC were sent back to the PIs with additional questions as decided by the panel, and the remaining two projects were accepted without further questions.
Upon clarification of questions by reviewers, the project teams were selected to be SSG awardees and a following awardee session was held to introduce all awardees to one another and to the SEIRI team. This session also informed faculty about the IRB application process and an assigned SSG team member who would work with them as a consultant for the 2-year duration and monitor progress of the project (i.e., 2018–2020 for SSG 2018 awardees). Faculty are provided support during a 2-year period of the SSG award through multiple reports and meetings.
Reviewer Template
Proposal #:
Title:
Final Score (4 = Excellent, 3 = Very good, 2 = Good, 1 = Fair):
Summary:
The potential student impact of the project |
The potential for the project to advance knowledge |
The appropriateness of the budget, including faculty time commitment |
The qualifications of the team to conduct the work |
The sustainability and long-term impact of the project |
The probability of the project leading to external funding |
Appendix C: Faculty Meeting Assignment
Critical Components of Pedagogical Method
What are some of the most essential aspects that define your pedagogical method? Please cite relevant literature as you identify these aspects. (Below is an example of this activity.)
Critical Components for Peer-Led Team Learning (PLTL)
As an example, for the peer-led team learning (PLTL) model, the identified critical components are as listed below.
The organizational arrangements, including the size of the group, space, time, noise level, teaching resources, and the like promote learning.
The workshop materials are challenging at an appropriate level and, integrated with the other course components, intended to encourage active learning and to work well in collaborative learning groups.
The peer leaders are students who have successfully completed the course. They are well trained and closely supervised, with attention to the knowledge of the workshop problems, teaching/learning strategies, and leadership skill for small groups.
The faculty teaching the courses are closely involved with the PLTL workshops and the peer leaders.
The PLTL workshop sessions are integral to the course, coordinated with other elements.
The institution, at the highest levels of administration and pedagogy, and at the department levels, encourages innovative teaching and provides sufficient logistical and financial support.
The students are trained to solve problems in the PLTL format.
Appendix D.1: SSG Annual Report Form
Due July 31, 11:59 p.m. EST
Submit your reports via the following link:
Qualtrics survey link
Section 1. Project Information
Principal Investigator.
Project Title.
Section 2. Project Overview
What were your project goals?
To what extent have you achieved each of your project goals? Please explain.
Has the scope of your project change in any way in comparison to how it was originally proposed? If yes, how?
How and from whom did you find support when you encountered challenges?
Section 3. Project Impact
How many students have participated in your project so far? What was the nature of their participation?
How would you characterize the demographics of your student population impacted in the past year?
How many students do you anticipate participating in the next year of your project?
How does your project connect to other university initiatives (i.e., RISE, Grand Challenges, HIPs, other grant programs)? Please describe.
Section 4. Instruction/Pedagogy
What were the intended learning outcomes (i.e., content knowledge, professional skills)?
Do you anticipate modifying these in the future? If yes, please explain.
What curricular materials did you use or develop?
In the following table, list three to five key instructional strategies of your project. Second, describe how well you feel you implemented each of these strategies. Finally, describe how effective you felt each of these strategies were.
Instructional strategy |
How well do you feel you implemented this strategy? |
How effective do you feel this strategy was? |
---|---|---|
Section 5. Assessment
How has the data that you have collected informed what you reported in the above table?
How did you measure student outcomes?
Have or will you utilize assessment data to inform your future project implementation?
Section 6. Dissemination
Have you disseminated the results of your project this year? If yes, please describe how and where.
Where and how do you plan to share your results in the next year?
Section 7. Funding
Have you applied or are you planning to apply for external funding? If yes, where?
To what extent do you expect to work with SEIRI to secure future funding (i.e., as educational researchers, evaluators, consultants)?
Do you have any additional comments regarding your experiences in the SSG?
Appendix D.2: SSG Final Report Form
Due Friday, July 31, 11:59 p.m. EST
Submit your reports via the following link:
Qualtrics survey link
Final Report Questions
As a result of the project, do you feel better prepared to engage in education evaluation or research? If so, in what ways?
What are the plans for continuation of this project beyond the SSG award period?
Have others (i.e., faculty members) expressed any interest in your project? If yes, how?
You already shared a COVID-based report for your SSG. Are there any additional SSG changes pertaining to COVID you would like to mention?
SSG-Specific Feedback
The SSG initiative should:
give more grants with smaller awards
give fewer grants with larger awards
not change anything; the present funding level is reasonable.
-
The core goals of the SSGs are:
to provide faculty members in science, technology, engineering, and mathematics (STEM) departments with funding to develop, implement, and evaluate the impact of pedagogical innovations across multiple IUPUI STEM courses.
to promote Discipline-Based Education Research at IUPUI.
to enable faculty competitiveness for external funding with agencies such as the National Science Foundation (NSF), Spencer Foundation, and the National Institutes of Health (NIH), or other internal funding such as the Indiana University Curriculum Research Grant (IUCRG).
What improvements could be made to the SSG initiative to better achieve these goals?
If offered again, what aspects would you suggest being modified to the SSG initiative?
Appendix E.1: Faculty (SSG 2017) Group Interview Questions (Conducted in Spring 2020)
What was your motivation to apply for the SEIRI Seed Grant (SSG)?
How would you describe the relevance of your SSG participation to your professional advancement?
Did you receive any reactions from your department chair? If so, please elaborate.
Did you gain any form of professional credit implementing your SSG proposal related goals? (Portfolio development, tenure track goals, course release?)
Did you get any scope of contributing to teaching beyond the regular responsibilities of your position?
What pedagogical innovations will you continue to use from the SSG experience?
How do you see the science/engineering research pursued in your discipline as similar to or different from the DBER research you conducted for SSGs?
Were there any surprises/unexpected facets during the SSG implementation?
Did you learn something from the other SSG faculty?
Appendix E.2: Faculty Group Interview Themes and Codes
Inductive coding of faculty group interviews yielded nine themes that offered insight into our three research questions. The nine themes with underlying codes (bullet points) and representative quotes are presented below.
Some codes that do not include a quote indicate a verbatim faculty response. For instance, under theme “I. Motivation to apply for SSGs,” code “To understand a specific pedagogy (like PLTL)” does not include any quote, as that was a verbatim faculty response in answer to the prompt, What was your motivation to apply for the SEIRI Seed Grant (SSG)?
Motivation to apply for SSGs
-
To involve more faculty and students to think about a solution to a pedagogical problem
“One of the main issues with the program was that as a feeder program to do a graduate program and the students do not have enough technical skills which are required and so I wanted to design an experiment that could allow us to measure how students did in an online classroom setting with use of technology.”
To understand a specific pedagogy (like PLTL)
The monetary incentive
-
To implement and test a curricular innovation at a department level
“I had a long-term goal for almost a decade of introducing [a] greater computational component in every single course that we teach. So, the goal was to have every course have maybe close to 25% component that was computational. The only way to make this successful was to have every single person, every single faculty member in the department be committed and involved with this so that one could transform the entire curriculum. And that requires some financial support, which is where SEIRI and the SSG program came in. That’s how we got involved.”
-
-
SSG and professional advancement
-
Benefits related to teaching for tenure-track faculty
“My tenure and promotion is heavily based on research, but also I need the teaching part to be satisfactory. And [with the SSG funding] I got the chance to look into new pedagogy and the methodology, and we want to publish the results to the actual FIE. That will give us preliminary data.”
-
Challenge for tenured faculty (blessing vs curse for tenured faculty)
“SSGs do not really count as research or teaching advancement. But it could still go under the bucket of teaching and/or service. However, ironically, not continuing the new pedagogy might affect faculty teaching evaluations. It [SSG] probably goes under the bucket of either teaching or service, but not research.”
-
Helped a lecturer with teaching focus in significant ways
“When I think about my focus on teaching and how the dean and the department looks at my focus on teaching, I think that this grant contributed greatly to their understanding of what I’m trying to do and was extremely positive.”
Helped build groundwork for larger professional goal
-
Increased visibility among network of disciplinary professionals
“[This project helps in getting] visibility, in the sense that, after we did this work, we became part of a bigger network of physicists who are doing similar things, which then led to giving invited talks at conferences; actually we had an invitation this week to speak at the American Physical Society annual meeting, which is our biggest professional society.”
-
-
Department chair reaction
Recognition at the department faculty meetings and department talk invitees
-
Implementation of an SSG project brought the department chair’s attention to aspects of curriculum that need to be modified.
“At the beginning, [the department chair] was encouraging us to do this renovation to our curriculum because it’s a starting point. No one explored this kind of collaboration [data science and informatics course-based collaboration] before. And obviously, we did two rounds of experiment, plus this year we did a third round again. Every round we’ll modify the process to try to make it better.”
-
Professional credit
-
Manuscripts and external funding were listed as credit, but they do not count as much toward promotion of faculty in STEM departments.
“If that [SSG project] gets [externally] funded, we get credit for that. But how that credit works is very hard to measure, because to make a very direct and blunt point, I know for me to get promoted, I need another federal grant. If this gets federally funded, I won’t get promoted because it won’t get counted.”
-
Ongoing discussions in certain STEM departments about how STEM education research can be incorporated within promotion and tenure criteria
“That the place of educational research within the promotion and tenure track has been discussed. I don’t know that I can make a definitive statement after that about the outcome, but I would say that it’s been discussed as a something that needs to be considered.”
-
Research in your STEM discipline vs DBER research
-
Different in context but similar in approach.
Faculty research is embedded in a certain STEM area and has no DBER implications. DBER implications are present within their course teaching. But DBER and STEM research carry similar processes of having a research question and methodology.
“The activities the students are doing are just like research that would be done in my lab or [by] a colleague in a related aspect of chemistry. But the goals of the project were really not the goals of doing chemistry; they were learning how to have students do chemistry. And all of the thought considerations, analysis, trying to think about what data we get and how to analyze it, what it means, and that’s all completely different from anything else I do [referring to own research].”
-
Difference in familiarity and require support from STEM education researcher
“I’ve continued to feel that I know absolutely nothing about the science of teaching, whereas I would feel like at least hopefully in my new area of research, I might know something. The hurdle that I’ve identified is my own confidence that I know what I’m doing. And sort of my desire to be partnered with someone who does know what they’re doing … [as a result] of what I guess my feeling for doing more work in the educational field… . I certainly still feel very much like an outsider, an outsider who maybe, you know, understands a few phrases, has a little bit of language right on the edges with, let you go to a seminar and maybe appreciate the education-based seminar. But you don’t necessarily feel like you can stand alone and do anything.”
-
Assessment-based differences
“So we all now maybe understand more about the importance of assessment. We may understand, some of you may understand very broadly with some of the kinds of assessment … but it’s still so different from disciplinary research. I think, I may recognize that there is a project that I might want to do in bench chemical research, where I can do everything and I need collaborators, but I feel like I understand what might need to be done and know what I need to communicate to my collaborators and know whether they’re doing what they need to do. They may have to provide all of the fine tuning. Because they’re the experts there. But I still feel like in educational research, maybe I have an idea, but I have no clue of what [the] appropriate analysis evaluation of it is. And [I] don’t really feel like I can drive things very much because I have a piece of an idea, but I know there’s this big bad part, and I’m not sure how much it would take to change that dynamic.”
-
DBER research findings are subject specific not generalizable like STEM research findings.
“It is similar on a sort of 30,000 feet superficial level. There are similarities like coming up with the hypothesis and the device or instrument to collect the data. You have to analyze the data [which is also a similarity]. However, in physics we believe that if I come up with some general observations and phenomena, that that applies no matter what, you know, that electron is on the moon, its properties are the same. But for SSGs, anything that I discovered here, is that true for students at IUPUI versus students at say Harvard versus students at a community college? The answer is probably no.”
-
Surprising/unexpected facets of SSG implementation
-
Dealing with students is a dynamic process.
“Sometimes you expect this should work [referring to SSG goals], but [it] doesn’t work as I expected. Something surprisingly comes out based on student feedback. They learn more things in that aspect, instead of what we expect them [the original outcomes] to be.”
-
Lessons from failed implementations
“The experiment we decided to do in [the] beginning was a fail. It was a complete failure, and we learned that during the first time we ran the lab. One realization that is somewhat of a surprise is that we proposed to do something that would be hard to do with a lot of lab sections and we recognized that up front.”
-
Unexpected success from first-time implementation trials
“I knew that the career path of a patent examiner is a very good career path. It is one of the, one of the best things our students can do. And I was surprised to see how many students took that on and were very successful at becoming patent examiners at the patent office. And from what I can tell, they’re very happy at their positions at the patent office. So that was a surprise. I just didn’t know how that was going to shape out, but it was definitely a surprise.”
-
Faculty more cooperative that expected
“Since ours was a department-wide effort, it required every single faculty member to be 100% on board. The fact that they were … surprised me! I think there was absolutely no resistance anywhere. But everybody was on board. It was really, frankly, shocking. You know, as a past department chairperson, [I] never got this level of cooperation for anything. So that was a surprise.”
-
Benefits of being in other SSG recipients’ company during faculty meetings
-
Informal interactions are useful, sometimes more than department seminars.
“I did learn something from the informal interaction that happened during, before, and after the breakout groups. Or just well, or simply seeing them and having and hearing something and then begin to have a conversation. I learned something about what he [a separate SSG awardee] was doing. And so that promoted a conversation at a later date or before the next meeting or something like that.”
“Let’s say I went to all of my department seminars for the last year, and I compared what I got from those to what I got out of the SEIRI discussions that we had. Alright, I, in that case, I might have gotten a lot more out of the SEIRI discussions.”
-
Gained knowledge about pedagogical communication
“It’s helpful seeing how people [SSG peers] communicated more pedagogically linked types of work.”
“Every time I come here [faculty meeting], I learn a different [pedagogical] concept from [a] different [breakout] team.”
-
SSG-based challenges to be addressed
Faculty see a need for more specific proposal writing workshops in future.
-
SSG support appreciated
“While you are recording, I just want to say that you were fantastic. All the support you gave us. Thank you. That was really excellent.”
“I will just double that and say you guys are just amazing. I mean my experiment that I am running would have been impossible without your support, and it is not often that you come across that in one’s life, you know. So, kudos to you guys for putting it together and helping the way you did.”
-
-
Teaching beyond regular responsibilities
Faculty were able to
develop new programs (for example, a leadership program with the peer-led team learning format)
-
redesign labs
“[The SSG] didn’t cause me to teach something that I had never taught before. But clearly, I didn’t teach something differently that I have taught before.”
-
design courses that are being transferred to other disciplines
“This methodology is now standard in three courses, in the ME [mechanical engineering] department, one course in the BME [biomedical engineering] department. We’ve done it a couple of times in electrical engineering, and I think that’s continuing. So, you know, part of the goal here was to expand the reach of peer-led team learning. I think that’s occurred.”
-
expand the application of a pedagogical method to new disciplines
“I think what I found interesting there was the fact that people implement PLTL for the students in the classroom. But you get this benefit of the leaders and their improvement… . For these small number of leaders and mentors, the improvement is sometimes quite deep. And it changes their whole trajectory of their education. And that would not have occurred without SSG.”
-
stimulate reconsiderations about goal of existing STEM laboratories
“I think maybe one thing that [SSGs are] going to stimulate, and there’s been some discussion about it, is that our lab courses don’t get revised very often in this department, generally. It really opens the question, why do we teach lab courses? What do we want students to get out of the lab course? Are they there to learn a technique or are they there to see [a] demonstration of a particular phenomenon/reaction that they’ve learned about? Or what are they there to do? Because the project that SSG funded really kind of changes what the student gets out of the lab. They’re likely going to see less reactions than they will have seen in class… . They’re not necessarily going to learn the standard or an array of organic techniques. So, what does the lab do? Is the lab there to let them see what researchers like to make them enthusiastic about it? Or is it there to make them technically adept at doing something in particular? And that’s a big question because labs do need a facelift.”
-
form collaborations with new instructors in the department who teach related courses to include the pedagogical innovation component
“The idea was to have a computational component and at a minimum 25% of every course… . That was happening in other courses, but not all faculty members were equally comfortable with this in terms of implementation. They were on board with the idea but did not know how [to] do it. How do you create and assess assignments that are computational in nature and incorporate it into your classroom, lectures, and recitations and so on? So, the three of us would help them get up to speed and create, help create assignments.”
SSG activities sustain beyond funding timeline
-
Faculty discovered aspects of the course that are effective and plan to continue those elements in their courses.
“We have surveys that we have put together. And it helps us, helps me as an instructor to find out how students have improved their knowledge of intellectual property and other aspects of what I teach. When I first started this project, I didn’t know how to really to navigate it. And now I have a roadmap to do so. And so, I think it more or less is, to me, … like a flashlight that’s shining light ahead more than anything else.”
-