Introduction
Every time a new technology is introduced that affects the media’s organization, production, and distribution, journals see a surge of semi-structured interview studies,4 where researchers ask a selection of industry representatives a predetermined set of questions about how technologies like algorithms,5 metrics,6 or chatbots7 affect some aspect of professional practice. While the semi-structured interview is a valuable tool for most qualitative researchers and is a great method for soliciting perceptions, opinions, and experiences of actors in the media industries,8 it still leaves the challenge of understanding how the technology actually works.9
The whiteboard method introduced in this article remedies this limitation by (1) allowing the researcher to engage in a knowledge-producing partnership with industry representatives, where the researcher can ask unprepared, seemingly uninformed questions; (2) ushering informants into a co-creative “teaching mode,” whereby the informant and the researcher can address whatever they see as relevant; and through this interaction (3) arriving at a representation that constitutes a shared understanding that remains faithful to the informant’s perspective. While semi-structured interviews excel at the exchange of information between the researcher and informant, the whiteboard method places understanding the use of technology at the center of the exchange. The researcher here assumes an active student position, whereby the expert can explain industry and technology processes with minimal diversion from the aim, allowing for deep, detailed reflection and constant clarifications between the researcher and informant to ensure that a common understanding is reached. This is particularly useful for research where the aim is to understand a phenomenon we know little about, particularly if the technology is a novelty for the research community. This approach is furthermore especially appropriate in shaping industry encounters, whereby the whiteboard is used to invite informants to share insights that we might otherwise not (know to) ask about. In addition to providing a new method for gaining “new, valuable knowledge that otherwise would have remained unknown,”10 the whiteboard method offers a way to set a relaxed tone in industry encounters, enabling all participants to introduce more subjective, colloquial, and ad hoc aspects to the research setting. Similar to the “structured industry workshops,” the whiteboard method thus facilitates learning how practitioners engage and exchange knowledge.
In this article, we first present the challenge of understanding technological developments in the media industries by drawing on sociotechnical theory and the concept of technological literacy. Extending technological literacy beyond its frequent use in educational research,11 we argue that the idea of technological fluency is essential for media scholars to capture and understand how technology is shaping and shaped by media institutions and practices. The ensuing question here is how to create a fertile learning environment to build and enhance such technological literacy in the media research community. In response to this challenge, we introduce our novel method for building technological literacy as a joint teaching and learning process involving partnerships and the active co-creation of knowledge12 between technology experts from the media industry and researchers. The meanings of co-creative and collaborative practices in qualitative research are both varied and overlapping, often described as “messy.”13 Under the broad umbrella of participatory coproduction, co-creative methods span from action research with citizens and community members, to facilitating informants with the tools to create knowledge.14 We align ourselves with the latter approach. We understand co-creation in research as providing the space for informants to articulate the knowledge of their expertise with technology. In the words of Phillips (2011),15 “academic researchers co-produce knowledge with people with lived experience of the research topic (co-researchers) rather than carrying out research on, or communicating the results to, them” (p. 4, emphasis in original). In this sense, the proposition of the whiteboard method includes co-creative or collaborative practices that go beyond extracting knowledge from participants and instead allow them to take control of the knowledge production process, at least during the data collection process.
After outlining the theoretical foundation for our research approach, we describe the advantages and disadvantages of the established methods on which whiteboarding is built – the semi-structured elite interview, visualizations, and field notes. Next, we present the unique components of the method, starting at the level of informant recruitment, through the process of getting informants into “teaching mode,” to how to generate data with the method. We then provide two brief case study examples where the method has been successfully applied to understand technology applications in the media industries–newsroom data lakes and modular video distribution. Finally, we discuss how the method contributes to technological literacy within the sociotechnical framework before considering limitations and ethical implications.
The Methodological Challenge of Researching Media Technologies
The sociotechnical perspective understands technology as neither fully deterministic nor fully socially constructed. Instead, technology is understood as developing within social and organizational contexts, whereby the norms, routines, and cultures on the social side interact with the mechanical and computational aspects on the technical side.16 Technical standards may therefore influence practice,17 but institutional factors also shape technology acquisition and use. Beyond understanding the role of technology in the media,18 the application of sociotechnical theory also presupposes an understanding of how the engineering and design of technology work “under the hood.” In other words, sociotechnical theory relates to the concept of technological literacy. Simply put, to capture and interpret how technical standards and protocols shape practices within the media industries,19 media scholars need a certain level of technological knowledge.
The use of the term “literacy” implies a form of education about technology that is not restricted to mechanical skills or narrow forms of functional competence.20 While functional understanding of how to use a technology relevant to a task could certainly be part of technological literacy, the understanding of how technology works through analysis of its structures21 is more relevant to our application of the concept. According to Garmire and Pearson,22 technological literacy contains three dimensions: (1) knowledge (factual knowledge and conceptual understanding), (2) critical thinking and decision-making (approach to technological issues), and (3) capabilities (the use of technology to solve a problem or task). In the context of our research, the knowledge and critical thinking dimensions are particularly relevant. Similar to news workers facing challenges to understand and engage with technology that potentially compromise journalistic norms and values,23 media scholars, particularly those coming from the humanities and social sciences,24 are also facing “intelligibility” issues25 regarding emerging technologies. Media industries and their professionals constantly engage with and adjust to the introduction of new technology or new technical media workers, such as editorial technologists, who strive to reconcile media’s practices, norms, and values, with technology’s internal mechanics. While such technical profiles have typically not been the focus of media researchers, these “strangers”26 can offer key insights into technology that improves our overall literacy. As Lischka and colleagues claim (2023),27 these editorial technologists should be regarded as an important “community of accountable engineers of sociotechnical change” (p. 1043).
Each iteration of such technology development tends to spur a flurry of research into what this means for media practitioners and their audiences.28 For example, historical research on newspapers has studied how innovations in printing press technology influenced newsroom practices,29 satellite technology has inspired research on the globalization of media culture,30 and the introduction of the World wide Web introduced a whole field of digital journalism research.31 Most of these studies are conducted by social scientists whose technical knowledge is limited to the degree that they lack insight into the actual workings of the technology itself. Expert interviews with technologists and engineers can yield significant value in this regard, informing about specific experiences of technology usage. Many of the studies covered here circumvented this inherent limitation by using semi-structured interviews.32 However, the data obtained through this method primarily constitutes industry and practitioner perceptions of how a technology works and how it influences practice. The whiteboard approach rather complements the semi-structured interview by contributing to technological literacy through a collaborative process.
To build technological literacy, an alternative and arguably more effective approach frames researchers as learners who develop technological knowledge and critical thinking skills alongside industry representatives. While the semi-structured interview method mirrors traditional teacher-centered learning models, where knowledge is transmitted from an active teacher to a passive student, contemporary learning theories highlight the advantages of student active and collaborative learning.33 These theories advocate for a joint learning partnership between teachers and students34 or (value) co-creation.35 In this context, we perceive technological literacy as value co-created by researchers and industry informants through a reciprocal knowledge-producing process.
Methodological Components
The whiteboard method combines a variety of qualitative methods used to interact and co-create value with experts in the media industries. Features of the interview method provide the procedure by which experts are engaged, visualizations provide the means by which interactions are carried out, photographs constitute the data collections, and field notes provide the context to the analysis. These elements all contribute to the development of technological literacy.
Whiteboards
When we refer to whiteboards, we mean the white, marker-friendly version of the traditional blackboard. While we refer here to the analog whiteboard, whiteboards are also discussed in the methodology literature as an interactive application used in digital classrooms.36 Referred to as interactive whiteboards (IWBs), this strand of research considers the integration of technology to enhance pedagogy and student learning outcomes.37 In pedagogical research, whiteboard applications are described as appropriate for sensemaking exercises, to allow students to share their understanding of concepts to reach common agreement. A key advantage of whiteboarding is thus that private mental models can be translated into visuals that in turn become shared conceptual models.38 We argue here that the analog whiteboard can serve the same purpose, but furthermore that it serves the additional function of enabling a pedagogical setting, whereby the informant is invited to enter a co-creative “teaching mode.”
Engaging Expert Informants
Whiteboarding can be used as an alternative or supplement to the semi-structured expert interview. The primary data collected using the whiteboard method consists of a series of pictures taken throughout the process, documenting each iteration of the visualization as it develops. Thus, the method does not require audio recording. There are two main reasons for the emphasis on documenting drawings on the whiteboard instead of audio. First, the conversation should be spontaneous and un-cautioned, allowing for simplifications, misunderstandings, digressions, and corrections. While voice recording has the advantage of accuracy, the presence of a recorder can also be inhibiting,39 involving issues of data capture, discomfort, and invasion of privacy.40 In our experience, expert informants may also caution their statements as they are “on the record,” creating an odd difference of tone and information during the recorded sessions and the conversations taking place before and after the recording. We believe this may limit what information can be solicited and our ability to use all the material from the session. Second, as this type of interaction consists of a lot of drawing and pointing, much of the recorded data would be relatively useless41 out of the visual context of the whiteboard. While voice recording can help build report with informants as the interviewer does not need to look down at their notes all the time,42 the whiteboard method presents an even greater advantage, as it invites interaction through gesturing and moving about. For projects in which recording is necessary, we suggest audiovisual solutions that can record the progress made on the whiteboard.
While the semi-structured interview is a great method to explore participants’ thoughts, feelings, and beliefs about a particular topic, it also has its limitations.43 Not only does the procedure collect large amounts of data, most of which is not used,44 but it also puts the researcher in a relatively restrained position, sometimes requiring faked ignorance with relative limited space for spontaneity in the interviewee response. In the case of interviewing experts, this method furthermore usually only collects strategic discourse.45 As experts and elites are used to being asked about their opinions and perceptions, informants may try to portray themselves or their organization in a good light. Interviews can thus introduce biased self-reporting.46 Most importantly perhaps, while the semi-structured interview is not as rigid as the structured format, it is still guided by a set of preplanned questions.47 Of course, semi-structured interviews allow for open-ended responses from participants for more in-depth information, but they rely on the questions to keep informants on track. The whiteboard method in essence only has one (initial) question: “can you explain to me how x works?”
The whiteboard method thus draws upon many of the core features of the elite interview. Questions about technology usually involves only a handful of respondents available to media researchers48–technical experts operating mostly at the backend of media production, distribution, and regulation. Where the whiteboard method departs from the elite interview is on the aspect of timing. Whereas Lilleker49 suggests that elite interviewing should occur at the very end of the research process, as you need detailed knowledge about the topic and the informant to demonstrate credibility to the interviewee (and to enable the researcher to challenge false information), the whiteboard method essentially allows conversations to take place at any point in the research process, but typically near the beginning, as it forms part of the learning process itself.
Visualizations
Visual methods are often referred to as co-creative methods,50 whereby participants become coproducers of meaning.51 Such research settings often involve think-aloud processes where participants engage in an exercise to solve some problem.52 In such cases, the researcher may participate in the production of meaning.53 Here, it is important to clarify what kinds of reproduction of the visuals are acceptable to informants and in what contexts and settings they will be used.54 It is also important to remember that translating images to verbal expression or vice versa can change the meaning and that there is a limit to the method, as not all thoughts will necessarily be expressed in such sessions.55
When visualizations are mentioned in the methods literature, most refer to the use of pictures, photos, or images as frozen objects meant to engage informants in a certain process of deliberation or reflection.56 In most cases, these visualizations are chosen by the researcher to engage participants in some co-construction of meaning.57 Hence, by using visualizations, certain analytical decisions are already made, which in turn influences the data analysis process.58 The researcher thus has a formative role in setting the parameters of the research.59 The social meanings of the modes of representation used and the cultural conventions they convey,60 therefore, need to be considered carefully, as not every participant can be encouraged to participate in such processes.61
Like established visualization methods, whiteboarding invites co-creation of meaning but has a further advantage in that it can be initiated without the use of preselected pictures or images that can frame the context or introduce bias that can shape the direction of the conversation or limit the perspectives emerging from participants. However, even when whiteboarding starts with a predefined model or sketch, the result is a unique visualization with every encounter, as the representations are drawn from the mind of the informant, providing key insight into the workings and understandings of technology in media production. For analytical purposes, the researcher documents the process and evolution of the drawing with several iterative photographs, until the informant arrives to the final version of the visualization. These incremental pictures help align the completed drawn whiteboard with the field notes and understand the step-by-step explanation of the experts.
Field Notes
Field notes are usually used for contextual information, to capture meaning beyond what is spoken or observed in the research setting.62 These notes can be taken during the research process or immediately preceding the interview or observation. There is thus normally a loss of data associated with field notes,63 as a key drawback of this method is accuracy,64 not least since notetaking should not disrupt the flow of conversation. Nevertheless, fields notes are good for context, particularly in secondary analysis.65 Moreover, field notes protect participants from too much exposure.66 Like with other kinds of identifying data, privacy protection should be applied to field notes.67
With the whiteboard method, field notes can be taken throughout the session. An advantage here is having more than one researcher in the room, as it allows for more impressions to be recorded. As there is no interview guide, field notes can also be taken more freely, often to note down questions that emerge during the whiteboarding session. On revision, these notes provide a chart to illustrate how technological literacy is attained during co-creation. A key step in collecting field data with the whiteboard method is post-session notes where the researcher or researchers summarize their understanding of the informants, visualizations, and technologies that emerged during the session. These notes can provide key insights valuable for the analytical process, particularly in comparative studies.
The Methodological Procedure
The whiteboard method consists of three distinct steps. In the following, we outline how to recruit informants, how to get them into a co-creative “teaching mode,” and how to generate data from un-recorded, collaborative sessions with industry experts.
Identifying and Recruiting Informants
Start by developing a list of possible informants. This can be a challenge in itself, as media industry professionals are typically difficult to access.68 Engineers and technicians seldom present the company publicly and thus leave little trace in industry coverage. Identifying the right function in the company can be facilitated by talking to existing contacts in the organization and then pursuing their leads. Further informants can be snowballed once whiteboarding sessions are underway. Invitations should explicitly state that meetings will not be recorded. This should increase the rate of success, as it lowers the pressure of performance on the informant. Consider the level of detail provided in the invitation. Co- creative sessions work better if informants are not overly prepared, entering with fixed assumptions or strategic goals in mind.69 Furthermore, long emails can be a deterrent for busy industry professionals.
As whiteboard sessions aim for co-creation, the process should not be limited to one-on-one situations like the typical interview. One-on-one whiteboard sessions can work well, but typically these assume the form of an across-the-table-type conversation. One-on-one whiteboarding can thus be likened more to a mentoring session, whereas a “teaching mode” is more easily attained with multiple people in the room. When approached, informants should therefore be welcome to bring to the session whoever they want from the organization. This can help expand the pool of expertise reached, including expertise that could otherwise be missed. Having more people in the room can also be an advantage as it can introduce more perspectives and even provide insight into tribal terminology that opens for further inquiry, thus enhancing technological literacy. Using the whiteboard method to understand media technologies will necessarily constitute a strategic or convenience sample. Informants should be sampled until the pool of available experts is exhausted. While we also suggest aiming for data saturation, since each session can be quite different, judging when data saturation is reached is challenging. Instead of considering data saturation in terms of number of sessions, we understand data saturation as “the phase of qualitative data analysis in which the researcher has continued sampling and analyzing data until no new data appear and all concepts of the theory are well-developed” (p. 1123),70 allowing the researcher to stop data collection. In the case of the whiteboard method, because each expert will introduce context-based information, the process may have reached saturation even if new context data continues to emerge. We believe that once similar categories routinely appear in the drawings of participants, we have reached data saturation.
Getting Informants into Co-creative “Teaching Mode”
A whiteboarding exercise will be most fruitful if there is not too much information to participants up front.71 There are clear advantages to meeting relatively unprepared informants, as it presents an opportunity to frame the conversation, as opposed to settings where experts come prepared with already formulated thoughts, strategies, and priorities.72 The drawback of such cold-start situations is that it requires the researcher to be able to quickly and engagingly define the problem, establish the purpose, and clarify expectations. This cold-start is, however, necessary to initiate the whiteboarding, whereby the problem is defined by making initial drawings on the board. What works well in this setting is sketching out assumptions (e.g., this is how we think the technology works) or asking about relationships (e.g., how do signals travel from x to y).
Whiteboarding thus starts with the researcher in “teaching mode,” standing next to the whiteboard with a marker in hand, drawing the first initial processes or relationships. As the goal here is to get informants to come to the board, it is advantageous if the first sketch is overly simplified or even wrong, as it will more easily prompt informants into “teaching mode,” to correct the mistakes made. The aim is thus to get informants to take over the teaching role and use markers to explain how the technology works in different situations, settings, and relationships, or to use the eraser to correct mistakes. When informants are successfully brought to the board, the researcher can sit back down and assume “student mode.” From this position, the researcher can ask follow-up questions continuously to clarify issues and develop the model further. Getting informants to the board will be easy in some cases and difficult in others. Not all informants will be comfortable on the board. When this happens, the researcher can stay on the board and draw up relationships as the informant responds to questions, making sure to correct mistakes and add features as the conversation progresses, or, to encourage reluctant informants to come to the board, simply hand over the marker, signaling a transition of roles.
Data Generation
The data collected with the whiteboard method consists primarily of pictures of the illustrations and writings on the whiteboard, field notes taken during and after the sessions, and auxiliary data. Pictures should be taken throughout the entire process, to gather the iterative process by which consensus or a common understanding is attained. This is necessary so that the researcher can backtrack the evolution of the final visualization and avoid relying solely on a final, potentially chaotic picture. In cases where informants pull out premade PowerPoint slides to illustrate a point, the researcher could take pictures of the slides. Often such slides will represent established visualizations of relationships and concepts accepted within the industry or the organization, and thus present valuable data in their own right, as they reflect standardization. Informants may also refer to documents, websites, reports, and other sources, either during the session itself or during post-session email exchanges or follow-up conversations, constituting auxiliary data.
At the start of the session, the researcher should make sure consent to picture-taking is granted. To protect source anonymity, only the visualizations should be captured, leaving informants out of the frame. The image files should be stored securely in line with data protection practices. One of the key advantages of the picture material is that these can be translated into more formal representations, enhancing anonymity. Depending on the research questions, whiteboard pictures can be formalized as models, relationships, timelines, or processes representing actors, tech stacks, distribution routes, or value chains that can be systemized in a number of ways and be used for a variety of purposes, depending on the aims of the research.
Notes should be taken throughout the process. Notes are not only useful for analysis, but they can also be useful in the sessions themselves, as they allow the researcher to return to an issue for clarification, particularly when issues grow in complexity. This is where it is useful to have more than one researcher in the room, as steering the conversation requires a high degree of engagement, leaving little room for notes. Multiple note-takers also provide more nuance to the data. Note-taking can of course also be conducted during single-researcher situations, but this obviously puts more demand on the researcher. Notes should be “cleaned” immediately after the session73 and stored securely for later analysis.
As whiteboarding does not require recordings but only relies on pictures and notes as data, the sessions should be followed by an immediate debrief by the research team to discuss main impressions and take-aways. Here, it can be useful to compare with prior cases, to establish similarities or diversions. Furthermore, it provides opportunities to discuss what type of questions led to engagement or controversy, what questions were met with shrugs or indifference, and what type of questions the informants did not understand. The debrief is also useful for summarizing how the session went, the mood during different parts of the conversation, dynamics and hierarchies in the room, and potential points of particular candor or secrecy. The researcher should take additional notes during the debrief to use for later analysis.
The conversational and co-creative design of the whiteboard method means that the interaction with informants typically extends beyond the meeting itself. Informants may send additional information, or they may provide more nuance or clarification through follow-up interaction. This process is akin to collaborative notetaking or member-checking,74 whereby informants are given the opportunity to add to the material. While the time elapsed between the initial meeting and this feedback stage can constitute a burden on the informant, such a process of re-citing nevertheless provides agency to informants, extending co-creation beyond the actual meeting itself.
Applications of the Method
There are a number of questions about technology application in the media to which the whiteboard method could apply. It could be used at a macro level to map data flows within media companies, or to outline the development of communication technology over time. At an engineering level, whiteboarding can be useful for describing how a particular technology works, such as the difference between 4G and 5G technology in mobile communication; how Bluetooth enables Internet of Things; what blockchain is; or what a Large Language Model does. At the level of media production, whiteboarding can be used to understand how programmatic advertising integrates with online monetization strategies; how news media content intersects with social media platforms; or how personalization algorithms and recommender systems work on a website. Basically, anything that can be modeled can be investigated using the whiteboard method. We turn now to two cases where we have used the whiteboard method to understand media technologies: newsroom data flows and modular video distribution. In both these cases, we had little a priori knowledge of how the technology actually works.
Data Flows in News Organizations
This project aimed to investigate how news organizations use data and how it flows during production processes. We knew, from experience, that rank-and-file journalists and editors are often unaware of these processes, and we needed a better overview of the flows of data from expert informants. We started by identifying and recruiting people in charge of the technical infrastructure of the news organization to participate in the study. We were thus able to access the people who buy the software or third-party services that generate, analyze, or distribute data for the entire enterprise. We premised the informants with a rudimentary model showing an audience analytics software linked to the newsroom, and we asked how data “travel” across the organization. In a sense, we nudged the informants into a co-creative “teaching mode,” by handing a marker and letting them draw the data flows. As each informant proceeded to draw, not only did they show us the flows, but they also showed us different types of data, how various access parameters enabled different kinds of users to access the data, and–what we believed was the most important–the technical systems that allowed for these data flows to occur.
The resulting whiteboard models displayed three distinct clusters of systems of interlocked services that used data in one way or another and then passed it along to the next system (see Figure 1). One of the informants color-coded the groups of news workers who would interact with different systems (from journalists and editors to marketing and business intelligence). In under an hour, an entire ecosystem of software materialized on the whiteboard. In our notes, we noted the function of each system, what type of data it used, and who used it. What emerged from these sessions is that most of these systems contribute to what the informants referred to as a “data lake,” where all the data is stored, and most of the systems both contribute with, and exploit, data. Hence, the field notes and the visualizations generated through these whiteboard sessions improved our technical literacy of the role and function of data in the newsroom.
The main contribution of this research was realizing how dependent news organizations are on various software services that inform the news operation as a whole. Most importantly, this kind of insight would not have been possible because we had no idea about the technical infrastructure “under the hood.” Moreover, this insight would not have been possible if we had approached the usual participants of our research (journalists and editors) rather than the data workers.
Modular Video Distribution
In this project, we were primarily interested in understanding how content delivery networks (CDNs) work in video distribution. The issue emerged during relatively informal conversations, with industry stakeholders representing the telecom sector, who described CDNs as an essential component in broadcasting, and one that constituted a new link in the value chain of distribution. To understand what CDNs are and how they work, and the role they play in traditional broadcasting, we approached existing contacts in our national market, who led us to the relevant peering and distribution experts in the public service broadcasting, telecom, and regulatory sectors. Most of these informants were engineers who were not used to talking to social scientists. We therefore used the concept of the value chain to get started, the aim of which was to get informants to enter the “teaching mode.” We drew up a basic value chain consisting of content producers, platforms, distribution, hardware, and end-users. We then asked informants to identify where in the chain CDNs could be located. As conversations progressed, new elements were added to the chain, while some were also erased, creating a modular model whereby video signals travel through multiple technologies where CDNs turned out to be important in all of them (see Figure 2).
The main contribution of this research was the modularity that was discovered by asking questions about CDNs.75 These insights had not been possible without a firm technical understanding of the different modes of transmission; the impact of server capacity, scalability, and traffic load in video streaming; and the architecture of CDNs that informants explained to us during these sessions. The whiteboard method thus led us to insights we would not otherwise have thought to ask about. The issues only emerged because we came to the meetings relatively uninformed, enabled experts to explain in detail by giving them the position of teacher, and allowed ourselves to ask uniformed questions.
Discussion
As demonstrated by the examples, the whiteboard method provided insight into technical foundations of media production that would otherwise remain unknown.76 We were thus able, through the steps in the whiteboard method, to attain a level of technological fluency that provided legitimacy to the research and enhanced contribution to technological literacy within the field.77 This was possible in large part because the method enabled us to create a space for knowledge transfer and co-creation, giving informants the space to define technological understandings. This suggests that the method can serve both as an exploratory device in early stages, to learn the nuances of a technology we only knew in theory, and as an explanatory device, to clarify the internal mechanism of technology we have seen in the field. In that sense, this method is a particularly good steppingstone for new avenues for further research.
From a sociotechnical perspective,78 we were also able to better understand how technology influences practice79 and the ways in which institutional contexts shape the use of technology.80 The sociotechnical framework thus contributes to enhance technological literacy that is not restricted to functional competences.81 The method furthermore contributes to strengthen the application of sociotechnical theory in media industries research, as the intended and unintended uses of technology are not alone in shaping technology adoption, but company, market, and industry context also matter–knowledge that would have been difficult to gain outside the avenue of technological literacy.
Using the whiteboard method to conquer “intelligibility” issues regarding emerging technologies,82 the procedure contributes to strengthen factual and conceptual knowledge by having the technology explained by experts.83 Moreover, using the whiteboard to co-create a common understanding that makes sense to the researcher while also reflecting the expert domain of the informant informs critical thinking, which strengthens technological literacy, particularly to unpack technological black boxes.84 This in turn has potential applicability to enhance capabilities and decision-making in media industries research as it enables evaluations and recommendations regarding technology procurement, utilization, and regulation. The whiteboard method thus adds to the co-creation process85 a tangible and collaborative tool to arrive at a common understanding, lowering the bar for social science researchers to contribute to technology understanding within academic research. The method is not only practical in that it uses accessible devices such as markers and boards, notebooks and phone cameras, sidestepping intrusive recording, and arduous transcription (thus also saving cost and time), but whiteboarding arguably also allows researchers to get more from informants, explore unknown territories, and encounter the industry relatively uninformed.
Limitations and Ethical Considerations
Like every method, whiteboarding has its limitations. First, with the problem of not knowing a technology comes the issue of not knowing who to ask. So, with an open invitation to the point of contact to include whoever they think is relevant from the organization, there is a lack of control over who will be in the room. Second, while the method saves time from transcription, it can be time-consuming to both locate the right informants and schedule a meeting that suits everyone’s calendar. An advantage here, however, is that the sessions can be conducted at any time during the research process, rather than at the end as with expert interviews.86 Third, while this method allows researchers to seek clarification about a particular technology, the technology may also turn out to be too complex so that it remains unclear how it works even after a whiteboarding session. Here, our recommendation would be to use the first session to calibrate the level of complexity sought and then adjust the level of detail requested from respondents. This is where the “teaching mode” shines, as it allows for simplification and repetition without coming across as clueless.
Ethical considerations should also be accounted for when using the whiteboard method. First, images are representations87 that will inevitably include culturally shaped understandings of things such as hierarchies and power dynamics, also pertaining to “cold” technologies. Representations of technological processes can thus reflect organizational contexts and professional norms where the technology is used. Second, informants always come with preconceptions depending on their role, background, and identity. The same applies to the researcher. And third, anonymity must always be granted should informants wish to remain anonymous. Researchers should also be mindful that anonymized informants may be identifiable in concentrated industries. Re-citing or member checking is therefore crucial in respecting the privacy of research participants.
Conclusions
The whiteboard method is introduced here as an avenue for researching emerging technologies in the media industries. The advantage of the whiteboard method is that it enables co-creation in arriving at a shared understanding with informants about how a technology works. Furthermore, it fosters a wide setting for knowledge sharing, providing new insights engendered by asking unprepared and uniformed questions as issues emerge. This co-creating process is enabled by the whiteboard, used to bring informants in “teaching mode,” the result of which is a dataset consisting of iterative pictures of co-creation as it evolves, arriving finally at a representation that remains true to the perspective of informants. This method is particularly useful for researchers with a background in the social sciences or humanities that have little formal training in engineering or coding. The contribution of the whiteboard method is thus to expand the methodological toolbox available to researchers as media industries face new technological challenges, strengthening technological literacy in the field.
The whiteboard method is useful for learning “how things work.” Further uses of the method could include research on back-end technological systems and the paths that data travels from the newsroom to the user. In the newsroom and at the management level, the whiteboard method could be used to map out corporate infrastructures such as content management systems, dashboards, and tech stacks, as well as third-party reliance on websites and social media. The method could also be useful in understanding the integration of artificial intelligence, automation, and personalization during various stages of production and distribution. Further research could also test the application of the method beyond technological literacy to use whiteboarding sessions to map out industry relationships and historical processes.
Notes
- Helle Sjøvaag is Professor of Journalism at the University of Stavanger, Norway. Her research focuses on media economics, digital infrastructures, and datafication and regulation of the communication industries. Her most recent book is The Markets for News: Enduring Structures in the Age of Business Model Disruptions (Routledge, 2022). ⮭
- Ragnhild Kr. Olsen is Associate Professor in the Department of Journalism and Media Studies at Oslo Metropolitan University. Her research interests are digitization and value creation in journalism, particularly in local journalism, as well as media innovation, audience perspectives in editorial priorities, and platformization of news media. ⮭
- Raul Ferrer-Conill is Professor of Journalism in the Department of Media and Social Sciences at the University of Stavanger, Norway. His research focuses on digital journalism, media engagement, and sociotechnical processes of datafication. He is currently Chair of the Journalism Studies Division of NordMedia. Ferrer-Conill received the Bob Franklin Journal Article Award in 2021 and 2024. ⮭
- Steen Steensen and Oscar Westlund, What Is Digital Journalism Studies? (Taylor & Francis, 2021). ⮭
- Taina Bucher, “ ‘Machines Don’t Have Instincts’: Articulating the Computational in Journalism,” New Media & Society 19, no. 6 (2017): 918–33. https://doi.org/10.1177/1461444815624182 ⮭
- Catlin Petre, The Traffic Factories: Metrics at Chartbeat, Gawker Media, and the New York Times (Tow Centre for Digital Journalism, 2015). Columbia Journalism Review. ⮭
- Valerie Belair−Gagnon, Seth C. Lewis, and Colin Agur, “Failure to Launch: Competing Institutional Logics, Intrapreneurship, and the Case of Chatbots,” Journal of Computer-mediated Communication 25, no. 4 (2020): 291–306. https://doi.org/10.1093/jcmc/zmaa008 ⮭
- Zvi Reich and Aviv Barnoy, “Reconstructing Production Practices Through Interviewing,” in The SAGE Handbook of Digital Journalism, ed. Tamara Witschge, Alfred Hermida, David Domingo, and Chris W. Anderson (Sage, 2016), 477–93. ⮭
- Taina Bucher, “Neither Black Nor Box: Ways of Knowing Algorithms,” in Innovative Methods in Media and Communication Research, ed. Sebastian Kubitschko and Anne Kaun (Palgrave McMillan, 2016), 81–98. ⮭
- Steensen and Westlund, What Is Digital Journalism Studies? 101. ⮭
- Eugene Judson, “Improving Technology Literacy: Does It Open Doors to Traditional Content?” Educational Technology Research and Development 58 (2010): 271−84. https://doi.org/10.1007/s11423-009-9135-8 ⮭
- Alison Cook−Sather, “Listening to Equity-seeking Perspectives: How Students’ Experiences of Pedagogical Partnership Can Inform Wider Discussions of Student Success,” Higher Education Research and Development 37, no. 5 (2018): 923–36. https://doi.org/10.1080/07294360.2018.1457629 ⮭
- Tania Pearce, Myfanwy Maple, Kathy McKay, Anthony Shakeshaft, and Sarah Wayland. “Co-creation of New Knowledge: Good Fortune or Good Management?” Research Involvement and Engagement 8, no. 1 (2022): 65. https://doi.org/10.1186/s40900-022-00394-2 ⮭
- Ramírez Montoya, María Soledad, and Francisco José García−Peñalvo, “Co-creation and Open Innovation: Systematic Literature Review.” Comunicar: Revista Científica de Comunicacíon y Educacíon 26, no. 54 (2018): 9–18. https://doi.org/10. 3916/C54-2018-01 ⮭
- Phillips, Louise, Embracing the Messy Complexities of Co−Creation: A Dialogic Approach to Participatory Qualitative Inquiry (Taylor & Francis, 2024). ⮭
- Paul N. Edwards, Geoffrey C. Bowker, Steven J. Jackson, and R Williams, “Introduction: An Agenda for Infrastructure Studies,” Journal of the Association for Information Systems 10 no. 5 (2009): 365–74. AISeL. ⮭
- Geoffrey C. Bowker and Susan Leigh Star, “Invisible Mediators of Action: Classification and the Ubiquity of Standards,” Mind, Culture, and Activity 7 nos. 1–2 (2000): 147–63. ⮭
- Seth C. Lewis and Oscar Westlund, “Actors, Actants, Audiences, and Activities in Cross-media News Work: A Matrix and a Research Agenda,” Digital Journalism 3 no. 1 (2015): 19–37. https://doi.org/10.1080/21670811.2014.927986 ⮭
- Eddy Borges−Rey, “Data Journalism as a Platform: Architecture, Agents, Protocols,” in The Routledge Handbook of Developments in Digital Journalism Studies, ed. Scott Eldridge II and Bob Franklin (Routledge, 2018), 284–95. ⮭
- David Buckingham, “Creative Visual Methods in Media Research: Possibilities, Problems and Proposals,” Media, Culture & Society 31 no. 4 (2009): 633–52. https://doi.org/ 10.1177/0163443709335280 ⮭
- David Richard Moore, “Technology Literacy: The Extension of Cognition,” International Journal of Technology and Design Education 21 (2011): 185–93. Springer. ⮭
- Garmire and Greg Pearson, eds. Tech Tally: Approaches to Assessing Technological Literacy (National Academies Press, 2006). ⮭
- Bronwyn Jones, Rhianne Jones, and Ewa Luger, “AI ‘Everywhere and Nowhere’: Addressing the AI Intelligibility Problem in Public Service Journalism,” Digital Journalism 10, no. 10 (2022): 1731–55. https://doi.org/10.1080/21670811.2022.2145328 ⮭
- Michael Karlsson and Helle Sjøvaag, “Introduction: Research methods in an age of digital journalism” in Rethinking Research Methods in an Age of Digital Journalism, ed. Michael Karlsson and Helle Sjøvaag (Routledge, 2018, 1–7). ⮭
- Jones et al., “AI ‘Everywhere and Nowhere’.” ⮭
- Avery E. Holton and Valerie Belair−Gagnon, “Strangers to the Game? Interlopers, Intralopers, and Shifting News Production,” Media and Communication 6, no. 4. (2018): 70–8. https://doi.org/10.17645/mac.v6i4.1490 ⮭
- Juliane A. Lischka, Nadja Schaetz, and Anna−Lena Oltersdorf, “Editorial Technologists as Engineers of Journalism’s Future: Exploring the Professional Community of Computational Journalism,” Digital Journalism 11, no. 6 (2023): 1026–44. https://doi.org/10.1080/21670811.2021.1995456 ⮭
- Steen Steensen, Anna M. Grøndahl Larsen, Yngve B. Hågvar, and Birgitte Kjos Fonn, “What Does Digital Journalism Studies Look Like?” in Definitions of Digital Journalism (Studies), ed. Scott A. Eldridge II, Krissy Hess, Edson C. Tandoc Jr., and Oscar Westlund (Routledge, 2021), 7–29. ⮭
- Claes Thorén, Pär J. Ågerfalk, and Mats Edenius, “Through the Printing Press: An Account of Open Practices in the Swedish Newspaper Industry,” Journal of the Association for Information Systems 15, no. 11 (2014): 779–804. http://doi.org/10.17705/1jais.00379 ⮭
- Lisa Parks and James Schwoch, ed., Down to Earth: Satellite Technologies, Industries, and Cultures (Rutgers University Press, 2012). ⮭
- Michael Karlsson and Christer Clerwall, “Patterns and Origins in the Evolution of Multimedia on Broadsheet and Tabloid News Sites: Swedish Online News 2005–2010,” Journalism Studies 13 no 4 (2012): 550–65. https://doi.org/10.1080/1461670X.2011.639571 ⮭
- Carl C. Okafor, Raul Ferrer-Conill and Helle Sjøvaag, “Dimensions of data quality in smart cities datafication,” The Information Society 41, no. 5 (2025): 315–326. https://doi.org/10.1080/01972243.2025.2523249. ⮭
- Kristin Børte, Katrine Nesje, and Sølvi Lillejord, “Barriers to Student Active Learning in Higher Education,” Teaching in Higher Education 28, no. 3 (2020): 597–615. https://doi.org/10.1080/13562517.2020.1839746 ⮭
- Alison Cook−Sather, Catherine Bovill, and Peter Felten, Engaging Students as Partners in Learning and Teaching: A Guide for Faculty (John Wiley & Sons, 2014). ⮭
- Mollie Dollinger, Jason Lodge, and Hamish Coates, “Co-creation in Higher Education: Towards a Conceptual Model,” Journal of Marketing for Higher Education 28, no. 2 (2018): 210–31. https://doi.org/10.1080/08841241.2018.1466756 ⮭
- Colleen Megowan−Romanowicz, “Whiteboarding: A Tool for Moving Classroom Discourse from Answer-making to Sense-making,” The Physics Teacher 54, no. 2 (2016): 83–6. https://doi.org/10.1119/1.4940170 ⮭
- Derek Glover, David Miller, Doug Averis, and Victoria Door, “The Interactive Whiteboard: A Literature Survey,” Technology, Pedagogy and Education 14, no. 2 (2005): 155–70. https://doi.org/10.1080/14759390500200199 ⮭
- Megowan−Romanowicz, “Whiteboarding.” ⮭
- Darren G. Lilleker, “Interviewing the Political Elite: Navigating a Potential Minefield,” Politics 23, no. 3 (2003): 207–14. https://doi.org/10.1111/1467−9256.00198 ⮭
- Peter Williams, “The ‘Collaborative Personal Statement’: A More Inclusive Method of Data−Gathering Than Audio Recording Interviews with Vulnerable People,” European Journal of Special Needs Education 35, no. 4 (2020): 466–81. ⮭
- Julia Phillippi and Jana Lauderdale, “A Guide to Field Notes for Qualitative Research: Context and Conversation,” Qualitative Health Research 28, no. 3 (2018): 381–88. https://doi.org/10.1177/1049732317697102 ⮭
- Virginia Braun and Victoria Clarke, Successful Qualitative Research: A Practical Guide for Beginners (Sage, 2013). ⮭
- Anna Potter, “Managing Productive Academia/Industry Relations: The Interview as Research Method,” Media Practice and Education 19, no. 2 (2018): 159–72. https://doi.org/10.1080/25741136.2018.1464716 ⮭
- Phillippi and Lauderdale, “A Guide to Field Notes for Qualitative Research.” ⮭
- David Craig, “Breaking into Hollywood: Studying Media Producers,” in The Routledge Companion to Media Industries, ed. Paul McDonald (Routledge, 2022), 478–89. ⮭
- Elizabeth Dubois and Heather Ford, “Qualitative Political Communication| Trace Interviews: An Actor−Centered Approach,” International Journal of Communication 9 (2015): 267–91. Ijoc.org ⮭
- Svend Brinkmann, “Unstructured and Semi-structured Interviewing,” in The Oxford Handbook of Qualitative Research, ed. Patricia Leavy, 2nd ed. (Oxford University Press, 2014), 277–99. ⮭
- Christian Herzog and Christopher Ali, “Elite Interviewing in Media and Communications Policy Research,” International Journal of Media & Cultural Politics 11, no. 1 (2015): 37–54. Intellect Ltd. ⮭
- Lilleker, “Interviewing the Political Elite.” ⮭
- Ken Anderson et al., “Numbers Have Qualities Too: Experiences with Ethno-mining,” Ethnographic Praxis in Industry Conference Proceedings no. 1 (2009): 123–40. Wiley.com.; Dubois and Ford, “Qualitative Political Communication| Trace Interviews.” ⮭
- Christina Tatham−Fashanu, “Enhancing Participatory Research with Young Children Through Comic-illustrated Ethnographic Field Notes,” Qualitative Research 23, no. 6 (2022). https://doi.org/10.1177/146879412211101 ⮭
- Werner Wirth, Susanne Wolf, Ursina Mögerle, and Saskia Böcking, “Measuring the Subjective Experience of Presence with Think−Aloud Method: Theory, Instruments, Implications,” Proceedings of the Seventh Annual International Workshop on Presence (2004): 351–58. ⮭
- Tatham−Fashanu, “Enhancing Participatory Research with Young Children Through Comic-illustrated Ethnographic Field Notes.” ⮭
- Helen Pain, “A Literature Review to Evaluate the Choice and Use of Visual Methods,” International Journal of Qualitative Methods 11, no. 4 (2012): 303–19. https://doi.org/10.1177/16094069120110040 ⮭
- Wirth et al., “Measuring the Subjective Experience of Presence with Think-Aloud Method.” ⮭
- Luc Pauwels, “Visual Sociology Reframed: An Analytical Synthesis and Discussion of Visual Methods in Social and Cultural Research,” Sociological Methods & Research 38, no. 4 (2010): 545–81. https://doi.org/10.1177/004912411036623 ⮭
- Anderson et al., “Numbers Have Qualities Too.” ⮭
- Dubois and Ford, “Qualitative Political Communication| Trace Interviews.” ⮭
- Buckingham, “Creative Visual Methods in Media Research.” ⮭
- Pauwels, “Visual Sociology Reframed.” ⮭
- Pain, “A Literature Review to Evaluate the Choice and Use of Visual Methods.” ⮭
- Phillippi and Lauderdale, “A Guide to Field Notes for Qualitative Research.” ⮭
- Williams, “The ‘Collaborative Personal Statement’.” ⮭
- Lilleker, “Interviewing the Political Elite.” ⮭
- Phillippi and Lauderdale, “A Guide to Field Notes for Qualitative Research.” ⮭
- Williams, “The ‘Collaborative Personal Statement’.” ⮭
- Phillippi and Lauderdale, “A Guide to Field Notes for Qualitative Research.” ⮭
- Potter, “Managing Productive Academia/Industry Relations.” ⮭
- Craig, “Breaking into Hollywood.” ⮭
- Janice M. Morse, “Theoretical Saturation,” Encyclopedia of Social Science Research Methods 3 (2004): 1122–3. ⮭
- Anderson et al., “Numbers Have Qualities Too.” ⮭
- Dubois and Ford, “Qualitative Political Communication Trace Interviews.” ⮭
- Lilleker, “Interviewing the Political Elite.” ⮭
- Williams, “The ‘Collaborative Personal Statement’.” ⮭
- Helle Sjøvaag, Ragnhild Kr. Olsen, and Raul Ferrer-Conill, “Delivering content: Modular broadcasting technology and the role of content delivery networks,” Telecommunications Policy 48, no. 4 (2024), https://doi.org/10.1016/j.telpol.2024.102738. ⮭
- Steensen and Westlund, “What Is Digital Journalism Studies?” ⮭
- Sjøvaag, Olsen and Ferrer-Conill, “Delivering content.” ⮭
- Edwards et al., “Introduction: An Agenda for Infrastructure Studies.” ⮭
- Bowker and Star, “Invisible Mediators of Action.” ⮭
- Lewis and Westlund, “Actors, Actants, Audiences, and Activities in Cross−Media News Work.” ⮭
- Buckingham, “Creative Visual Methods in Media Research.” ⮭
- Jones et al., “AI ‘Everywhere and Nowhere’.” ⮭
- Garmire and Pearson, Tech Tally: Approaches to Assessing Technological Literacy. ⮭
- Bucher, “Neither Black Nor Box: Ways of Knowing Algorithms.” ⮭
- Dollinger et al., “Co-creation in Higher Education.” ⮭
- Lilleker, “Interviewing the Political Elite.” ⮭
- Dubois and Ford, “Qualitative Political Communication| Trace Interviews.” ⮭
Bibliography
Anderson, Ken, Dawn Nafus, Tye Rattenbury, and R. Ryan Aipperspach. “Numbers Have Qualities Too: Experiences with Ethno-mining.” Ethnographic Praxis in Industry Conference Proceedings no. 1 (2009): 123–40. Wiley.com.Wiley.com
Belair-Gagnon, Valerie, Seth C. Lewis, and Colin Agur. “Failure to Launch: Competing Institutional Logics, Intrapreneurship, and the Case of Chatbots.” Journal of Computer-Mediated Communication 25, no. 4 (2020): 291–306. https://doi.org/10.1093/jcmc/zmaa008https://doi.org/10.1093/jcmc/zmaa008
Borges-Rey, Eddy. “Data Journalism as a Platform: Architecture, Agents, Protocols.” In The Routledge Handbook of Developments in Digital Journalism Studies, edited by Scott Eldridge II and Bob Franklin, 284–95. Routledge, 2018.
Børte, Kristin, Katrine Nesje, and Sølvi Lillejord. “Barriers to Student Active Learning in Higher Education.” Teaching in Higher Education 28, no. 3 (2020): 597–615. https://doi.org/10.1080/13562517.2020.1839746https://doi.org/10.1080/13562517.2020.1839746
Bowker, Geoffrey C., and Susan Leigh Star. “Invisible Mediators of Action: Classification and the Ubiquity of Standards,” Mind, Culture, and Activity 7 nos. 1–2 (2000): 147–63. https://doi.org/10.1080/10749039.2000.9677652https://doi.org/10.1080/10749039.2000.9677652
Braun, Virginia and Victoria Clarke. Successful Qualitative Research: A Practical Guide for Beginners. Sage, 2013.
Brinkmann, Svend. “Unstructured and Semi-structured Interviewing.” In The Oxford Handbook of Qualitative Research Second Edition, edited by Patricia Leavy, 277–99. Oxford University Press, 2014.
Bucher, Taina. “Neither Black Nor Box: Ways of Knowing Algorithms.” In Innovative Methods in Media and Communication Research, edited by Sebastian Kubitschko and Anne Kaun, 81–98. Palgrave McMillan, 2016.
Bucher, Taina. “ ‘Machines Don’t Have Instincts’: Articulating the Computational in Journalism.” New Media & Society 19, no. 6 (2017): 918–33. https://doi.org/10.1177/1461444815624182https://doi.org/10.1177/1461444815624182
Buckingham, David. “Creative Visual Methods in Media Research: Possibilities, Problems and Proposals.” Media, Culture & Society 31, no. 4 (2009): 633–52. https://doi.org/ 10.1177/0163443709335280.https://doi.org/10.1177/0163443709335280
David Buckingham. “Defining Digital Literacy: What Do Young People Need to Know About Digital Media?” Nordic Journal of Digital Literacy 10 (2015): 21–35. Idunn.no.
Cook-Sather, Alison, Catherine Bovill, and Peter Felten. Engaging Students as Partners in Learning and Teaching: A Guide for Faculty. John Wiley & Sons, 2014.
Cook-Sather, Alison. “Listening to Equity-seeking Perspectives: How Students’ Experiences of Pedagogical Partnership Can Inform Wider Discussions of Student Success.” Higher Education Research and Development 37, no. 5 (2018): 923–36. https://doi.org/10.1080/07294360.2018.1457629https://doi.org/10.1080/07294360.2018.1457629
Craig, David. “Breaking into Hollywood: Studying Media Producers.” In The Routledge Companion to Media Industries, edited by Paul McDonald, 478–89. Routledge, 2022.
Dubois, Elizabeth, and Heather Ford. “Qualitative Political Communication| Trace Interviews: An Actor-centered Approach.” International Journal of Communication 9 (2015): 267–91. Ijoc.org
Dollinger, Mollie, Jason Lodge, and Hamish Coates. “Co-creation in Higher Education: Towards a Conceptual Model.” Journal of Marketing for Higher Education 28, no. 2 (2018): 210–31. https://doi.org/10.1080/08841241.2018.1466756https://doi.org/10.1080/08841241.2018.1466756
Edwards, Paul N., Geoffrey C. Bowker, Steven J. Jackson, and R Williams, “Introduction: An Agenda for Infrastructure Studies.” Journal of the Association for Information Systems 10 no. 5 (2009): 365–74. AISeL.
Garmire, Elsa, and Greg Pearson, eds. Tech Tally: Approaches to Assessing Technological Literacy. National Academies Press, 2006.
Glover, Derek, David Miller, Doug Averis, and Victoria Door. “The Interactive Whiteboard: A Literature Survey.” Technology, Pedagogy and Education 14, no. 2 (2005): 155–70. https://doi.org/10.1080/14759390500200199https://doi.org/10.1080/14759390500200199
Herzog, Christian, and Christopher Ali. “Elite Interviewing in Media and Communications Policy Research.” International Journal of Media & Cultural Politics 11, no. 1 (2015): 37–54. Intellect Ltd.
Holton, Avery E., and Valerie Belair-Gagnon. “Strangers to the Game? Interlopers, Intralopers, and Shifting News Production.” Media and Communication 6, no. 4. (2018): 70–8. https://doi.org/10.17645/mac.v6i4.1490https://doi.org/10.17645/mac.v6i4.1490
Jones, Bronwyn, Rhianne Jones, and Ewa Luger. “AI ‘Everywhere and Nowhere’: Addressing the AI Intelligibility Problem in Public Service Journalism.” Digital Journalism 10, no. 10 (2022): 1731–55. https://doi.org/10.1080/21670811.2022.2145328https://doi.org/10.1080/21670811.2022.2145328
Judson, Eugene. “Improving Technology Literacy: Does It Open Doors to Traditional Content?” Educational Technology Research and Development 58 (2010): 271–84. https://doi.org/10.1007/s11423-009-9135-8https://doi.org/10.1007/s11423-009-9135-8
Lischka, Juliane A., Nadja Schaetz, and Anna-Lena Oltersdorf. “Editorial Technologists as Engineers of Journalism’s Future: Exploring the Professional Community of Computational Journalism.” Digital Journalism 11, no. 6 (2023): 1026–44. https://doi.org/10.1080/21670811.2021.1995456https://doi.org/10.1080/21670811.2021.1995456
Michael Karlsson, and Christer Clerwall. “Patterns and Origins in the Evolution of Multimedia on Broadsheet and Tabloid News Sites: Swedish Online News 2005–2010.” Journalism Studies 13, no 4. (2012): 550–65. https://doi.org/10.1080/1461670X.2011.639571https://doi.org/10.1080/1461670X.2011.639571
Morse, Janice M. “Theoretical Saturation.” Encyclopedia of Social Science Research Methods 3 (2004): 1122–3.
Konrad, Kornelia, and Knud Böhle. “Socio-technical Futures and the Governance of Innovation Processes: An Introduction to the Special Issue.” Futures 109 (2019): 101–7. https://doi.org/10.1016/j.futures.2019.03.003https://doi.org/10.1016/j.futures.2019.03.003
Lewis, Seth C., and Oscar Westlund. “Actors, Actants, Audiences, and Activities in Cross-media News Work: A Matrix and a Research Agenda.” Digital Journalism 3, no. 1 (2015): 19–37. https://doi.org/10.1080/21670811.2014.927986https://doi.org/10.1080/21670811.2014.927986
Lilleker, Darren G. “Interviewing the Political Elite: Navigating a Potential Minefield.” Politics 23, no. 3 (2003): 207–14. https://doi.org/10.1111/1467-9256.00198https://doi.org/10.1111/1467-9256.00198
Megowan-Romanowicz, Colleen. “Whiteboarding: A Tool for Moving Classroom Discourse from Answer-making to Sense-making.” The Physics Teacher 54, no. 2 (2016): 83–6. https://doi.org/10.1119/1.4940170https://doi.org/10.1119/1.4940170
Moore, David R. “Technology Literacy: The Extension of Cognition.” International Journal of Technology and Design Education 21 (2011): 185–93. Springer.
Pain, Helen. “A Literature Review to Evaluate the Choice and Use of Visual Methods.” International Journal of Qualitative Methods 11, no. 4 (2012): 303–19. https://doi.org/10.1177/16094069120110040https://doi.org/10.1177/16094069120110040
Lisa Parks, and James Schwoch, eds. Down to Earth: Satellite Technologies, Industries, and Cultures. Rutgers University Press, 2012.
Pauwels, Luc. “Visual Sociology Reframed: An Analytical Synthesis and Discussion of Visual Methods in Social and Cultural Research.” Sociological Methods & Research 38, no. 4 (2010): 545–81. https://doi.org/10.1177/004912411036623https://doi.org/10.1177/004912411036623
Pearce, Tania, Myfanwy Maple, Kathy McKay, Anthony Shakeshaft, and Sarah Wayland. “Co-creation of New Knowledge: Good Fortune or Good Management?” Research Involvement and Engagement 8, no. 1 (2022): 65. https://doi.org/10.1186/s40900-022-00394-2https://doi.org/10.1186/s40900-022-00394-2
Petre, Caitlin. The Traffic Factories: Metrics at Chartbeat, Gawker Media, and the New York Times (Tow Centre for Digital Journalism, 2015). Columbia Journalism Review. https://www.cjr.org/tow_center_reports/the_traffic_factories_metrics_at_chartbeat_gawker_media_and_the_new_york_times.phphttps://www.cjr.org/tow_center_reports/the_traffic_factories_metrics_at_chartbeat_gawker_media_and_the_new_york_times.php
Phillippi, Julia, and Jana Lauderdale. “A Guide to Field Notes for Qualitative Research: Context and Conversation.” Qualitative Health Research 28, no. 3 (2018): 381–8. https://doi.org/10.1177/1049732317697102https://doi.org/10.1177/1049732317697102
Phillips, Louise. Embracing the Messy Complexities of Co-Creation: A Dialogic Approach to Participatory Qualitative Inquiry. Taylor & Francis, 2024.
Potter, Anna. “Managing Productive Academia/Industry Relations: The Interview as Research Method.” Media Practice and Education 19, no. 2 (2018): 159–72. https://doi.org/10.1080/25741136.2018.1464716https://doi.org/10.1080/25741136.2018.1464716
Ramírez Montoya, María Soledad, and Francisco José García-Peñalvo. “Co-creation and Open Innovation: Systematic Literature Review.” Comunicar: Revista Científica de Comunicacíon y Educacíon 26, no. 54 (2018): 9–18. https://doi.org/10.3916/C54-2018-01https://doi.org/10.3916/C54-2018-01
Redvall, Eva Norvup, and Inge Ejbye Sorensen. “Structured Industry Workshops as Methodology: Researching National Screen Agencies and Policies.” Media Industries 8, no. 1 (2021): 47–66. https://doi.org/ 10.3998/mij.94https://doi.org/10.3998/mij.94
Reich, Zvi, and Aviv Barnoy. “Reconstructing Production Practices through Interviewing.” In The SAGE Handbook of Digital Journalism, edited by Tamara Witschge, Alfred Hermida, David Domingo, and Chris W. Anderson, 477–93. Sage, 2016.
Steensen, Steen, and Oscar Westlund. What Is Digital Journalism Studies? Taylor & Francis, 2021.
Steensen, Steen, Anna M. Grøndahl Larsen, Yngve B. Hågvar, and Birgitte Kjos Fonn, “What Does Digital Journalism Studies Look Like?” In Definitions of Digital Journalism (Studies), edited by Scott A. Eldridge II, Krissy Hess, Edson C. Tandoc Jr., and Oscar Westlund, 7–29. Routledge, 2021.
Tatham-Fashanu, Christina. “Enhancing Participatory Research with Young Children Through Comic-illustrated Ethnographic Field Notes.” Qualitative Research 23, no 6 (2022). https://doi.org/10.1177/146879412211101https://doi.org/10.1177/146879412211101
Thorén, Claes, Pär J. Ågerfalk, and Mats Edenius. “Through the Printing Press: An Account of Open Practices in the Swedish Newspaper Industry.” Journal of the Association for Information Systems 15, no. 11 (2014): 779–804. http://doi.org/10.17705/1jais.00379http://doi.org/10.17705/1jais.00379
Peter Williams. “The ‘Collaborative Personal Statement’: A More Inclusive Method of Data-gathering Than Audio Recording Interviews with Vulnerable People.” European Journal of Special Needs Education 35, no. 4 (2020): 466–81. https://doi.org/10.1080/08856257.2019.1706256https://doi.org/10.1080/08856257.2019.1706256
Wirth, Werner, Susanne Wolf, Ursina Mögerle, and Saskia Böcking. “Measuring the Subjective Experience of Presence with Think-aloud Method: Theory, Instruments, Implications.” Proceedings of the Seventh Annual International Workshop on Presence (2004): 351–58.


