This lesson plan outlines an interactive workshop on how algorithms and algorithmic bias influence the information and choices that are presented, or not presented, to individuals in online environments.
This lesson plan outlines an interactive workshop on how algorithms and algorithmic bias influence the information and choices that are presented, or not presented, to individuals in online environments. The workshop also explores the implications of algorithmic bias in our everyday lives and society and its disproportionate impact on historically marginalized groups. The workshop is founded on the understanding that technology is not neutral and that algorithmic awareness and more critical use of digital technologies are essential starting points for counteracting the negative effects of algorithmic bias and advocating for more ethical information systems.
This workshop addresses several literacies and competencies that are relevant to various information and digital literacy frameworks. These literacies and competencies include the ability to:
Recognize that technologies and algorithms are not neutral and often perpetuate and exacerbate social inequities in everyday life and society more broadly.
Recognize contexts and scenarios in which algorithmic bias influences the information and choices presented to different individuals and groups in different online environments, as well as the potential impacts this may have on those individuals and groups.
Apply algorithmic awareness to the act of recognizing and counteracting algorithmic bias negative effects, including through critical evaluation of search results, use and comparison of various search tools, and modification of privacy settings within digital tools.
Recognize algorithmic bias as a systemic issue that calls for approaching algorithmic awareness and justice as both individual and collective efforts.
For individuals using the ACRL Framework for Information Literacy for Higher Education, these literacies and competencies can be tied to many of its conceptual frames, and those frames related to knowledge practices and dispositions. They include “Authority Is Constructed and Contextual,” “Information Has Value,” and “Searching as Strategic Exploration.” (See Appendix 1: ACRL Framework for more details.)
This workshop is intended for any adult audience in the United States and Canada. Though it is most likely to be offered in higher education contexts, the workshop is also relevant to the general public and can be adapted for international audiences. The workshop can also be adapted for high school or middle/junior high school students.
Though there is no absolute limit to the number of workshop participants, the workshop facilitator should consider their intended audience and the level of interactivity they hope to have. Smaller groups tend to lend themselves to greater interactivity than do larger groups, though this is not always the case.
The workshop was originally created as a synchronous online workshop, conducted via Zoom and using a shared Google Doc. It has been offered since spring 2020 as part of the Rowan University Libraries workshop program, which is open to all of the Rowan University community (students, faculty, staff, and alumni) and the general public. Because the topics of algorithmic bias and algorithmic awareness affect virtually everyone in their personal, professional, and academic lives, the workshop material is relevant to all disciplines and professions and the general public.
This workshop was designed by Andrea Baer. Rowan University Libraries’ Workshop Planning Committee and library faculty and staff have promoted the workshop program.
A collaborative writing tool, such as Google Docs or Etherpad.
An internet-connected computer for each participant.
For online workshops, an online conference program like Zoom, WebEx, or Google Meet.
The choice to record or not record an online workshop: Due to the interactive nature of the workshop, the workshop creator has chosen not to record past workshops. Though some workshop registrants who could not attend the synchronous session have been interested in receiving a recording, the workshop creator believes that participants engage more actively because the workshop is not recorded. Not recording the session also appears to have incentivized individuals to attend the live session. Other workshop facilitators may choose to record a session depending on their audience, priorities, and other contextual factors.
Use of collaborative digital writing tools: Whether facilitating the workshop online or in person, the workshop presenter uses a shared Google Doc as a tool for introducing resources and activities and inviting participant engagement. So that attendees can actively engage, it is recommended that they have access to an internet-connected computer throughout the workshop. In both online and in-person workshops, the facilitator allows time for participants to respond to discussion prompts first by adding content to the Google Doc and then through synchronous discussion.
For those who prefer not to use Google Docs, an alternative collaborative writing tool, such as Etherpad, can be used instead. The choice of a collaborative writing tool could also be a point for discussion. The workshop facilitator has used Google Docs because of the ease with which content can be created and shared. However, it is worth considering the tradeoffs that come with using any given digital tool (e.g., privacy and security) and how the tools one chooses to use may vary depending on the purpose and context.
The workshop is relevant across and beyond academic disciplines, and the choice of content and activities can be modified for different audiences and teaching contexts (e.g., by using examples that are relevant to a specific community, profession, or area of study). The workshop template includes additional activities and resources that can be integrated into an expanded workshop or can replace the featured activities.
The workshop can be expanded with an additional session focusing more closely on algorithmic bias manifested within searches. Such a workshop can include a closer evaluation of search results and search result rankings; a comparison of search results and rankings in different digital platforms or when using different personalization and privacy settings within the same platform; and a discussion about possible reasons for differences in search results and ranking. Such a workshop can incorporate some of the activities that are referenced in the workshop template (for example, see the sections “Other Activities to Try” and “More Food for Thought”).
The outcomes below mirror many of the literacies and competencies that are listed in the “Literacies & Competencies” section above. The learning outcomes describe more concrete and observable behaviors through which workshop participants can demonstrate their learning.
Reflect on the roles of algorithms and algorithmic bias in our everyday lives and more broadly in our society.
Identify contexts and scenarios in which algorithmic bias may influence the information or the choices presented to you in a given online environment.
Identify and apply simple strategies for recognizing and counteracting the negative effects of algorithmic bias, including modification of personalization settings, privacy tools, and “click restraint” when reviewing search results.
Recognize algorithmic bias as a systemic issue that calls for approaching algorithmic awareness and justice as both individual and collective efforts.
Possible outcomes for an expanded workshop: If this workshop is expanded to include a focus on algorithmic bias in searching results (see “Curricular Context” above) additional outcomes will include the ability to:
Evaluate the relevance and credibility of search results, including search result rankings.
Compare search results and rankings for a search done in different search engines (e.g., Google, DuckDuckGo, Safari).
Compare search results and rankings for a search that is done when logged into a Google account and when logged out of that Google account.
Articulate possible reasons for search result rankings for a given search.
Identify privacy settings within different browsers and search engines (e.g., Chrome and Firefox browsers, Google and DuckDuckGo search engines).
Create a new collaborative document that includes the contents from the workshop template (see Materials section below).
Be sure to include the Creative Commons license attribution, which is in the footer of the Google Doc.
Modify the copied template for the specifics of the given workshop (e.g., date, any desired modifications to content or activities).
If using Google Docs click the Share button in the upper right of Google Docs to give workshop participants editing privileges to the shared document. Change the settings under “Get link” to “Anyone with the link” having “Editor” privileges.)
The workshop facilitator may want to consider or discuss with workshop participants whether to share the collaborative document outside of the group or to keep it internal to the group. Different groups may have different preferences, depending on their individual or collective goals.
Recommendation: After the workshop facilitator updates the workshop template to fit their workshop preferences, it is recommended that they keep an unshared version of the workshop template for their own files and make another copy of the document that will be shared with participants and to which participants will have editing privileges. (The private master copy of the workshop template can then be accessed if workshop participants inadvertently make unwanted changes to the shared document.)
2. The workshop facilitator should recommend that participants have access to a computer during the workshop to engage in activities fully and contribute to the shared document. Contributing to the shared document may be more difficult if accessing it with a smaller device like a tablet or smartphone.
Recommendation: If conducting the workshop in person, it would be best if the facilitator displays the shared document on a presentation screen or monitor. If conducting the workshop virtually, the facilitator may benefit from having two computer monitors, one on which to view the online conference platform (e.g., Zoom) and one on which to view the shared document.
Participant learning can be assessed by observing their contributions to the shared document and discussions throughout the workshop. The workshop facilitator might further assess learning through a post-workshop survey that includes short, open-ended prompts like the following:
Please share one valuable thing that you learned from this workshop.
What relevance does algorithmic awareness have to your life?
Please share 1-2 strategies you can use to counteract the negative effects of algorithmic bias on your and/or others’ online experiences.
What remaining questions do you have about algorithmic bias awareness?
What aspect of the workshop contributed most to your learning?
What would you change about the workshop?
Because this workshop was designed to be optional and stand-alone rather than instruction embedded into a course or program, assessment of participant learning occurs primarily within the confines of the workshop itself. Participants also receive a follow-up survey, through which they can provide feedback on their workshop experience.
This workshop has evolved over time with modifications made in light of participants’ engagement and discussion, the facilitator’s ongoing reflection on the materials and activities, and the facilitator’s continued learning about algorithmic awareness and algorithmic justice. During and after each workshop, the facilitator has revised the template based on the success of each session and what they observe as ways to strengthen activities and materials.
The workshop structure is intentionally flexible to allow time for richer participant discussion and interaction. When preparing for the workshop, facilitators might make notes about activities that are most essential for their audience and activities that might be omitted to allow for further discussion during another activity. Time estimates for activities can also be modified in light of the facilitator’s and audience’s priorities.
The workshop was developed largely to support users in more critically evaluating online information, search result rankings, and digital tools. While the original workshop focused mainly on search, the workshop has evolved to address the wide-reaching impact of algorithms and algorithmic (in)justice on everyday life. Within the time constraints of this workshop, the goal is not for participants to understand algorithms and digital tools with the technical expertise of a computer programmer but rather to recognize the “black box” nature of most algorithms, to challenge the notion of digital tools as neutral, and to reflect critically on the information and the choices that individuals encounter through use of these tools. With greater awareness of algorithmic bias, participants will hopefully be better positioned to make more conscious and informed choices about how they use digital tools and how they evaluate and use the information that they access through those tools. Workshop attendees have appreciated the wider-lens view that the workshop has developed and have often commented at the end of the workshop that they had not previously realized the degree of algorithms’ influences on everyday life and social inequities, both for themselves and different communities. (Read the entire reflection in Appendix 2: Reflection.)
Please see the Reflection section of this document (above) for more guidance on introducing content and facilitating activities.
A typical workshop lasts 60 minutes. The workshop template includes activity time estimates for a workshop that can be modified based on the workshop audience and purpose. A workshop could easily be expanded to 75 or 90 minutes, since the workshop template includes optional activities that could be done during or outside of the workshop and because some activities and discussions could be expanded on.
Welcome participants as they enter the workshop space. Inform them that the workshop will start at the designated time and provide the link to the shared document that will be used throughout the session. (Before the workshop participants should have been asked to ensure that they will have computer access during the workshop so that they can fully engage in activities and contribute to the shared document.)
At the start of the workshop, introduce yourself, state the workshop title, remind participants that the group will use the shared document throughout the session to reflect and share ideas, and reshare the shared document link. If using Google Docs as the shared document platform, inform participants that if they are logged into their Google accounts while accessing the shared document, their names will appear in the document as they type. If they prefer to add to the document anonymously, they should log out of their accounts or access the shared document in a browser in which they are not logged into Google. Also explain that during most activities, attendees will initially have time to contribute thoughts to the shared document and thereafter, time allowing, will be able to share further thoughts by speaking to the group.
If a group is small and time allows, the facilitator might invite participants to briefly introduce themselves and explain their interest in the workshop topic. These introductions can be particularly valuable in public workshops in which participants may have a wider range of knowledge and reasons for attending than is generally the case in course-integrated workshops.
Introduce the workshop learning outcomes (included in the workshop template).
Introduce the Grounding Principles (also in the workshop template). Explain that these principles reflect how all participants can expect to engage and learn together at this workshop. Acknowledge that the roles that algorithms play in our lives affect us in intersecting but also different ways, that discussing algorithmic bias issues is sometimes uncomfortable, and that this is an environment for reflection and mutual respect and care.
Introduce participants to the definitions of bias and of algorithms that are in the shared document. Then ask them to answer the question, “How would you define or describe algorithmic bias?” (under the “Defining Algorithmic Bias” activity). Acknowledge that algorithmic bias is a complex and multi-layered concept; there are no single, all-encompassing definitions, and people’s different explanations will shed light on different aspects of the concept. After people have added initial thoughts, open discussion about participants’ definitions. Highlight salient points that will be built on throughout the session.
Ask participants to complete the poll (in the shared document) and immediately thereafter add their reasoning for their ranking (in the “Discussion: Google & Neutrality” section). After attendees have added their initial thoughts, open live discussion. Emphasize during the conversation that although this activity focuses on Google, algorithms and algorithmic bias are relevant to all online search experiences. Stressing this point will help in transitioning to the next activity.
Ask attendees to reflect on the question “What are some ways that algorithmic bias influences your life, whether large or small?” Then invite people to comment on their responses within the shared document, while also noting that the group will revisit this question toward the end of the workshop.
This workshop section includes several short videos, each accompanied by a reflection/discussion prompt. Before showing each video clip, introduce the related prompts. Explain that participants can be thinking about those prompts as they watch the video clips and that they will also have time after each clip to respond to prompts within the shared document. As with previous activities, allow for live discussion after each video viewing and initial writing as time permits.
This is a good introduction to how algorithms often mirror back and perpetuate racism and social inequities more broadly.
Prompt: Do Noble’s thoughts on or experiences with search algorithms like Google challenge or affirm any of your own previous views or experiences? How (not)?
After watching the video, the group might reflect again on the definitions of algorithmic bias that they articulated earlier, and how certain aspects of those definitions are evident in Noble’s presentation.
(Watch up to the 1:30 mark.)
Before showing this video, ask individuals if they have heard of search engine optimization (SEO), and, if so, how they would explain it. Convey that SEO essentially involves strategies for website owners and businesses to improve their search result rankings, so that their site is more likely to be found by people.
Prompt: Let’s watch about 1.5 minutes of this video about SEO. As you watch consider:
What are some key points the speaker makes about SEO?
Who is the speaker’s intended audience?
During the ensuing discussion, draw out who the intended audience of the video is (business owners and organizations seeking to improve their search results rankings) and the speakers’ representations of SEO’s purposes and effects.
(Watch up to the 1:25 mark.)
This documentary trailer starkly contrasts the previous video on search engine optimization. It emphasizes the addictive and manipulative qualities of many digital environments, and in particular social media platforms that personalize the information sent to users. Ask participants to consider connections and differences between the content and the tone of both videos.
Prompt: Please respond to one or both of the following questions by adding your thoughts below:
What key points are made here?
How does this video compare with the previous one?
After viewing the video, highlight the contrasting tones between this and the previous video clips, as well as concepts that are particularly relevant to algorithmic bias (e.g., the effects of online personalization, including benefits and drawbacks; small and large ways that algorithms affect both individuals and groups; who benefits and who is harmed by algorithms and by addictive qualities of many digital technologies).
This short video introduces the practice of click restraint, a way to bring greater algorithmic awareness and critical thinking to engagement within online search results. Click restraint involves skimming a page of search results before choosing which links to click on. The video stresses that the first results on a search results page aren’t necessarily the most relevant or useful; rather often the first results are at the top because of marketing search engine optimization. Through click restraint, internet users can evaluate search results more critically and with algorithmic awareness.
Optional Video: “The Era of Blind Faith in Data Must End” (TED Talk, Cathy O’Neill)
(Watch up to the 3:35 mark, or more if time allows.)
Data scientist and mathematician Cathy O’Neill challenges the notion that technology is neutral and demonstrates how algorithms drive decisions that have real life consequences (e.g., who gets a job/raise/layoff and who gets a mortgage) and how these decisions are based on often false assumptions. This talk illustrates how algorithmic bias and algorithmic justice extend far beyond search result rankings.
Prompt: Feel free to add your thoughts or reactions to the video clip below. Did anything surprise you or stand out to you?
This section of the workshop focuses on small, concrete steps that individuals can take to reduce the degree to which they are affected by personalization on different online platforms. While acknowledging that algorithmic bias is pervasive and systemic and therefore cannot be escaped or quickly fixed, the workshop facilitator can present the activities below as examples of how, with increased algorithmic awareness, individuals can also take some proactive steps in their online experiences.
Part 1: Google Account Settings: Look at your institutional or personal Google account settings. Go to: Manage my Google account > Data and personalization > Manage your data and personalization
Group discussion: What options do you have for limiting how much information is stored about you? What information about you does Google have stored?
Part 2: Chrome Browser Settings: In the Chrome browser open the Settings. What options do you have for limiting the information stored about you?
The activities below can be integrated into an extended workshop or can be recommended as things to do outside of the workshop.
Try searching in Google when logged into your Google account. Then try searching Google in incognito mode. Do you notice a difference?
Do a search in Google and in an alternative search engine like DuckDuckGo. How do the results compare?
This section of the shared Google Doc includes several online tools and resources that can help individuals be more critical consumers of online resources and platforms and reduce the number of online trackers that store personal user information and influence online personalization. It is important when introducing these tools to emphasize that they are not guarantees that one’s personal information will not be tracked, but that they do prevent a good amount of online tracking. Also note that online privacy is an ever-changing issue, so the best privacy tools may change over time.
In this final activity, participants are asked to reflect on their thoughts at the start of the workshop and on how their views have developed over the workshop duration.
At the beginning of today's session, you considered what role algorithms play in your life, whether large or small. Would you add to or modify your earlier thoughts?
What can you do to counteract algorithmic bias?
Mention these resources as items of possible interest for those who would like to learn more.
Check out this TED Talk “The Moral Bias Behind Your Search Results” from Andreas Eckstrom. Does this complicate how you think about the concept of algorithmic bias?
Watch the short video “The Miseducation of Dylann Roof” (Southern Poverty Law Center). What argument is the speaker making? What connections or differences do you see between this video and Safiya Umoja Noble’s arguments in the introduction to her book Algorithms of Oppression?
If time allows, invite participants to share further thoughts or to pose questions.
Algorithmic Bias & Search Systems LibGuide (Rowan University)
Algorithms of Oppression: How Search Engines Reinforce Racism. (Ebook) Noble, Safiya Umoja. (2018)
"Why Algorithms Can Be Racist and Sexist." Rebecca Heilweil. Vox. February 18, 2020.
"I Know Some Algorithms Are Biased - because I Created One." Nicholas T. Young. Scientific American. January 31, 2020.
"Weapons of Math Destruction Outlines Dangers of Relying on Data Analytics." (Interview with author and mathematician Cathy O'Neill). All Things Considered, NPR. September 12, 2016.