21 Ways to Incorporate Anti-Racism in your Subject Area

All Subject Areas

Provide inclusive imagery and representation of diverse communities within your OER. From ‘Looking for Images that Reflect Diversity, Equity, and Inclusion? Ask the Community‘ contributed by Heather Blicher, CCCOER Blog provided under CC-BY 4.0

Arts, Audio-Visual Technology and Communications

Include the issues of representation and the portrayal of BIPOC communities and individuals within the media. Here are some OER resources to use:

Education & Training

Discuss the problems with school disciplinary actions and how Black and Brown students are disproportionately disciplined (Government Accountability Office [GAO] report, public domain), and how the rise of School Resource Officers (SRO) has led to the school to prison pipeline (wikipedia page, CC-BY-3.0).

How we’re priming some kids for college – and others for prison 2015 TED Talk by Alice Goffman, provided under CC BY–NC–ND 4.0

Health Sciences

Call out the prevalence and problems with race-based medicine in health science courses.

Dorothy Roberts TED Talk 2015, The Problem with Race-Based Medicine provided under CC BY–NC–ND 4.0

Discuss the social determinants of health. The federal government has several resources available to share with students about social determinants of health. These are in the public domain and thus can be readily incorporated in your OER work. See Healthy People 2030 from the U.S. Department of Health and Human Services and Know What Affects Health from the Centers for Disease Control and Prevention.

Healthy People 2030, U.S. Department of Health and Human Services, Office of Disease Prevention and Health Promotion. Provided within the public domain.

Human Services

In Barber/Cosmetology programs make sure to cover methods for styling Black and Natural hair. This training is too often left out in curriculum and leads to structural inequalities.

A Celebration of Natural Hair TEDx Talk from Cheyenne Cochrane, provided under CC BY–NC–ND 4.0

Information Technology

Discuss the problems with artificial intelligence and machine learning algorithms and how these can reinforce social biases (e.g., creating a model with training data that reflects a history of racist policies and practices), and how these algorithms are used to reinforce racist policies. Also, point out the lack of representation within the industry and resulting failures in IT/data products (e.g., filters that do not recognize Black faces). See GAO report (public domain) on Consumer Protection: Congress should consider enhancing protections around scores used to rank consumers (2022). “The risks that consumer scores can pose include potential bias and adverse effects, and the scores generally lack transparency. The data used to create scores may contain racial biases—for example, one study found Black patients were assigned lower risk scores than White patients with the same health care needs, predicting less of a need for a care management program.

Additional Resources:

Check out #WOCinTech Chat for free photos of women and non-binary people of color working in the Tech field; licensed CC BY #WOCinTech Chat or wocintechchat.com

Law, Criminal Justice, Corrections & Security

Call out police violence against Black and Brown communities and systems of mass incarceration.

Mapping Police Violence TEDx Talk by Samuel Singyangwe, provided under CC BY–NC–ND 4.0

View the TED Talks on ‘Truths about the US prison system‘, provided under CC BY–NC–ND 4.0

Government Accountability Office (GAO, public domain) reports on racial profiling issues in the 1990s, finding that Black people:

As recently as 2017, GAO noted that these racial and ethnic biases in justice and policing continue to persist, disproportionately affecting Black and Hispanic people.

License

Icon for the Creative Commons Attribution 4.0 International License

Guide to OER and Antiracism Copyright © by WTCS OER Network is licensed under a Creative Commons Attribution 4.0 International License, except where otherwise noted.

Share This Book