This is a review of Chapter 1 of “Data Feminism” by Catherine D’Ignazio and Lauren F. Klein, 2020, this was done as a part of the Collaborative Book Review Project with a great group of HASTAC Scholars. To learn more about the project and see previous collaborative book review check out the collection here, and to learn about why we chose Data Feminism for this project you can read a previous blog post of mine here.
“Principle: Examine Power” delves into the intricate web of power dynamics and inequalities that permeate both the tangible and digital realms, using data science as its focal point. Through the lens of data feminism, the authors critically examine how power structures shape and are shaped by data practices. This exploration is grounded in a compelling narrative that draws from a range of real-world scenarios, including the poignant story of Serena Williams’ childbirth complications, a powerful vehicle for discussing racial disparities in healthcare.
At the heart of this chapter is a methodical dissection of the “matrix of domination,” a conceptual framework that elucidates the multifaceted and intersecting nature of oppression. The authors adeptly utilize this framework to unravel the complexities of power dynamics across various domains, including structural, disciplinary, hegemonic, and interpersonal levels. This approach illuminates the entrenched biases and injustices in data collection and interpretation, biases that cause data to be unevenly reported or valued, or biases in framing and hypothesis that impact interpretations of data. It challenges the reader to confront the often invisible forces of privilege and oppression that underpin data science. This is particularly effective in highlighting the stark realities of racial and gender disparities in healthcare, as exemplified by the discussions around maternal mortality rates among Black women. By weaving together personal stories, statistical data, and critical analysis, the chapter brings to the fore the dire consequences of systemic inequalities and the urgent need for a more equitable and inclusive approach to data science.
The most interesting part of the chapter is its deep dive into the intersectionality of biases within data science, particularly illuminated through the story of Serena Williams and her childbirth complications. This stands out because it encapsulates the broader themes of the chapter—racial and gender biases in healthcare—and ties them into a compelling argument about the systemic nature of such biases. Williams’ experience is a powerful example of how even globally recognized, affluent individuals are not immune to the systemic biases embedded within healthcare systems. Despite her fame and resources, Williams faced life-threatening complications during childbirth, a scenario that disproportionately affects Black women in the United States. This story is particularly poignant because it bridges the gap between abstract data and the real-world consequences of biases. It humanizes the statistics, making the discussion on racial disparities in maternal mortality rates both relatable and urgent.
Furthermore, the subsequent discussion on the mobilization of Williams’ experience into a broader conversation about healthcare disparities among Black women introduces an engaging narrative on the power of storytelling in data science. The chapter illustrates how personal stories can catalyze public awareness and policy change, emphasizing the significance of examining power dynamics and biases not just through numbers but through the lived experiences of individuals. The raw data of analysis is still a product of context and other complexities of embodiment, and this book addresses these in Chapter 2 on the principle of elevating emotion and embodiment and Chapter 6 on context. This thread is interesting because it exposes the reader to the complex layers of biases and showcases the potential for advocacy and change. It highlights the necessity of recognizing and addressing the intersection of racism, sexism, and celebrity within healthcare, pushing for a reevaluation of how data science can be used to challenge and dismantle systemic inequalities. This blend of personal narrative, statistical analysis, and social critique encapsulates the essence of data feminism, making it a compelling and thought-provoking part of the chapter.
There have been many recent public discussions addressing issues of racial and gender biases in healthcare and technology, including analyses and initiatives aimed at combating these entrenched disparities. For instance, the pervasive racial bias in healthcare technologies and medical devices has been spotlighted, highlighting how racial and ethnic minorities suffer from health inequities in the United States, with structural racism identified as a fundamental driver. The COVID-19 pandemic and broader social injustice issues have pushed health disparities into the spotlight, emphasizing the need for systemic reform. Efforts like HIMSS’ Global Health Equity Week aim to harness the power of information and technology to eliminate health disparities (2). However, challenges remain, such as racial bias in electronic health records and medical devices not tested on diverse populations before market launch. Additionally, the impact of racial bias in healthcare algorithms has been scrutinized, revealing how millions of black people are affected by biases in these systems(3). Such biases can lead to discriminatory impacts based on race and ethnicity, underscoring the necessity for investigations and reforms in healthcare algorithmic software used to determine patient care.
These discussions underscore a growing awareness and concern regarding the impact of racial and gender biases in healthcare and technology. They call for a multifaceted approach to address these issues, including reforming global health ecosystems, implementing more inclusive testing protocols for medical devices, and investigating algorithmic biases to ensure equitable healthcare outcomes for all.
The chapter does not shy away from critiquing the status quo of data science, where the interests and goals of dominant groups often overshadow those of marginalized communities. Through examples ranging from algorithmic bias in job screening processes to the exploitative practices of data collection by corporations, the authors expose the ethical dilemmas and social injustices perpetuated by unchecked power imbalances in the digital age.
However, the chapter is not merely a critique; it is also a call to action. By advocating for a critical examination of power relations and proposing a paradigm shift rooted in its approach to data science through the lens of intersectional feminist thought. It challenges traditional, patriarchal, and often exclusionary perspectives by highlighting the importance of considering gender, race, and class in data analysis and representation. This shift advocates for more equitable and inclusive data practices that recognize and address biases, aiming to use data for social justice and equity. By integrating feminist principles, it calls for a transformation in how data is collected, analyzed, and applied, prioritizing ethics, inclusivity, and the dismantling of power imbalances. This offers a vision for a more inclusive and responsible data science, the authors provide a roadmap for challenging and transforming the oppressive structures that pervade our data-driven world.
In conclusion, “Principle: Examine Power” is a thought-provoking and illuminating chapter that combines theoretical insights with real-world examples to unpack the complexities of power dynamics in data science. It serves as a crucial reminder of the ethical responsibilities of data scientists and the potential of data feminism to foster a more just and equitable digital society. The authors’ rigorous analysis and compelling narratives make a compelling case for the need to critically examine and challenge the power structures that shape our data and, by extension, our world.
- D’Ignazio, C., & Klein, L. (2020). 1. The Power Chapter. Data Feminism. https://data-feminism.mitpress.mit.edu/pub/vi8obxh7/release/4
- Ledford, H. (2019). Millions of black people affected by racial bias in health-care algorithms. Nature, 574(7780), 608–609. https://doi.org/10.1038/d41586-019-03228-6
- Racial Bias Deeply Rooted in Healthcare Technology and Medical Devices | Chartis. (2022, October 28). https://www.chartis.com/insights/racial-bias-deeply-rooted-healthcare-technology-and-medical-devices