In the festive spirit of the holidays, my little sister gifted me an Apple Watch, a gesture rooted in her support for my renewed passion for running after a challenging year marked by the loss of our father amid the COVID-19 pandemic. However, as the wrapping paper crumbled and excitement filled the air, little did I know that the sleek wearable device was not designed to cater to people with darker skin tones like mine.
Amid the holiday shopping frenzy, where millions are spending billions on gifts, wearable tech devices like the Apple Watch may seem like the perfect present, blending style and functionality. Yet, a deeper dilemma emerges as technology continues to fail people of color, delivering inaccurate readings, particularly for individuals with darker skin tones. This issue extends beyond wearable devices to seemingly neutral everyday technologies, including soap dispensers, automatic hand sanitizer stations, camera recognition software, heart rate monitors, and even self-driving cars.
The problem lies in the fact that many high-tech gadgets, including pulse oximeters, deliver biased readings due to the use of biased data and algorithms. Pulse oximeters, essential for measuring oxygen levels in the blood, often exhibit racial disparities, with studies indicating lower accuracy for Black patients compared to white patients. This raises concerns about medical errors and mistreatment based on false estimates generated by these clinical tools.
Even seemingly commonplace devices like thermometers, including forehead thermometers, are not exempt from racial bias, potentially delivering less accurate readings. The widespread reliance on such devices in households, day care centers, and medical settings poses a significant challenge when inaccuracies are rooted in racial bias.
Addressing these issues becomes crucial as racial bias in medical technology not only poses a risk to accurate health assessments but also contributes to broader disparities in the treatment of people of color. The lack of standardized improvements in pulse oximeters and the slow response from regulatory bodies, such as the Food and Drug Administration, further exacerbate the problem.
Legal options for individuals discovering the inaccuracy of medical devices remain limited. While some companies face lawsuits for inaccurate estimates and breaches of information, the FDA’s response to racial inaccuracies has been insufficient. Improved pulse oximeters that include additional wavelengths in the light beam exist, but their limited usage and standardization across hospital systems impede widespread benefits.
The implications of racial bias extend beyond the realm of medical technology, permeating various aspects of society. From disparities in pain medication administration for Black children to differences in cardiac procedures for Black men and higher mortality rates during childbirth for Black women, the consequences are profound. The conversation goes beyond technological flaws; it delves into broader issues of colorism, medical mistrust, and the societal impact of skin pigmentation.
In conclusion, the revelations about racial disparities in wearable tech and medical devices underscore the urgent need for inclusive innovation. As we navigate a technologically advanced era, it is imperative for companies to acknowledge and rectify these issues. This involves not only refining existing technologies but also fostering diversity in product development teams, implementing robust testing protocols, and actively seeking feedback from users with diverse skin tones. Only through a collective effort can the tech industry ensure that innovations are accessible and accurate for everyone, regardless of their skin color, contributing to a more equitable technological landscape.