Bias is either categorized as a fixable error or a structural inequity. This either-or language should be reframed as a both-and situation. As knowledge construction workers and insight architects, we struggle with seeing and addressing spectrum of biases. The power and ease of scale of inequities in our digital systems affects the effectiveness of achieving business goals and maintaining client loyalty. During this fireside chat, discover the "bias wheel" as a more practical guardrail to navigating this spectrum. We will also discuss the disparate impacts of bias, including questioning the trust of and trustworthiness in our data, algorithms, systems and platforms.
Trained as a computer scientist and an educator-scholar by profession, Dr. Brandeis Marshall teaches, speaks, and writes about the impact of data practices on technology and society. She focuses on educating and producing scholarship that amplifies Black women thriving in data and tech careers. She started DataedX (pronounced data-ed-x) Group. DataedX Group, at its core, counteracts automated oppression efforts with culturally-responsive instruction and strategies. They strengthen your data equity resolve -- that ability to acknowledge and perpetuate the anti-discriminatory narrative centering non-white people and communities throughout the data life cycle.
Why data? Because data is fueling the algorithms that’s making the tech and that tech is perpetuating harm to Black women. We can end this relentless cycle with culturally responsible data education and intentional processes dissemination.
Dr. Marshall holds a Ph.D. and Master of Science in Computer Science from Rensselaer Polytechnic Institute and a Bachelor of Science in Computer Science from the University of Rochester. Currently, she's a Stanford PACS Practitioner Fellow and Partner Research Fellow at Siegel Family Endowment. She is on sabbatical leave from Spelman College, where she's a Full Professor of Computer Science.