data-equity.jpg

#Health Equity

Professional data equity experts on how they practice

“Bias in research has real-world pains that disproportionately affect people that look like me. Dark-skinned people. Women… Having the equity lens to know that people respond to different research methods differently — it’s important. It’s expensive. It’s time-consuming. But it’s necessary… to ensure that we receive accurate results, that we treat all groups the same, we don’t lump folks together, and that we’re better able to interpret the data.” -Sian Lewis

At the center of Harmony Labs is research. Our research utilizes the Narrative Observatory, data infrastructure about U.S. media consumption, to provide strategists and changemakers with insights necessary to create effective, constructive stories. Since late 2021, we’ve been hard at work building an audience-narrative architecture for the Robert Wood Johnson Foundation’s (RWJF) efforts to advance health equity in the United States.

This work requires an unflinching commitment to data equity in research design and modeling, and to implement standards of practice that clearly and thoroughly engage with what data equity should be. But barriers like a lack of incentive, a lack of resources, and a lack of concrete steps have led to data equity being historically omitted from most research best practices, and rectifying harmful narratives is often rooted in rectifying harmful research design. Janay Cody, PhD and Sian Lewis, MS, both data scientists and equity consultants as well as members of the Harmony Labs Advisory Board, are currently working as equity advisors for our project with RWJF. In a recent conversation, Science Director Riki Conrey, PhD spoke with both Dr. Cody and Lewis to discuss:

What should data equity actually look like? How can we, and other researchers, enact practices that hold us accountable?

We hope the insights from their conversation can act as a resource and conversation starter for all researchers, changemakers, advocates, and storytellers.

“Data equity sits right in the middle … of strategy and science.” -Janay Cody, PhD

Data equity is:

  • Being conscious of outcomes — research and the tools created from it must, by design, afford outcomes that uplift and empower communities, rather than disparage or drive existing, harmful stereotypes.
  • Instituted throughout the lifecycle of any data product — from actual practitioners doing the work, to collection, to interpretation, to analysis.
  • Seeking “sameness” in measurement — which may not always mean sameness in method to ensure that concepts are captured equitably across different groups or cultural contexts.
  • Interrogating assumptions in real-world settings — interrogating whether models are appropriate, or if existing parametric assumptions are actually relevant/useable.
  • Informing research design — having the right kinds of conversations across an array of perspectives to ensure the right questions are being asked of the data.
  • Building the work into a project’s budget — respecting time, effort, and resources.
  • Practical — real choices have to be made in designing the work, and trade-offs will necessarily occur.
  • Participatory action research — partnering with the communities that are actually going to be affected by the data.

“I actually [talk] to the folks who are affected by the models that I’m building many times during the process. It is a part of my work. I literally build it into all my plans. It’s a part of the budget. It’s an essential part of it. I don’t think that folks have that lens at all. And I wish that they did. I think that a lot of the equity issues we went into would be alleviated if people took the time and spent the resources to do this.” -Sian Lewis

Data equity is not:

  • Diversity. Diversity is representation, but equity is a measurable improvement in outcomes.
  • Exclusively linked to the end product and how it affects people — data equity is instituted from the ground up.
  • An afterthought or “filter” to lay across data measurement.

“Data equity is not diversity. Diversity is making sure there is representation of people from different backgrounds. That’s great. That is not equity. Equity is ensuring a measurable improvement in outcome, such that typical characteristics that we care about, like race and gender and age, don’t statistically determine particular outcomes.” -Janay Cody, PhD

Modeling

Algorithms can be biased by design. They are simplifications of the world and use historical data to predict the future — essentially, mathematical stereotypes. Delivering models with an equity lens is hard, but possible, and often rooted in clarity and documentation.

  • Be clear about the purpose of the model and what it should be specifically designed to do.
  • Trade-offs will be necessary — improving the accuracy of those who typically are unseen means a decline in the accuracy of individuals who typically are seen the most.
  • Documentation from start to finish — don’t gatekeep your models or silo your expertise, make the instructions clear and easily accessible.
  • Be iterative — models should be updated and checked regularly over time.

“[Data equity] is … empowering because it finally gives a voice to people. I like to describe myself as the wind. You don’t see me, you just feel my effects… My goal is to give the evidence to the people who need it to actually say the thing that they need to say, make the claim, change the conversation… My goal is not to be seen. It is to be felt through the insights that I provide that are useful for people who need to accomplish a task. That’s very different than the way we think about research and academia and the way we were trained.” -Janay Cody, PhD

Making Design Decisions

Listen to Dr. Cody walk through a recent example in which the structure of an experiment was questioned and restructured after applying an equity lens to the research design decisions. The transcript is available below.

“Oftentimes groups have to balance wanting to demonstrate their impact with actually wanting to make one. In digital campaigns, one of the ways in which they target is through zip codes, and for Black folks, and I’m thinking of Black folks in Pennsylvania, 3% of the zip codes encompassed the majority of that population. When it comes to designing a test, an experiment, to be able to demonstrate your impact in an election or at the end of a campaign, that’s very difficult to do with your zip code, the possible universe of zip codes being [shrunk] so much. So a lot of the things that I’ve had to do is talk about how we’re going to design an experiment. You can create your target universe based upon having a Black population, for example, that is, [one] that makes up at least 30% of the total population of the zip code, where there is a minimum of at least 5,000 respondents, and then you would find treatment and control that way so that you are not taking out most of your sample for an experiment. … There’s this other possibility of making the decision to not to do an RCT and instead choosing an alternative method, like a quasi experiment where you construct your treatment group strategically instead of randomly, and then you construct a comparison group through either propensity score matching, regression discontinuity… if time is on your side, you do … different estimations. But it’s the very real design question of, ‘Do I do an experiment at all, as I’m trying to balance: I want to make an impact in a Black community, but I also need to be able to demonstrate it to my funders and to people who want to support the scaling up of this particular initiative.’”

Though far from comprehensive, these foundational principles of data equity across research design and modeling, as well as an understanding of what data equity is not, have informed and are informing our current and ongoing work with RWJF. Follow this space for more updates concerning both this project and others.

 


 

Dr. Janay Cody is a behavioral data scientist and data equity advisor. Dr. Cody empowers results-driven leaders with equitable data policies, inclusive research, and measurement strategies to manage their impact.

Dr. Cody is highly regarded for the analytic solutions she created for philanthropies including the Kettering Foundation; government research teams including the Administrative Office of the Courts on behalf of the Judicial Council of California and the National Assessment of Educational Progress; and data and tech companies including Catalist, Movement Labs, Harmony Labs, and Analyst Institute.

Dr. Cody received her PhD in political science from the University of Notre Dame, du lac.

 


 

Sian Lewis, MS is a data scientist and researcher who serves as the Head of Research at A — B (www.abpartners.co). Also, she serves as a political data scientist by employing predictive analytics on Democratic political campaigns. Her research focuses on narratives, inoculation, and persuasion. She formerly served as the data science and analytics team manager for the Department of Defense Healthcare Management System Modernization project (DHMSM) at Booz Allen Hamilton. Her recent accolades include the 2021 Society of Women Engineers Spark award, the 2021 Black Engineer of the Year Modern Day Technology Leader Award, the 2020 Women of Color in STEM award, and the 2019 DC FemTech award.

Latest News


  • #Democracy
  • #Civic Information
  • Blog

Capturable Curiosity for Civic Engagement

Hero Image for Capturable Curiosity for Civic Engagement
  • #Climate
  • Blog

Audiences for Electrics

Harmony Labs
  • #Audience
  • #Narratives
  • #Democracy
  • Blog

The Ultranationalist Narrative in Media

Harmony Labs
  • #Democracy
  • #Content Testing
  • Blog

Alex Garland’s Civil War May Normalize Political Violence

Harmony Labs