The great equalizer: How data sharing can help communities overcome social inequities

Posted by

Listen on SPOTIFY | APPLE | GOOGLE

Episode 1 | The Great Equalizer: How data sharing can help communities overcome social inequities

Guest: Jameila “Meme” Styles, Founder and President of MEASURE, Social Science Research Council, 2022 Just Tech Fellow, Data activist

Meme Styles and host Tim Zonca discuss how better data sharing — and more of it — will give community organizations and civic leaders a common language for solving complex problems like health, education, criminal justice, and economic inequities.

Show Notes

Learn more about MEASURE, its mission, and impact. Then, get involved.

Transcript

Tim Zonca 00:02

Greetings from the team at Vendia, and welcome to Circles of Trust — a podcast for leaders across all industries, committed to speeding up innovation at scale and making a profound positive impact on business and the world. I’m your host, Tim Zonca. And I’m thrilled to kick off our first season with Meme Styles. Meme is a data activist. She’s Founder and President of MEASURE. She’s a member of the Social Science Research Council, and a 2022 Just Tech Fellow. Meme, welcome to the show. It’s great to have you.

Meme Styles 01:02

It’s good to be here, Tim.

Tim Zonca 01:04

So when we first met, you said you’re an activist looking to disrupt research. What does that mean to you?

Meme Styles 01:10

Yeah, I want to put a point on what you said here. I’m a data activist. True to the core, 100% in heart and also in operation, the way that I move. You know, when I said that I’m looking to disrupt traditional research it’s said with the understanding that traditional research is often, I would say, conducted in a very academic, very white, male-centered and -led, very institutional framework. What MEASURE does is that we are reclaiming the narrative about how and why systems of oppression are experienced by powerful Black, Brown, and Indigenous communities. And the way that we’re doing that is we are disrupting traditional research by centering the stories and the lived experiences of people that have experienced different forms of oppression, of neurodivergent people, veterans, LGBTQIA+ stories. And so what this means, though, Tim, is that it, it calls for a decolonization and a reframing of what research is, and how it’s defined, and how it’s accepted, and how one is looked at as a credible researcher. So in my mind, those that are most impacted by the systems or by inequity are the ones that should lead the research, should develop the research protocol, analyze the data. So when we think about, you know, what has been around for a long time, community participatory action research, there’s a spectrum to that. And we are looking to radically move the needle on that spectrum to having communities truly be the ones who are leading the work. In doing so it creates a more accurate and inclusive account of that actual narrative.

Tim Zonca 03:34

It sounds like, the way you describe that, you have across the spectrum, maybe the academic, institutional, traditional research, and then this community-focused research. Can you talk a bit more about that, that picture that you paint? Or maybe either the tension across those or just the variants across that spectrum?

Meme Styles 03:57

Yeah. In talking about it, I can’t help but point to why it’s necessary. And so when I think about how traditional research has created harm, and I know that, we’ll talk a little bit more about this, but, you know, the Tuskegee experiment; labeling the want and the need for an enslaved Black body, a person, to escape the plantation was actually a medical diagnosis called Drapetomania. So, there has been a history of us being left out of traditional research, and it’s been used as a weapon against us. And so I would say that the need, first of all, is so that we can reclaim power through research, to be honest, because there’s so much power in the pen. And I mean, you can literally create systems and totally break down systems just within the power of the pen. And so we are using that power in order to create new avenues of research that are actually led by people. And what’s happening is that it’s challenging the dominant narrative of history by imagining new futures through research.

Tim Zonca 05:33

You use, in just even that kind of explanation or context, terms like power, and I’ve heard you say data is power. You said it has the ability to inform and transform, but it can be incredibly harmful. So tell me, tell me more about that.

Meme Styles 05:54

Yeah. And I think it’s something that people are starting to truly recognize. With the increased data-fication of our bodies and our movement through society every single day, we’re realizing that Big Brother’s truly out there, that surveillance is everywhere. It is. You’re in your car, many cars have LoJack(TM). I mean, there are so many cameras out there that are watching every single move; our neighborhoods are now adopting technologies of, you know, cameras and surveillance everywhere. And so it’s long been recognized that data can be biased, right? In so far as that the data reflects the perspectives and the values and the understandings of those that have collected and analyzed data forever, right? So people that collected and analyzed it forever are just perpetuating the same biases. For example, in California, they were seeing that there was a database that was saying that “these people” were probably criminals, probably in gangs, right, like a gang member database. And you had children that were on this database, right? Like, I mean, come on, there has to be some barometer of ethics, morals, and anti-racism when we think about the collection of data. Data is power, and when I say data is power, that’s what I mean: Power can be used for good or evil. You know? And so, I feel like, from what we’ve experienced so far, data for me has been used to reinforce and perpetuate attitudes and beliefs about race, crime rates, education outcomes or achievement. You know, the understanding about, well, work that I’m really doing right now in our organization, is the experiences of Black girls and being adultified. You know, we saw a report that came out of Georgetown University back in 2017, that showed, through data, that Black girls are looked at as less innocent, needing less love, needing less protection. And so, you know, when we think about the power of data to disrupt the status quo or to give agency back, or not even back, but to give agency to communities that have historically been oppressed, that’s where I see the power. And, for me, the power of good is incredible when I see it happen. When I see organizations using data in order to tell their story, they’re getting funded to do the work that they’re great at. They are able to demonstrate to their community the progress that they’re making within their community because now they have the data. They are able to go to City Council and speak up against a ridiculous rule or law that does nothing more than continue the harm done on their children or their families. They’re able to go to the city, to the school, the school board meetings and have a real conversation about what’s happening. It’s not just their anecdotal experience, right? (Which I say that very lightly because I believe that lived experiences data as well.) Now they have the numbers, they have the reports in order to confirm it. And unfortunately they have to confirm it.

Tim Zonca 10:03

Do you have any favorite examples or examples that really jumped out to you, where you’ve seen that data being exposed and for good? That’s representative of really, you know, what you’re doing every day and what your organization’s mission is? And you want to see more of that? I’d love to hear what example or two always come to mind when you think of that.

Meme Styles 10:26

Yeah, I have several. And it’s definitely a good hair day for me today, so I’m going to start with that. The Crown Act, right? So The Crown Act is a law that wants to ensure that you are not discriminated [against] because of the way that you wear your hair, right? We saw through data that, especially, Black women were being discriminated because of their hair. We also saw mass media, where you saw kids getting their dreads cut off, etc. So, there’s knowing the data, and Dove really did a great job at centering the data to be honest, which created this new law, right? What we did recently here in the city of Austin, was that we worked with the city in order to really explicitly show how that data was impacting people. And how, through the Innocence Initiative, this initiative that we’re working on, how important it is to protect black hair, and textured hair, and people being able to go into work with maybe no hair at all, and not face the harms of discrimination. And so, what that resulted in was a new law here in the City of Austin, passing The Crown Act locally. And we’re hoping to see that happen everywhere. There’s no reason why, you know, it should just be Austin should be the first city. I mean, it definitely should be everywhere. And so that’s what we’re really hoping to see. And that’s the power of data. That’s us, you know, Black women, being explicitly shown the data and the disparity and being able to then go to City Council and say, “This is a problem. We want to change this.” And that’s exactly what happened.

Tim Zonca 12:37

Yeah, that’s a great example. Any others?

Meme Styles 12:42

Definitely. I have another one here. And this one, this one is more so about birthing outcomes, right. And I know that I’m kind of sticking with the work that MEASURE has done. But really, this is what we do every day, we love to do this work that we hate to do, you know? And it’s really about Black women being less likely to survive a childbirth. That should not be anything that I have to worry about as a Black woman or for me to have to worry for my daughter’s health, you know? And so what we ended up doing was we partnered with a collaborative called the Maternal Health Collective, or Maternal Health Collaborative here, locally, just to learn what the stories were, what were the experiences of BIPOC women and birthing outcomes. And what we learned from this research was that Black, Brown, Indigenous women were experiencing racism. They had to switch doctors more often. There was this level of fear in birthing. And, and so through that data, we were able to, you know, provide that back to these organizations that are doing this work every single day. And with that research and report, they were able to get the funding needed in order to be sustainable and in order to really meet the needs of their community through direct service.

Tim Zonca 14:19

Thank you. Thank you for going through both of those. I think it’s especially, at the onset, as we talk about data and research, and regardless if it’s academic-driven or community-driven, that just those stories that kind of personify the “so what” of it all, are really, they’re just impactful and powerful to hear. Well, what about sharing that information and the access to that information? I think one of the things that you told me, I think it was the first time we met, is talking about how you work with organizations that traditionally haven’t shared, or don’t share, or aren’t really good at sharing data. So can you tell us about what kind of organizations you work with? What kind of data do they have? And why does sharing that [data] matter?

Meme Styles 15:09

Yeah, definitely. I mean, first of all, sharing data can help communities — and all of us — truly start to understand where those social inequities are occurring. In many cases, ignorance is bliss, right? And so as long as we don’t see it, as long as we’re not the ones experiencing it, it doesn’t really matter. And so what we do at MEASURE is that we’re tapping on those big institutions to share data because, when they do share, those disparities are front-and-center. Right? And so organizations like education agencies… we’ve been asking education agencies to share the data about how the discipline, the disciplinary outcomes that Black and Brown kids experience and face every single day. And disaggregate that data. And that’s the problem: In many cases, you can find data, but it’s not disaggregated. And so, remember this term, when data is not disaggregated, then people disappear. In many cases, Black girls disappear because you’re looking at datasets. “Oh, yes, these are our disciplinary outcomes.” But you’re not really sharing how each group of people are being impacted by those outcomes. Another group that we’re always looking to gather more data from, it’s really at the local level cities, right? Like, I’m really interested, you know, in the work that’s happening in Pflugerville, Texas. We partnered with a group of concerned community members to, to kind of identify some of the issues that were perpetuating racism within the city. And so what ended up coming from those meetings was an Equity Commission for the City. So the City actually created an Equity Commission because of the work that we did. But there’s more that needs to be done. The data about how many Black and Brown people are getting or gaining City contracts, we need to know that. The data about how much money is actually being spent to disrupt some of the inequalities that we see, you know, with Black children or Brown children within the City, we need to see that. And so it’s really about, again, using the knowledge that we have at MEASURE in order to put out [Freedom of Information Act] FOIA requests, you know, to get that information. And then it’s also this other piece about deep diving into our own community to draw out those stories and those experiences that can really provide a good understanding and a better understanding about what’s actually happening.

Tim Zonca 18:20

In a study, like the examples that you just went through, do you find that in most cases, or, you know, all cases, the data is there? And it’s really about, you know, accessing it, disaggregating it, sharing it with the right people, making it actionable? Or are there times where you dig and you’re like, “Well, we have a hypothesis, but we we actually don’t have the right information.” And that’s part of the project or initiative.

Meme Styles 18:48

Yeah, there’s both. But to be honest with you, I think it’s still kind of new, right, for organizations and the collection of data. And in the collection of data, well, okay, for one example, like the police department, right? Barack Obama had an incredible initiative, the Police Data Initiative back in 2015, I believe it was, where he challenged police departments to start making their data publicly available. And we saw very quickly that there’s so many police departments that were not prepared to do that. And the data that was provided, in many cases, is incredibly messy. And so I think that it takes people like me, those data activists, to push in and to ask for data for us to begin a conversation about it. [To begin a conversation] in order for those datasets to be better, useful tools, so that we can then find those inequities and so that we can better solve for those inequities. But yeah, I definitely feel like it is, it’s now becoming … I always say this data is now sexy. You know, it wasn’t before. It just wasn’t. And now it is. And people are wanting to tap into those, you know, to those tableaus and those Microsoft APIs and share their data-driven work and everything else. But again, it wasn’t always like that. And so it’s going to take us some time for everyone to kind of, you know, reach a point where data truly is not just for the internal organization, but for everyone. That’s when data is now a utility, right? Data should be available, it should be something that we’re able to leverage and able to use in order to make the society better. And so I just, I feel like we’re getting there.

Tim Zonca 21:12

And in the example you use with, like the police department, saying, you know, they weren’t prepared to share the data… it was kind of a mess. Is that the most common reason organizations don’t share information, or are there other impediments? Or just reasons that they, you know, they’re sitting on some set of information, and it’s just hard to get out the door. They can’t make it accessible?

Meme Styles 21:38

Yeah, I think there’s several, several, several reasons. Right. One reason, of course, is I think that when you share data, that means that you are making yourself vulnerable, right, to the people. And some organizations just don’t want to be vulnerable, and they don’t want to be totally 100% accountable to the people. That’s just, that’s one experience that I have. And I know that to be true. And I think, secondly, again, organizations are kind of racing to have the capacity to do data well, whereas they had not done it well before. And so they may have had terrible records management systems in the past. There are organizations out there that are working to do police records management systems better. There’s one that comes to mind right now, Mark 43. We’ve been able to work with them, you know, early on in MEASURE’s work. And they’re thinking deeply about how do we do data better, so that it’s more transparent to our community, but also really great for evidence-based decision making at the police department level. And so, I think that those are some of the reasons why. But I also… I always say this: Equity can be measured. Right? And when you begin to measure equity, that’s where some organizations are not ready. Right? They haven’t done the work. They haven’t really, truly thought about what it means to be diverse, or inclusionary, or creative, a place of belonging. And so I think that there can be a hesitancy about sharing data that truly illuminates what’s happening. And I’ve personally worked with organizations that, you know, we’ve learned from the people about how they were operating. And it came to the point of like, “Oh, wait, hold on, we don’t want to share that just yet.” Yeah. Yeah. But I mean, you work with MEASURE, so we’re going to share it. At the end of the day, I just feel like it might be a fear. I think there’s a fear.

Tim Zonca 24:04

Oh, interesting. Yeah. …So maybe just to shift a little bit. I think, up until now, I think you’ve been kind of painting a picture of your vision and kind of what you see happening, especially in the data sharing space. But what about your vision for MEASURE, in particular? What do you see as MEASURE’s role in the world?

Meme Styles 24:41

Yeah, so I’m going to give you this is a kind of a framework of inquiry that I’ve been thinking about, as a researcher with the Social Science Research Council. So they’ve selected me as a fellow for the next two years, and what they did was they put us all up on the mountain. They took us up to the mountain in the Catskills to just really think, to have an opportunity to think before we dove straight into this fellowship. And when I was up to that mountain, what I came up with was this framework of inquiry: I said to myself, “You know, perhaps a measure of equity is the absence of fear. And perhaps a measure of equity is the absence of pain.” And then, just a couple of days ago, I was on another panel, and I was listening to the stories of Black women and Brown women, of Indigenous women, and [I was] thinking to myself, and also listening to the stories of white women. And then I came up with this understanding that perhaps a measure of equity is the presence of joy. Meaning that, it’s us all working together. To get through the mud together. We’ve been through so much as a country, right, as a society. We’ve experienced so much institutional racism, so much structural racism, so much oppression in every form, and every at every level, right? But, maybe, when I’m, when we’re thinking about imagining this world, we have to start with “perhaps.” Because “perhaps” really holds space for uncertainty. But it also truly holds space for possibility. And I think that’s my vision for MEASURE. So MEASURE is really taking on this notion of possibility. This position of possibility. And so, through our work, we’re taking on systems of generational oppression and redistributing the power of research to our community so that we’re able to really redefine this narrative that we’ve all been in together.

Tim Zonca 27:03

You know, it’s interesting, I hear that “perhaps” in the suggestion of this notion of possibility. But also, I’m really intrigued by what it seems like part of your thinking kind of resulted in. It seems like you started with this idea of the absence of something, like the absence of fear. And then it seemed like, there was this idea of what gets replaced with this? And you said, well, the possibility or potential for joy. So tell me more about that tension.

Meme Styles 27:37

That was a journey, Tim. You’re making me dig deep here because that was a true journey in order to get from this absence of fear to the presence of joy. And I’m still, I’m still trudging through that, to be honest with you. But where I’m at right now in that journey is, truly, that I want to be able to trust that my 17-year old son, who is an almost six-foot-something Black boy, can drive his car to school and come back and not be harassed by the police. I don’t want to be afraid to have him do that. I want to be able to know that my 23-year old daughter, when she is ready, hopefully 10 years from now, to go have a baby, that she can go to the hospital as a Black girl with dreads and be believed about her pain. Right? I want to know that my mother-in-law, who needs access to healthcare, is going to get that access to healthcare. And so that, for me, is where I was at with that first statement is that it’s the absence of…it’s really holding space for the absence. How likely is it that my son is going to come home safely? How likely is it that my daughter isgoing to give birth and be okay with birthing? How likely is it? Is my mother-in-law going to get the healthcare access that she needs? But then…this joy piece, right? This joy piece is… I believe it’s owned by us collectively. Perhaps a measure of equity is the presence of joy: That means that it takes all of us, together. In order to get there, joy happens through pain. Right? Joy happens through pain. We’ve been through so much, again, as a country and as a society, that we need to get to this “perhaps joy” at the end. And we can only do it together. Now, mind you, the absence of fear and the absence of pain is not on me as a Black woman to solve for. Right? I don’t own that. That is the system. That is those that are perpetuating racism within their organizations and in their spaces. That is, you know, these stories that are being passed down generationally about who has more power and who does not. That’s who owns that, right? But there is a opportunity for me to engage in this anti-racist world and imagining this new utopic society where we all are truly included. And that is, by us working together, to get to that place of joy.

Tim Zonca 30:56

Thank you. Thank you for sharing that part of the journey that it sounds like you’re currently on as you kind of think about where that role of joy then comes in. Especially, I like the framing with “perhaps,” and then kind of thinking about a measurement of that. It’s really, I don’t know, it’s fascinating. I haven’t thought enough about it. But I’m excited to just be exposed to that idea, to be able to kind of consider that now. Well, you know, you just walked through painting a picture of, you use the word, “utopia.” So if you can, you know, get anything at least related to a tangible set of the kinds of things that MEASURE is really trying to deliver on over the course of the next year or so to unlock something as it relates to sharing this precious and potentially powerful information, what would you love it to be?

Meme Styles 31:59

Well, first of all, let me just say there’s an incredible group of Black women on the East Coast, and they’ve developed what’s called the “Building Utopia” deck. This is a game that one can play, and it’s a game that one can play that really centers in, you know, imagining new futures for Black people. And it’s not just for Black people, it’s for everyone, right? And so that’s where, you know, we’re really drawing upon my identification as an Afrofuturist, to the person who really draws from my past generations and generations of ancestors who have brought me to this place right now — today, sitting with you. And me also, at the same time, holding space for seven to eight generations from now, of that Black girl that came from me, you know, in the legacy that I hope to create for that Black girl. And so that’s, that’s, that’s that first part that I just want to address for me right now. I, personally, would love to gain a deeper understanding about how artificial intelligence can be used in a way that does not create a deeper pit of racial disparities, right? Like, that’s where my mind is really excited. And my heart is excited about it right now because we know that, as AI rapidly increases, it’s so incredibly important right now (it was actually so incredibly important, like years ago) to engage in it so that it does not cause additional harm to the BIPOC community. And so, in one way, I know that I can engage right now and the way that I am engaging right now is we’re solidifying a partnership with an incredible company here in Austin called Kung fu AI. And they’re really truly holding space for like, you know, we need to create and innovate in a way that does not widen these gaps. Right? And so that’s where I’m at right now: How do we train AI technologies to not widen these gaps and then maybe even potentially start to close some?

Tim Zonca 34:46

Is the main concern that AI would perpetuate some of the kinds of biases that are just embedded in the data?

Meme Styles 34:56

Yeah, again, data in, data out, right? So if we don’t engage, if people that look like me don’t engage in the work of developing the systems, then we’re going to be left out. And then we’re probably going to also be harmed. We’ve already seen it happen, right? There’s been multiple stories and multiple examples of how, you know, Black faces or Black hands are not seen through the development of these different types of technologies we’ve seen. You know, AI can now create art, right? And so, some of that art might be harmful if we are not the ones that are helping to create these algorithms and truly engaging in and the development of these technologies. Ruha Benjamin is one that said that we the people need to be the ones that are creating technology. We need to truly engage in these conversations and then also developing. That’s where I’m most excited right now.

Tim Zonca 36:11

And then what about as it relates to some of the work that you have, you know, maybe even a little bit nearer term that you’re working on over the course of the next few months/half a year with MEASURE? What are you most worried about? And then what are you most excited about in that more near-term horizon?

Meme Styles 36:33

Okay, so I think that and I guess, it’s also kind of ties back to some of that work, the work that we’re doing with Kung fu AI around AI technology. So what I’m most thinking about is we’re creating a social impact platform right now [at] MEASURE. And this social impact platform is where powerful Black, Brown and Indigenous leaders are then connected. They’re able to report their impact through what we call community impact metrics. They’re able to find funding and team up as a force-multiplier for change. And so that’s really where a lot of my thinking is going every single day — the creation of this technology. I’m asking myself questions like, “How do I create technology that’s safe?” And I’m starting with that, right? How do I create technology that is going to see a Black face, whereas technology in the past may have not seen — even, like, literally seen — a Black face to fit inside of the picture? How do I create technology that is not going to create additional surveillance, that doesn’t need to be there? Right? That could potentially create harm. And so that, to me, this is the most important work for me, right now, for my organization. This is the most important work. I’m super excited for organizations and companies that are recognizing this and that are coming behind us. You know, to support this work, companies like Vendia, organizations like the Social Science Research Council, that believe that people of color need to be the ones developing code, [that] we need to be the ones that are, you know, in control of the algorithm and of the code, right? And so, yeah, that’s where I’m at right now. This platform, truly, is being positioned to be an antiracist social platform, where we can be intentionally exposed to funding, not passively. Remember: Black-led social change organizations get 1% of philanthropic funding. 1%. We are that other 1% that nobody talks about, right? It’s truly going to be a place where people that look like me can measure their progress. And where we have a database now, where we can say this is how much we’ve done in the past year: Fund us. Right? It’s going to be a place where we can, you know, truly champion one another’s goals. We do that in real life already. We are a people that, you know, even if we didn’t have the funding, we’re going to do the work. But we could do the work even better if we had the resources that we need in order to do it well. So that’s what I’m very, very excited about. I keep using that word, but I think I love to use that word. And that is really what’s truly bringing me joy right now is the development of this new technology. And I’m also hopeful about how I might think about or imagine AI technology being incorporated into it. And I say that to you, Tim, as an extreme critic for artificial intelligence and technology as a whole. Just to be honest with you, I have been hurt by technology. And that’s the truth. And so I’m using that pain in order to get to that joy.

Tim Zonca 40:56

That’s a powerful way to start to wrap up this conversation, which is to hear some of the pain you just suggest that came from technology, and then the 180 of “I’m working on a technology platform.” I mean, it’s really tremendous. And what about maybe just some final thoughts, you know, especially as people listen to you and say, “Hey, how can I help? How can I get involved?” Or what do you want to ask the people who have listened to this?

Meme Styles 41:33

I mean, first of all, MEASURE is always looking for brilliant people that are just interested in being an advocate for change. And so what we have on our website, and you can always go to our website: It’s wemeasure.org. And we’re always looking for folks that are just, again, just interested in data activism, you know, wanting to use the power of data for change. And it might be super project-based, like being able to provide three or four hours out of your month to analyze a dataset. And that is always incredibly needed. But I think the whole, overarching ask for me would be, you know, trust Black women in the space. Trust women of color in the space. We know what we need. We may not know exactly how to code it, right? But we know what it needs to look like. And we have the experience. And because of MEASURE and other organizations like mine, we have the data to prove it. And so, yeah. Trust Black women in this space of technology and development because we’re pretty phenomenal.

Tim Zonca 43:05

Meme, thank you so much. I mean, thank you for your time is I think the obvious one, but just thanks for sharing your vision, your expertise with us. It was really a wonderful way to spend time. So I thank you very much. And so to our listeners who Thank you.

Meme Styles 43:19

Thank you, Tim.

Tim Zonca 43:22

Thanks so much to our guest, Meme Styles, for all the real talk on data sharing. I really appreciate it. And thank you, too, for listening in. If you’re interested in learning more about the various organizations, products, or research mentioned in this episode, visit vendia.com/resources/circlesoftrust for all the links. When you’re ready to keep the conversation going, download or stream all of our episodes on Spotify, Apple Music, and top streaming services. And if you have a point of view on the challenges, power, or potential of real-time data sharing and want to be a guest on the show, email us at [email protected] or DM @VendiaHQ on Twitter and mention Circles of Trust. Thanks again for joining us and Circles of Trust. And if you like what you hear, please take a moment to drop us a few stars in a favorable review, or share Circles of Trust with your colleagues and network. Until next time.

Join Circles of Trust and get more real talk on real-time data sharing

Get Circles of Trust in your inbox every time a new episode goes live.

Posted by

Search Vendia.com