New Look at Data: Question of Ethics and Inclusivity with Brandeis Marshall

New Look at Data: Question of Ethics and Inclusivity with Brandeis Marshall

Written by Cait Sarazin

Interviews

Brandeis Marshall, PhD is a renowned computer science scholar, educator, and the founder of DataedX (pronounced Data-Ed-X), an edtech company devoted to ensuring data competency and promoting inclusivity within the data community. 

Our Senior Content Manager, Cait Sarazin, met with Brandeis to discuss her revolutionary contributions to the data ethics conversation and the importance of amplifying Black voices in the tech space. 

Cait: How have you been misrepresented in the tech space? 

Brandeis: People assume I belong to a particular box. I’m in a box that is female, a box that is Black, American, a higher education teacher, and so on. People often address me as a member of only one of those boxes at a time instead of all of me; a multifaceted and multidimensional person. 

I’m a speaker, an educator, and scholar, but it seems as though people throughout my career have only looked at me as the “diversity person,” as if I don’t conduct my own research or talk about other things. People pinhole you in a box, and that’s all they really want to know about you. 

That’s incredibly frustrating. How do you engage with people and speak out in a way that asserts your individuality as Brandeis, not just a Black woman in tech or a Black woman at a college? How do you attest to the importance of your work? 

Hopefully, my reputation precedes me, so I try to produce good work and let it speak for itself. I also do my best to work alongside a lot of collaborative teams, whether it’s in programming, scholarship, or any other kind of events that I put on. As an instructor, I try my best to convey the practical realities of being in computing and how computing could harm different populations. 

On that note, I weave in ethics, as well as the concepts of diversity, equity, and inclusion. In my research, I draw attention to how the lens of computing may look different for marginalized groups and those that are underestimated. I try to zoom in on how data and interpretation of that data moves through the system for someone who is marginalized. 

In particular, my Black Twitter work might seem frivolous because I’m gathering tweets about the Oscars, but it’s not — it’s looking at a large organization that is there to promote a certain caliber of work. If those the organization promotes are too homogenous, it, therefore, marginalizes others. My question is, how do we best quantify and qualify that? No matter what I invest my time in, I seek to find a holistically-made lens that allows people to examine the intersection of race, class, gender, and computing. Hopefully, my methods impact somebody, whether they are listening, taking one of my classes, or reading my work. 

Definitely. Making sure data and AI don’t come from a homogenous perspective is also an important conversation to have — can you elaborate on that, as well as how it relates to your work with DataedX and some of the other projects you’ve been a part of? 

I thread in the ethics of computing work throughout everything I do. I’m always looking for the literal pros and cons of tech, no matter what I investigate. This includes my Black Twitter work, where a couple of former students and I were gathering tweets about the Oscars. We noticed the prevalence of mainstream ideals and the minimal representation of Black+POC narratives. That raised a lot of questions about intellectual property (on the tech ethics side) and construction of sentiment analysis methods (on the tech side). I’m still exploring how best to approach it. 

That’s why a lot of the collaborative work in this space that I participate in is crucial. Bringing in people from other fields, like sociology and anthropology, can help with understanding semantics and the interpretation of those seman. If the semantics are wrong, then the sentiment will be wrong, and I think that is the core of the issue. But, how do I do that responsibly? Again, it goes back to the ethics question, as well as whether or not there is homogeneity in the construction, design, and delivery of different algorithms (be it AI, machine learning, predictive analytics, or anything in between). 

My research focuses on data engineering — I call it my “tech jam”. I enjoy examining how you collect and organize data while highlighting a host of procedural and ethical problems. The data science field has become a natural extension. It aligns with my continual curiosity about how data and tech impacts communities it's intended to serve. When moving through the data work process, these issues compound. I want to explore how we are supposed to make those choices and discuss biases within the system, as well as answer the question: where does technology’s usefulness stop, and where do more humanistic views need to intercede? 

These questions were part of the impetus for my business partner and me to build DataedX. We’re approaching data science instruction in a different way. We’re developing paths for individuals, small groups and businesses to advance their data science capabilities. Our DataedX Club is launching soon for individuals to get started, stay informed and share insights with others about the data science space. We also design customized webinars, fireside chats and workshops. 

Our work showcases how people, data and tech interact. That’s a conversation we need to be having more on the ethics side — it’s not just tech up until a certain point, and then it becomes a people concern. I think there is a relationship, a hybrid approach we need to take moving forward. Hopefully, we can get more tech people to be aware of that and understand it, and I aim to be part of that conversation. 

I think that’s really interesting. We discussed how people put you in boxes, and now algorithms are doing the same thing. How do you incorporate intersectionality into such a complex algorithm? 

The question is, how do I be in tech yet shine a light through all of its holes from the inside out, and then work to fix those holes? I’m still on a path of discovery. Once I identify a gap,  sometimes I find an entire body of work in another subject area or I turn around and see another gap. 

I’m figuring out how to handle intersectionality when it comes to tech and data work. Two approaches I’m taking: learning more and talking to lots of people to get different points of view.  Understanding where tech can be good and where it can induce harm evolves at the pace of tech. It’s also important to remain open-minded and curious about how data moves through our digital systems. I don’t know how often people are thinking about that. The industry likes to focus on optimization, making things faster and more efficient, but I don’t know if faster is better. I really question that notion. 

Faster might ultimately mean more compounded errors and irreparable damage. Unfortunately, we’re already almost in this place because systems that have existed for decades have gone unchecked and unregulated. 

GDPR is one stab at it, but there are challenges with that, too. How do you enforce GDPR? What are effective ways to do so? There’s much yet to be done in this space. I believe it’s a matter of ensuring that people in the tech world who are developing these softwares be more conscious of how and why what they are building impacts the rest of society. 

I agree with you. Technology is moving so fast, but we’re not really thinking about things like neutrality and intersectionality. You mentioned that this is still a learning process for you — that sometimes you discover one hole only to find another immediately — so how do you deal with the knowledge that you’re going to get it wrong sometimes? 

I think you have to stay humble. I think one of the perpetuating stereotypes of someone in computing, besides being male and white, is that you’re somehow unconcerned about the human condition and believe you’re always right. 

That’s not my posture at all. I want to have a conversation and know if I did implement something incorrectly, and what the ramifications are. That’s a different part of the culture in computing. As computing becomes more pervasive, though, I see a growing number of people entering the industry from backgrounds other than math; people who have not traditionally participated in computing. They’re driving the conversation in the right direction. 

So, for me, it’s essential to be humble and talk with people about being open. I believe the fact that I teach and train folks is beneficial because I’m always trying to hone my communication skills: sometimes, I communicate concepts well to people, and not so well to others. Paying attention to how people digest content is a good indicator of whether you’ve done something right. 

That’s the reason why I ran this independent introductory data science program. I worked with a great group of Black women interested in data science who want to learn more in a safe space. I’m doing my best to provide the necessary safe space because breaking into the data work area is not easy. It is still very male, very white, and mostly middle-aged. So, for Black women to say “Oh, I want to do analytics,” or “I want to do AI,” or “I’ve been doing this stuff for X number of years,” you’re still invisible, not counted, and fighting to be seen and heard. There’s a lot about staying humble in doing this work. 

That was really well-put. So, tying this all together: what do you predict, such as upcoming trends, will be important in the next year? 

This is a tough one. We’re in the midst of a public health pandemic, economic uncertainty and on the cusp of making meaningful progress in dismantling U.S. systemic racism. The reckoning spans political, economic, medical, racial, sexual orientation, gender identity and social class boundaries. These disparities and the role of tech in them are more than trending topics and aren’t being swept under the rug any longer. 

What I fear is the ramping up of disinformation — intentional dissemination of misleading information to perpetuate particular untruths for specific benefit to a weaponizer. Disinformation comes primarily in two forms: blatant (overt) and implicit (nuanced). The blatant disinformation messaging can be fact-checked and called out. The nuanced disinformation messaging, however, is the most dangerous. It blurries the lines between truths and falsehoods for people. 

Eroding the identification of what’s true and what’s false leads to deeper polarization. If weaponization campaigns are successful, it will further entrench our disparities through suppression, oppression, manipulation and misuse of data. Computing, as a field, has to evolve in response to our new society of tech awareness. There should be more regulation and thoughtfulness put into how technology impacts those who are part of the system.  

I don’t know if computing will respond by ushering in a cultural shift to help mitigate disparities in tech. I’m hopeful that it will be a trend that turns into a tradition. We’ll see.

——————————————————

 

Brandeis Marshall, PhD is a computer science scholar, educator and founder of DataedX.

Instagram: @bhmarshall 

Twitter: @csdoctorsister 

Website: http://brandeismarshall.com