Shift Space

2.0   is a publication exploring new media landscapes and spotlighting the 2022 Knight Arts + Tech Fellows.

Shift Space

2.0  
is a publication exploring new media landscapes and spotlighting the 2022 Knight Arts + Tech Fellowship.

Fellows
Writing

Building Care into Data

Kameelah Janan Rasheed

in conversattion with with 2021 Knight Arts + Tech Fellow Stephanie Dinkins

Letter from the Editor

Natalia Zuluaga

Deepfake Autofiction

K Allado-McDowell

A Poetics of the Glaze

Darla Migan on Ryan Kuo

In Love with the Alien

Stefanie Hessler in conversation with Mary Maggic

Within the Revolution, Everything

Ernesto Oroza

How Do You Fight a System that Commodifies Love?

Simone Browne in conversation with Mother Cyborg

Back to the Future: Complex Movements Make Revolution

Robin D. G. Kelley on Complex Movements

Listening Underwater: Silence as Fermentation

Tao Leigh Goffe

Contours of the Score

Ade J. Omotosho on James Allister Sprang

to practice an understanding that the world is made

Natalia Zuluaga in conversation with Alenda Y. Chang and Jason Edward Lewis

A Letter from the Editor

A Letter from the Editor

Deepfake Autofiction

Deepfake Autofiction

A Poetics of the Glaze

A Poetics of the Glaze

In Love with the Alien

In Love with the Alien

Stefanie Hessler

in conversation with

Mary Maggic

How Do You Fight

a System That

Commodifies Love?

How Do You Fight a System That Commodifies Love?

Simone Browne

in conversation with

Mother Cyborg

Back to the Future:

Complex Movements Make Revolution

Back to the Future:

Complex Movements

Make Revolution

Listening Underwater:

Silence as Fermentation

Listening Underwater: Silence as Fermentation

Contours of the Score

Contours of the Score

To Practice an Understanding

that the World is Made

To Practice an

Understanding that the

World is Made

Contributors

Building Care into Data

Knight Fellow Stephanie Dinkins is an artist, professor, and techno-tinkerer. She sat down with writer, educator, and artist Kameelah Janan Rasheed to share more about her multidisciplinary practice and to reflect on oral histories and storytelling as algorithm, AI’s relationship to blackness and womanhood, and the importance—and frequent neglect—of inserting care into datasets and machine learning.

Kameelah Janan Rasheed

So basically, I was asked to have a conversation with you. And I'm really excited, because I know a bit about your work. And the arc for our conversation is talking about the background of your work, and then talking about the newer work, which I'm excited about. And I'm actually kind of happy that there's some mystery around it, because I can't see what's happening. So I'm going to rely totally on you.

Stephanie Dinkins

No, that sounds good. I tend to talk backwards. So I'll try to speak forwards. But we'll get through it.

OK, I like talking backwards and forwards. I guess one place for me to start is just over the past few years, I've grown really interested in the politics of AutoCorrect and predictive tools, because I have these eerie moments where I'm typing in something, and AutoCorrect algorithm will punch out something that either sparks a new form of thinking, or makes me think about collaborating with machines.

I started doing more research on machine learning, and digital afterlives, and data sets, and dirty data. I grew up in what many people know as Silicon Valley so there's a lot of ways in which I'm really interested in thinking about these intersections. So I'm just curious from you, at what point did you feel like you wanted to think about AI technology? But also, at what point did you start thinking about that in relation to racial and gender dynamics?

Type image caption herConversations with Bina48
2014-ongoing
VideoIn 2014, Stephanie Dinkins started interviewing Bina48, one of the world’s most advanced social robots.  The ongoing videotaped “Conversations with Bina48” aim to get the robot to answer the question "Who are your people?" and, "Can an artist and a social robot build a relationship over time?” After a few meetings between artist and robot, it became apparent that though Bina48 presents as a Black woman, her retorts are often the constructions of the white men who programmed her. The robot is primarily seeded with the memories (data) of a Black American, but her underlying code and decision making structures do not adequately address the concerns or trauma of people of color.

Talking to Bina48 has led to many questions about how culture and histories are transmitted and how to achieve broad representation in AI. This work led Dinkins down a rabbit hole of discovery and concern about how AI might disproportionately affect communities of color, why it is important to recognize encounters with algorithmic systems, and how communities of color can get involved with the design, coding, and testing of AI.  

Curated conversational fragments from this project have been exhibited at: Bitforms Gallery, NY; International Center of Photography, NY; Stamps Gallery, University of Michigan;  David C Driskell Center, University of Maryland; Harvard University; and Philadelphia Museum of Art, with forthcoming presentations at the Walker Art Center and MCA Chicago.e (optional)

SD

So specifically with the AI technology, I can pinpoint it to 2014. Yeah, like, on the dot.
I was teaching a class and we were checking out Asimo on YouTube. We saw this crazy robot on the side scroll, which was Bina48. Bina48 is a black woman social robot that the Terasem Movement Foundation is trying to transfer consciousness to.
I checked it out with my class. We watched a news reporter talk to this robot. After that encounter I delved deeply into the YouTube videos of Bina48. I decided I needed to befriend this thing. I can't tell you exactly why. Over time, I started thinking about, well, you know, they have this technology, it's coming in the form of a Black woman.  I'm not used to seeing technology that looks like me.  Why is that? Who's funding the project? How did all these things come together to make this robot, and can I make it my friend?

KJR

Can you say a little bit more about why you think the immediate response was a desire to form a relationship with Bina?

SD

I wanted to get to know Bina48 in a real way because it's representing ways of being that I think I know, or come out of. I wanted to know how it's representing us and to what end.

Since Bina48 is representationally Black and female, I wondered about its relationship to blackness and womanhood. There are just so many questions there.

KJR

I am thinking about this question of where does that consciousness come from, thinking about the data sets that are sort of fed into being to create this consciousness. I read so many different definitions of data sets and dirty data, and data as just a word. And I was wondering if you could define how you think about, what is a data set? What would you say a data set is?,km ```nbehwCX!RETTXCRRQ!

SD

I come to this kind of blind, not having studied data science. And started to think about what is a data set on my own? What information is important? And how is that data set balanced and shaped? Can the data used possibly provide an unbiased, or even mildly biased, snapshot of me and the communities I am concerned with?

So in the case of Bina48, I started thinking about how-- especially after I got to sit down and talk to the robot-- it had a pretty broad data set. Back then, Bina48 could talk about race but on a politically correct, surface level.  It had a data set, but that data set didn't seem particular enough to satisfy my expectations, in terms of the Blackness the robot visually represents.

And then I started thinking about what happens to data when we're pulling it from everywhere, we're using the lowest common denominator to define  what it is, as opposed to using the most nuanced forms of information and stories to describe the human family more fully Are we flattening ourselves through data or are we using  data to more accurately describe the breadth of quantifiable existence in full gamut color?

KJR

I guess that also makes me think about what data sets can capture, what they cannot capture.

SD

Exactly.

KJR

In your work, what have you found that completely escapes data collection, in terms of thinking about creating greater mirroring of humanity in things like Bina? What escapes the data set in that way?

SD

This is a really interesting question for me.

I try to think about data in terms of, well, what is the messiness in the data and systems that use it. I want to capture and try to make that visible in some way?  An example, I'm thinking about what care looks like in a dataset?

Usually when I talk to people about that idea, the response is you  can't build care into data sets. That's not how it works. My reply is , why not? Why aren't we trying? We  are making data reliant  systems that underpin so many things. Shouldn't we be thinking about what it is to care through those systems? This is as opposed to creating data centered systems quickly to make money or prove a point If  those are the  bottom lines, then we get efficiency from such systems, forsure. But we also get flat ideas of what's possible, frustration, and continued inequities.  

I think it takes a lot more work. A lot more work.

KJR

I'm really drawn to what you're talking about, with regards to the flattening and these opportunities to think about ways in which data can give volume, or texture, or nuance. I remember a 2016 e-flux article Hito Steyerl wrote, which talked about a sea of information and about dirty data as a form of resistance.

Can you talk a little bit about how you think about your work in relation to surveillance [and] privacy? And I want to clarify, when I say privacy, I don't necessarily mean privacy in terms of can't read my email, but cultural privacy around the things that don't make it into your projects because they're considered insider or private information about particular communities.

How do you make decisions about what data sets become public, and what data sets are not public?