Episode Two: (In)Visibility

Episode 2 February 08, 2021 00:40:49
Episode Two: (In)Visibility
Mirror with a Memory
Episode Two: (In)Visibility

Feb 08 2021 | 00:40:49

/

Show Notes

Episode Two explores the benefits and disadvantages of going unseen by surveillance technologies. We examine notions of visibility and invisibility in the context of AI imaging systems with author and professor Simone Browne, artist Sondra Perry, and artist and academic Mimi Onuoha.

View Full Transcript

Episode Transcript

Speaker 0 00:00:03 To be positive. You're going to be positive. You all have a great, unique human being. You are a great, unique human being. You all a great, unique human being. Speaker 1 00:00:22 Welcome back to mirror with a memory, a podcast from Carnegie museum of arts Hilman photography initiative, exploring the intersection of photography, surveillance and artificial intelligence. I'm your host Martine sins. And what you just heard was an excerpt from a feature film I made called incense sweaters and ice. That was fave Victor. And the role of queen Speaker 1 00:00:53 Incense, sweaters and ice follows a young woman. Girl is the name of the character as she makes a reverse migration from Los Angeles to Clarksdale Mississippi. And there's sort of two other characters that are orbiting her one WB who only appears through text message and another queen who is a kind of internal amalgamation of outside voices. And a lot of what I was interested in was the mini gazes we're subject to, and how to have some agency within that. And also how being photographed or taking photos of yourself or documenting what you're doing is a way of situating yourself in the world. The character girl is a traveling nurse. So in the film we see her going to the hospital, working with clients on a date, I was really interested in the idea that everything you do, everything that's filmed is a performance. And so visually I wanted to try and represent like all the possible cameras that might see her during a day from her front facing camera, to her being in the background of someone else's camera to the hospital entry and exit, and also this character WB that there's a kind of first-person. Speaker 1 00:02:30 And so we're switching through a few different viewpoints. Speaker 0 00:02:34 Do you think about things in a different way, but I can't really concentrate on them. Speaker 1 00:02:37 Like this moment when WB short for white boy is filming the character at her apartment while she's reading a book on her couch, Speaker 0 00:02:45 But it's like, it's, it's not really about even quiet. It's the fact that you're filming me like an eye icon. Like I'm very aware of you being here. Like that's the, that's the issue for me right now? So just, um, stop. That'd be great. Speaker 2 00:03:14 There are many, I guess, influences to the project, but my interest visually was trying to show how there are different gazes basically, and how sometimes you don't want to be seen, depending on who is recording you. I was kind of interested in showing, I guess, the different intimacies with the different gazes basically. And that that's a big part of my own understanding of photography, of film. I have a longstanding interest in orphan media and sort of people's home videos and snapshot photography. And so all of those scenes were playing a part in this. And I was reading as I was working on the film, which premiered an exhibition at the museum of modern art about this time cinema as I was terming it, which is my understanding of surveillance and the different responses that specifically black women have to them. Because I feel over the last few years though, there's been other times when this has happened, the character of it feels a bit different to me, but people have adopted the fashion, the body, the language of black women, and there's more visibility of black women themselves. I'm always weary of visibility and of representation being like what we want, what I want, because it's so easily, co-opted instrumentalize representation, won't solve everything. There's a lot of problems associated with it. Speaker 2 00:05:05 In this episode, we're talking about visibility and invisibility, how it works and how it doesn't work in AI and surveillance imaging systems, which also gets at something we touched on in our previous episode about biometrics, the benefits and the disadvantages of going unseen to dig into this. I spoke to two people. I admire Sandra Perry. I'm a video and installation artist, and I'm interested primarily in what our bodies are doing in relationship to the technology so that we find ourselves working with on a daily basis. And we mobilizing the potential of representational technologies, whether those be avatars or algorithmic structures, I'm interested in what blackness is doing in those types of, to how Speaker 3 00:05:58 It shows up, how black people find themselves in digital space. My name is Simone Brown and I'm a teacher writer and researcher. I look at surveillance, try to examine it and understand its operation in black people's lives. In 2015, Simone published a book on the second Speaker 1 00:06:15 Object called dark matters on the surveillance of blackness Speaker 3 00:06:19 In dark matters. I set out to put surveillance studies into conversations with black studies and to really kind of think about what happens to surveillance studies. When we center black feminist theory, when we sent her the experiences of black people, and we can try to understand the kind of precursive formations of surveillance prior to nine 11 or some other, you know, spectacular event that ratcheted up a lot of surveillance infrastructure that we see now. And I try to draw a longer history of biometric technology by looking at the practice of branding, enslaved people as a form of identification or a form of punishment. If you just think of biometrics, simply as, you know, bio of the body and metrics as a form of measurement, to see how some of the same type of racial logics play into the current ways that biometrics are researched and developed, where it tries to assume that bodies can be identified in a stable manner. Speaker 3 00:07:27 And that, I guess the key thing that came out of that was this notion of, you know, certain technologies when it comes to things like facial recognition place, the white body as the kind of prototypical body that these technologies are designed to work for. And so what happens for those that fail to enroll, whether that is when you go to a bathroom and you try to use the touchless faucets, or, you know, your face might not be recognized by an automated camera to see that there's something about like escaping that capture that can give us some lessons about another way around or through these technologies. Speaker 1 00:08:11 What I keep thinking about is a realization I had very, very late, which was in public restrooms. They have these automatic faucets and those things like never work for me like ever. And I remember when they started to be like so dominant, like in every bathroom you'd be in and I'd always be like trying to get it to work and see people next to me, like having no problem and be like, what's wrong with me? Why can't I, why can't mind work? Why does this never worked for me? And it was only like years later that I realized part of that was because those technologies have a hard time reading dark skin. And it was really this like, Oh moment, like that's, what's happening this whole time. And I feel like Simone's work is answering some of those questions if like all the ways that something as common and repetitive and necessary as bathrooms, which we already know have been the site of racial conflict in our country with just a segregation. Speaker 1 00:09:16 For example, she sort of uses that technology that we encounter all the time to talk about all the ways that it might miss me as a person, as someone who's interacting with it and why that is and how it is racist in its ideology and the history of those ideologies. And so we might think of something like touchless technology or biometrics as being contemporary, but they have these long standing precedents that bring us to where we are. And I feel like her book dark matters was really informative for me and just illuminating in the same way when I finally understood why I could never use those faucets. I was like, Oh God, it all makes so much sense. Dark matters. Speaker 4 00:10:14 It's been a touchstone for Sandra too. Yeah. I, um, became familiar with Simone's work through her like groundbreaking book, uh, dark matters on the surveillance of blackness. And it became really foundational for a lot of the thinking I was doing at the time, that time being 2018 while Sandra was working toward her solo show at the serpentine gallery in London, which explored technology identity and the black American experience, I was able to invite Simone to come to the exhibition. And we did a closed door workshop with other black artists and academics in London. And it was one of the best professional moments of my life. I have to say the exhibition was called typhoon coming on, which used a Turner painting called the slave ship as a foundational work. The outside of the gallery was a projection mapped with a CGI rendered ocean of the color purple removing the bodies and the ships and the fish of the Turner painting. The entire exhibition was kind of a meditation on how exhausting living in a capitalist world is and what we can do to put it a kink in the gears of productivity that refuses the arrest and the recuperation of the bodies of black people and trying to find ways through the artwork to make it less efficient, less efficient in its readings, less efficient in its ability to make work easy, to make working Speaker 2 00:11:54 Easy though, Speaker 3 00:11:55 Workshop that we did with, uh, Simone and Neil white was kind of based on someones work, working through these types of surveillance modes and talking about those specific mechanisms that allow us to be seen in that Speaker 2 00:12:09 To be seen. I'm thinking a lot right now in terms of there is this unrest that seems that and exhaustion basically with like the surveilled image and its failures. And I'm curious if there's anything about the possibilities of that, I guess, to be more specific, I'm thinking about all the footage we see of black death that often is coming from some sort of surveillance footage or from somebody recording an interaction. And there's a relationship to that with older lynching photography, uh, but the sheer, like amount of it that we're seeing and then the repetition of it being used as some sort of evidence and then the person not getting convicted. There's like that on one hand for me, when I think about like racialized surveillance right now. And then on the other hand, I think there's like this enormous amount of joy and pleasure and fun in digital space. I think the way that blackness operates, there's a lot of like linguistic fun and imagery and just like, well, I think of something like black Twitter, or I think of DeQuan Mimi, I'm just curious to Simone, if you could kind of talk about that spectrum, I guess, as it relates to your research and your work Speaker 3 00:13:36 <inaudible> pandemic time is such an exhausting time, right? It's the, there's no time the days to just like blending into just day and night for me being unable to breathe because of COVID seeing breath being taken away again and again, through these videos, there's something about the repetition of that, that I don't necessarily think that these videos aren't going to make, I want to say many people, certain people, white folks, the state, whoever recognize black people's right to humanity, to life, to livingness. And so when you talk about like, there's also moments of pleasure, moments of what Katherine McKittrick calls black livingness, there's something about like black joyfulness that has, will be. And I think always was that moment of putting a kind of wrench in the system, a kink in the system. And I think that that kind of shows the emancipatory possibilities of this kind of creative innovation like DeQuan or the ways that people are organizing for rest, for joy. Speaker 3 00:14:40 Like, you know, the nap ministry by Trisha Hersey is always reminding us as Sondra does that productivity is painful. We need to rest, but I don't think any type of liberation that could come from these videos of black people, dying of white people, calling the cops on black people for picnicking shopping, whatever it is, there can be no liberation that comes from those things, having it on tape, having it on video doom, scrolling, it is not making it stop. And so I don't really have an answer, but that we have kind of like an ethical relation to our consumption and our distribution of these things. Like we don't have to share, we can still honor still work towards something that looks like justice, but not be part of this same type of practices that Martine just reminded us this very similar to the kind of sharing of lynching artifacts and photographs. And when I say artifacts, I mean, parts and pieces of people's bodies, clothes, these types of things that were part of the making of white identity, white supremacy and community formation. That's I can't say it was a long time ago because it is still happening. Speaker 2 00:15:53 There's definitely, I agree. There's nothing liberatory about that, but at the same time I'm encountering that. I guess there's part of like the feed use, I think community forming. Was that the term that you just used? Yeah, I guess I'm curious, because I see a lot of like social media that I consume, which is primarily black culture and black cultural production. And sometimes it's not even, you know, it's just like deep YouTube videos, but that's where I spend a lot of my time online. I see that as another type of community formation in addition to sort of these family structures and Sandra, I see resonance as with both of those things in your work. So I'm curious about like your own image, production capture and distribution, as it relates to that. Speaker 4 00:16:40 That's a great question. I think it's something I've been mulling over for a while about how I capture images and what they do after they leave, you know, my studio AKA my local connection on the internet and get disseminated. I've been feeling really kind of like protective over the people that I image, I guess I always am, but even more so now that there's a clear kind of like cultural understanding about like how companies like Twitter and Instagram work and how they build their like profit margins, which is through the collection of data from people who use these free platforms, but wind up doing a lot of labor. I had a, uh, kind of like a Finsta Instagram account for, for like two months Speaker 2 00:17:36 Finsta as in fake Insta. So an Instagram account that's private and thus less censored than someone's official presence Speaker 4 00:17:45 Just kind of missed like a connection. And all I watched was like makeup tutorials and like home decor, Instagram accounts, I've realized that I missed it a lot, but I was spending way too much time on it. So I deleted it a couple of weeks ago, but the reason why I deleted my social media in the first place a couple of years ago is because there was a real clear understanding that the platforms that like these amazing activists and people who were doing organizing doing this like really amazing work of, of creating community through media production, even in all of the amazing things that they were making and producing and building were being exploited by these platforms. And the fact that like the same kind of community building was happening on the other side, through like these Q and on conspiracy theories and the rise of the alt-right, um, that's been happening since like 2015 or something like that. Speaker 4 00:18:47 It was just like hard to figure out what to do in those spaces. What they're useful for. It becomes really complicated. You know, I've been really conflicted about what that means and then what it needs to upload to contribute because imaging inside of these spaces can be really regulatory and interesting. And there are possibilities to like really mess with algorithms. I was thinking like, wouldn't it be interesting if every time that you uploaded a selfie, you moved your nose around or something, a centimeter to the left and then a centimeter to the right, and then like moved her lips and to see how that factors into algorithms that are being used to identify faces. I think that would be a really cool way to like engage in being on one of those platforms that requires a kind of rigidness of identity in order to profit off of that data. But I, yeah, I've just been thinking about what it, what it means in the same way that like using those ancestry tests not only gives away your DNA signature, but your entire family's DNA signature, you know, I've been thinking about what it is to even upload images of people onto the internet. And I don't, I just, I've just been kind of mulling it, you know? Speaker 5 00:20:08 No, I don't, I don't really know my name is Mimi and Noah. I am an artist Speaker 6 00:20:18 And most of my work is looking at the implications of a world that has to be made to fit the form of data. And so I find myself asking questions, like what are the reasons that our world is being made to fit these forms of data and who benefits and who loses from that? And in what situations, when I think about that particular nexus, that nexus being our endeavor here, the intersection of photography surveillance and AI, there is this question that just comes to mind. And that question really is what does it mean to be seen? And I use seen as opposed to be known very intentionally here, what does it mean to be seen? Who do you, who do we? And I, I mean like you, whoever is listening to this, who do you want to be seen by who do you not want to be seen by, in what situations does being seen mean that you are also known, just being seen mean that you then that you count in some way, or just being seen mean that you have been, uh, like flattened and reduced Speaker 2 00:21:18 In 2017. Mimi started making a body of work, addressing these very questions Speaker 6 00:21:24 As aggregated as a whole series that really has to do with Google's reverse image, search algorithms. And it is playing with those reverse image, search algorithms to think about power and community and really identity. They're all a little different, but all of them that deal with images from my own personal family's collection images that haven't been put online that have always only been offline in our own kind of archives and taking those images and then putting them to Google's reverse image, search algorithms and seeing which images it says are classified as similar in a lot of different versions of the piece. What I'm trying to do is show these, this image from my own family's archive. And then all of these other images from these other people or different groups who are similar to us and thinking about what does it mean that they're similar to us. Speaker 6 00:22:10 And also what does it mean that this is a grouping, a grouping of people who are similar, who will never meet, who will never know that they have been grouped as similar, who almost have no stakes in it, particularly in one case, I have this frame cluster with I think, 16 different images. And at the center of it is a photo of my mom. And it's a photo of my mom in the late eighties or early nineties. And she's standing in this house, she's wearing blue jeans, she's wearing this like tight black shirt and red heels. She was phenomenal. She looks so good. And she's standing in this house and she's kind of looking out at the camera and the, all the photos around her are photos that have been algorithmically classified by those reverse image, search algorithms as a similar and they range. And I should say also I'm Nigerian. Speaker 6 00:22:56 My mom was a black woman, we're Nigerian Rebo. And so there's my mom in the center and around her, are these women from various different settings and racial and ethnic backgrounds, which I think is intentional on Google's part. And that's everything from like a woman. And what looks like a prom dress sitting against the wall to this child who is standing like in what looks like a nightgown to another woman who looks like kind of a model who's standing wearing this white dress. Most of them are standing, a few are sitting, but all of them have in common that they are a woman standing on her own in a background that feels kind of golden hued. And so there's this kind of similarity to the feel of a lot of them, even though they're, as I said, are slight differences to me, this one is very much about how it makes me feel. There's something odd. It's like this back and forth of. I see it. I do see the similarity. I see why all these people would be grouped similarly. And also, I think it's weird that I, that I see it because I, I know how this happened and I know that they're not actually similar at all that it's just, it's just the images. It is a strange feeling of comfort and dissonance at the same time Speaker 2 00:24:06 Project improve or reveal anything specifically about how Google is training these algorithms, except that the training is active and ongoing, and that it's wrong to think of these algorithms as unchangeable or fixed. Speaker 6 00:24:20 I know that Google runs flirts, reverse image, search algorithms, and it certainly goes through a lot of different processes. Some of them are ones that are an obvious, but like you wouldn't think about because they don't want to return anything that would be seen as like inappropriate or explicit. So there, there are filters. And then also that Google does this thing. That's a bit different than some other historical reverse image search algorithms where it's not just like looking at the pixels purely they're first reading the image and tagging it with something. And then returning images that also like are similar to that tagged in a similar way, as well as have some kind of like image analysis done on them. But the truth is, I don't know, I don't know, 100% I have code that I was running on it and I did it multiple times. Speaker 6 00:25:03 And sometimes the way that those images were tagged changed and clearly it was like this result, something had changed over on Google servers, whatever update that they had just pushed forward. And I was belatedly stumbling upon it in a way that most people wouldn't and, you know, in a way that most of us, when we live our lives, we're not quite as aware how these algorithms that we use or that are classifying us, are changing in the background, knowing that so many proprietary companies hold disproportionate power in the space are doing this work of trying to like almost like reverse engineer, what the terms are and what is valued and what isn't valued and how these different things are working for me. And that piece, I think a lot of it is this, the uncertainty there's so much uncertainty in it. There's so much unawareness that is built into it, whether it's the people in the photographs themselves, who I don't know who, who then they don't know that they're all similar. Speaker 6 00:25:52 And I find that interesting, not just in a way of thinking about power and access is always, but also just in, like I said, almost like a dialogue, almost like a very intimate dialogue that I'm the only one engaging in to classify as always just a fraught act. And yet it is an act that we are constantly doing. We're constantly classifying ourselves with constantly classifying one another constantly classifying the world is tempting to think that passing off that task of classification to machines, to objects, to algorithms, but that actually relieves us of the tensions of it. And I think that what is really noteworthy about this particular moment is that it seems like there is a wider mass understanding of some of that, that, Oh, actually in passing off these decisions, we haven't passed off any, any of the other tensions associated with them. Speaker 7 00:26:49 Here's more of my conversation with Sandra and Simone, somewhat. Speaker 2 00:26:53 We're talking about reminded me, I guess, of your piece. It's in the game. I really love this film and I've taught it a few times. It always gets great response. And I think one thing that people really latch on to is the afterlife of images and of one's own image and how you're talking about this kind of distributed persona. And maybe your use of avatars speaks to that a bit, but could you describe this piece? And I guess maybe it's how you made it like your process rather. Speaker 4 00:27:28 Uh, the, the piece that's in the game is structured around the story about my brother who played division one basketball in college for Georgia Southern university for about two years. And he was involved in a class action lawsuit with thousands of other players against the NCA and EA sports who use the images of basketball players without their permission in March madness, video games. It was settled in 2015 for I think, $60 million Speaker 7 00:28:04 Category. A if the claim administrator determined that you were on the roster of a team that appeared in a qualifying video game, Speaker 4 00:28:11 It was really interesting because my brother and I are really, really close right now, but I had a lot of resentment against him because I thought that his like college experience was just super easy. Cause he was a college athlete. And I remember happening upon the story about the lawsuit and asking him if he known about it. And he had, he already signed up to be a part of the class action lawsuit. And we just got into a long conversation about what our experiences were with higher education. And I had no idea that he was struggling. I had no idea that he'd go to bed, hungry college athletes, aren't allowed to be compensated for their labor. They just get scholarships for education. That's the, that's the justification that they give that, um, college athletes get a free education. So they shouldn't be compensated for their physical labor, um, playing these sports and making these colleges billions of dollars per year. Speaker 4 00:29:10 The work consists of a lot of things. You know, I was really interested in likeness and that's what the lawsuit was called, was like the likeness lawsuit and my brother and I are twins. So I thought maybe our first instances of likeness where each other. And so it revolves around our relationship and then his relationship to the game that images him in some really specific and then nonspecific ways like they get skin tone, correct. You know, but not necessarily all the facial features. And that's interesting too, because in the lawsuit, if the game was produced later in the decade, then you got more money from the lawsuit because the imaging was closer to reality. If you were imaged in a game that was earlier in the decade, you got less money. And because the technology couldn't image the players, which is really fascinating and really makes no sense because if you think about the system of exploitation, like it really has nothing to do with like how you were imaged. It has to do with how your body was used. I made an avatar of my brother. Who's like talking to himself in the piece, Speaker 8 00:30:18 Six, seven, 220 pounds, athletic run, four can shoot a little bit playbill defense Speaker 4 00:30:25 And, uh, use like a couple of versions of your everything. And everything is you for sung by the stylistics like today, I saw somebody who looked just like you, they walk like you do. I thought it was you. I was thinking about this piece for a couple of years, working on it off and on bringing my brother from Pensacola, Florida to go through the game. And there was something that just wasn't clicking until I found out that the metropolitan art museum and the British museum were making 3d photo scans of pieces in their collection and putting them on these web platforms that allows you to download them and kind of alter them or 3d print them. And it started to make sense to bring those stories together, because I was thinking about these objects that exist in these encyclopedic museums and how they're digital being was kind of being co-opted from a open source internet understanding. Speaker 4 00:31:29 And I thought that that was really wild because so many of the objects in those institutions, there's a lot of conversation about repatriation happening. So what happens when the objects they stagnate inside of that space, but the digital rendering of the thing can get downloaded by, you know, a teenager and completely digitally manipulated, like what happens to those objects? What happens to its being. And I saw something similar happening with not just my brother, but all of the players that were dealing with this lawsuit. You know, in a lot of ways, it like brings up more questions than it does answers, but there is something around a course exploitation around what the remnants of colonialism that aren't necessarily remnants because they're still very much present and kind of like the haunting of digital remnants, you know, what that means in relationship to the body, if there's a haunting, then that means that there was a physical presence. At some time, there was something real in the world that existed. And in these cases they still exist. So it's like the haunting and the body exists at the same time, which is really wild. Speaker 2 00:32:40 Yeah, it's crazy. I mean, there's so much with the student athlete controversy as well as just like professional athletes in general and a lot of maybe parallels in language or people invoking the language of slavery in terms of their own exploitation. I guess I'm listening to this and I'm thinking about the ownership of images online, because when I'm just thinking about like athletes are certain bodies, certain body types, this algorithmic gaze, there's always a kind of like sorting and valuation that comes into play. And I think one thing I really appreciate about it's in the game is these kinds of poetic description. But my favorite part of the piece is when your brother is like going through the players and talking about each of them meaty, that's my favorite part. Speaker 8 00:33:32 One of my favorite player is Julian Allen, six, four from Connecticut. It was just an all around great place. Speaker 2 00:33:38 I love basketball. I'm a Clippers fan for those who are wondering, I don't support the Lakers. I never have. So I like athletes. I love hearing athletes talk about embodiment. Basically. They wouldn't talk about it that way. That's kind of a nerdy way of saying it. But that moment to me, in the piece in the game, it's a lot about questions around embodiment and these implications of digital copies of selfhood. And so I feel like this EA sports NCAA basically using without their permission, the likeness of all these college athletes, what I love about that moment is kind of gets to all that. It gets to how they are using these people, their bodies, they're literally using their bodies and the way they move. So they're not only taking their likeness in a 2d flat way, but you know, these games are very realistic. So in Sandy's like playing game, like going through the stats, it's like the players in that like ready player mode where it's kind of like bouncing around like hovering. That's what I like about it is there's a slip of him talking about the player, like as he's playing the game and the player, his teammate, and as he's kind of cycling through them, yes, you become aware of the extent to which their bodies were stolen. Speaker 8 00:35:07 He was really athletic. Um, just cracked the heart. Hardworking, never took a play off Antonio Hanson. We actually went to the same play in his same junior college conference together. We went to independence. I went to Barton County. So I knew them before I actually signed to Georgia. So we talked briefly. Speaker 4 00:35:42 I'm curious, Speaker 2 00:35:43 It's just to hear you talk Simone about that relationship between the valuation and kind of ownership in a distributed platform. Like I guess, you know, images are so dispersed right now. We can't use the same rubrics, but at the same time, those rubrics are applied to us. So, you know, I love when there are slips and when that changes, Speaker 4 00:36:08 It got me thinking of Saundra's piece that uses an avatar. I know when Sandra, you, you told me about this piece and the way that, you know, the avatar could not contend or conform to your own bodily ways of being Speaker 2 00:36:24 This is from a work of Sandra's called graft and Ash for three monitor workstation, which takes the form of three screens, a fixed to an exercise bike that viewers are invited to ride. This is the moment from the film component of the piece that Simone is referring to. We see Sondra's avatar on screen first against a blue chroma key background. The kind that is usually used to composite in images or special effects, the background becomes this red bloody fleshy. See, as avatar says, Speaker 1 00:36:56 Hi there, nice to meet you. We are the second version of ourselves that we know of. We were made with Saundra's image. One of them captured with a Sony RX, 100 under fluorescent lights at her studio in Houston, Texas on April 15, 2016, we were rendered to Saundra's fullness to ability, but she could not replicate her fatness in the software that was used to make us Sandra's body type was not an accessible preexisting template Speaker 2 00:37:39 Back to Simone and Sandra, Speaker 4 00:37:40 I think with both of those pieces, there's so many residences there of what it means to like represent black life black bodies with the limits of these technologies, with the avatar. I think that like technologies are always telling us how they're made, why they're made and who made them when it comes to like these body types that I had access to you in a free program that I was using, I could have had a body that looked closer to mine with teeth that were crooked, all of that stuff, but I, I had to pay the added on. And so it reminds me that normativity is, is embedded in every single thing that we encounter, including imaging technologies, of course, that are created in order for people to like create idealized versions of themselves. So it tells me about what the ideal looks like for these programmers and for society in general, it's not a fat body, it's not a body with cricket teeth, you know, it's, it's not any of those things. Speaker 4 00:38:43 And so it was really necessary for the avatar who is not me, who looks like me kind of, but it's not me. I mean, she really did try. And so it's humorous and it's ridiculousness. Um, but it's also really kind of like jarring to, to know that your body doesn't even exist in digital space. It's like being kind of erased all of the time, but you know, that's also something that as a black woman, I know like pretty like instinctually, you know, that that's going to be the case. Yeah, completely. That's kind of made me think about your work Simone as well, and what you're researching or thinking about now about blackness in digital space and how it fits or doesn't fit. I'm hopeful that there might continue to be something good or liberatory about remaining unrecognized or unintelligible or unseen in these spaces and by spaces. I mean, coming under like the white gaze. And so I always come back to that question, like what happens when blackness enters not only questions of surveillance, but questions of justice, these things have always been important and foundational shaping questions for everyone trying to live in this world in a black life that has, you know, joy fullness. Fulfillness, that's what I want to continue to Speaker 1 00:40:11 Look at. Thank you for listening to mirror with a memory. Next we consider evidence with Lynn Hershman, Leeson and American artists mirror with a memory is a production of the Hilman photography initiative at Carnegie museum of art in Pittsburgh. For more information on the ideas and individuals featured in this episode, please visit C M O h.org/podcast.

Other Episodes

Episode 6

March 08, 2021 00:46:23
Episode Cover

Episode Six: Power

Do we have the power to refuse mass surveillance? In our final episode, we speak with Forensic Architecture founder Eyal Weizman, who explains how...

Listen

Episode 4

February 22, 2021 00:39:17
Episode Cover

Episode Four: Storytelling

In Episode Four, we talk about the algorithmic potential of storytelling. Artists Stephanie Dinkins and Stan Douglas discuss how they use the language of...

Listen

Episode 1

February 01, 2021 00:41:06
Episode Cover

Episode One: Biometrics

Photography has been used as a tool to record our bodies from the creation of the first mugshots in the late 19th century to...

Listen