Episode Three: Evidence

Episode 3 February 15, 2021 00:40:44
Episode Three: Evidence
Mirror with a Memory
Episode Three: Evidence

Feb 15 2021 | 00:40:44

/

Show Notes

If we know that it is impossible for a photograph to be objective, then why do we rely so heavily on photography as evidence? In Episode Three, we speak with artists Lynn Hershman Leeson and American Artist to consider how AI can complicate our relationship to pictures we would otherwise think of as visual “proof.”

View Full Transcript

Episode Transcript

Speaker 0 00:00:05 Welcome back to mirror with a memory, a podcast, exploring the intersection of photography, surveillance and artificial intelligence. I'm your host, Martine Syms. You've probably seen a deep fake video by now. The viral ones are usually of a famous person or a politician doing things they didn't really do or saying things. They didn't really say like this deep fake video of Kim Kardashians by the artist and researcher, Barnaby Francis who works under the pseudonym bill post, when there's so many haters, I really don't care because their data has made me rich beyond my wildest dreams. I know you can't see it, but trust me, the effect is uncanny. The vocal impressions sinked with Kardashians, mouth face and mannerisms deep fakes are big and pornography now, too. And disinformation campaigns and various types of attempted fraud. It's essentially a mapping process. A well-trained AI reads two videos, one of the celebrity or politician or whoever, one of the deep faker mouthing out what they want that person to say. Speaker 0 00:01:12 The AI tracks the motion of the lips and face on both looking for patterns. Then digitally combines them. There are full body versions of this too, that results in things like Bernie Sanders, dancing to lady Gaga, though, these tend to be less convincing. There are a few different ways to fake the vocal component and analog recording like Jordan peals spot on impersonation of Obama, or more recently an AI that collects data from your voice and reconfigures it to create entirely new sentences. The videos themselves can be wonky. Sometimes you can see the image separate and the algorithm struggled to keep up, but some of them are quite good. And the technology is only improving part of the problem. The threat is that people trust videos and they trust images, even when they know they shouldn't, even when they know that photography has been manipulated since the earliest days of the medium. So there's a paradox we want to look at here when it comes to thinking about photographic evidence, we know these problems yet. We still rely on images and videos to prove that something's real. And increasingly corporations and governments are relying on artificial intelligence in these ways too. But before we layer in the digital, let's start with the original, deep, fake, by an artist who is thinking about so many of these things long before anyone else, Speaker 1 00:02:39 Lynn Hershman, Leeson. And I work in many media, an artist and a filmmaker. And I use a lot of computer-based technologies as well. I'd like to think of working at the edge of things, kind of in the blur where you don't really have disciplines or definitions, but poorest edges. And I think is the poorest edges that create what is an identity things that really isn't obvious. Speaker 0 00:03:07 I'm constantly in awe, honestly, of everything that you've made and been working with for so long. And it's very inspiring to me. So I feel like I'm a real Lin nerd. I've read all the books I've watched most of the films and it just constantly bringing up issues. Speaker 1 00:03:28 That's very kind of you, I feel like I'm inspired by your work as well. And next time I need to know something about myself. I'll know who to ask, but, uh, I very happy because for years or for decades, my work was really unknown. And in fact was not even seen until about six years ago. Most of what was never seen and it just needed Martine's generation and the millennials to be born. So it's like it's been waiting their whole life for you. Speaker 0 00:04:00 That's what it feels like, honestly, cause it really did feel like speaking directly to the concerns of my lifetime and my generation Speaker 2 00:04:08 <inaudible> Speaker 1 00:04:15 To me, surveillances nice, essential underlying core, kind of the spine of what one works with when they're dealing with identity. It's often the leftovers, the discouraging things to tell the most. So artifacts and discards, I think are evidence that are really crucial in understanding who we are and what our true identity is rather than the one that we project to the outer world. Speaker 0 00:04:41 That seems like a good place maybe to start talking about Roberta Brightmore and maybe just explaining who was she, how did she come to be? And the project like around this idea of evidence, Speaker 1 00:04:55 Most of my important work, or I should say my most important work comes out of the cusp of disaster. And I had had an exhibition at a museum, my first museum exhibition, where I used sound and media and my exhibition almost in the middle of the night was evicted. It was thrown out because of what I was working with was an art. And I thought, well, who needs a museum? And went out and rented a room in a cheap hotel and set up many of the things that were in the exhibition, which were really the evidence of a woman who could have lived there, but didn't, so it was all the artifacts from her classes to what you read to the clothing she bought to discarded bills and records of her life. And the room was open 24 hours a day. You just got a key and went in and checked it out. And as I was sitting in the room, I was thinking, well, what if you could create somebody or a person who would exist in real time and in real life, but was fictional. And so she emerged from that project. And what I did first was to think of what the archetype of a single blonde woman living in the seventies in San Francisco would be like, and then adapted her life to what I wanted to propose her first entrance into the world was 1972. And she was exercised in 1979. Like Speaker 0 00:06:25 Literally exercise in a crypt in Italy, in an act akin to a sort of ritualistic cremation. Speaker 1 00:06:33 And this character Roberta existed kind of as a performance first with myself because nobody else would do it. And eventually there were three multiples that would go out and interact with the world. For instance, taking ads out for roommates, going out to meet people in public, where I had the surveillance photographer and video and everything was recorded. All of her meetings were recorded. Everything about her and who she was, became a way of understanding her. And only after she was exercised, you could go through these hundreds of documents of information that let you Reaper form just by going through the material, the experience of that archetypal person and culture. Speaker 0 00:07:20 And what did you learn at the end of Roberta's life about that persona or about that woman? Typical woman in San Francisco in the seventies, Speaker 1 00:07:31 Roberta was a mirror for culture and she reflected the society she'd lived in, I should say also that, you know, she saw a psychiatrist, she had a driver's license, she had credit cards. And if I had done her six years later, I would have been arrested for fraud, but this was pre-computers there. I was able to get away with it. Speaker 0 00:07:48 There's a way that this evidence and the video, the photography create her reality. Like then that becomes this real person in a way, Speaker 1 00:08:00 Again, you know, what is fiction? What is real? I mean, media creates a story and a narrative that often serves another purpose than truth. And I think sometimes you have to use these fictions in order to get to the underlying truth of what's going on. Yeah, absolutely. To stand in for reality, essentially. That's what I'm interested in is that blur, you know, was she real if she represented what was going on, maybe more real than an actual person that didn't keep the records Speaker 0 00:08:38 You were talking about where identity is kind of formed in this blur. And I'm kind of thinking about how that's expanded in a way, like how all this data that's created though. It's, it's not really our identity. There is an intersection that, so I guess I'm thinking Speaker 3 00:08:56 About the surveillance you use with Roberta to create an identity and how that relates to your more recent concerns, like how it's grown or how it's changed. Speaker 1 00:09:06 Surveillance is a kind of witnessing of an event happening in various forms. So it could be an analysis of the discards of somebody's life that usually earn paid attention to, or could be photographic or video records of any Phantom actually in progress and in process, as it happens, these are things that are collected and that I analyze afterwards to see what was really there. Just like when I shoot a film, I find that the real truth is outside the time that you see cut modern surveillance. To me, it's more invisible, which makes it even more dangerous and more perverse because there are things like algorithms that are traps and threats, their biometrics, the history of our DNA, the tracking of your choices online, the access people have to your digital identity, how your digital identity is sold, how it could get you in trouble, how your identity could be false and keep you from getting an education. Speaker 1 00:10:09 If you can't get alone or buying a house because the information under your digital profile is something that, that isn't real. And since you don't know it exists, you can't even fight it or argue for yourself. So that's why it's so dangerous. I think this kind of perversion of control of your identity and who you are by access to your online profile that people eagerly provide in the simplest way, just by putting in your email, you get all this massive amount of information about who you are, what your choices are, even who your mother's friends are. So these are more sophisticated ways of calling identity in the present, much more sophisticated than existed in Roberta's time, all these things that people, normal people don't have access to or ability to analyze, become the witness to who we are at the deepest. Speaker 3 00:11:13 My name is American artist, and I'm an artist educator as well. And really what I like to say about my practice is that I'm exploring black labor and visibility within network life. And what that's looked like for me is studying the migration of black Americans within the United States, the history of race in America, but also looking at things like the history of Silicon Valley as a continuation of forms of settler colonialism within the United States and sort of thinking of these different practices of captivity, as well as exploitation and thinking of how those have a legacy in contemporary American life. What's really compelling to me about the Roberto Brightmore project is that it sort of only exists through evidence and that like the only way she could really create this person was through these IDs, these photos, these other elements of evidence. And that's really interesting to me because of also how often a lot of the evidence that's created around us in how we circulate is often out of our control and maybe depicts us in ways that we don't want to identify with. But in that process, that's sort of how identity becomes recognizable is through these forms of evidence Speaker 0 00:12:40 In a gesture, engaging the evidentiary and similarly active ways. American artists legally changed their name to American artists in 2013. Speaker 3 00:12:50 When I changed my name, what was interesting to me was when I went to this hearing and they sort of asked me why I wanted to do it really, their only concern was that I wasn't evading some sort of legal repercussions. You know, it wasn't changing my identity to escape something. And besides that, they really couldn't care less why or what I was changing my name to you. It was interesting to me how my capacity or desire to exit or escape this system might have been read in that one moment. But for me it was about many other things. One of them being how to sort of manifest a reality of what it means to be an artist by simply stating it. You know, if I say I'm an American artist, then suddenly I'm an American artist, but also in the context of let's say digital systems, the way that it's not really recognize as a name has been sort of a continuing aspect of that, you know, it's like my SEO is kind of non-existent for a while because of that. Speaker 0 00:13:56 As in search engine optimization, how you or your website is ranked in search engines like Google based on factors like domain, age, website, security, how many times you link to others? How many times others linked to you? Speaker 3 00:14:12 I think that that has been valuable to me and being able to have some amount of anonymity within the context of digital space. Um, while at the same time in real life, it's the opposite. It's like an extremely unique signifier, you know, as a name it's like not something that's easy to forget. Speaker 0 00:14:36 Cambridge Analytica scandal broke in 2018. This was the revelation that Facebook had handed over all kinds of personal data to a political consulting firm for advertising purposes, all without anyone's consent. As we've all learned more about how this works, how the data we give to the internet is harvested, how the internet itself never forgets more and more of us are considering how we live our digital lives. We asked our friends, colleagues and members of the general public, how they're navigating all of this. Here's what they had to say. Speaker 4 00:15:11 So for the question, tell us about your online presence. If I searched for you, what might I find? Honestly, very little. And I think that's also by choice. Speaker 0 00:15:21 My approach to posting online content has changed in the past five to 10 years because of what we are finding out about Facebook day to day and how they're collecting information from us and how it's kind of very difficult to limit the type of information that they collect. My approach to posting online has definitely changed probably mostly in that I'm not posting things that are as personal. I used to use Facebook and I used to just put up a Facebook album anytime I went anywhere, you know, any time I had fun with my friends, I would take a million pictures and I would upload them all. I would say, now my approach is a little bit more curatorial. I'm certainly more cautious about posting photographs that might be construed as too intimate, or that could be leveraged against me in some capacity that I might not find out later. I think I'm cautious, particularly as a woman, but just cautious about what I wear or what I'm holding, if I'm photographed with alcohol. And so I tend to post wholesome content of family and friends and some jokey material, but generally speaking, my life is much less broadcast than it was prior Speaker 5 00:16:41 Because all my social media and websites and everything is public, I'm just under the assumption that everyone or rather anyone is accessing my, my images online. And I don't necessarily have a problem with that. Especially since I'm in a field where publicizing your, your content is key to growth. Speaker 4 00:17:03 It's a track record of who you are as a person in a way, what you post online. So for that reason, I feel like I'm always lying when I'm posting photos online. I never want to be too happy because there's already a lot of that. And I can't be too truthful because maybe the truth will hurt some of my close friends. Yeah. So that's, that's how I kind of see it. You kind of have to, I, my opinion blur the lines of what you post and what you want to share about yourself. Speaker 0 00:17:38 Enter the dignity image, a term and concept American artists coined in 2016, they had a solo exhibition of photographic work built around this concept at the museum of African diaspora in San Francisco in 2019. Speaker 3 00:17:52 So before I called it, the Digney images thinking about these images that we sort of have for ourselves that we never circulate online. And the reason I kind of came to that was because I was doing this performance and online performance where I wasn't publishing images in my social media, but rather was posting these same blued out images and redacted captions. And I did that for about a year and it was a critique of social media and how we're commodified into the realm of social media, but also asking those around me, friends and family, to not rely on this projected image of who I am through social media, but rather to insist on and authentic experience of me, if, if such a thing exists. What happened during that time was that I was really scrolling through my phone often looking through my camera roll as a stand in for my social media in a weird way, sort of like surrogate experience. Speaker 3 00:18:58 And I would return to some images more than others images that were important to me that I wouldn't share online. And I just thought about how the personal images that we save have a significance outside of this system of valuation and attention economy that we're expected to participate in in a certain way, meaning that in this moment in time, it feels like if an image isn't fit for social media, then it has no value, but obviously that's not true. That's where this term dignity image came from was trying to obscure some kind of intrinsic value to images that we choose to save for ourselves that we don't circulate, that we don't submit to this commodified experience. There is an amount of political resistance in choosing not to share an image like that. And that's really what I wanted to focus on. Do that process. I asked some people around me to share images with me of their dignity images. Oftentimes, you know, those ended up being photographs of, or with deceased family members or photos where people are embarrassed of how they look or photos that they took to remember something, but not for, you know, a public display of a, but merely just to remember that it happens a lot of these ways of creating and value in images are really outside of what we associate with something like social media, Speaker 0 00:20:31 Still social media is influence persists. Here's an excerpt of an interview from a video work American artists made in 2016 called prosthetic knowledge of the dignity image. When I'm taking a picture, Facebook's always at the back my mind, I can't seem to get, uh, get it even if, even though I might not put any of those pictures online, it's always at the back of my mind because people, um, because it's something that people around me are doing, uh, and the thought is just inevitable. Speaker 3 00:21:05 Part of what I wanted people to understand is kind of how their information is handed over in these contexts and also how we're expected to perform and how it shifts our ability to see ourselves or perform in a certain kind of way. I think that the way that we see ourselves through images, or, or maybe you could say evidence that we're creating with our own cameras and devices is different than how we're sort of categorized or produced as data points through social media or through digital systems. And I do think that it gets a little bit muddied when we're interfacing with digital services. So often you then sort of start to reproduce what is shown to you and what, what is shown to you is not purely an image of what you put out there, but it's also impacted by what these data systems want you to buy or want you to see. Speaker 3 00:22:08 So I think, you know, what autonomy might look like now is much more complicated, and it's also much more complicated to resist or refuse to be complicit in systems like this. And that's something that I'm really struggling with because I feel like it's a conversation that me and my peers have been having for a long time of how to have some resistance towards how technologies and digital systems compromise us or hold us captive, or, you know, have authority over what we create, but it just feels like it's becoming quickly. So thoroughly compromised that it's hard to think of what those sorts of resistance look like. This points towards a much bigger problem as we move from the idea of evidence as proof of self or proof of identity to evidence in the more judicial sense and the dangerous ways that States and corporations are using photography, surveillance and artificial intelligence to produce it. Speaker 3 00:23:09 There's definitely a continuation between photography's surveillance, artificial intelligence. And that for one thing, photography has always been used to produce evidence and thinking about how that evidence then goes to support the presence of an individual or to assert or frame a certain identity, but also thinking of the history of photography in relationship to criminology and surveillance, thinking of things like phonology or scientific racism as that's been used in sort of creating materials, such as books or media that depict, for example, biological qualities of black people as you know, biologically inferior. And so that's something that has always been reproduced through things like photography and surveillance, and thinking about how artificial intelligence then maps onto that. It's sort of allowed for these systems of surveillance and photography to operate with this sort of fake autonomy, sort of like autonomously reproducing a lot of the values that were already inherent within them. And that's something that has been a result of artificial intelligence becoming so broadly used in because it's become much cheaper and easier to produce these types of systems and to rely on them. Then we also want to allow these artificial intelligence systems to make decisions that formerly would have been made by people. Not that they wouldn't have made the same mistakes and people were doing it, but now those same biases are sort of hard coded into the machines that we're interfacing with. Speaker 1 00:25:00 There's me and Lynn Hershman Leeson. Again, speaking about a recent piece of hers called shadow stalker. It was commissioned by the shed in New York for a recent show called manual override, which included my work as well. It returns to this idea about surveillance and evidence. And I guess maybe can you just introduce it, especially in this intersection that we're talking about between photography surveillance and AI shadow stalker is an interactive installation and film that uses projections and algorithms to make visible systems of racial profiling that are increasingly being used by police departments worldwide. You know, it's about the shadows that we create on the internet about information that follows and reflects the histories that we have, like our zip code and all the consequences that happen from our online life. And this is also evidence that's providing statistics and material that become a foundation for the faulty logic that limits our options in the future. When you entered the room, you see a film 10 minute film. The first half is by Tessa Thompson, which really explains some of the dilemmas of predictive policing Speaker 0 00:26:21 Officers that are given maps with red squares, indicating where crime will occur. The algorithms are based on past crimes. Police patrol the 500 by 500 foot square locations looking for suspects. Most often they're in low-income districts, but do these little red squares that are supposed to our reality, Speaker 6 00:26:44 Actually Speaker 1 00:26:46 January steward is the character from the deep web. That again gives further warnings about how to protect yourself. Speaker 0 00:26:54 The flooding, our shows become addicted to deletions and contaminated history insist on being legible, own your profile, take hold of your avatar. Honor, your shadow, hold a tight. It contains your future and your past, and like DNA history refuses to evaporate. Speaker 1 00:27:26 When you put your email into a system, when you enter the installation, it then does a data mining search that brings up all the information about the individual. After the surveillance camera takes your picture. It creates a shadow of your profile. And inside of that is inserted all the information about you that moves through you as you walked through the room with the connected projector. And this is an absolute projector that my team found that allows this to happen. And the individual's shadow fouls them in the room with all of the information that's growing inside of them. Speaker 6 00:28:05 It's a way of visualizing the way people are invisibly tracked. Speaker 1 00:28:10 Absolutely. Mainly to let them know that this is going on. I think that when people see their own information inside their moving shadow, it has a different resonance than if you just say it's happening. Speaker 6 00:28:25 Yeah, absolutely. I mean, I've been so, so angry the last few months. It's like, I'm laughing about it only because it's insane. And I've really been thinking a lot about how do you channel my anger or what I can do with it rather than just be consumed by it, these technologies in terms of their methods of control or have been a parent to me. But I guess I've only realized recently how many people aren't aware of how it's used, you know, how they're not aware of how they're being tracked in this way. Speaker 0 00:29:07 American Speaker 1 00:29:08 Has studied and made work about predictive policing technologies too. So for the exhibition, my blue window, this was at Queens museum. I created this installation where when you first walk in, you're confronted with this large blue curtain. And this Navy blue fabric was meant to reproduce this fabric as associated with police. And this ideology that's really become prominent within police of blue life and blue lives matter. And so you enter into the installation and you go around this curtain to enter into this space where Speaker 3 00:29:42 You watch this large film projection, and you sit on these stadium bleachers kind of like bleachers. You would have at a sporting event. And you watch this dash cam footage from a police car Speaker 3 00:29:59 Film. There's this heads up display that seems to be on the windshield of the police cruiser. And it sort of begins by showing this amorphous blob. That's kind of like undulating. And then above that, there's this text that says forecasting. And what it's doing supposedly is generating predictions around where crime is going to take place. After that happens, then a map appears, and it's a map of this area of Brooklyn, Flatbush, Brownsville, and that sort of area, which is heavily policed, and that's where this video takes place. And so in the video, you see this map, you see the hotspots generated on the map, and this is a sort of based on existing predictive policing softwares. It's not an exact replica, but it sort of takes qualities of different softwares and combines them in order to generate this interface. As the police officer drives around the map, sort of reorients. Speaker 3 00:30:58 And they're trying to go to these places where supposedly crime is going to take place. And in the meantime, as they're driving around these bright blue boxes, that look kind of like facial tracking are blinking onto people that you see in the streets as the car drives by. So you suppose that these people have been identified as potential criminals, and it's really meant to show how an actual police vision might operate. And a lot of the people that end up being seen in the video are black people that live within that area, sort of incidentally, because that's a lot of who lives there and it's not really a coincidence. And so something about predictive policing as strategy is that it's sort of using geography as a stand in for race. And so saying we're determining places where a crime might happen based on a specific zone where things happen and using that to dispatch police, it's not acknowledging the fact that obviously people live in specific places and these geographies are not incidental to the type of people that are being policed and ultimately reproducing a lot of the biases and prejudices that we already associate with policing, as it is today, Speaker 7 00:32:18 <inaudible> Speaker 3 00:32:23 Thinking about this aesthetic of speculative fiction and Saifai, it was important for me to think about this story. That's almost, you know, cliche and thinking about predictive policing, which is minority report. That's also where the name of the application, the phone app that's in the exhibition comes from. So the title of the app is 1956 slash 2054. That first date is when that book was written or published by Philip K Dick. And the second day is when this story actually takes place. And it was interesting to me to think about these two dates in relationship to predictive policing. And then the actual Materialise software is in effect right now, which is somewhere between those two years. It was also compelling for me to think in the context of this story, the way that the minority report actually functions minority report is sort of this small dissenting opinion or within the prediction. There's sort of a larger probability of something happening, but there's always this smaller probability of it not happening. And that's, what's considered the minority report. I thought that was important because in every prediction it's always probabilistic, it's never a hundred percent true or accurate. It's always just a guess. And so there's always a minority report embedded in any sort of prediction and whether or not we choose to see it, or whether we choose to believe that our predictions are a hundred percent accurate is really determining what our future possibilities are gonna look like. Speaker 8 00:34:01 <inaudible> Speaker 3 00:34:05 In this book, carceral capitalism, Jackie Wayne talks about this idea of a crisis of legitimacy and one of police feeling like their relevance is being taken away or like they're losing ground. And that happening in the face of widespread protests around police brutality and calls that the police are racist. So what she proposes is that part of the significance of the reliance on these database systems, these predictive policing systems that use data science is that they are meant to be objective. They're meant to, to allow the police to say, we're not racist because we're using scientific tools, we're using database technology. And the fact of the matter is that that is not objective in any way. And in fact, because it's of how it's developed because of how information is fed into it, it really just reproduces and kind of creates a feedback loop of a lot of the biases and racial biases that are already expected from the police. So thinking of this sort of algorithmic vision is really a continuation of that biological vision or the body-worn camera as it's been called. I just want to sort of like make that continuation really clear, you know, that we're not really entering into a new era of anything, but really just sort of the most recent and maybe the most nuanced version of something that Speaker 6 00:35:38 Has always been present and operated in different ways. Have your feelings towards AI changed over time? I know mine certainly have, and you've been working with it for over a decade longer. Speaker 1 00:35:56 I don't think it's artificial. It doesn't interest me in the same way that it did when we were inventing it it's turned into something. I think for me, my interpretation of it is that it's being used in negative way of controlling a culture rather than expanding on creative potential. So, I mean, there are different ways of using it for patterning, I guess, but I'm not all that interested in exploring what it is because it usually just points out some of the disastrous problems and also reflects the biases of programmers. And doesn't really help us to create a world of preserving the planet or individuals and, or creating their spans of potential that I think that all of this is going to change because I think we're going through a period of correction right now that everything is erupted and we have a chance to revise all of our thinking. And maybe AI will go along with that, where we could use the potential of it in a way that is more enriched. Speaker 6 00:37:08 And what is the potential that you see? Like, what are those possibilities that obviously it's gone in one direction, mostly owned by corporate interests and the state are influenced by that. But what are the other things that brought you to it that you were interested in exploring, and that you think maybe we'll begin to explore more of, Speaker 1 00:37:34 I'm thinking about maybe a global connectivity ways to solve the problems together that we've created that your generation has inherited. I think that we have to think about the symbiosis of life rather than a separation of the wealthy from the poor and the dominance of certain corporations in individuals over others. But as far as agency, I think we have to understand that we have it in the first place Roberta and a lot of my work was born out of the difficulties of an environment and created a way to survive beyond that, or at least to see beyond the immediate difficulties of what culture and society were doing to us. But now we live in a global culture and I think we do have two infant entities in agencies that will allow us to kind of sidestep the inheritance we've been placed into and people younger can create own world. That's going to be based on consciousness and connectivity and generosity and create a place of survival that goes beyond the rational into one of creative spirit that allows us to be human in the way I think that humanity has as an ideal. Speaker 6 00:38:54 I think bearing witness is a really great way of describing your practice. I'm just thinking now I loved how you said you're in the blurb. You know, I think that's a beautiful way of describing what you're doing Speaker 1 00:39:05 When you're young and have a deep trauma in order to survive. You remove yourself from the trauma as it's happening and look down at it as if it's someone else. And I think that I adapted the early trauma survival skills to looking at the trauma for culture, and maybe even predicting the culture because of the patterns I could see in it in a way it turned that, that early trauma into an advantage in order to survive and also to survive a shifting world. I think it's kind of a test of where we are and what we need really is a culture is to understand when surveillance is taking place, what the evidence is, how it could be used against us in turn into something that's more liberated and can be used to defy the technology that exists by creating another one. That's better. I think to be an artist, just to be an optimist, because you have to believe that you could not necessarily change things, but that you have a vision that can inform the world of things that you see that can't be seen in any other way. So I think that survival and being optimistic, no matter what field you're in Speaker 6 00:40:23 Mirror with a memory is a production of the Hilman photography initiative at Carnegie museum of art, Pittsburgh. For more information on the ideas and individuals featured in this episode, please visit coa.org/podcast.

Other Episodes

Episode 0

January 22, 2021 00:01:25
Episode Cover

Podcast Trailer

The Mirror with a Memory podcast focuses on different facets of the conversation around artificial intelligence and photography—from biometrics and racial bias to the...

Listen

Episode 4

February 22, 2021 00:39:17
Episode Cover

Episode Four: Storytelling

In Episode Four, we talk about the algorithmic potential of storytelling. Artists Stephanie Dinkins and Stan Douglas discuss how they use the language of...

Listen

Episode 5

March 01, 2021 00:42:07
Episode Cover

Episode Five: Land

What is the environmental impact of AI on our planet, and what colonial impulses does this technology enable? Episode Five zooms out and up...

Listen