Episode Transcript
Speaker 1 00:00:03 Hi, I'm going to assume you have a smartphone. I'm also going to assume that you're listening to this podcast on your smartphone. And if it's a newer phone, I'm going to assume that you unlocked it using your face, waiting for it to recognize you. What do you make of that technology? Do you think it's useful or scary or a little bit of both? Do you know how it works? You look at the camera and the camera looks back at you. The camera, the phone, the company that made the phone, the people who made the company, they all claim to see you. They see you through a tiny lens and beams of infrared light, and thousands of little dots that chart the shadows and contours of your face. Their ability to see you is how you access your stuff, how you connect with the world. But what happens when they don't see you or can't see you, what happens when they get it wrong? The people who make the phones will say your image is safe, but what if it isn't? And what about all the other cameras out there? What do they see and what do they capture? And where do those captured images go? Digital images are faster, easier. Our cameras are smaller and sometimes hidden.
Speaker 0 00:01:13 Our images are fed into datasets, that power, all sorts of things that we'll discuss. But before we get started, it's important to remember that when it comes to the photographic image itself, none of this is new as always, we have to consider what lies beyond the frame. Welcome to mirror with the memory mirror. With the memory is a podcast from Carnegie museum of art, Hilman photography initiative, exploring the intersection of photography, surveillance and artificial intelligence. In each episode, we'll look at a different facet of this intersection and talk about it with artists and other thinkers and experts exploring this terrain from biometric surveillance to ideas around visibility, opacity, borders, and migration, the environment, and the state. We'll talk about deep fakes, real fakes, algorithmic violence, algorithmic justice, the very nature of truth, and much, much more. We're also making a case for artists as important contributors to these dialogues artists.
Speaker 0 00:02:18 Aren't just using these technologies in their work. They're deepening our understanding of how these technologies function and malfunction what they reveal and what they obscure, where we might need more vigilance and where there might be a bit of hope. I'm Martine Syms, your host, as I've been explaining it to people. The podcast is about this intersection of photography, surveillance and artificial intelligence, but it's also about investigating how algorithms have changed our concept of the image and how they've changed, what an image is and how artificial intelligence and machine learning are changing our understanding of photography, but also how we take photos as well.
Speaker 0 00:03:11 I'm an artist and my work deals with popular mythology. I use a combination of broken samples, strange loops and artificial intelligence most recently to think about how various influences cultural, economic, sociological, psychological bear down on a person in terms of the intersection of photography surveillance and artificial intelligence. I started making, moving image work specifically with the internet in mind as its distribution point. And this was before YouTube or Instagram or social media and sort of web 2.0 as it was called 15 years ago. And I was really interested in that as a kind of cinematic space. And I guess I've continued to think a lot about how computers see, but more how they changed the way we see
Speaker 0 00:04:13 An image is constructed. And it's always constructed even when it's a kind of documentary image. And I think it's kind of like how you parsed what's in the image that has a lot of parallels to the way that artificial intelligence works. And in terms of authenticity, when I look at pictures, I don't think that one's real, that one's not real because from the earliest experiments with the medium, there were always interventions. And in fact, in the earliest experiments, it was so much more complicated to produce the image. So most of the time it was constructed. And I think that representation always has problems. And that a big part of my research is that it, in some ways failed or fails. I think that's when it's most interesting to me, but it's also very seductive and that's partially why I use it, the mirror quality, that's what people are drawn to. They see themselves, they see a reflection of what they're doing, and there's a lot of space for manipulation and joy. Within that.
Speaker 0 00:05:35 For this episode, I talked to my friend, Zach blasts about biometrics, the technology that powers things like facial recognition and other kinds of tracking and security systems. Something wants only seen in scifi movies that many of us now interact with every day in his artwork and academic research, Zach has explored biometric technology and its for years. But before we get started, I want to define some of the terms we're going to be using like artificial intelligence and machine learning and algorithms and datasets, all of which are related most simply artificial intelligence is a computer program or machine that has been trained to do something that previously required human cognition like playing a game or recognizing someone or making decisions or assessing information. An algorithm is the formula or recipe that makes the AI work and machine learning is that process of training the AI to use that algorithm correctly, engineers typically do this using a dataset, a collection of images or sounds or information, anything that you feed into the machine for it to learn from. It's sort of like studying flashcards when you're learning a new language, which is a simile that starts to get at some of the inherent problems in all of this. If you're only looking at a finite set of flashcards to learn that language, you may start to wonder who chose those flashcards and what might they emit AI. Can't really ponder these philosophical questions of who and what gets left out of machine learning, at least not yet, but we can. Okay, here's that?
Speaker 3 00:07:20 Hi, my name is Zach Blas and I'm an artist writer and filmmaker living in London and my work broadly explorers technology and power. I've had a long interest in this for about 15 years at this point. And I usually approach that from queer and feminist perspectives. Over the years, I've taken a particular interest in biometrics, specifically facial recognition and more recently artificial intelligence on the one hand, I'm interested in the questions and problematics around surveillance and all that, that poses. But I also have a nother interest around AI and biometrics, which is more about how it activates certain kinds of desires within us and how that impacts how we might see the future unfolding. I'm curious if Zach, you could talk to us about what biometric technology is, but also how did you get interested in it? I actually got interested in biometrics through drag.
Speaker 3 00:08:29 When I lived in Los Angeles. In my twenties, I went to a lot of drag shows a lot. I participated in a lot of queer nightlife and I think one of the elements of queer nightlife that really jumped out at me artistically and I became very fixated on was the face and how the face was something that was always kind of made over performed, constructed, shifted, stretch, kind of everything you could imagine. And I found that just incredibly fascinating and captivating, you know, on an aesthetic level, on a personal level, emotional, but also politically and intellectually, you know, like kind of, um, made all my senses tingle and because of my interest in emerging technologies and how that plays out socially and politically, maybe around 2010, I started stumbling upon more and more journalistic writing about biometrics and facial recognition and that just immediately collided for me.
Speaker 3 00:09:31 And I found that such a, you know, interesting technology that was beginning to develop and obviously had been developing quite robustly since nine 11, but that the premise of the technology was very anti queer and that it couldn't hold the performative aesthetic ambiguities and excesses that I think kind of marked that queer nightlife. I was part of in my twenties. Now I can just kind of backtrack and say, okay, what is a biometric technology? Well, a biometric technology very basically is some kind of digital technology that analyzes a part or aspect of a body in order to identify and verify the subject. So this could be analyzing one's Iris. It could be analyzing hand geometry. It could be analyzing the face early biometrics very much stayed at the surface of the body. So you would have a digital technology that would calculate right. Something from the surface of the body, whether that's the face, the hands, the eyes.
Speaker 3 00:10:39 And then based on that calculation, the technology would supposedly be able to reveal a core unique individualizing identifier about a person as biometric technologies have began to develop these technologies. Don't only exist now at the surface of the body. So for instance, you have behavioral biometric technologies, which would be something like gate detection, how someone walks. There's also soft biometrics that have continued to develop over time, which would be technologies that attempt to look at the body and determine gender, age, race, et cetera. And now we even have biometric technologies that kind of go inside the body in a sense. So for instance, there are biometric technologies that look at DNA, but again, I think if you had to extract like a very simple definition, I think it would be a digital technology that uses computation algorithmic calculation to identify and verify
Speaker 2 00:11:51 Female age 33,
Speaker 4 00:11:55 Race, Caucasian, and I think background in Russian. Okay. Would you remove your shoes please? And stand against the wall?
Speaker 0 00:12:06 Martha Rosler is vital statistics of a citizen simply obtained a video work from 1977, opens with a woman, played by Rosler as she has questioned, arranged and measured by a male examiner and his male assistant, both in white lab coats, Rostler narrates
Speaker 4 00:12:26 Her body grows accustomed to certain prescribed poses, certain characteristic gestures, certain constraints and pressures of clothing. Her mind learns to think of her body as something different than herself. It learns to think perhaps without awareness of her body as having parts, these parts have to be judged. The self has already learned to attach value to itself, to see itself as a whole entity with an external vision. She sees herself from outside with the anxious size of the judge who has within her, the critical standards of the ones who judge, I need to remind you about scrutiny, about the scientific study of human beings, visions of the self, about the excruciating look at the cell from outside as though it were a thing divorced from the inner self
Speaker 0 00:13:18 Three more assistants, join women this time, standing passively as the examination continues and the subject takes off her clothes.
Speaker 4 00:13:27 How one learns to manufacture one self as a product, how one learns to see oneself as a being in a state of culture, as opposed to being in a state of nature, how to measure oneself by the degree of artifice, the remanufacture of the look of the external self to simulate an idealized version of the natural, how anxiety is built into these works, how ambiguity, ambivalence, uncertainty are meant to accompany every attempt to see ourselves, to see herself as others. See her. This is a work about how to think about yourself. It is work about how she's forced to think about herself, how she learns to scrutinize herself, to see herself as a map, a terrain, a product constantly recreating itself inch by inch room manufactured, programmed reprogrammed controlled the survival mechanism in which one learns to utilize every possible method of feedback to reassert control.
Speaker 0 00:14:34 The examination continues.
Speaker 4 00:14:36 What length is nine and three eight, a head crypto's had height 67 and a quarter.
Speaker 0 00:14:44 A woman gets dressed again. She doesn't put on the shirt and slacks. She arrived in. She's given a black cocktail than a wedding gown and avail. She's experienced a kind of reprogramming over the course of the examination. Her biometrics evaluated her examiners, redressing her to conform to what they believe those biometrics say about who she is and what she wants. Later in the piece, we see a sequence of archival images use to measure women and children for clothing patterns, the images depict white bodies only they read like catalog of parts. Speaking about vital statistics. Years later Rostler said neither photography nor science, nor data gathering where the villains of the piece, but rather it was a critique of the social practices, determined in the formation of the categories, woman and other, which brings us to the origins of this idea that a person's outer shell can be used to determine so much about them.
Speaker 0 00:15:47 This is at the heart of biometrics, whether the imaging is happening through our I-phones or in a 19th century photography studio like that, of the French police officer and photographer I'll fonts, bear to young Betsy young created a system of photography that led others to establish pseudosciences like for neurology and physiognomy, which assigned facial features and measurements to one's likelihood to commit a crime. It's a dangerous idea that we can trace back to the transatlantic slave trade and forward into the Nazi regime apartheid in South Africa and the predictive policing technologies used all over the world today. But like Rosler says, we have to ask ourselves if it's photography or data gathering, that's the villain here, or if it's the corporate and socio-political practices and conditions in which they're made and used. Okay. Back to me and Zack,
Speaker 3 00:16:46 When I began doing this research, which again was around 2010 and I completed my PhD in 2014, AI had not really integrated with biometrics. So when I was doing all this research, you know, facial recognition actually was quite different earlier biometric technologies, you would kind of have preset characteristics or measurements that you're looking for. So it's kind of innocence having like, uh, an algorithm that is like locked down, you know, not dynamic and then not as used to analyze people as opposed to having some kind of algorithm that is constantly training on data sets constantly refining based on whatever kind of faces it's being fed. And then of course, how effective the algorithm operates is based on the training set.
Speaker 0 00:17:40 Yeah. That's the part, I guess when I think about what biometrics often miss it's in the training a lot of the times. Yeah, it's true.
Speaker 3 00:17:50 I mean, it's interesting artistically too, because one of the problems usually is when you train an algorithm and it gets calibrated towards whiteness, then it's going to have a difficult time successfully identifying or verifying something that skews from that
Speaker 5 00:18:07 It's really, that dataset is like everything where you train your machine learning off of, you know, and I've been thinking about it in some ways, it's like a pedagogical thing. Like how am I going to teach my AI baby, what the world is that it needs to know about? And when you're just describing which yes, this kind of documented phenomenon of like failure to recognize non whiteness basically also is just bringing me to this question of visibility and invisibility.
Speaker 3 00:18:41 There is a lot to be said about that. I mean, um, I really like this essay from the early nineties, by this quite quiet centric information scientist who used to teach at UCLA, his name is Phil Agra. And he wrote this essay called surveillance and capture in which he kind of presents a polemic and says surveillance is, um, an actually outdated model from the 19th century. And what we're living in today is could be more aptly termed capture. And for Agra capture is primarily informatic, not visual. I think that's an important distinction that computers are not first and foremost, visual machines, computers are informatic machines, and that information has to be mediated in some way to generate an image. And so this idea of attending to informatics and also this idea of a grammar, which I really liked this word that that capture in a sense is about grammars.
Speaker 3 00:19:36 So we can think about that with a facial recognition algorithm, like what is the informatic grammar that a particular algorithm is taught or, you know, speaks. So then it becomes this question of what potentially falls outside of the grammar can't be captured. So right. If you have a grammar that's based on, let's say white faces, then whatever is not perceived or interpreted as a white face, right. Is not captured by that algorithm. So I think what I'm trying to get at is that I think the word capture evokes this dynamic of is one recognized, is one not recognized. And it has to do with basically right, how the algorithm is constructed and written. Now I think this idea of capture is really interesting framework for people that are interested in a variety of minority Darien struggles, whether those are anti-racist or queer feminist minority is kind of broadly speaking are the ones that often fall out of recognition from these biometric devices.
Speaker 3 00:20:47 So right. It could be like not being able to successfully recognize a transgender person. There is examples of working class people, damaging fingerprints, and therefore not being able to authenticate. I mean, it can go on and on and on, I guess like one of the political questions is what does it mean to be recognized by a biometric technology? And what does it mean to fall out of recognition from a biometric technology? And of course, being recognized by a biometric technology, you know, provides a lot of, you could say benefits of being a citizen of a state, also being able to travel internationally and not being able to be biometrically recognized can present a numerous set of very complicated, very real political problems, whether that is happening at a border when someone is not able to be authenticated. So I guess like what I'm trying to emphasize is I always want to be very careful about not having a kind of romance, like a, like a kind of superficial romance of falling out of recognition from the machine. You know, it's really complicated and there can be a lot at stake, but what I, this is kind of why I think of it as a paradox. It's like, okay, you could be a minority Marian person and you are falling out of recognition from the machine that creates a set of political problems for you. But at the same time, you can have a more, we could say radical or utopic desire to still just get outside, to get out from this paradigm of recognition.
Speaker 5 00:22:22 Yeah. There's a lot of reasons why I love the way that you're making a distinction actually between like using capture. First of all, it's vocabulary and opacity capacity is something I've been thinking about a lot right now. And I think kind of what you're talking about this recognition of, right, there's all these problems that can be created when you're not recognized, but there are also a lot of benefits. This is a very analog misrecognition, but an early or work that I made, that's called notes on gesture after an essay by Giorgio Gambon, that was basically saying some of our notions or knowledge of embodiment moved towards the cinema. Um, and that was part of modernity and part of political consciousness that we start to learn how to move in a distributed way through film. And I was thinking about it in relationship to vine and viral videos and memes gifs, especially, and the way that these kinds of short, looping films of at the time, there was a lot of black women that were being used as reactions. Anyway. So I made this piece and the person who's performing in it is a friend of mine and artists named diamond Stingley. He
Speaker 2 00:23:46 Needs, check yourself, check yourself, check yourself.
Speaker 5 00:23:50 The show opened where this piece kind of premiered. Um, people just assumed that it was me in the reviews. It said it was a self portrait. I dunno, there's just like a mistaken identity. And that wasn't an intentional part of the work, but it sort of spoke to what I was thinking about.
Speaker 5 00:24:12 And for me personally, around that time, maybe in the year prior, like 2014, I had gotten off all social media. So there wasn't another way people could verify like who I was. So everybody just thought I was diamond or thought diamond was me. And she says, still people come up to her thinking that she's me. And I enjoy this kind of invisibility it aloud, or like Cassity because it's just, I think there's a trap of visibility and of recognition of being recognized. And there's a lot of benefits and pleasure I get from those slippages and that it just cracked me up. It still does a lot of people just don't they don't know what I look like, or they don't know what I sound like because I'm often using performers, which to me is just part of a film practice I'm interested in. But because of the way my work is distributed, it works in another way as like some of the assumptions people bring to it, get, I don't know, displayed, shown.
Speaker 5 00:25:24 Yeah, exactly. That's, that's a really beautiful example. Um, I think opacity there's a generosity and a capricious ness to that term that I think really kind of beautifully does the work because with opacity, it doesn't mean one is hiding. It doesn't necessarily mean like one is literally invisible, right? It's more that one has somehow kind of stepped out of the dyad of here's, how we make someone intelligible. And here's how they completely disappear out of that. It's like a very non-binary term, which I find really attractive. And I guess we can also just cite the source since we're talking about it, right? A Pasadena is a philosophical concept was developed by the philosopher and poet, Edward Gleason, who wrote from Martinique in the Caribbean. And one of the things that I love that Glissant says is that opacity is the step beyond difference. And the way that I interpret that is difference is definitely squarely within the terrain of identity politics, because you can only be different if you can identify what you're different from.
Speaker 2 00:26:42 Okay. My name is <inaudible>
Speaker 6 00:26:45 And I'm a professor of cinema and comparative literature at New York university in New York. I am originally from Mali and my work in the last 10 years has focused generally broadly on kind of black, radical thought African diaspora studies individuals like, uh, Edward song, Angela Davis, uh, and they rereading of classic texts of the Negritude movement.
Speaker 1 00:27:21 Eduardo Glissant was a French poet and philosopher born in Martinique in 1928. He died in 2011 and is best known for his work on what he called the poetics of relation, how he can and must connect to one another and to ourselves in a diasporic and post-colonial world. I'll let Manteia a self-professed Lasante and tell you more.
Speaker 6 00:27:45 Lisa is hugely relevant to this conversation of representation of reality, representation of transparency, legitimacy, what image count and what image doesn't count. And what's important about Lisa is that he is a second generation to the Negritude movement. And negativity is like Harlem Renaissance in a way, they had decided to stop talking about black people as a social problem, but to talk about black people through their culture, through the creativity. So black is beautiful if you want, and then Africa become the center of the Negritude movement. If you want to be authentically black, you have to come to Africa, you have to find Africa, you know, those are return narratives and all the things that, you know, go along with that all the way to ask for centricity. Listen, the argument is we are Caribbean. We, uh, after the, not just African we Indians and we wait and this mixture of cultures makes us richer, more complex, more beautiful, but yet here we are hating ourselves because we don't perceive ourselves as African enough.
Speaker 6 00:29:03 So this was a critique of negative movement. And of course that was huge. And a lot of debate came around that. So let me go faster. Lisa started this movement called creolization derealization is really the definition of what I just said. We are clear societies, identities come from counter the confluence of several cultures from Europe, from Africa and from America. And so identity is unpredictable. It's not biological because culturally we keep changing because of the confluence, the, the contact zone between all these different people from these different origins. So creolization was started by Gleason. Through that concept. After that, he came up with a theory that he calls poetics of relation. Listen said with technology, whatever you say about modality, good things or bad things, the world has become connected. The whole world has become connected. And because the whole world is connected, we have to learn beyond the continental way of knowing the world beyond the, the affiliation that is, um, I'm the son of the Sony King. We are his descendant. He said, no, you are related also to the person you meet in America. You are related also to the person you meet in Japan and so on. So this new reality of the world that we're in, where the past, the present and the future all come together and they live similar 10 years later, they're destroying each other, give some call it the new, borrow the new world, the new reality. So ask that.
Speaker 1 00:30:47 Yeah, it's an exciting idea that none of us are fixed, that all of us are constantly becoming, and it runs exactly counter to how biometrics are supposed to work
Speaker 6 00:31:00 Metrics. They compile datas. And of course, you talk to somebody who's a decent audience. Definition may be prejudice against biometrics, but they're compile data. So they go from very simplified notions of life. And then they try to reach complexity with that. And most of the time, as they try to reach complexity with that, and here comes my bias, the human is actually removed out of it. And we're all reduced to types. No complexity actually, whereas design begins with complexity and multiplicity. He says, yeah, I don't know everything about myself, let alone knowing everything about you. Whereas that's what the English and the French used to do. I notice, I notice, I notice, therefore, I understand you, therefore, I own you. So the point is that every relationship begins, you know, passage in chaos that we attempt to create some order around. So when we meet, what is very important to Lisa is that we'll not know the result of what is going to happen from our meeting, because if we did that, then it's like biometrics. Then there is no point in meeting do some basically is interested in this moment of encounter, which is a moment of cares, which is a moment of doubt of shaking of trembling now so much what result out of it, but how that can begin to cement a relation between two people, two cultures and the whole world. So <inaudible>
Speaker 7 00:32:42 Want us to look for here's Zach. Again,
Speaker 3 00:32:49 The first work I made around biometric facial recognition, um, this called facial weaponization suite, and this work was made between 2011 and 2014. And it's a series of masks, a video, some photographs, but the kind of core of this work was actually leading a series of public workshops in different places in the U S and Mexico. They were like consciousness raising workshops. Because again, at that point in time, people really didn't know much about facial recognition. And so we would come together as a group read material together, and we would also try and kind of study, um, biometrics in the local context. So at some point in these workshops, I would end up scanning everyone's face. And then I would aggregate that data in 3d modeling software. And then I would work with actually a group of fabricators in Los Angeles to produce these masks.
Speaker 3 00:33:45 Now, the takeaway, you know, is that these masks ended up looking highly abstract because it's kind of in a sense, a collective face. So the masks would, of course not let you be like biometrically recognized as a human face. And so there is this kind of practical takeaway, and I think the work really got like popularly spun in that way. But I think for me as an artist, I always want the work to have some kind of like conceptual excess. That can't be so easily instrumentalized because I feel like if it could just be instrumentalized than I, then I should just kind of go to your activist work, but I want the art to have something else. And for me, it was about the masks making this demand of a past city with their presence. After that work, I wanted to make a work that addressed more squarely the violence and the kind of structural violence that biometric facial recognition enacts and participates in.
Speaker 3 00:34:46 So I made this work called face cages, and this idea of the face cage came from a feminist communication scholar named Shoshana magnate, who wants described biometrics as a cage of information. And I thought that was really interesting because at once, um, you know, it links biometrics to the prison industrial complex, but it also makes you think about the caging of materiality of embodiment. You know, that a biometric reading can be a frozen, static reading, but a body can continue to change and evolve and develop over time, I worked with three other queer artists. It was hall Supriya Ellie Mirmont and Meesha Cardenas. We were all in California at the time. And we scanned our faces with a biometric machine, took the kind of like grid landmark plotting points of the reading, you know, kind of creates this mesh over the face. Those were turned into these metal objects. And what was really interesting was that when these were put on her face, which should have fit perfectly because that's the whole promise of like biometric reading, they were actually quite painful and difficult to wear. They didn't fit quite right. So the work was really about performing this kind of in congruence between digital abstraction and like actual embodiment
Speaker 2 00:36:13 To put it
Speaker 3 00:36:13 Quite simply biometrics are the world's number one border security technology. So biometric technologies are becoming the default to police and secure borders. And, you know, I think another important reason to think about biometrics from the framework of capture and kind of thinking about the robustness of like, what do they represent is how more and more countries around the world. It seems like every year, make biometric identification a requirement to have an national identity document from a particular country. And just to give you an example, when I moved to the United Kingdom in 2015, the visa is, had just changed to a different scheme. And the visas here are called biometric residence permits. And I had to go to a building in Manhattan where the department of Homeland security is based. And because the United Kingdom basically like outsources, all of its biometric aggregation needs to a third party private company.
Speaker 3 00:37:19 And anyway, just kind of give all of this biometrics before I could even enter the United Kingdom. And then my first year here, the very first time I left the UK, I went to Italy with my partner. And then when we were coming back, the border agent told my partner, they were like, your biometrics are missing. And it was just such an absurd moment. And they actually detained us. And, you know, I'm just thinking, okay, where a white American, um, couple I'm living in the UK being sponsored by a university as a professor. I mean, you know, it's, that's, I think that is a lot of privilege accrued and yet still, it was just this moment of biometrics not aligning in this moment. You know, we don't believe anything. You say that's all to say that the authority and the supposedly truth-telling that biometrics does has gained so much momentum around the world.
Speaker 5 00:38:15 Yeah. Well, it also just reminds me, which I try to talk to people about when, like, let's say a piece of mine is glitching is I'm like, no technology works perfectly. Yeah. I'm always like, does your iPhone ever freeze? It's the same thing it's like, which is crazy. Just listening to that story. Even if the biometric
Speaker 0 00:38:36 Data was there, there's always just going to be errors.
Speaker 3 00:38:40 I think that's a really important point, Martine, because, you know, just always drawing people's attentions to the fact that these technologies break and don't work perfectly, I think is actually something that always needs to be done because you know, so many people encounter these technologies first in scifi movies or television, and there is a kind of fantasy around them about them working seamlessly. And of course, when you experiment with them or you experience them, this can be very far from the case. So it's good to always remind oneself that these technologies aren't perfect, they have their limitations, they break. And actually the good in that is that these technologies that are up for grabs in a sense that like their futures, aren't already written about like how these technologies have to be used.
Speaker 2 00:39:33 <inaudible>
Speaker 6 00:39:40 The image, even though it has this connotation of being real, it's poetic in a sense, the whole point is that if you go into philosophize, to me, let your philosophy be poetic. If you go into the physics to me, let there be some poetry in it. And basically that was his approach to, to the image also in photography and then digital system. Uh, so far we think is going to come and destroy the human. When maybe we'll do something else. Maybe there is something unpredictable about it beyond, you know, creating telephone applications and datas and so on. Maybe there is something else
Speaker 2 00:40:20 <inaudible>
Speaker 0 00:40:27 Thank you for listening to our very first episode of mirror with a memory. In the next episode, we'll continue exploring this idea of visibility and invisibility with Simone Brown, Sandra Perry and Mimi Onuoha fear with the memory is a production of the Hillman photography initiative at Carnegie museum of art, Pittsburgh. For more information on the ideas and individuals featured in this episode, please visit C M O a.org/podcasts.
Speaker 2 00:40:59 <inaudible>.