Episode Six: Power

Episode 6 March 08, 2021 00:46:23
Episode Six: Power
Mirror with a Memory
Episode Six: Power

Mar 08 2021 | 00:46:23


Show Notes

Do we have the power to refuse mass surveillance? In our final episode, we speak with Forensic Architecture founder Eyal Weizman, who explains how artists, activists, and researchers can use the tools of photography, surveillance, and AI to hold corporations, governments, and other institutions accountable.

View Full Transcript

Episode Transcript

Speaker 0 00:00:05 Okay. Speaker 1 00:00:06 Hi, again, welcome back to mirror with a memory, a podcast, exploring the intersection of photography, surveillance and artificial intelligence and the significant ways in which artists are contributing to that space. I'm your host, Martine, since this is our sixth and final episode, and we can start by stating obvious. There are lots and lots of nefarious ways in which photography surveillance and artificial intelligence can converge to give governments and corporations entirely too much power. We've talked about this as it relates to biometrics policing privacy, the environment, the power dynamics can change quickly. And even those without power can become empowered. That's what we ultimately want to talk about. As we bring this conversation to a close let's start with the public and their fears around the places in their lives and in our world where photography surveillance and artificial intelligence intersect. We surveyed members of our community for their personal takes. They had a lot to say, Speaker 2 00:01:12 Who do I think is accessing my photographs online? Oh man. Uh, the governments. I mean, obviously, you know, I think, you know, dudes, you know, you got my first names. You can just Google me. Like they're already weird enough on Instagram. I feel like, yeah, the government and weirdos, Speaker 1 00:01:32 I think surveillance technology is super creepy. The tools that I'm most worried about are honestly like the things that I have in my home. You know, I have a phone and a computer and that's like that. As far as I'll go, I'm never going to get like an Alexa or something. Those things scare me so much. I think it's so creepy. I don't have grand fears around surveillance technology, but I do have a concern that there are conclusions drawn and profiles made about someone that probably wouldn't be fully accurate. Speaker 2 00:02:07 My biggest fear is it's just the fact that very soon, you know, I'm, you're literally going to be on camera 24, seven, three 65 legs, literally constantly surveilled like there cameras in your house. There's cameras in stores. There's cameras on your computer. Speaker 0 00:02:24 That'd be living a science fiction story. It won't be science fiction. It will be science truth. We will be tracked to the point where people will know where we're going when we're coming. They'll even know our patterns to the point of where we're going to be expected. Next. I think we're already there. Speaker 1 00:02:41 We all assume that big brother is watching so to speak and knows everything, but they might know enough to paint a very poor picture and one that you would not like of yourself in isolation of everything else that makes up a human being and a life. We have very little control what is seen Speaker 3 00:03:00 And what is heard. And then what is collected on us. And that's almost scarier than someone or some entity being able to get the whole picture. I would say my biggest concern about surveillance technology is the idea of the photograph, the idea of the camera. And I use that in the expansive sense where of camera can be an experimental vessel to capture data. And my biggest concern with surveillance technology is like, you don't get context, you get a stoppage of time and you get that frame of time and you can only provide an analysis of that frame of time without knowing what happened before. And after. Speaker 4 00:03:48 In our last episode, we heard from a writer and scholar named Arthur Holland, Michelle. I am a senior fellow at the Carnegie council for ethics in international affairs. Let's stay on the dark side for a moment with Arthur and zoom out and up again because Arthur's focus on aerial surveillance is an important consideration in any conversation about power and tech. As Arthur mentioned in our previous episode, the idea of flight and a bird's eye view has enticed humans for centuries, but when it comes to surveillance and some of the concerns we just heard about from the public aerial photography and its increasing sophistication and availability at new layers of intrusion, add an AI and it gets even more complicated. When I was writing the book eyes in the sky, I was looking specifically at a subset of aerial surveillance technologies called wide area motion imagery, which in contrast to a traditional aerial surveillance camera that operates sort of like a telescope. Speaker 4 00:04:54 These cameras can watch a whole city at once. They are absolutely massive. And you would think that that is a tremendous, uh, Survalent power that any organization wishing to watch and by extension control the object of their interest, we would definitely want to have it. It's a no-brainer, but there is a challenge that arises with collecting so much information. And I should say, aerial information is data rich. Someone needs to analyze all of that information. You need human eyes to actually look at every single pixel to determine what is going on on the ground. Obviously, if you have a camera large enough to watch an entire city, you need a lot of people to do that. A lot of eyeballs on the image, which isn't practical, not even for an organization as large and well-funded as the department of defense. And so the instinct has been to turn to artificial intelligence, to do the watching for us, to revert, to machine vision, to generate information from this data. Speaker 4 00:06:16 That's significant because if you crack that technological challenge of replacing human eyeballs with computers that do not get tired, that do not get stressed, that do not get distracted, you are in a sense unlocking the whole Pandora's box of aerial surveillance. We live in a world where a lot of data is already collected about us and on us. And so even though we are constantly generating data, we're able to enjoy a certain degree of anonymity because there are simply not enough eyeballs to track our every move. If that act of tracking is outsourced to a computer, then really anybody who has the power and interest in seeing your every move could in theory, just click a button and do exactly that. If you think about that super power in the hands of say an oppressive government, it's not that hard to see how it could be extremely troubling. Speaker 4 00:07:37 Perhaps the government wants to track a specific political group. All they have to know is the location of a student meeting for a party that they have some opposition to. And from that they can track all of the individuals who leave that party and suddenly get, uh, an automatically generated work map of their adversary. Something that would have taken a lot of time and resources to achieve in an earlier age, it gets to a broader point, which is that at the moment, the biggest threat to privacy and civil liberties is not so much the collection of data in any format, but the automation of the processing of that data, that's where suddenly you, you can see surveillance begin happening on a truly industrialized scale, which in, in the wrong hands is purely a nightmarish prospect AI's role in this type of data collection and data analysis is significant and scary, but Speaker 1 00:08:56 It can come into play when reversing these power dynamics too, as does a term I learned about through the work of one of our episode, two guests, scholar, and writer, Simone Brown, I first came across the term suveillance in Simone Brown's book, dark matters. My understanding of surveillance is that the Sur means above or over. And it's why we think about it as a kind of state like corporations or the police, or, you know, whatever evil Corp pan Opticon the valence part is from <inaudible> to watch or visually and Latin to keep watch. So then surveillance in contrast SU means below in contrast to the state. So that means like peers watching or keeping watch recording. So that's people around you at a protest, let's say filming, that's a kind of surveillance I'm interested in surveillance because it is a kind of <inaudible> of images, typically surveillance, you know, immediately when you say surveillance, you have an image that pops up, you kind of think of this closed circuit television, there's an angle, that's a part of it. There's a way you see the cameras and it sort of provides like an official record. I'm interested in more subjective experience. The lived experience surveillance kind of gets to that for me, I'm also interested in embodied photography and embodied filmmaking and surveillance is often handheld closer contact. There are certain formal qualities to it. Speaker 1 00:11:04 Back in episode three evidence, we talked to American artists and artists and educator who explores black labor and visibility and the history of race in America through their work surveillance is a concept they've thought about a lot as well. Speaker 5 00:11:20 This notion of surveillance as the opposite of surveillance in that it's those without the power looking up and observing the oppressor to put it really bluntly or these acts of resistance by observing. I want to also talk a little bit about that idea of dark surveillance, because it's a really interesting idea in that it's thinking specifically about race relations and racism and the racial gaze and adding this there dimension onto this plane available, which is a racialized one, this act of dark surveillance, being a sort of act of resistance against racism, through acts of letting others know, particularly, you know, letting black people know when there is an imminent threat of captivity, you know, so different means and tools of creating that awareness. So maybe like a really obvious example is the green book. That was a book used to tell black people that were driving, where it was safe to stop. You know, so tools such as these that are used to sort of undermine particularly along the lines of racialized gaze and the potential of captivity. And I think it's, it's a really captivating idea and to think about valence and all forms of valence, uh, with this additional layer of racialization and thinking about how that complexifies our entire relationship to violence and that it's never just a mundane or single directional thing, you know, there's always these other layers to it. Speaker 6 00:13:10 Surveillance can also be a way of true seeking by aggregation and consensus, which feels especially important in a moment when the very notion of truth is under assault to dig into this. We turned to forensic architecture and artists, activists, technologist group that's at the forefront of using photography surveillance and artificial intelligence to formalize and verify collective acts of surveillance, holding States, corporations, and institutions accountable for their misdeeds. I spoke with forensic architecture, founder, ELL vitamin about all this. Speaker 7 00:13:45 My name is ed vitamin. I was trained as an architect and gradually drifted into human rights investigations. I now run an agency called forensic architecture, and we are based primarily at Goldsmiths university of London, but also in different places worldwide. And we undertake investigations into state violence, state and corporate violence. I guess I kind of started forensic architecture. Well before the agency was established, I was working as an architect in the human rights movement in Palestine, a network of many closely knit organization, Palestinians international in Israeli that oppose Israeli colonization in Palestine and the human rights violation that are associated with them. I said, I did it as an architect. That means that I was mapping out the Israeli settlement project in West bank and Gaza, the way the settlements were built, I noticed to control and survey and actually limit the sort of Palestinian habitat within those areas. Speaker 7 00:15:01 So I was looking at human rights violations through architecture and understood that architecture is not only a discipline for building and designing houses or cities, but actually a field of knowledge and a way in which we can experience the world and understand it and represented and wrote human rights report that effectively criminalized, if you like the work of Israeli architects, building in a settlement or working with a military and producing evidence. And that was all throughout the early two thousands around 2010 other kind of developments manifested themselves. Firstly, there was all of the sudden new type of evidence that became available to human rights investigators. And these were images and videos that were posted by the people that experienced violence firsthand and posted them all sort of like social media websites like YouTube and Facebook. And the problem was actually to use this very raw material, very heartfelt, very experiential material, but that material was a raw material that had to be synchronized and had to be located and had to be verified. Speaker 7 00:16:26 And slowly I realized that in order to analyze this media floats, we were swimming gradually swimming in. We need a architecture to help us do that. So architectural models became the medium within which we could locate those videos. And those photographs imagined something happens. An incident happens, a police brutality, Israeli police shoots unarmed Palestinian somewhere and on the scene, there are anything around between a dozen and two dozen cameras. Most of them switch on after the incident happened, but still capture important things. But some of them might have been filming in another direction or something else at the same time that the shots were fired and captured either the sound of the shot or sometimes rarely, but sometimes it happens. They capture the person being shot or the person doing the shooting. But what one needs to do is to establish relations between those. When we look at each one of those videos separately, there is a limited amount of information that we can draw from it. Speaker 7 00:17:46 Investigation start when you build relation between videos, every video is like a doorway to another video. In every video, you would see a detail that could help interpret or understand what exists in the video next to it. So for example, one video would simply show us people running away from the scene and other video would show us a Jeep coming in a military Jeep coming in another one would just capture a drone in a sky. And the other one would be off the person being hurt perhaps on the ground. In other video would be off of another bystander trying desperately to call for medical services or for an ambulance. And you need to build them together into an image of what has happened. Then you need to look at all those perspectives simultaneously and to understand the relation between video one, video two and video, and to a certain extent, architecture, all of the sudden became like an optical device for us because the minute we could build very precise three-dimensional model, we could locate each one of those videos precisely where it was in the model and precisely the point of view of the camera in play and show how all the cameras are moving in space in relation to each other and start navigating from one video to the next, that is to say architecture becomes an optical instrument, the only way to make sense of the multiplicity and variation within this model. Speaker 8 00:19:32 So, Speaker 7 00:19:32 So that is really how we started. And when we started around 2010, 2011, really your head three or four videos, if you were lucky around each incident today, when we are working with groups of protestors in Hong Kong, or with groups of protesters in the U S in the black lives matters protest, we are sometimes confronted with hundreds of simultaneous videos that are not several seconds long anymore, but could be two hours long. There could be like live feeds and humans. Vision is essential to that, but in order to interpret them, you need help. We cannot employ hundreds of people to look at hundreds of videos every day. We need to automate part of that process. And the way in which we automated is by training machine learning classifiers, to do the first triage. If you like the first kind of pass over that material and tell us, do you see a tear gas canister, or let us know in which frame, amongst all those thousands of videos, you could see a particular tear, gas canister, a particular gun, a particular type of vehicle, or somebody being hurt, or somebody shouting, et cetera. And then that would pass on to a human researcher who would actually verify and stitch together with other videos. Speaker 6 00:21:03 I was thinking about just when I was going through your work is this question of verification, especially in light of, for example, the Brianna Taylor verdict, or many other verdicts before then, where you do have a lot of evidence let's say, or you have video, or you can verify this information. And it seems to not matter, I was just curious if you could talk about your process of verification, but then also this insistence upon the truth. I loved how you described truth as a common resource. Something that we need to understand our place in the world and as it's in constant assault lately, um, I thought that'd be a good place to start. Speaker 7 00:21:47 This is really at the heart of our practice. So initially I would start by saying that when I say to colleagues at Goldsmiths that I wanted to, or that I've just established a forensic Institute, people go like, what are you wanting to work like the police isn't that what we are always almost instinctively in social movements against not that kind of authority. And indeed it's completely against my instinct to do that because in a way that I grew up in the anti-colonial movement in Palestine, your official truth, capital T was what the state was saying to us about the fact that there isn't really a colonial project. There isn't really an occupation. It's not an occupation. There are no human rights violation. The Nakba that is to say the expulsion of Palestinians from Palestine in 1948, didn't really happen. People just ran away. And I realize that in the sense, if we are to engage in truth practices, let's say it's not enough to just contest the facts that the government establish, but to think about different ways of arriving at the truth at arriving at facts, because the way in which we are used to receive facts is really by trusting the institutional authority of agencies that are certified to do that. Speaker 7 00:23:31 So in completely paradoxical way, we find ourselves sometimes very similar to those what is called like the post Trudeau's or, you know, the, those people that accuse experts as fake news in the world, et cetera, in a almost dangerous way. Similarly, we saying most of what, you know, we go against established truth. We go against vested interests. We go against the mainstream institutional authority to dispense affects. We go against the police. We go against the, the court, the legal system. We go against other state agencies, et cetera. We actually establish an alternative way of truth production that is open. That is a bit more democratic. That is not relying on expertise, but relying on the hard work of demonstrating how verification work. So I think the question goes towards this sort of methodology that we developed called open verification, but which we say to establish truth, we did not go through the main gatekeepers of truth practices, but we work directly with the communities that experienced violence, their testimony, their video is the main thing we work with and is the most important and precious evidence that we have very often by the way, delivered with great risk and great difficulty. Speaker 7 00:25:01 Then with an open kind of network of volunteers, of people that stand in solidarity with the people that experienced violence and actually rather than create a closed box, it kind of a laboratory with a sort of hygienic kind of relation to evidence in which the old of rituals of privileged access to truth are being exercised. Do it any open, announced that we do it and integrate as many points of view as we can into it, this sort of poly perspectival approach to truth in which there is no privileged position to look at, but in wide multiplicity of situated perspective of situated partial, sometimes parties and perspectives that add up and create the fabric of verification that we need. Because when you are establishing facts, the videos that are truthful, the videos that are not fake would agree with each other would have a hinge to another video and the testimonies would align. And when something is actually being manipulated, it will obviously often kind of fall out of that very fragile network. Speaker 1 00:26:21 What do y'all is proposing and enacting with his colleagues and collaborators at forensic architecture is a kind of aggregation emerging out of machine learning. They're building their own data set from both official state and satellite sources and user-generated photographs and videos captured out in the field. Then analyzing that information when the images and videos Speaker 6 00:26:44 Connect in some way, they can start to verify an event with arguably more and better information than that of official channels. Sometimes you guys work with a journalistic output operation like the New York times. Sometimes you're working with something like amnesty international, and there's also obviously this very robust art context. And I was curious how those different contexts maybe shape both the work if they do, and the audience that you're speaking to. And then also since there is this political and humanitarian element to your work, what is the benefits of putting that research in an art context? Speaker 7 00:27:31 Yeah, this is, this is a really crucial question for us because when we exercise something that we call it, counter forensics, meaning both to investigate state crimes, to investigate the investigators, investigate the police force, the secret services, the military, et cetera, but it also means a different methodology. So the counter methodology to, to state one, and then when you do investigate the police or investigate the prosecution, it's much harder for us to get into the official forums of accountability, the sort of state forms of accountability like the courts or parliamentary inquiries or anything was a kind of like a government stamp on it because sometimes they use the law to exercise violence. This is definitely my experience in Palestine, where the law is weaponized. The law is not neutral, so we need to find alternative fora and therefore, uh, art and cultural institutions, uh, some of the most effective ones. And I think that this sort of like not perfect belonging when you put something that is strange to a forum is also a great potential because with it, you can open up new ways of seeing new ways of understanding and new ways of discussing visual evidence and technology. Speaker 6 00:29:09 There's a specific forensic architecture project. We wanted to talk about triple chaser, a video work and investigation. The group showed in the 2019 Whitney biennial. It responded directly to an activist movement that grew out of the discovery that tear gas grenades produced by a company called Safari land were being used against migrants on the U S Mexico border among other conflict zones. Safari land was owned by former Whitney museum, vice chairman, Warren B candor's, which raise questions about the money he'd been giving to the museum. I was among the 50 participating artists in that biennial who publicly asked for candor's removal from the board a few months later, he resigned Speaker 9 00:29:53 <inaudible>. Speaker 6 00:30:00 I wanted to talk about triple chasers specifically and, um, pack this work as a really incredible use of artificial intelligence. And also it's a way of allowing us to see something that is typically unseen or hard to see. So I was just wonder if you could describe triple chaser and then we can Speaker 7 00:30:26 Talk more about it. I guess, that the work could not be really separated from the entire controversy and the entire set of protest and activism that happened around the dealings of the vice chair of the Whitney board of directors wouldn't be calendars. So if, if we kind of used art and cultural forums as human rights forums, or as kind of alternative accountability forum at the Whitney, it was the first time that we encountered the actual complicity of members of an institution in human rights violation. So the th the whole story kind of evolved from the actual activism that was led by groups like decolonize, this place, and many others Speaker 6 00:31:19 Decolonize. This place is an activist group based in New York city that organizes and supportive indigenous rights, black liberation anti-capitalism de gentrification and other causes. They've staged protests at many cultural institutions, including at the 2019 Whitney bye. Speaker 7 00:31:38 And we were invited to do something else, but we realized we had to investigate the institution that invited us around the same time. People like decolonize this place, and others were kind of desperately looking online for evidence of where the munition that was manufactured by Safari lands, horribly titled company, that won't be Kansas owned. That was manufacturing police equipment, including tear gas, canisters. Where was it being sold when you export from the U S weapons or ammunition, lethal ammunition, that's all public records. You can find that where this material goes, but what is called less than lethal munition. You don't really have that on a public record. So you need to find photographs of those instances where this material is being deployed worldwide, say in Palestine, say in Mexico and other where wait, wait, was found, you need to find it in order to verify that the contract has been signed, but to look for it online, amongst millions and millions and millions of videos and images, it would have taken years to do. Speaker 7 00:32:55 So we decided to actually train a machine vision classifier, to do the work for us. And usually the way you would do that is like teaching a child how to see. So anyone that has a kid now you have like a toy train and you say, ah, this is a train. And then you hold it from slightly different perspective and say, that's a train, that's a train. And the child kind of understand that the train is not that object from their perspective, but this train could be seen in the dark and the light in far near, from above and below. And it's still a trend. So you need to teach a machine how to see in teaching, how to seize all about the relation between sensation and cognition, between sensing and sense-making. And that's a very interesting process because you start thinking about photosensitivity about the way in which we are aestheticize and understand the world around us. So we used our toolbox, which is build an architectural model and use a 3d model of, uh, of a particular bit of munition with it. We were able to generate almost infinite amount of images and tell the classifier that's a trip of chase, or that's a triple, this is also a triple shares. Here are 10 triple chases. And even if you look at it from the other side, still triple chaser, and repeat that process like thousands, tens of thousands hundred thousand Stein Speaker 10 00:34:27 <inaudible> Speaker 1 00:34:32 Video opens with footage from the Tijuana San Diego border captured in November, 2018, when U S border agents fired tear gas at migrants, David Byrne narrates Speaker 10 00:34:45 Forensic architecture is training a machine learning classifier to search for images of tear gas, grenades manufactured by the Safari land group. Speaker 1 00:34:54 A few minutes in a seizure warning appears on the screen. Then a pile of 3d model tear gas canisters. There must be three dozen of them strewn and angled and all the various ways they might fall on the ground after they've been used the canisters flash and reconfigure as background images cycle behind them in bright abstract patterns. As we watched the machine learn to see these canisters at any possible angle in any possible environment. What differentiates this data set from other data sets we've talked about today is that this one isn't based on existing images, it's based on synthetic computer generated images, because this many photographs of this many Speaker 6 00:35:38 Canisters and this many conflict zones simply don't exist. Forensic architecture also train the classifier to recognize what is not a triple chaser grenade, things like soup cans, beer cans, fire, extinguishers, flashlights, water bottles, batteries, and they trained it to recognize the triple chaser in simulated real world environments. I wanted to ask about open source because I was going through all of the code. You know, that's very available on your website. It's really generous. And I guess I wanted to talk about that in relationship to the collaborative way that you made that work. And I guess how that fits in with it as a kind of activism, because it had real effects. And sometimes I think I can feel on my cynical days if I'm working within an art context, and obviously there are constraints, there are whatever realities to where that money is coming from as well. I think how do I work with is how do I respond to it? And I just think it's a really interesting way of both the way you use artificial intelligence with resources, but also how that is all open source. It's all available. People can read about it. And then there was a sort of collaboration, both artistically, but also politically. Speaker 7 00:37:00 Yeah, I think, I think it's absolutely right initially in order to get those models done, we needed collaboration from activists in Tijuana and in Palestine. So my very good friend, Emily Jacir runs an art space that I think very likely is the most tear gas art spaces in the world, just by the war in Bethlehem. I had a pile of all sort of like canisters and threw remotely on the phones. She was kind of like showing, is that the one is that the one down? And finally we found the right one and she measured it for us. She photographed it for us. And from those videos, we were able to produce photogrammetry model, which is sort of three-dimensional rendering of videos. So already the collaborations started there. So it's never just the piece of work that is projected it's everything. It's the piece of work. Speaker 7 00:37:59 It's the news that we continuously updated. It's the relation with the colonized displace and with our colleagues in the museum, it's the decisions that were taking the conversation. And I think that that indeed had contributed to, by no means it was the only thing, but it contributed to the decision of one Candace to resign. And it still resonates because it kind of challenges the very foundations of the way in which art is produced the way in which art is displayed the way in which it is asked to be radical, but still polite to the context. So that politeness is really, um, we all, I think you too, we, we, we gave up on it. I mean, we, that was not a polite participation Speaker 6 00:38:53 Forensic architecture's approach to surveillance imagery, to satellite imagery, to machine vision, all in service of identifying and verifying state violence, all with the goal of holding its perpetrators accountable, it gets at something foundational about any technology, especially photography. Speaker 7 00:39:14 Every camera has its politics. Every camera has a point of view. Every camera has its history, body cams and police dash cams and activist cameras and satellite images. Each one of them, you need to think in relation to the history of that practice, there's always a set of relation that go into the photographic event. Meaning what is your relation to a photograph is not only what is registered in it, but how it was produced. What is the certain contract or agreement implicit or explicit between you and the person that was taken it? What kind of bonds it make? What kind of exchange of knowledge working with photography entail? Uh, so for us, we would like to think of photography as always a collective practice. That is to say every relation between the people, taking it, people analyzing people, advocating with people, litigating with it and creating those kinds of comments, aesthetic commons. Speaker 7 00:40:26 If you like photographic comments that make us understand that the photographic event is not only pressing the shutter, pressing the button and taking the photograph, but to work with photography is all those relations, social and political activists litigation that go into them. And this is what we have to offer into that. I think that kind of interdisciplinarity is, is a very weak term to explain the, sort of the process of open verification, where each individual person or group of people that join into that common investigation, bring in very specific individual situated perspective. That is about what they could do, what they could expand, what they've seen, what is the unique experience that they bring about? And that is what enriches a reality, not reducing it to each one of those, not to the technique and not only to the experience with bringing those together is I think the strength of the collaboration, this idea of Speaker 1 00:41:34 Open and transparent collaboration at the intersection of photography surveillance and artificial intelligence is exciting. And it addresses some of the core inequities that artists and researchers are trying to bring awareness to. And if possible, correct. In episode two, we spoke to the artist and researcher, Mimi Onuoha about visibility and invisibility and everything. We don't know about how we're seeing online here. She is again with a final thought Speaker 11 00:42:06 When it comes to data collection in general, you have always these two different groups or two different entities. There is the group that is kind of collecting something that wants a thing to exist. And then there's the group that makes up the collected. And when those two groups are the same is when you have the least issues, because that means that the group that is actually doing the collection also will have a very clear, very clear understanding of what it means to be collected. And so the decisions that they make will, will be different depending on that when those groups are different, when the grip that is collecting, something is different than the group that makes up the collected, then that is when you start to have all these different questions that emerge around power and access and so on. So I think that for me, that's how I try to like map these things out really in terms of just relationships of power. And I find myself often wondering, well, what does it take? I don't think that it has to be a zero sum game. Always. It's not necessarily that it has to be like this group will only have this agency and control. This group will only have that. It's not always like that, but what does it take to change that? Or what does it take to kind of disrupt that in some way? It seems to me like that is the place to focus on. Speaker 1 00:43:22 Sometimes aside from it's kind of artistic merits or personal merits, I feel like, why am I using all this technology? Like I could make my life much simpler, never dealing with machine learning or artificial intelligence. Again, I could just take a photo or not be concerned with that. I knew there was this relationship between institutions, corporations, States, and artistic work. But talking with everybody, I guess I saw more of how it's useful and how it is kind of a space of progress and hope, hope, always feels like a weird word to me, but optimism, I find a sense of optimism in the fact that these things are influx and that we can change them. We dictate how these technologies get used and right now the internet for years, but especially right now, it's kind of given people a voice to say, we don't like that. Speaker 1 00:44:24 Or to hold corporations, institutions, the state accountable in some ways are use the way we use the technology, our actions around the technology shape, what it is and shape what it will become. So we have this power to say, no, we don't like that. Or we'd rather do this. You know, I always think about like with Twitter, there was no reply function when it was first built that was added in because that's how people use the tool. And so we can determine how these tools look, feel, work what's acceptable. What's unacceptable based on our actual use of them. And that's why I want people to use them more, you know, like take some ownership over it. I'm excited about that. I've been holding onto the idea that we can dream a new future and that not being able to do that is a lack of imagination. And it just requires a lot of work, obviously an activity. But first and foremost, we have to be able to like see it and envision it and dream it and imagine it, and then we can work towards that. So I think this kind of radical imagining that we talked about is inspiring to me and is kind of keeping me alive right now. Speaker 1 00:45:52 Thank you for listening to mirror with a memory, a production of the Hilman photography initiative at Carnegie museum of art in Pittsburgh, for more information on the ideas and individuals featured in this episode and the rest of the series, please visit coa.org/podcast.

Other Episodes

Episode 4

February 22, 2021 00:39:17
Episode Cover

Episode Four: Storytelling

In Episode Four, we talk about the algorithmic potential of storytelling. Artists Stephanie Dinkins and Stan Douglas discuss how they use the language of...


Episode 1

February 01, 2021 00:41:06
Episode Cover

Episode One: Biometrics

Photography has been used as a tool to record our bodies from the creation of the first mugshots in the late 19th century to...


Episode 0

January 22, 2021 00:01:25
Episode Cover

Podcast Trailer

The Mirror with a Memory podcast focuses on different facets of the conversation around artificial intelligence and photography—from biometrics and racial bias to the...