The longstanding political and financial incentive to indiscernibly spoof representation has been empowered by synthetic opinions, images, news, and their automated generation. Private social media platforms openly facilitate this hijack by setting terms of engagement while allowing capital to manipulate visibility, faking relevance and consensus.
This dynamic emboldens the vectorialist class, flooding discourse and drowning out legible signals in a state of hyperreality. The result is a new iconoclasm through instrumental overload, marking a broader semiotic shift where meaning collapses, symbols confuse, and sentiment overrides fact. Creating conditions where spectacle replaces discourse, stigmas shape communities, and unchecked power flourishes.
Read full transcript (generated by Whisper)
Welcome Constant. We need to switch on. Thank you. If you want to fundamentally change society, you first have to break it. And it's only when you break it, is when you can remould the pieces into your vision of a new society. If you're collecting data on people you're profiling, then that gives you more insight that you can use to know how to segment the population. Shall we play a game? Which side do you want? To keep their messaging about issues that you care about, and language and imagery that they're likely to engage. Talk to us about that. Yeah, so… Cleaning that intelligence for opponents suddenly opens a door onto an altogether different form of data gathering. We have relationships with specialist organizations that do that kind of work. You know who the opposition is, you know their secrets, you know their tactics. Contact information about phone numbers, instant messaging screen names, anything you want to tell, interests, what books you like, movies, anything. And most importantly, who your friends are. And then you can browse around and see who people's friends are, and just check out people's online identities, and see how people portray themselves, and just find some interesting information about people.
The two fundamental human drivers when it comes to taking information on board effectively are hopes and fears. And many of those are unspoken, even unconscious. You didn't know that that was a fear until you saw something that just evoked that reaction from you. And our job is to drop the bucket further down the well than anybody else, to understand what are those really deep-seated underlying fears. Fake news. Fake news. Fake news. Here's an interesting fact about fake news. It does not have a universally accepted definition. Manipulation of content is not so much the curation for the benefit of the user experience, but rather censorship to produce and manipulate emotional responses. It has to happen without anyone thinking, hey, that's propaganda. Because the moment you think that's propaganda, then the next question is, who put that out? So, my name is Constant Delaert. That's a Dutch. But Constant Delaert also works. My microphone seems to be terribly loud. Is it okay? But I feel like I have to whisper. And I always like to speak up a little bit when addressing people, keeping them awake during strange topics. But anyway, Constant Delaert, many people speaking the English language think it's actually all the time boring art.
It's a chosen name. That's why I'm showing my password. So, a lot of people write it like this. But that's, yeah, so that's not the way to write it. You have to write it with two A's. So, a lot of mistakes even happened on this occasion. But now you get punished. You have to write it with three A's. And if it goes worse, you have to write it with four or five A's. Anyway, so I'll talk about spoofing. But I will also talk about a few things in which agency was added to media. And we'll start with this moment where boomers were applauding an eraser. So, please. If we have some really great questions, we'd like to answer them. But before we do, Bill and Randy are going to run through some really quick demos which you'll be able to see up on that screen to give you a little feeling for the kinds of things that we can do. I'm going to show you the Mac Paint graphics editor that runs on Macintosh and lets you edit images. Sometimes conveying your message goes better with pictures. Your message goes better with pictures than with words.
What you see when you open up Mac Paint is three palettes. A palette of current tools, a palette of patterns, and a palette of border thicknesses. To use them, you point with the mouse and click on a tool and press, stretch out a line here and release when you've got the line. Or choose a different thickness and do the same. There are structured tools along here that let you make hollow and filled objects like a filled rectangle here. Choose a pattern to fill it with. And here, make a filled oval. You can erase parts of the image using this little thing that looks like a brick. It's supposed to be an eraser, but that's the best I could do. And if you just scrub with it, it kind of eats away. Okay, so this is significant. As you get bored with scrubbing a lot, you can just click. Because we're used to a lot of these functions. And actually, Bill Atkinson also developed… I'm sorry for walking around so much, but I'm trying to keep energy in my body. Is it possible to take it slightly down? Um… So Bill Atkinson also made HyperCard. And HyperCard was a precursor to the browser.
It was actually one of the first examples to connect hypermedia. He also notably thought about it while he was on an asset trip. And then he had to wait until he could use his fingers again so he could program it. But, anyway, I thought that was funny and remarkable. And he also… So he made MacPaint, but he also made the marching ants. Which tied into the frequency of the monitor. But he also advocated for a function button to take… Within programming, there was always already a moment to go a step back in programming. But he wanted a function key for that. To just go one step back. And Jobs didn't want an extra key. And then he said, like, can we have an interchangeable function key that, like, in combination with something else, could function. And now, at least it's ingrained in my body. It's removed from touch screens, of course. But it's still ingrained in my body. To go between control Z, control C, control V. And it's like these kind of, like, strange things. That were designed in not that far away. When people were actually applauding the use of a graphic eraser on a computer.
It actually goes further. …peel off a copy. And, in fact, feature and option will peel off many copies. So, with the copying and the making the line of the object, that really blew the mind of the audience at the time. So, that was 1984. You see how he's like a rock star on stage. Originally a neuroscientist and then actually was, you know, really came to fruition at Apple. I went to visit him. Also, remarkably, a nice reference to James. If you're still watching, James. Vanessa, hi. Naveen. And… Yeah. I went to visit him as a kind of pilgrimage. Because I wanted to see the person that made Undo. Implemented Undo. And I made these drawings. Like pixel by pixel from the video. And I gifted them to him. Where I gifted this play to him. So, he could make more prints of it. And these types of drawings. And the significance. And the… You see, like, I don't know if he's… He seems to think I'm a remarkable person looking at that picture. But he put these things in his treasure cabinet. Or his, like, kind of, like, where he keeps his keepsakes. And I saw all these patents there.
And I saw this kind of strange object there. And I asked what it was. And he told me the story that when he left Apple, he had all these patents running to make a small device. And he said, you know, you can have it anywhere. But you can have it in your pocket. And you can send messages from one device to the other device. So, this was early 2000. And he had prototypes from AT&T and Sony. And, like, all these devices there ready. Like the ones that I didn't recognize. But they were caught on. Like the companies didn't support it completely. And… And he was able to do this remarkable product placement. Not here. Who's not here? The groom. What's happening? Big's not here. But we're 25 minutes late. Well, did anybody call him? Well, give me a phone. Somebody give me a phone. Oh. All right. Here, sweetie. So… Ah. Master volume goes up. Now I get it. So, this was the product placement for it. Right? Like this kind of agency that was given to Carrie Bradshaw and wasn't able to use it. This kind of like magic kind of tool that wasn't available to her.
But, like, other people already had it. Had access to it. Already had this agency. It was this introduction of agency with this kind of mystique around it. And I thought this was kind of interesting to realize. And I think I can relate that to how Bill Atkinson related to this when he was already made this precursor. And he tried to do it. Apple failed. Made his own startup to do it. And the thing was that the object that I saw in his kind of treasure cabinet thing was that he had saw the advertisement. Had seen the advertisement in 2008. And he had cut out the screenshot from this image. And he had put it on a piece of wood. To carry around in his pocket the four months that he had to wait until the iPhone actually came to his house. Or he would actually be able to obtain the actual iPhone. And he said he walked around with that piece of wood in his pocket. This is an actual phone. But, like, to feel what that agency would be like. To have that much agency with you all the time. So, there's this person that was like a neuroscientist.
Developed all these kind of things. But still had to have that kind of physical relationship. And that kind of… And that kind of totem almost. And that kind of effigy of that potential and that agency. So, let's think of that in like the amount of time that we need to regurgitate agency to enter our lives. Anyway. Thanks, Bill, for sharing your stories. Yeah. So, of course, like many thoughts around tools. So, for example, Marshmallow Clue. And of course, first we shape our tools. Then after they shape us. Or maybe it was this person. We make our tools and then they shape us. Or maybe it was even Winston Churchill saying we make our buildings and then our buildings shape us. You know, we don't really know. I wanted to respond also quickly because I found an old record I made to the talk yesterday from Gregory Shatonsky. That who mentioned form. And like missing form and things like shape and color. So, I found this early work I made. It's called the Fibonacci Resolution. Consider the pattern. Consider the opacity. Consider the texture. Consider the scale. Consider the scaly. Consider the volume. Consider the density.
Consider the storage. Of course, this was generated with an AI last week, but I am using it as a teaching tool because it's just… When I bike to my studio, I listen to this, and I generated about 5,000 versions of the track. So then I just listen to all the considerations I have to make before I finally dare to finish a piece or a talk. That's why I'm nervous now. So I wanted to talk further about music, talking about MIDI. These are the people that were behind MIDI. Actually, same kind of area. So in 1983, MIDI was released. MIDI was actually an interesting precursor to what is now open-source movement because it was actually an industrial kind of language that was actually released to be able to have multiple… Multiple companies use a kind of interchangeable format. At the same time, for example, Philips lost the video wars and then regained some traction with compact discs and stuff. But these people were smart, and they actually released this technology. But the interesting part is that there was a change in music at the same time, right? So it influenced the cultural output. And the MIDI tracks were actually, when using full just the MIDI that was available, if it was…
If it was like applied to all the instruments, the music turned cold and turned really efficient and turned exactly on the beat. And so what happened was that that wasn't like the best use case of that, right? Like it was this cold kind of effectual application of the technology. So what happened is that the humanized function was added and quantized functions were added. And in principle, if you translate that, it's a programmed inconsistency. It's a programmed inconsistency that wasn't developed within the kind of innate technical or the capacities of the technical language. And it was applied again to have that type of inconsistency so humans would perceive it as more attractive. Anyway, more tools, of course. So the man who only has a hammer, everything he encounters begins to look like a nail, Maslow said. But Mark Twain also said it. So, you know, and Ben Perruch said it. Even Barack Obama said it. But it's nice because I found it all on a quote website. Even Charlie Munger said it. He made a lot of money. So if all you have is an iPhone, you know, everything starts to look like what? Anyway, that was the question.
So. Some other. In the process of making hypertext transfer protocol usage searchable an initial technique used to analyze relevance was based on link popularity developed by Ink Tomy. From 1996, Ink Tomy was a key infrastructure provider in the early web, powering search engines for Yahoo, MSN, and others with scalable distributed technology. It also offered caching and traffic management software to ISPs and portals. Though largely invisible to users, Ink Tomy helped shape the performance and architecture of early internet services. Sorry, I used Werner Herzog's voice there. Felt like doing that. How many times something was linked to from other pages would have a significance. It would rank as higher quality content through quantifiable interactions on record. Google named the process PageRank, for example. Not only are they ranked, but they're also ranked. So, they're ranked after web page, but also after Larry Page, the founder. So, this is a process of like spoofing validity, right? Like if you would know that process of like ranking a page and making it more popular according to like the site visits and according to like how many pages linked to it, very quickly afterwards a commercial services popped up to just deliver that, to spoof that relevance.
Ink farms emerged as commercial services offering to promote websites. So, they're now called link farms. So, they're now called link sites with so-called link farms, offering to craft and spoof relevance for money, also called spam dexing, later just referred to as search engine optimization, and later black hat SEO. So, that brings us to like the kind of tradition or net art tradition of actually, you know, copying that kind of relevance and even domain squatting. And I'm still in the awkward privilege of owning HamburgerBahnhof.com. As you're seeing here, this is kind of history, and then you small escreve section it right here, and then it isander 여기서om at you're seeing cabins of publish and isander into the short important matter aplica. Big film. großen per boxing. I'm really happy to get a chuckle from the audience. Also this beautiful website. So, I think as many others, like when I'm answering this, I'm in doubt currently. Like, I do feel like I'm actually so already controlled and manipulated that I don't really fully, I'm fully capable of clicking this box. As in, I do it because it needs, it puts me forward in my path where I need to go.
But I am, I am, of course, like in an existential crisis. And that existential crisis also started with collecting bots in around 2014. And I was looking at these bots, the Instagram bots, and I was looking at like their peculiarities. And like these kind of like inconsistencies. So, not just the human inconsistencies, but actually these mechanical inconsistencies, right? So, Instagram had the square format and they forced it at that time. And the images that were automatically uploaded were automatically cropped. So, the faces were taken off. So, I think if you're reminiscent around your time in the army, you won't crop off people's heads. It seems like this is also like missing the crucial part of the image. This as well. Also made me… I made me wonder if you can snowboard a marathon. I think it's quite difficult. I think they mean that they've been snowboarding for a really long time instead of the particular distance a marathon is downhill. John Allen writes in his profile, and this is very inspirational. John Allen writes, I'll prove to the world that I would ecome some hin in porant to they ned someday. So, of course, this was a time like we don't have that anymore.
They became like kind of treasures of like these kind of strange artifacts that we don't have anymore. So, I'm happy that I collected them. Well, not only because I know John Allen's motto, but also because I understood like how they were spinning text. And now people are automatically adjusting content. They were scraping content and then pushing it back, right? And then they were inducing mistakes and changing stuff around so it would be less easily retrievable. Darset writes, I love so much Aston Eber. I think there's a… I think there's a… I think there's a… I think there's a certain musician who's in trouble currently. So, when I bought all these bots and I bought them also just for my own account, I felt like I just went through retail therapy, like I bought new shoes. Like I really thought like people were looking at me like as a completely different new person. As if my words had a larger validity, as if I had be… As if it was more important to what I was doing. And I thought like I wish that for other people, you know. I think other people should also be able to see that.
Other people should also have that feeling. And I saw like a lot of competition in the art world particularly where people were like kind of like adjusting their content so they would get more followers and that they were kind of, you know, they were taking selfies with like really important people. And then I thought like, you know, like they should feel less unequal. They should feel more equal. So for example, Klaas Biesenbach, I thought like let's round it up to a hundred thousand. And it's not that I'm not a fan of Biesenbach. I'm not a fan of Biesenbach. I'm not a fan of Biesenbach. I'm not a fan of Biesenbach. I'm not a fan of Biesenbach. And then there was this up and coming artist, Jeff Koons, and he only had like close to 11,000 followers. And I thought like he should be, you know, have that same kind of privilege, right? And gave him 100,000 followers. Jerry Saltz was a critic. He got 100,000 followers too. Ai Weiwei, you know, he was posting all the way from the other side of the firewall, you know, like had to make a lot of effort. And I thought like, you know, I'm not a fan of Biesenbach.
I'm not a fan of Biesenbach. So I thought, and actually this curator was really close. He already had like a lot. So actually in the end, everybody had to be equalized to Obrist. Yeah, so, yep. Anyway, but this is still current, right? So this was a scroll of like all the accounts you can buy. And this hasn't changed. That market is still very, very active. Okay. And we can believe that all of this stuff isn't going on, but that kind of undercutting of these value systems that we have of quantified social capital, you know, still remain, right? Like we have this system in which we think like, ah, there's this number and then there's a quantified social capital and that means relevance. And in the end, the incentive to spoof that is tied to a transaction and to a financial transaction, right? So in the end, that is this deep undermining already of that whole belief system and like where we're thinking of like a direct value being relevant. Actually, this was also the Instagram work was a commission for Jeux de Paume, by the way. Funnily enough, wanted to reference that. And then afterwards, I thought like I should maybe go more dramatic and start more with like…
I start my own Facebook army. And I had thousands of accounts based on… I got all the names of the actually Hessian mercenaries that fought the American Revolution. And I got all their names and actually registered them one by one using existing industry to have my own army and to feel what that would be like. So these are some of the accounts. Aston Iber showed up. There again, by the way. And these accounts would post like kind of parts of an essay on identity that was written by my good friend, Robert Sarkowski. Or Sarkowski, sorry. And this is what they were posting. And it was kind of interesting because if you see it on the right, there were these suggested friends. And I think in the end, that was like the main method of the work. Going around and being introduced. So there were these kind of like scarily sorted suggested friends and people were kind of shocked. But yeah, that was 2017. And we're using methods like this, like bots that would actually automate a lot of the social behavior. If you look to the left-hand side. Of the window, you will see general info and Facebook.
When you're done setting the general information, select Facebook on the left to set more information about each identity. Here you can set the current city, hometown, relationship status, biography, languages, website, and preferences for your identities. In the current city and hometown fields, you can use tokens to set the current city and hometown fields. You can use tokenized text or random location. And for this example, we will set the hometown and city to random location. Open up the settings option to define the location. You can set the location by continent. Country. State. And city. As you can see, FriendBomber can create identities of people anywhere in the world. And as geographically broad as a continent. Or as specific as a city. The software will allow you to filter by population. This is great if you want to focus on or away from densely populated areas. When you're setting the location format, you should always use the city name state code format when you are using a USF. And that's where you'll find a lot of information. I'm going to show you how to set a location. I'm going to show you how to set a location.
And I'm going to show you how to set a location. Click test it to make sure the location is set to your needs. Click test it to make sure the location is set to your needs. Click test it to make sure the location is set to your needs. Anyway, to me, this is interesting. For example, here it just says that… Google Places option. When the find in Google Places option… Just take Google Places to just register and affiliate with certain areas that other identities have affiliated with. So in the idea of that technology to just create identities and with this general approach with low cost, kind of like home-made bots in 2017, in the research I met people that were running 4 million Twitter accounts from their living room. This type of technology would specify into a much deeper realm of what kind of interest there would be, what kind of political interest there would be, and combine that with nudge theory surrounding other people and pushing them in a certain direction. Of course, this was like a breeding ground for propaganda. Of course, this was like a breeding ground for propaganda. And in the end, what was interesting, like tying this into the MIDI story, was that in these bots, we actually had to program an inconsistency.
We had to program a timed inconsistency to make the bots more human. So that bot would actually go onto Facebook and wouldn't be detected as a bot by using inconsistency in timing and like where it would click. And like within the development of like headless browsers or like people making bots or like agents or like… What's really interesting now, to ask you this question, to constantly rebuke us for this type of participatoryism on what allows us, let's just say, efforts, the But like tying it into this idea of like emulating and spoofing what is human and what is valid to be human or what is real, this idea is that the incentives just so much larger than what we can battle with our technical means. That is a continuous war that will not be won. But in the end, incentive will always be there to spoof. That in the… I mean, maybe it's related to like having children. But if the end we are spoofing this, we're spoofing the … I mean, maybe it's related to like having children, But in the end, we are spoofing the that validity continuously. This was one of the things that was needed to spoof these identities too.
These were the SIM cards. I collected them and I actually found a lot of these SIM cards being reused for the gold value, which is very difficult to get out. I would not recommend that. And this is a quick reading, which is hard for me to read or hard for me to listen to because I'm not that good in public reading, but I'll still do it. Semiotic tears. So, which is clearly a very beautiful image that a lot of people needed to interact with and comment on. However strong illusions, ambiguities, confusions on the thresholds. And Vinicude83 says, and The Temptation, Rick Dance says together, mirror images may be the experimentum crucius. Will dispel any doubt. Just reproduce a mirror in a photograph, in a motion picture or television shot, or in a painting. These images of mirror images do not work as mirror images. There is no imprint or icon of a mirror other than a mirror. The latter in the world of signs becomes a shadow of its former self. Derision, caricature, memory, you can make a portrait, either a photograph or a painting, and assert that it is. Realistic, truer than the original.
With mirrors, there is no truer image than the original's. A catapric, catoptric, sorry, this was a word that was pretty new to me. Element capable of reflecting a semiotic element existing independently of it, without modifying it, cannot in turn be reflected by it. The semiotic element can only, or can be realized by it, making its genus a concept pure content. These two universes, of which the former is threshold to the latter, have no connecting points, the extreme cases by distorting mirrors, being in fact catastrophe points. There comes a time and choose, there comes a time and choose which side one is on. The catoptric universe is reality, which can give the impression of virtuality, whereas the semiotic universe is virtually, is a virtuality which can give the impression of reality. So this is what Umberto Eco wrote back in 1986, and is now active in its own posts. And finally, most, since we are really interested in automated content bots and the sort of anthropomorphism of technology, Um, Um, we, we really liked the, sort of this beautiful example of, um, Blade Runner, this replicant Roy Batty, um, who, who delivers this sort of very, very beautiful speech, um, that I'd like to play, we'd like to play for you guys.
I've seen things you people wouldn't believe. Um, I've seen things you people wouldn't believe. Yeah, so I cut the sound here because it's actually interesting to consider that bots, seeing things that we can't believe as humans, right? And then we can consider this whole latent space idea, that the mechanized idea of seeing something and interpreting the space between known concepts and being able to recognize what is between these known concepts becomes latent space. Moments will be lost in time. Like, tears in rain. So, uh, we decided this needed some amplification, so we decided to tweet this statement and have it purely retweeted by bots. So we're now on 11,300, oh, 13,000, Jesus Christ. Um, and think of the, and we wanted to think of the value when it's deleted. Disappearing, like, retweets in the rain. Yep. Um, and I did, uh, that Facebook army, the importance there was that I basically crafted a weapon. I didn't dare to shoot it. Like, I didn't know, actually, when I made it, Trump was still in the, wasn't confirmed to be the Republican candidate. Bernie was still in the race, you know? We were thinking, like, do we give them to Bernie so we support Bernie?
Or do we give them to Trump and then later undercut it and say, like, oh, this was fake. And in the end, we couldn't make our minds up, or I couldn't make my mind up. The team of people I got together to debate this couldn't make their mind up. But in the end, um, people shot the weapon anyway, right? Like, I just built that same weapon, and other people were firing it at will. So I did write more of these poems. So, like, the poem I shot before, which, of course, relates to this kind of, like, semiotic relationship, right? Like, do we have that same relationship to something? Do we have that same relationship to signs already through social media and through, like, now through this kind of privatization of social media, that same relationship to signs? So do we still believe, like, a retweet signifies agreement? Do we still believe these kind of, like, things? And actually, when it becomes that transaction and when that validity can be spoofed, you know, after a while, you can own that type of, and you can change that kind of semiotic relationship. So this isn't just saying, like, that use of AI doing that is only, like, currently an amplification of that process already happening.
I did make, like, kind of, like, tribal references to all the accounts I could buy. So I would make these flags of, like, different Instagram accounts. And you could buy male, female accounts, Russian, for example, or Ukrainian. The project starting in 2014, it was very much influenced by how that war started. And when the actual full-scale invasion happened, the flags were hanging in the Tretyakov Gallery in Moscow. And then we had to have this, like, absurd symbolic moment of, like, my symbolic work that refers to all these propaganda wars and, like, to this hijack of social capital. They had to actually… And I had to blur their faces because they were potentially in… They would be implemented in something that would be against the regime as a state-run gallery. And in the end, I don't know, like, maybe I don't even want to show it, but, like, in the end, they have to, of course, remove that Ukrainian flag. I did, as a reference, start a thing called Common.Garden. And Common.Garden is a website that everybody can use and you can meet each other on it. You can have informal kind of Zoom meetings without using large corporations.
It's something that runs on servers of ours. And you can use it as a Zoom alternative. You can present portfolios. You can… People in Nuremberg are using it, but also all around universities. And Common.Garden is actually another reference to my name being boring, dull, as in making dull art, making dull technology, making Common.Garden ordinary technology. So, like, kind of this… I don't know, discrete kind of relationships again. And knowing me, I don't know, I don't know. And knowing me, like, I'm hosting it. We're responsible for it with a couple of friends. And that's how we're running it. And we're running a foundation. We selected the… We're running a foundation called Distant Gallery on it. I wanted to… I put this on there as a reference also. So, you guys… But now I'm a part of this group also. Made in group of Bitnicks. I'm proud of that. And Sokrovsky's there. And Ed Dekker. And basically, this is the format. So, you would see the audience in the back and people are presenting over there. And people can design their own setting of, like, how the interaction works. I did want to… I will hurry up.
Also, I have to say, also because the talks have been super interesting and very inspiring, I'm also, like, extra nervous because I want to respond so much. And I want to, like, pack all these things in. So, one thing that I thought was crucial is that we're talking about this technology in, like, multiple applications or, like, multiple areas. So, are we talking about social… like, social responsibility or the responsibility of society, how it deals with this technology? Or are we dealing with how the technology works and, like, the cultural effects of that technology? Are we dealing with, like, its potential danger? Or are we dealing with, like… Anyway, so, like, these things are… sometimes get convoluted. So, I just wanted to refer… This is a story that's known to a lot of people. The tourniquets. The tourniquets came, like, basically developed a postal system turned into, like, one of the most important aristocratic families in Europe under the Habsburg Empire at that time. And they were running it for, well, over three centuries, as you see here in this quick reference I found. But when they were bought out… In the end, they kind of dwindled because there was, like, already other technologies were slowly entering the field.
But they had this kind of monopoly, right? They had this monopoly on this kind of way of communicating. And they made their money out of that. And with that kind of, like, necessity proven and their monopoly over it, there was actually a moment that in 1876 they were actually bought out by the Prussians. And they were actually given this three million thaler, as in they were actually given that amount of money to be bought out. And it's interesting, I asked Chateaupetit for what the equivalent would be now. And that kind of calculation was interesting, like, purchasing power would be 90 million because it was delivered in gold. Or, no, it's silver and, like, translated between the two. But then labor value would be much higher. And then if it's calculated according to the share of what was invested of the country, the GDP share of what was invested in the country, it's just an assumption that it's probably about 50 million new Cafe. But if they are running ahead, especially outside really rural andxe where they find to go like the 55 million that were able to have control over that communication system are a lot of incentives to be able to promote it, but what is that type of investment?
What is that type of investment related to how it's been done in history already, to reclaim that type of societal responsibility over an agency that was added to culture? Brief break, sorry, brief break from my voice. Although we know the landscape of quantified social capital, as long as it's been corrupted by created and automated social interactions, we can nostalgically look at a participatory mass medium before it was under private control and transformed to be the outlet of one of the richest people in the world. Briefs signified something close to a line and after word of mouth was written into the bluff, wipeout Gegenborn- Richmond- dictated the disclosure, along with almost all the presidential allegations obsezzed, omposing appearance as an antibiotic risk the Clifton Urest strـ§in approving your economicvery value position. This engagement would suggest an alignment between your own opinions and the algorithm Rendering your own selection methods deterministic Although I do agree we are ourselves mostly driven by controllable variables And our culturally conditioned circumstances that enable us to resist This programming has been diminished And we have found ourselves in a hyper-reality Where bold opinions resonate further in a space Filled with the sound of the breaking in between Representation and moral values from post-colonial social contracts One can no longer be bored at family gatherings Be bored while consuming content as a group Waiting for the rush of cultural alignment We are the ones who sold the world to be addicted to the minimum viable dopamine rush Delivered by personally catered media In the cage of the hot exemplatory narrow past And now we do deeper know if this shit is real And now we do deeper know if this shit is real And now we do deeper know if this shit is real
And now we do deeper know if this shit is real even though if the shit is real. So this was a video generation with the frame rate too high to comply. So in the end, it just spits out all these different frame rates and I really enjoyed that effect because it was like outside of my control, right? So it introduces this relationship to AI and spoofing images and like spoofing that type of valid imagery. And so of course, this hopefully is known by a lot of people, but this is like the first layer, like the first confinet layer of analyzing an image. I will go over the five minutes though, I'm sorry. I will slightly. But this to me, it's like a kind of like almost like modernist understanding of what an image could be, right? Like straight lines, curves, gradients, opacity, like these kind of like values that you can use to kind of understand a lot of these images in these data sets. So important that I made a monument for it. But I always understand it as a kind of, I make this example that it would be like a kind of equalizer, like a graphic equalizer for sound.
And you could actually set all these like treble, bass, highs, whatever. And you could actually position them in a certain area. So let's say it's the color red, it's curves, it's relationships between the different colors. This could be relationships, this could be relationships between words, this could be sentiments, this could be a lot of these things. And what if these are variables that you can move along an axis, right? Like if you would say like more red, less red. So if I want a tomato, I would add some red, I would add more curves, and I would add, for example, for a horse, I would add some brown, and I would add some more straight lines. And for example, if I would make an apple, it would be much closer to the one that had, was there for the tomato, that setting. But there are variables in between, right? So and I think this is for me was always important to understand what latent space is. Can you project this understanding between these lines? Can you actually make that impossibility happen? And one of the technologies within AI is actually to subtract the mean, like average, from that image to make the most common thing and actually take it away so you can find what is different between the different images quite much faster.
So we, with the help of Adam Harvey, we found this color, dull brown. So it's still visible under dullbrown.com if you ever want to use the most boring color in the world technically. There's the codes to use it there. I did make Dull Dream, and that's, I restored it actually from the program I made in 2017, where you could upload an image and you can remove, partially remove the object hood. So you could remove the sense of representation of that object. So you could upload it. So you could upload it. So you could upload that image and it would make it dull. It would make it less significant. Of course, a word play again, but the idea of like the representational value of an image becomes kind of important. So we would use that kind of saliency analysis of like where the important part of the image is, and we would re-render it to be boring and less significant. These are user images that were uploaded to it. I think this was interesting. We had a discussion about like, you know, like asking ChatGPT to verify a quote. And I got a quote from even tools for , but then it's kind of interesting that it says, when I asked like, hey, can you find this quote that you actually provided for me?
It says like, and it's actually interesting, what matters is not to fight against the computer as such, but against its monopolization, against institutional privileges it implies. But then it actually admitted that it's a paraphrase. And it wasn't a verbatim citation. And then it actually just made this up. So it was actually this interpretation of all these data points, right? Like it was already manifesting between kind of all these data points that were there. And it was already just getting that type of knowledge and displaying it like that, pretending as if it's an exact quote. And this is already emblematic of that type of changing relationship that we have to communicate with information and like this kind of new pathway to communicate with information that we are apparently find it acceptable. So deal with this type of interpretation. And what I'm also wondering is if this is something that we is kept in place so we can get to feel superior by finding these mistakes. Like I'm actually seriously thinking that sometimes these mistakes are there for us to be found so we can feel good about ourselves being humans. I wanted to highlight, but I'm sorry, I'm going slightly over time.
But I wanted to highlight that we made a data set in 2017 and we did acknowledge the labor needed so we actually collaboratively made a new data set in 2017 to train all these typically European objects. And of course like European culture doesn't exist. But we tried to emulate it to some extent. For example, with blood of . And this was the rendering of blood of . So this is the first one. This is my in 2017. This is a of two years ago. And this is a very . It's too . I wouldn't eat it. This was my in 2017. Very painterly, very naïve. I loved it. And I knew of course like this technology would change and it would be much more photorealistic soon. And this was . And you still see all these artifacts and these imperfect artifacts. And I've really learned to love them, especially because I would also get images. Like, I'm sorry, my partner always does that when she emphasizes something. She does it in a metal grunt voice. But she was like, it's a louder version of that concept. And this is very . It's like extremely way too really.
Like the same with an accordion. Of course you see the beautiful mistakes. But this is very accordion. This is like extremely accordion. I love that. And for the people liking their pastries, this was a very, this is very croissant. It's extremely croissant. And this is also very croissant. But like this is my, I think one of my topics to explain it. Look at that dish in the back. Even the dish is croissant. So just saying like in a visual way of understanding a form of style or like a style element from this one thing that it's understood from the object hood of that representation. It didn't copy an image. It's far from that. It understood a principle. And it actually applied it. And it just randomly applied it to something else too. That type of representational value that we understand. And it could be a style element. And in the end, I'm thinking like all these paintings by Van Gogh I was raised by as like a Dutch kid going to the museum. Wasn't he doing that? Wasn't he doing that? Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah. Yeah.
Yeah. It sounds too silly for me to realize. But wasn't he actually doing that. And it wasn't necessarily just syphilis and the drink. I think I have to cut it there. There's a lot more. But, you know, maybe there's a future in which all of this can still happen and be discussed. . DE bearing. DE be LIE. Thank you for this wonderful insight into your work. Clue. Which made me think that Cambridge Analytica was also one of your projects. No. No. I think what we should briefly talk about before we hand over to the audience is, while your early work, and I remember when your army tried to become friends with me also, has always been sort of gleefully playful. And it was a very beautiful way of illustrating a certain vector, actually, that was becoming visible. On the internet, in the way that it was starting to get used, or has been used for a while at that point even, to manipulate power structures. So, from today's perspective, how could these spoofing techniques be used to counteract certain anti-democratic movements? Well, I think this is, of course, problematic because I couldn't fit all of this in, and this is the problem.
I just, but like, in principle, I would say that there's, these variables, I had a really interesting discussion in 2017 with Kala McDowell in Google Venice Beach. And it was about this kind of like, removing biases from data sets, right? Like, that was like the critical point at that time. And this understanding like, we can't remove all biases from data sets, because we don't have like some kind of supremacists, techno-humanist kind of end goal. And then, in the end, there would be this, kind of, end goal. And then, in the end, there would be this, understanding or like Kyla McDowell said it really beautifully like that at Google they found themselves in the manifold and that understanding of being in the manifold I think is also this kind of like beautiful analogy to this kind of latent space but to me it was this understanding of like how do we then make these variables modular so that we can use them that I for example using Google in a very different analogy using Google for example that I would be able to use Google as a elderly Swedish person or as a Bangladeshi frustrated teenager or like how like how accessible are these different data sets and these different targets and then for example within that type of analysis because a lot of this GAN stuff is basically an analysis tool of the information that is in the data set so if you how modular is it that I can actually discover and walk around latent space and I feel like there's a very crucial development right now and which is academically also very urgent to actually study the methods that we have to walk around in latent space and the tools that we have accessible for academic research
to actually see the failures in early data sets like a lot of data sets are being thrown away like also data sets I worked with also the data set that I made I can't find it anymore because that relevance was I thought it was like an like a symbolic action but Google is throwing away models beautiful poetic mistakes that were in the models but they're throwing it away and at the same time we're narrowing down this golden path of software interaction with all this AI stuff and we don't think about or like at least only within the open source community there are these available like is thinking of like this kind of modular interaction with it and I think that produces a lot of these kind of funny outcomes so a lot of funny outcomes like for example of course the finger problem was solved in this relationship but then I made a monument for the finger problem you know so I made a monument where people from the matrix that did the matrix special effects in the Hunger Games made a felt a realistic version of the finger problem so we could call that not eugenics but AIgenics any questions from the audience?
So much to talk about come on Thank you very much for the talk so I will out myself I made a Kazimir Malyevich and Eve Klein blue I made a I made a I made a I made a I made a I made a I took pleasure in lost I think I took pleasure in we can take out I don't remember how many years it took me to how many years I was born to how many years it took me to emma This is北해� pigs This is very cool there is one thing do you think it's dance or something like that? diciendo diversas man red välility es porque spl savior de los gr правos deropy fol укestra en los folios veloz y así es cuando Battery of invention Swapé del Infra transformer en y este venido a history Because I'm curious about how much history we can trace just by measuring the in-between time, how long it took for something to become culturally relevant. Because this was one of our utopian premises yesterday. Just by virtue of knowing that the alt-right is currently attacking cultural institutions and universities and so on, we know that there is some potency in working with the production of cultural meaning, regimes of representation and so on.
So can we trace these moments? I wonder, like, Malevich now, this year, probably because he's actually Ukrainian and now he's become more historically recognized as such, so there was a little bit more movement. So I kind of get why this was happening. Do you have some examples of that? And what do you make of this kind of proto-art historical hypothesis, or the possibility of digital art history looking into this periodization of when things become relevant by negatively tracing copyright? Yeah. I think that's a great question.