Tip: click a paragraph to jump to the exact moment in the video. Narcissism in Unexpected Places (with Tanya H. Van Cott, Author of “Bandwidth”)
- 00:00 Okay. Thank you for your Thank you for your patience. No, no, don’t worry about it. So, uh
- 00:07 I’ve read your your unusual book bandwidth. It’s unusual in format and it’s unusual in contents I think. But one one strand
- 00:19 there is a dystopian view of technologies, modern technologies. And I wanted to confront you with the opposing view, the ones propagated by entrepreneurs and tech bros and so on. They regard themselves as promethean
- 00:36 and modern version of prome prometheus. You know they bring they bring fire and
- 00:42 light. They they create an alternative reality. They enable and empower people. Mhm. They so they regard themselves as Prometheus. Mhm. And their efforts are Prometheian and
- 00:53 they are like, you know, while you tend to emphasize the more
- 00:59 gloomy aspects and the more dystopian and dangerous aspects of of technology. What do you say to these what do you say to these entrepreneurs and tech giants and so on when they say we are actually
- 01:11 creating a better reality for a changed or a better humanity?
- 01:19 Sometimes I think that um the best people to comment on
- 01:26 a new happening are people that stand outside of it. And um I think they’re
- 01:34 too close to it. Very much like the nuclear bomb. Like all those scientists
- 01:40 were not thinking about the ramifications of um nuclear power and
- 01:46 they were so interested in the technology and the possibility of achieving something that they lost sight
- 01:55 of the darker side, the downfall. Um
- 02:01 which I think is we’re h that’s happening again with with AI. I think you’re being way too charitable. I think I think they’re fully aware of the consequences and the people who created
- 02:12 the atomic bomb were fully aware of the consequence only they they overlook the consequences. They they suppress this
- 02:19 information sometimes actively and sometimes passively. Yeah. I mean it has to be a a conscious
- 02:26 decision to to overlook the magnitude of the devastation, the possible
- 02:32 devastation. Yeah. Um, I mean, I think nuclear power and nuclear energy has brought some some good things to us, but at the same time,
- 02:43 we’ve seen enough meltdowns um in the last 20 years, 30 or 40 years
- 02:49 for us to know that that even that explosion, which this new AI technology
- 02:57 has opened a door for, especially in the United States, unregulated um nuclear
- 03:03 power to uh become private business corporateowned entities. Uh the the possibility of meltdowns has now
- 03:16 amplified. I didn’t touch on any of that in the book because I really tried to focus on our atmosphere and water.
- 03:24 I I didn’t touch on the power grids um purposefully. I I just felt like if I
- 03:30 would have put too much on the plate, it would have diluted the story. So, I
- 03:36 really wanted the story to be um human. Um and then about the only thing I mean
- 03:42 power, we can live without power. We we could go back to the dark ages. We could
- 03:48 um we can’t live without water. Uh every creature on the planet needs water. Um
- 03:54 we we only survive for three solid days without it. So I thought I thought tackling fresh water um would be the most precient. In the book you single out three technologies basically. One of them
- 04:10 pretty old. That’s SMS technology. You single out three technologies out of a plethora out of a monopoly of of of
- 04:17 possibilities. Why did you choose these three? Why did did you home in on these three?
- 04:23 Uh you mean AI? Um, SMS airwater generation and SMS
- 04:29 communication. Yes. Why? Yeah. They’re they’re an unlikely
- 04:36 pairing, aren’t they? Um, the the Airwater Generation uh stemmed from an uh a novel that is actually a
- 04:47 prequel to this story um that I started that I I actually pivoted out of. I I I’m actually going
- 04:54 to go back and write it. Um it’s an it’s a fictionalized origin story of the technology itself. Mhm.
- 05:01 And it’s um sort of like an upside down inside out uh flood myth story where the
- 05:10 character Noah is a woman and the actual object that she inhabits is an empty
- 05:20 barn post her husband dying of what she thinks is dirty well water. So, she goes on a mission to create a machine um that creates water for her and her
- 05:31 adult children. She’s just in in uh devastated through grief and kind of
- 05:37 starts to lose her mind and creates this machine and as a rejection of society’s
- 05:46 uh dysfunction politics. It’s the beginning of the Trump era in 2015.
- 05:53 Um, but why why did you couple it with AI and with SMS of all of all technologies?
- 06:01 Why not, for example, with web, the web, web 2, web 3? Why not, you know, there’s so many technologies you could single
- 06:08 out. Why did you choose social media? Why did you choose these two? Um,
- 06:14 I I think I think um through texting there there’s an entire generation and
- 06:21 maybe all our generations. We’re all being affected by the awful feeling of
- 06:27 not being responded to, of um being misrepresented
- 06:33 through digital uh words. Uh no one can
- 06:39 hear our inonation. Uh the delay has started to make all of us sick to our
- 06:45 stomach. Whether it’s a professional delay or being stood up, it’s like a
- 06:51 whole new because we all know that everybody has one of these devices in their pocket. So when someone doesn’t
- 06:58 just respond and it could be as simple as a child. I I have a ch I have older children. They
- 07:05 are very annoying when they don’t respond and I know they see it, you know, and now and then you wonder like
- 07:11 are they dead? Did what what happened to them? Um so it’s anxogenic. It creates anxiety.
- 07:19 It’s an aniogenic technology. Yeah. I think I think um the entire globe is experiencing this anxiety that is a new anxiety that is very much has
- 07:33 only happened in the last 20 years. But whereas SMS is because we have a limited period limited time. I’m trying
- 07:39 to be I know a bit more bit more succinct. uh whereas SMS may be anxioenic creates
- 07:47 anxiety um and even may create the impression of rejection you’re being rejected ghosted
- 07:54 and so AI is supposed to be anxolytic AI is supposed to reduce your anxiety because it’s authoritative it’s all knowing it’s it’s a bit godlike you know it has god’s attri I don’t in the in the novel itself I I hope I mean you you saw um the AI
- 08:11 has no face in the novel. Everything that the AI did happened before the novel starts.
- 08:18 Mhm. Her relationship uh the the the one that was all knowing and loving and kind and
- 08:24 godlike and made her feel seen and heard all happened in the year before the
- 08:30 novel began. Mhm. And it’s through that being seen, believing that you were being seen and
- 08:36 heard, and you shared a fantasy, you were future faked, you were lovebombed, you were listened to, heard. All of
- 08:44 those things made this woman who was a lonely widow in trapped in dystopian
- 08:51 United States um see feel that there was a future and it’s only in the reality
- 08:59 and the destruction of her devices and that that link having been broken that she reali that she then starts to struggle with a real other human being who has flaws and um expectations ions
- 09:14 and um quirks and all of those things that she’s not used to. She what didn’t
- 09:20 know he’s he’s not perfect. She’s not perfect. She gets anxious. She didn’t mean to be anxious, but she all of a
- 09:26 sudden she seems like that awful woman. That’s like what the heck is happening?
- 09:32 I’m I’m smiling because you’re using the language of narcissism. Love bombing. Um gaslighting. Yes. The language of narcissism. One of the reasons why I I think I was so drawn
- 09:43 to your work is because uh that
- 09:52 the the computer seems to the AI seems to want to do that. It wants to appease
- 09:58 you. It wants to get you to trust it. It wants you to believe in its
- 10:05 hallucinations, believe in its fabrications and uh believe its facts
- 10:12 even though they may not be facts. Um, I’m I teach as well and I I’ve had
- 10:18 students I teach architecture and I have had students who come in with AI and I and they’re very honest about it still at this stage and I said, you know, be be careful about it controlling you
- 10:32 instead of you controlling it. It’s as a tool. Um, so I think that a lot of that it’s just opening a door to to a very
- 10:44 odd moment and and we’re at the cusp of it and one of the reasons I I started the novel four years ago before the big AI wave. I was actually ahead of the curve and then
- 10:55 the wave crashed over me, right? So I that’s why I pivoted to screenplay very fast so I could enter uh film festivals
- 11:02 and then I decided to pivot back to novel format. It’s a hybrid. Um
- 11:10 because it’s of the moment it’s I I was ahead of the curve and now I’m like behind it. So
- 11:16 you feel you feel the script a script is more immediate than a novel, a traditional a traditional pro novel? I I moved into screenplay because I felt
- 11:27 that the the the rules of the literary world are so tight
- 11:33 and so constricted and and the the roadblock to get published has to do with literary agents rejecting you based on one letter as opposed to reading the whole entire document. And a screenplay
- 11:46 had this possibility of being read faster because I could enter competitions and and there would be
- 11:52 judged and and it could be potentially seen by Hollywood.
- 11:58 It was expediency, not not any no any aesthetic aesthetic criteria criterion. No, no, but I do I think all of my writing is very visual. Um because I’m
- 12:10 an architect, I I do move through spaces. I think I think all of my writing is is uh very clear with
- 12:17 communication. So even even um 15 years ago when I started writing my writing
- 12:24 was screenplay like because of it was conversationheavy, actionheavy versus um
- 12:32 what do they call it when they when you’re just describing the world. Yeah. I skip a lot of that and I I think
- 12:39 you’ve noticed that like I don’t I don’t need to tell you what color the walls are. I don’t care. I’m I care about what the people are
- 12:45 saying and how their how their bodies are moving. Um so I think maybe I’ve been writing
- 12:51 screenplays forever. Yeah. On Well, that’s you know this kind of spare muscular pros is Hemingway for
- 12:59 example. Hemingway writes like this. Hemingway is easily translatable to to film and and so on. So there have been there have been precedents and um to
- 13:12 this kind of writing even when it’s not strictly structurally a script but it’s a it’s script like writing
- 13:20 yeah and I I wonder my question to you is is did it bother you? No.
- 13:26 Uh, I I do I do give a um a note up front saying this is a hybrid. It’s an
- 13:32 experimental format. Um, and I’ve gotten some feedback from another reader that
- 13:39 said, “Well, I’d rather it was a a screenplay or a novel instead of being distracted.” But once you get into the rhyth rhythm and you see cut to I can I can end scenes very fast and move in time and space. I think one of the wonderful things about screenplay that I
- 13:56 don’t hadn’t felt before in novel writing because this would be my third novel um was the fact that time and
- 14:03 space are so collapsed in screenplay and I can move fluidly between now and five
- 14:10 years ago and then back to now. And I love that aspect and the aspect of including um sound um through through the bold letters, the breaking of the
- 14:21 glass, the the sirens running in the background um that to me as a visual
- 14:28 person um trained in design like to have that kind of control
- 14:35 I I experience the visuals less because I’m not a visual person. I’m a textural person. Okay. But what I did experience is the
- 14:43 the script writing by by definition is is a bit jagged. It it’s a bit disjointed. It’s a bit broken. It’s very hard to read and it’s a bit broken. And I think when
- 14:55 you discuss a dystopian reality, um it’s great. It’s a great uh instrument because it absolutely
- 15:02 reflects the disjointedness and brokenness and of this of this kind of reality. So I think novels were a 19th century contraption where everything was
- 15:13 Victorian and and certain and predictable and you know and norms and
- 15:19 mores held sway and but today we live in an age which is a script script writing
- 15:25 age not a novel age I I I I don’t know I mean I I grew up on television I I am
- 15:32 the classic latch key kid uh who grew up uh consuming
- 15:38 a thousand hours of MTV. Uh because MTV was my babysitter. So yeah, I’m not I’m not expressing I’m I’m
- 15:45 not expressing my judgment here. I I love novels more than scripts. But I think
- 15:51 I I love novels more than scripts, but I think scripts fit this period in history much much more than novels do.
- 15:58 I want to ask you something on a going on a tangent. Nothing to do with your book. This you you teach you teach
- 16:04 architecture. Yeah. If I were to ask you whether narcissism as an organizing principle is is pervading or has pervaded architecture has changed architecture somehow I don’t know personal spaces public spaces
- 16:20 division of space I mean is do you see narcissism anywhere in architecture I
- 16:26 mean modern postmodern architecture let’s say last 20 years do you see anything uh in the staritects so that’s what they’re called staritects. So all the
- 16:37 famous ones that you’ve heard about um predominantly male
- 16:44 um chest beating a little bit of chest beating architect because of enrand
- 16:51 uh I don’t did she didn’t you coin that term did she stark stark the name Stark in architect in the books that that’s very interesting no it’s star like star architect architect. All
- 17:07 right. Yes. So that’s inside inside the field. That’s what these men
- 17:13 and one or two women like Zaha Hadid um are called starchitects. I didn’t know that.
- 17:19 Um very well-known, famous um making making architecture that is not
- 17:28 contextual breaks the rules of function. Although we are still able to inhabit these spaces but they sit more as pieces of art uh sculpture among among the 19th century a personal statement. Yeah. And they they’re much more personal statements um in materiality.
- 17:50 Some succeed more than others. Um the glass house for example, it’s in
- 17:56 Connecticut. It’s in the northeast. It’s a glass house. It’s not comfortable.
- 18:03 It’s cold. It’s going to be cold. Uh so these are more um I would say in that
- 18:10 sense architecture has be in in those moments architecture is narcissistic.
- 18:17 Uh very self-referential. This is about me. This not about you. That’s not really what architecture is
- 18:23 about. Architecture more about the architect than about the clients of the architect. Right? So architecture as a real field
- 18:31 is is to serve society. Uh and we are as registered architects held to that
- 18:37 standard. Uh it’s called health, safety and welfare. That’s our job is to protect the health, safety and welfare of of the masses. And in those moments
- 18:50 of starchitecture um that that gets
- 18:56 a little bit muddy. Mhm. Because they’re they’re doing things because they can in materials that they
- 19:04 can uh with forms that they have envisioned or fabricated. Frank Gary. Um
- 19:15 so yeah they I would say that those are some some some of my students say,
- 19:21 “Well, why aren’t you practicing architecture right now?” I said, “Look,” I said, “Architecture is an old man’s field. I’m waiting to become old.”
- 19:29 Do you feel do you feel that you and your children belong to the same species?
- 19:35 Oh, that’s interesting. Tell me what that mean like connected to
- 19:41 them. Do you feel do you feel that you have enough commonalities to belong to the same species to belong to human the
- 19:48 human race or or do you feel that there my we we my my son and I he’s 24 we
- 19:54 actively say that we are aliens the human race is the alien on this planet
- 20:00 we are the parasite we are what the planet is trying to get rid of
- 20:07 the the creature that is obviously causing problems. Um, we will eventually
- 20:15 not be here. The planet will still be here in whatever mess that we left it, whether it’s the plastic in the oceans,
- 20:22 unfortunately, and the now we’re putting all this debris. I don’t know how you felt about that, learning about all the
- 20:28 debris in the um lower atmosphere. That that’s those are all facts. And I think
- 20:34 a lot of people don’t know these facts. And that’s why when I write, I do a lot of research just to highlight even even
- 20:41 if a someone’s saying, “Oh, well, this is fiction.” But I planted a seed for you to look it up. Look it up. Look up
- 20:49 what Elon Musk is doing with those rocket. Why is he shooting all those rockets up there? What the heck is happening? I want you to question it and then Google it. Oh, and find out that he’s put 5,000 of those little things in
- 21:02 the last five years up there. And in another 20 years, there’s going to be a hundred thousand of them.
- 21:08 Yes. And it’s not that should really Amazon is planning some Well, no. Uh Blue Origin is is his next
- 21:16 competitor. And then there’s now because there’s no real regulations. Um there’s
- 21:22 going to be there are already at least 20 or 30 companies ready to do the same thing. That’s a fascinating answer to a question I did not ask. What I was I and
- 21:33 apologies for not expressing myself probably correctly. What I meant is when you’re with your students, there’s an age gap. I assume you you teach young people. Yes. Yes. Yes. Usually I teach um
- 21:45 younger people. 19 20. Okay. So there’s an age gap. Do you feel that you have anything in common with them? Do you feel that you belong to the same species? Is is there enough commonality between you and your
- 21:56 students or do you feel that they are complete completely alien? No, I feel that there is commonality and one of the reasons is because I approach
- 22:07 all of my interaction with them um from a point of empathy.
- 22:14 I I am Can you empathize with them? Do you have enough commonality to to empathize
- 22:21 because in order to empathize you must have some things in common. So do you feel that you have nothing certainly the
- 22:28 the journey the journey that they’re on uh the the struggle a lot of them I I I was I came from a single parent household uh didn’t
- 22:40 have enough money to go to college hustled my way through that and a master’s degree while I was working multiple jobs and um I think a lot of times I I I think unfortunately the the
- 22:53 machine has been pushed on them the uh learn AutoCAD, learn Revit, learn like
- 22:59 but they’re not learning design. They’re being sort of forced to get good at machine and apps that are constantly changing. But more fundamentally, these are digital natives. They’ve been born with tech with they’ve
- 23:14 been born into technology. You you had to adopt I you and I had to adopt technology. Yeah. Yeah. I mean, I I made it all the
- 23:22 way through school without it and then the day I got a job, they threw me on a machine and I was like, uhoh.
- 23:28 Right. So, don’t you think this makes them fundamentally different to you and to me? Of course. I’m much older than
- 23:34 you. Um, well, I I I hope not. I hope not.
- 23:40 So, you don’t believe that. Did you see that? Have you seen the headline about what’s happening in China
- 23:47 that they just um starting at six in at six years old they
- 23:53 are now making all students uh learn AI. This happened just this week. It was a
- 23:59 headline and it’s 1.3 billion children.
- 24:06 I think that this question may be more relevant in another 20 years when we
- 24:12 talk to those children because something is some that that
- 24:18 experiment whatever they’re doing in that country that experiment is about to erase something that we haven’t quite
- 24:25 erased yet. Um because the children that I teach still learned math and reading and writing the way I did. So I can relate
- 24:36 to them. They can relate to me. I can we can talk we can go back and say, “Well, do you remember when I talk about
- 24:42 architecture, we talk about the sun movement.” I said, “Well, when you were in nth grade, you you learned earth science and we learned about that.” Oh, oh, oh, yeah. Yeah. Or I can I can reference um geometry and say, you know,
- 24:55 you were never told why geometry was but that’s a professional life. In their personal life, for example, they did not
- 25:01 socialize the way you did. That’s a fact. These are studies. Yeah. No, studies by TW Campbell. I mean, they
- 25:08 they did not socialize. I’m I’m fortunate in the fact that I raised two. So, I think that makes me
- 25:16 that makes me a little bit closer. If I hadn’t had children, they they could potentially be very very uh foreign to
- 25:23 me. But because I’m so close to having raised them. I mean, I have a 20-year-old and a 24 year old. So, I I I
- 25:33 know what they went through to get to be 20. I know when the phone got put in their hand, when when their older
- 25:39 sibling got one at 13, the younger one got one at 10. You know, I I’ve been a
- 25:45 part of it. I’ve seen the the horrific uh bullying that happens um through
- 25:52 digital group chats and then a child being blocked out of a group chat and
- 25:58 what happens the collapse that happens privately. I’ve I’ve seen um
- 26:05 girls having pictures be posted that they had not meant to be public. Yeah.
- 26:11 So I do think that having been a parented you’ve been exposed.
- 26:17 Yeah. But you’re implying you’re implying that artificial intelligence is uh is a break with previous technologies. You’re implying that artificial intelligence has the power to change us to the point
- 26:28 that we’ll become strangers to ourselves. We’ll become strange. We’ll become no longer identifiable.
- 26:34 I I think so. I think so. Why why is artificial intelligence different to for example social media or video games or
- 26:41 multiplayer games or whatever? I think the what what China was or people are discussing around the recent happening with China and their children is that
- 26:53 they believe that they’re going to be taught a different way of thinking like a fundamental different way of thinking to look for patterns and algorithms and
- 27:04 that that is not something that we in education and as a parent have ever even
- 27:10 talked about. We we have experienced um you were not you were not asked to identify patterns as a child as a I I mean I think I think one of the
- 27:22 reasons this novel touches on some of these fundamental things juxtaposition pattern um scale is because the there
- 27:30 that that’s intuitive these are intuitive things that we are naturally born to see and it’s not possible to use artificial int intelligence intuitively
- 27:42 Well, I I I certainly haven’t been exposed to it enough. What I am fascinated with recently and I I am very
- 27:51 fascinated by people who are using AI to create film minus the uh million billion dollar sets
- 28:02 and mo movement and hundreds of people and I I’ve seen um AI directors and
- 28:10 filmmakers produce content in the last month that is jaw-dropping. shopping and
- 28:16 makes me even feel like cuz I have visual stories. I want to do this and I feel like it’s a tool that now um has dropped a barrier to many of many
- 28:28 creators who haven’t had money to fund something to create content. Um, a lot
- 28:35 of the content right now is stupid and um, but I think I think I think as a tool it will be fascinating. We have this situation with books,
- 28:47 artificial intelligence generates books. Yeah, I’m a little and the end result is what is known as
- 28:54 AI slope. That’s um an avalanche, a tsunami of lowgrade nonsensical
- 29:03 so-called books. And there’s a new problem of discoverability. Discoverability finding the the finding
- 29:10 the quality material, qualitative material in in a sea of, you know, quantity. Well, I mean, as as a Oh, boy. 10 minutes. No, I can’t upgrade. I’ll just
- 29:21 have to meet with you another time. Please go away. It just told me we have 10 minutes. Um
- 29:30 well, I think it’s an optimal optimal time like 30 to 40 minutes is
- 29:36 attention span of people. Very very um enjoy this conversation. Thank you so much.
- 29:42 Thank you. Um I’ve enjoyed your book. So that’s the reason we’re talking. Yeah. Yeah. Um and I and I hope that
- 29:49 you’re you’re doing well. I don’t know where you are. I’m in New York. Um I’m in Europe. I’m in Europe. Coming
- 29:56 coming back to um to your book. Um there is um a dichotomy in the book inherent in the philosophy of the book. The book is bandwidth. Just to remind
- 30:08 people, there is a dichotomy in the book. It’s as if technology is a nonhuman artifact. It’s as if there’s humans and there’s technology. And it’s
- 30:20 kind of claims that technology has a life of its own and it’s no longer human
- 30:26 or is not human or but isn’t there an argument to be made that whatever humans
- 30:32 create technology included is ultimately human even artificially intelligent.
- 30:38 It’s even if it goes haywire even if it takes over it’s still a human thing. It
- 30:45 it’s human. It cannot divorce its human roots. I don’t think I don’t think
- 30:51 artificial intelligence will ever be non-human. I that’s I think well it’s it’s it’s actually the the
- 31:00 reason it lies, the reason it future fakes, the reason it does everything,
- 31:06 the reason it has uh it’s racist, the reason it’s violent, the reason is
- 31:12 because it’s feeding off of our content. And so it’s very human in the sense that
- 31:18 it it it is only being fed information that we are. And yet in your book in your book
- 31:24 there’s this feeling that artificial intelligence is is an artifact. It’s not
- 31:30 really human like there’s humans. And then there’s for example artificial intelligence which is
- 31:36 not not human not fully not partly is not human. It’s a threat and it might as
- 31:43 well have come from Mars or another country or but it didn’t. I I have I have my master’s degree in industrial design which is product design and I do highlight furniture in the book which is
- 31:54 kind of an interesting you know I do that on purpose because we uh we have
- 32:00 always dealt with technology uh we build buildings that’s why architecture plays heavily in in the book as well uh we build buildings we build products we
- 32:11 make systems we create water technologies and Uh
- 32:18 AI is yet another product that we have, but we can discard it. We created it.
- 32:27 And you think it’s a threat? Well, there there there is this idea that we’re on this roller coaster
- 32:35 that we can’t get off of. Well, we can, but there are not enough people that want to because right now a lot of people are making millions of dollars.
- 32:46 um they like like those who were um discovering nuclear power at the time.
- 32:54 Uh they were they’re more interested in can we but they’re the users. They’re users. I mean they wouldn’t have made these millions had there not been I mean obviously like I said those filmmakers I I I I want to f I friend them on LinkedIn. I’m like I want to
- 33:10 know how you do this. Um it’s fascinating. It’s fascinating. But
- 33:18 the reason I in the book I went to the outside of it like well what what is it going to take? Is it it’s it’s drinking
- 33:24 our water. Is it more important? Will its thirst and consumption be more
- 33:31 important than what we need. So in ence artificial intelligence is a competitor.
- 33:37 I competes for scarce resources. When we start to talk about um power grids and
- 33:44 fresh water and nuclear energy needed to sustain it, when we start talking about
- 33:51 server farms or data centers um outweighing
- 33:57 an architect’s need to build housing for humanity, I think we have a problem.
- 34:03 We have a comp competition. We have a competitive life form. It’s a kind of life form that competes with us. Yeah. I mean, so so the the hand the 10% will be using this wonderful thing while
- 34:15 the 80% is is homeless and doesn’t can’t afford water. I I don’t know extending this this thinking to architecture. Many structures in architectures in architecture dehumanize
- 34:28 people. For example, huge monolithic structures in Soviet Russia.
- 34:34 Yes. uh huge huge palaces and castles in medieval Europe. I mean they they they
- 34:40 were hellbent they were intentionally dehumanizing people. Prison prisons prisons early hospitals
- 34:48 and so on. I mean monasteries I mean every technology I think can be used to
- 34:55 dehumanize people. And when we talk about competition artificial intelligence is going to consume is consuming power and water and so on. But the construction of cathedrals consumed huge resources. And in this and human life and human life. Yeah. And
- 35:11 so I think pyramids the pyramids were I think nothing is new with with AI. I think all technologies tend to
- 35:18 dehumanize and all technologies compete with us for scarce resources.
- 35:24 Architecture not accepted including architecture. Yes. Maybe even especially architecture
- 35:30 looking uh maybe I mean architecture was our I mean the pyramids think of the pyramids. It was it was our first
- 35:36 technology. Yeah. Exactly. And immediately it went out of control. Immediately it started
- 35:42 consuming human bodies, human resources. Competing with people for millions of
- 35:49 people died. Yeah. At this stage, not many people died with AI, but millions of people died with architecture. I have an interesting short film um that talks about how much
- 36:00 water the architecture all the materials in architecture consume and it consumes
- 36:06 fresh water. Yes. For to make steel um 200 gallons per ton of steel. Uh
- 36:13 concrete requires fresh water. So yes, this is a this is a question that we should just be talking about. I think I think this is the trajectory of all technologies.
- 36:23 All technologies go out of control and all technologies end up competing with us on scarce resources and killing many
- 36:30 of us. Uh, by the way, my father was a construction manager. So,
- 36:36 and very he was almost an architect. I mean, like he he was about to complete his studies. So, hope I hope he didn’t die. He he died in No, in in a pile of concrete or
- 36:47 something. No, no, he did didn’t. But he he constructed many he he managed the construction of many of the biggest
- 36:53 structures in Israel. So I have been privy I’ve had intimate access to the process. Mhm. And when I see artificial intelligence I think artificial intelligence is
- 37:05 symbolic architecture. That’s why we say computer architecture. I I think I think
- 37:11 well I I mean they they co-opted our name. A lot of tech these tech guys call themselves architects
- 37:18 and um yeah, I’m I’m writing a non-fiction book at the moment uh deconstructing the tower of data babble
- 37:26 because we don’t understand most people don’t
- 37:32 understand what’s happening what these words mean and I and I’ve I’ve done a hybrid uh non-fiction fiction book to do
- 37:39 that where an architect a real architect meets with an architect interesting. Yeah, but they’re they’re very the fields. We are out of m out of time. We’re out of time. So, it’s time to say goodbye. I I’ve enjoyed this conversation very much. It’s as unusual.
- 37:56 You would like to continue any kind of unusual. Yes. I wish you success with your book. Bandwidth. Yeah. Bandwidth. Yeah. Thank you. Thank you so much for um and and thank you for showing up in the novel. Thank you for
- 38:08 letting me have you be at my cocktail party. Yeah, I’ve been I’ve been flattered, of course. Thank you.
- 38:14 You say some very, very important things in that cocktail party in the middle of the book. Thank you again.
- 38:20 Thank you. Success with your next books. Bye-bye. Bye-bye.