Narcissism in Unexpected Places (with Tanya H. Van Cott, Author of “Bandwidth”)

Summary

The conversation explored the dystopian themes of the book *Bandwidth*, focusing on technology’s impact on humanity, particularly AI, SMS communication, Narcissism, and airwater generation, while contrasting optimistic views held by tech entrepreneurs. The speaker discussed the human-nonhuman dichotomy in technology, the anxieties provoked by digital communication, and the challenges AI poses as a competitor for scarce resources like water and power. Additionally, the dialogue touched on architecture's narcissistic tendencies, generational divides in technology use, and the evolving literary forms fitting modern fragmented realities.

Tags

Tip: click a paragraph to jump to the exact moment in the video. Narcissism in Unexpected Places (with Tanya H. Van Cott, Author of “Bandwidth”)

  1. 00:00 Okay. Thank you for your Thank you for your patience. No, no, don’t worry about it. So, uh
  2. 00:07 I’ve read your your unusual book bandwidth. It’s unusual in format and it’s unusual in contents I think. But one one strand
  3. 00:19 there is a dystopian view of technologies, modern technologies. And I wanted to confront you with the opposing view, the ones propagated by entrepreneurs and tech bros and so on. They regard themselves as promethean
  4. 00:36 and modern version of prome prometheus. You know they bring they bring fire and
  5. 00:42 light. They they create an alternative reality. They enable and empower people. Mhm. They so they regard themselves as Prometheus. Mhm. And their efforts are Prometheian and
  6. 00:53 they are like, you know, while you tend to emphasize the more
  7. 00:59 gloomy aspects and the more dystopian and dangerous aspects of of technology. What do you say to these what do you say to these entrepreneurs and tech giants and so on when they say we are actually
  8. 01:11 creating a better reality for a changed or a better humanity?
  9. 01:19 Sometimes I think that um the best people to comment on
  10. 01:26 a new happening are people that stand outside of it. And um I think they’re
  11. 01:34 too close to it. Very much like the nuclear bomb. Like all those scientists
  12. 01:40 were not thinking about the ramifications of um nuclear power and
  13. 01:46 they were so interested in the technology and the possibility of achieving something that they lost sight
  14. 01:55 of the darker side, the downfall. Um
  15. 02:01 which I think is we’re h that’s happening again with with AI. I think you’re being way too charitable. I think I think they’re fully aware of the consequences and the people who created
  16. 02:12 the atomic bomb were fully aware of the consequence only they they overlook the consequences. They they suppress this
  17. 02:19 information sometimes actively and sometimes passively. Yeah. I mean it has to be a a conscious
  18. 02:26 decision to to overlook the magnitude of the devastation, the possible
  19. 02:32 devastation. Yeah. Um, I mean, I think nuclear power and nuclear energy has brought some some good things to us, but at the same time,
  20. 02:43 we’ve seen enough meltdowns um in the last 20 years, 30 or 40 years
  21. 02:49 for us to know that that even that explosion, which this new AI technology
  22. 02:57 has opened a door for, especially in the United States, unregulated um nuclear
  23. 03:03 power to uh become private business corporateowned entities. Uh the the possibility of meltdowns has now
  24. 03:16 amplified. I didn’t touch on any of that in the book because I really tried to focus on our atmosphere and water.
  25. 03:24 I I didn’t touch on the power grids um purposefully. I I just felt like if I
  26. 03:30 would have put too much on the plate, it would have diluted the story. So, I
  27. 03:36 really wanted the story to be um human. Um and then about the only thing I mean
  28. 03:42 power, we can live without power. We we could go back to the dark ages. We could
  29. 03:48 um we can’t live without water. Uh every creature on the planet needs water. Um
  30. 03:54 we we only survive for three solid days without it. So I thought I thought tackling fresh water um would be the most precient. In the book you single out three technologies basically. One of them
  31. 04:10 pretty old. That’s SMS technology. You single out three technologies out of a plethora out of a monopoly of of of
  32. 04:17 possibilities. Why did you choose these three? Why did did you home in on these three?
  33. 04:23 Uh you mean AI? Um, SMS airwater generation and SMS
  34. 04:29 communication. Yes. Why? Yeah. They’re they’re an unlikely
  35. 04:36 pairing, aren’t they? Um, the the Airwater Generation uh stemmed from an uh a novel that is actually a
  36. 04:47 prequel to this story um that I started that I I actually pivoted out of. I I I’m actually going
  37. 04:54 to go back and write it. Um it’s an it’s a fictionalized origin story of the technology itself. Mhm.
  38. 05:01 And it’s um sort of like an upside down inside out uh flood myth story where the
  39. 05:10 character Noah is a woman and the actual object that she inhabits is an empty
  40. 05:20 barn post her husband dying of what she thinks is dirty well water. So, she goes on a mission to create a machine um that creates water for her and her
  41. 05:31 adult children. She’s just in in uh devastated through grief and kind of
  42. 05:37 starts to lose her mind and creates this machine and as a rejection of society’s
  43. 05:46 uh dysfunction politics. It’s the beginning of the Trump era in 2015.
  44. 05:53 Um, but why why did you couple it with AI and with SMS of all of all technologies?
  45. 06:01 Why not, for example, with web, the web, web 2, web 3? Why not, you know, there’s so many technologies you could single
  46. 06:08 out. Why did you choose social media? Why did you choose these two? Um,
  47. 06:14 I I think I think um through texting there there’s an entire generation and
  48. 06:21 maybe all our generations. We’re all being affected by the awful feeling of
  49. 06:27 not being responded to, of um being misrepresented
  50. 06:33 through digital uh words. Uh no one can
  51. 06:39 hear our inonation. Uh the delay has started to make all of us sick to our
  52. 06:45 stomach. Whether it’s a professional delay or being stood up, it’s like a
  53. 06:51 whole new because we all know that everybody has one of these devices in their pocket. So when someone doesn’t
  54. 06:58 just respond and it could be as simple as a child. I I have a ch I have older children. They
  55. 07:05 are very annoying when they don’t respond and I know they see it, you know, and now and then you wonder like
  56. 07:11 are they dead? Did what what happened to them? Um so it’s anxogenic. It creates anxiety.
  57. 07:19 It’s an aniogenic technology. Yeah. I think I think um the entire globe is experiencing this anxiety that is a new anxiety that is very much has
  58. 07:33 only happened in the last 20 years. But whereas SMS is because we have a limited period limited time. I’m trying
  59. 07:39 to be I know a bit more bit more succinct. uh whereas SMS may be anxioenic creates
  60. 07:47 anxiety um and even may create the impression of rejection you’re being rejected ghosted
  61. 07:54 and so AI is supposed to be anxolytic AI is supposed to reduce your anxiety because it’s authoritative it’s all knowing it’s it’s a bit godlike you know it has god’s attri I don’t in the in the novel itself I I hope I mean you you saw um the AI
  62. 08:11 has no face in the novel. Everything that the AI did happened before the novel starts.
  63. 08:18 Mhm. Her relationship uh the the the one that was all knowing and loving and kind and
  64. 08:24 godlike and made her feel seen and heard all happened in the year before the
  65. 08:30 novel began. Mhm. And it’s through that being seen, believing that you were being seen and
  66. 08:36 heard, and you shared a fantasy, you were future faked, you were lovebombed, you were listened to, heard. All of
  67. 08:44 those things made this woman who was a lonely widow in trapped in dystopian
  68. 08:51 United States um see feel that there was a future and it’s only in the reality
  69. 08:59 and the destruction of her devices and that that link having been broken that she reali that she then starts to struggle with a real other human being who has flaws and um expectations ions
  70. 09:14 and um quirks and all of those things that she’s not used to. She what didn’t
  71. 09:20 know he’s he’s not perfect. She’s not perfect. She gets anxious. She didn’t mean to be anxious, but she all of a
  72. 09:26 sudden she seems like that awful woman. That’s like what the heck is happening?
  73. 09:32 I’m I’m smiling because you’re using the language of narcissism. Love bombing. Um gaslighting. Yes. The language of narcissism. One of the reasons why I I think I was so drawn
  74. 09:43 to your work is because uh that
  75. 09:52 the the computer seems to the AI seems to want to do that. It wants to appease
  76. 09:58 you. It wants to get you to trust it. It wants you to believe in its
  77. 10:05 hallucinations, believe in its fabrications and uh believe its facts
  78. 10:12 even though they may not be facts. Um, I’m I teach as well and I I’ve had
  79. 10:18 students I teach architecture and I have had students who come in with AI and I and they’re very honest about it still at this stage and I said, you know, be be careful about it controlling you
  80. 10:32 instead of you controlling it. It’s as a tool. Um, so I think that a lot of that it’s just opening a door to to a very
  81. 10:44 odd moment and and we’re at the cusp of it and one of the reasons I I started the novel four years ago before the big AI wave. I was actually ahead of the curve and then
  82. 10:55 the wave crashed over me, right? So I that’s why I pivoted to screenplay very fast so I could enter uh film festivals
  83. 11:02 and then I decided to pivot back to novel format. It’s a hybrid. Um
  84. 11:10 because it’s of the moment it’s I I was ahead of the curve and now I’m like behind it. So
  85. 11:16 you feel you feel the script a script is more immediate than a novel, a traditional a traditional pro novel? I I moved into screenplay because I felt
  86. 11:27 that the the the rules of the literary world are so tight
  87. 11:33 and so constricted and and the the roadblock to get published has to do with literary agents rejecting you based on one letter as opposed to reading the whole entire document. And a screenplay
  88. 11:46 had this possibility of being read faster because I could enter competitions and and there would be
  89. 11:52 judged and and it could be potentially seen by Hollywood.
  90. 11:58 It was expediency, not not any no any aesthetic aesthetic criteria criterion. No, no, but I do I think all of my writing is very visual. Um because I’m
  91. 12:10 an architect, I I do move through spaces. I think I think all of my writing is is uh very clear with
  92. 12:17 communication. So even even um 15 years ago when I started writing my writing
  93. 12:24 was screenplay like because of it was conversationheavy, actionheavy versus um
  94. 12:32 what do they call it when they when you’re just describing the world. Yeah. I skip a lot of that and I I think
  95. 12:39 you’ve noticed that like I don’t I don’t need to tell you what color the walls are. I don’t care. I’m I care about what the people are
  96. 12:45 saying and how their how their bodies are moving. Um so I think maybe I’ve been writing
  97. 12:51 screenplays forever. Yeah. On Well, that’s you know this kind of spare muscular pros is Hemingway for
  98. 12:59 example. Hemingway writes like this. Hemingway is easily translatable to to film and and so on. So there have been there have been precedents and um to
  99. 13:12 this kind of writing even when it’s not strictly structurally a script but it’s a it’s script like writing
  100. 13:20 yeah and I I wonder my question to you is is did it bother you? No.
  101. 13:26 Uh, I I do I do give a um a note up front saying this is a hybrid. It’s an
  102. 13:32 experimental format. Um, and I’ve gotten some feedback from another reader that
  103. 13:39 said, “Well, I’d rather it was a a screenplay or a novel instead of being distracted.” But once you get into the rhyth rhythm and you see cut to I can I can end scenes very fast and move in time and space. I think one of the wonderful things about screenplay that I
  104. 13:56 don’t hadn’t felt before in novel writing because this would be my third novel um was the fact that time and
  105. 14:03 space are so collapsed in screenplay and I can move fluidly between now and five
  106. 14:10 years ago and then back to now. And I love that aspect and the aspect of including um sound um through through the bold letters, the breaking of the
  107. 14:21 glass, the the sirens running in the background um that to me as a visual
  108. 14:28 person um trained in design like to have that kind of control
  109. 14:35 I I experience the visuals less because I’m not a visual person. I’m a textural person. Okay. But what I did experience is the
  110. 14:43 the script writing by by definition is is a bit jagged. It it’s a bit disjointed. It’s a bit broken. It’s very hard to read and it’s a bit broken. And I think when
  111. 14:55 you discuss a dystopian reality, um it’s great. It’s a great uh instrument because it absolutely
  112. 15:02 reflects the disjointedness and brokenness and of this of this kind of reality. So I think novels were a 19th century contraption where everything was
  113. 15:13 Victorian and and certain and predictable and you know and norms and
  114. 15:19 mores held sway and but today we live in an age which is a script script writing
  115. 15:25 age not a novel age I I I I don’t know I mean I I grew up on television I I am
  116. 15:32 the classic latch key kid uh who grew up uh consuming
  117. 15:38 a thousand hours of MTV. Uh because MTV was my babysitter. So yeah, I’m not I’m not expressing I’m I’m
  118. 15:45 not expressing my judgment here. I I love novels more than scripts. But I think
  119. 15:51 I I love novels more than scripts, but I think scripts fit this period in history much much more than novels do.
  120. 15:58 I want to ask you something on a going on a tangent. Nothing to do with your book. This you you teach you teach
  121. 16:04 architecture. Yeah. If I were to ask you whether narcissism as an organizing principle is is pervading or has pervaded architecture has changed architecture somehow I don’t know personal spaces public spaces
  122. 16:20 division of space I mean is do you see narcissism anywhere in architecture I
  123. 16:26 mean modern postmodern architecture let’s say last 20 years do you see anything uh in the staritects so that’s what they’re called staritects. So all the
  124. 16:37 famous ones that you’ve heard about um predominantly male
  125. 16:44 um chest beating a little bit of chest beating architect because of enrand
  126. 16:51 uh I don’t did she didn’t you coin that term did she stark stark the name Stark in architect in the books that that’s very interesting no it’s star like star architect architect. All
  127. 17:07 right. Yes. So that’s inside inside the field. That’s what these men
  128. 17:13 and one or two women like Zaha Hadid um are called starchitects. I didn’t know that.
  129. 17:19 Um very well-known, famous um making making architecture that is not
  130. 17:28 contextual breaks the rules of function. Although we are still able to inhabit these spaces but they sit more as pieces of art uh sculpture among among the 19th century a personal statement. Yeah. And they they’re much more personal statements um in materiality.
  131. 17:50 Some succeed more than others. Um the glass house for example, it’s in
  132. 17:56 Connecticut. It’s in the northeast. It’s a glass house. It’s not comfortable.
  133. 18:03 It’s cold. It’s going to be cold. Uh so these are more um I would say in that
  134. 18:10 sense architecture has be in in those moments architecture is narcissistic.
  135. 18:17 Uh very self-referential. This is about me. This not about you. That’s not really what architecture is
  136. 18:23 about. Architecture more about the architect than about the clients of the architect. Right? So architecture as a real field
  137. 18:31 is is to serve society. Uh and we are as registered architects held to that
  138. 18:37 standard. Uh it’s called health, safety and welfare. That’s our job is to protect the health, safety and welfare of of the masses. And in those moments
  139. 18:50 of starchitecture um that that gets
  140. 18:56 a little bit muddy. Mhm. Because they’re they’re doing things because they can in materials that they
  141. 19:04 can uh with forms that they have envisioned or fabricated. Frank Gary. Um
  142. 19:15 so yeah they I would say that those are some some some of my students say,
  143. 19:21 “Well, why aren’t you practicing architecture right now?” I said, “Look,” I said, “Architecture is an old man’s field. I’m waiting to become old.”
  144. 19:29 Do you feel do you feel that you and your children belong to the same species?
  145. 19:35 Oh, that’s interesting. Tell me what that mean like connected to
  146. 19:41 them. Do you feel do you feel that you have enough commonalities to belong to the same species to belong to human the
  147. 19:48 human race or or do you feel that there my we we my my son and I he’s 24 we
  148. 19:54 actively say that we are aliens the human race is the alien on this planet
  149. 20:00 we are the parasite we are what the planet is trying to get rid of
  150. 20:07 the the creature that is obviously causing problems. Um, we will eventually
  151. 20:15 not be here. The planet will still be here in whatever mess that we left it, whether it’s the plastic in the oceans,
  152. 20:22 unfortunately, and the now we’re putting all this debris. I don’t know how you felt about that, learning about all the
  153. 20:28 debris in the um lower atmosphere. That that’s those are all facts. And I think
  154. 20:34 a lot of people don’t know these facts. And that’s why when I write, I do a lot of research just to highlight even even
  155. 20:41 if a someone’s saying, “Oh, well, this is fiction.” But I planted a seed for you to look it up. Look it up. Look up
  156. 20:49 what Elon Musk is doing with those rocket. Why is he shooting all those rockets up there? What the heck is happening? I want you to question it and then Google it. Oh, and find out that he’s put 5,000 of those little things in
  157. 21:02 the last five years up there. And in another 20 years, there’s going to be a hundred thousand of them.
  158. 21:08 Yes. And it’s not that should really Amazon is planning some Well, no. Uh Blue Origin is is his next
  159. 21:16 competitor. And then there’s now because there’s no real regulations. Um there’s
  160. 21:22 going to be there are already at least 20 or 30 companies ready to do the same thing. That’s a fascinating answer to a question I did not ask. What I was I and
  161. 21:33 apologies for not expressing myself probably correctly. What I meant is when you’re with your students, there’s an age gap. I assume you you teach young people. Yes. Yes. Yes. Usually I teach um
  162. 21:45 younger people. 19 20. Okay. So there’s an age gap. Do you feel that you have anything in common with them? Do you feel that you belong to the same species? Is is there enough commonality between you and your
  163. 21:56 students or do you feel that they are complete completely alien? No, I feel that there is commonality and one of the reasons is because I approach
  164. 22:07 all of my interaction with them um from a point of empathy.
  165. 22:14 I I am Can you empathize with them? Do you have enough commonality to to empathize
  166. 22:21 because in order to empathize you must have some things in common. So do you feel that you have nothing certainly the
  167. 22:28 the journey the journey that they’re on uh the the struggle a lot of them I I I was I came from a single parent household uh didn’t
  168. 22:40 have enough money to go to college hustled my way through that and a master’s degree while I was working multiple jobs and um I think a lot of times I I I think unfortunately the the
  169. 22:53 machine has been pushed on them the uh learn AutoCAD, learn Revit, learn like
  170. 22:59 but they’re not learning design. They’re being sort of forced to get good at machine and apps that are constantly changing. But more fundamentally, these are digital natives. They’ve been born with tech with they’ve
  171. 23:14 been born into technology. You you had to adopt I you and I had to adopt technology. Yeah. Yeah. I mean, I I made it all the
  172. 23:22 way through school without it and then the day I got a job, they threw me on a machine and I was like, uhoh.
  173. 23:28 Right. So, don’t you think this makes them fundamentally different to you and to me? Of course. I’m much older than
  174. 23:34 you. Um, well, I I I hope not. I hope not.
  175. 23:40 So, you don’t believe that. Did you see that? Have you seen the headline about what’s happening in China
  176. 23:47 that they just um starting at six in at six years old they
  177. 23:53 are now making all students uh learn AI. This happened just this week. It was a
  178. 23:59 headline and it’s 1.3 billion children.
  179. 24:06 I think that this question may be more relevant in another 20 years when we
  180. 24:12 talk to those children because something is some that that
  181. 24:18 experiment whatever they’re doing in that country that experiment is about to erase something that we haven’t quite
  182. 24:25 erased yet. Um because the children that I teach still learned math and reading and writing the way I did. So I can relate
  183. 24:36 to them. They can relate to me. I can we can talk we can go back and say, “Well, do you remember when I talk about
  184. 24:42 architecture, we talk about the sun movement.” I said, “Well, when you were in nth grade, you you learned earth science and we learned about that.” Oh, oh, oh, yeah. Yeah. Or I can I can reference um geometry and say, you know,
  185. 24:55 you were never told why geometry was but that’s a professional life. In their personal life, for example, they did not
  186. 25:01 socialize the way you did. That’s a fact. These are studies. Yeah. No, studies by TW Campbell. I mean, they
  187. 25:08 they did not socialize. I’m I’m fortunate in the fact that I raised two. So, I think that makes me
  188. 25:16 that makes me a little bit closer. If I hadn’t had children, they they could potentially be very very uh foreign to
  189. 25:23 me. But because I’m so close to having raised them. I mean, I have a 20-year-old and a 24 year old. So, I I I
  190. 25:33 know what they went through to get to be 20. I know when the phone got put in their hand, when when their older
  191. 25:39 sibling got one at 13, the younger one got one at 10. You know, I I’ve been a
  192. 25:45 part of it. I’ve seen the the horrific uh bullying that happens um through
  193. 25:52 digital group chats and then a child being blocked out of a group chat and
  194. 25:58 what happens the collapse that happens privately. I’ve I’ve seen um
  195. 26:05 girls having pictures be posted that they had not meant to be public. Yeah.
  196. 26:11 So I do think that having been a parented you’ve been exposed.
  197. 26:17 Yeah. But you’re implying you’re implying that artificial intelligence is uh is a break with previous technologies. You’re implying that artificial intelligence has the power to change us to the point
  198. 26:28 that we’ll become strangers to ourselves. We’ll become strange. We’ll become no longer identifiable.
  199. 26:34 I I think so. I think so. Why why is artificial intelligence different to for example social media or video games or
  200. 26:41 multiplayer games or whatever? I think the what what China was or people are discussing around the recent happening with China and their children is that
  201. 26:53 they believe that they’re going to be taught a different way of thinking like a fundamental different way of thinking to look for patterns and algorithms and
  202. 27:04 that that is not something that we in education and as a parent have ever even
  203. 27:10 talked about. We we have experienced um you were not you were not asked to identify patterns as a child as a I I mean I think I think one of the
  204. 27:22 reasons this novel touches on some of these fundamental things juxtaposition pattern um scale is because the there
  205. 27:30 that that’s intuitive these are intuitive things that we are naturally born to see and it’s not possible to use artificial int intelligence intuitively
  206. 27:42 Well, I I I certainly haven’t been exposed to it enough. What I am fascinated with recently and I I am very
  207. 27:51 fascinated by people who are using AI to create film minus the uh million billion dollar sets
  208. 28:02 and mo movement and hundreds of people and I I’ve seen um AI directors and
  209. 28:10 filmmakers produce content in the last month that is jaw-dropping. shopping and
  210. 28:16 makes me even feel like cuz I have visual stories. I want to do this and I feel like it’s a tool that now um has dropped a barrier to many of many
  211. 28:28 creators who haven’t had money to fund something to create content. Um, a lot
  212. 28:35 of the content right now is stupid and um, but I think I think I think as a tool it will be fascinating. We have this situation with books,
  213. 28:47 artificial intelligence generates books. Yeah, I’m a little and the end result is what is known as
  214. 28:54 AI slope. That’s um an avalanche, a tsunami of lowgrade nonsensical
  215. 29:03 so-called books. And there’s a new problem of discoverability. Discoverability finding the the finding
  216. 29:10 the quality material, qualitative material in in a sea of, you know, quantity. Well, I mean, as as a Oh, boy. 10 minutes. No, I can’t upgrade. I’ll just
  217. 29:21 have to meet with you another time. Please go away. It just told me we have 10 minutes. Um
  218. 29:30 well, I think it’s an optimal optimal time like 30 to 40 minutes is
  219. 29:36 attention span of people. Very very um enjoy this conversation. Thank you so much.
  220. 29:42 Thank you. Um I’ve enjoyed your book. So that’s the reason we’re talking. Yeah. Yeah. Um and I and I hope that
  221. 29:49 you’re you’re doing well. I don’t know where you are. I’m in New York. Um I’m in Europe. I’m in Europe. Coming
  222. 29:56 coming back to um to your book. Um there is um a dichotomy in the book inherent in the philosophy of the book. The book is bandwidth. Just to remind
  223. 30:08 people, there is a dichotomy in the book. It’s as if technology is a nonhuman artifact. It’s as if there’s humans and there’s technology. And it’s
  224. 30:20 kind of claims that technology has a life of its own and it’s no longer human
  225. 30:26 or is not human or but isn’t there an argument to be made that whatever humans
  226. 30:32 create technology included is ultimately human even artificially intelligent.
  227. 30:38 It’s even if it goes haywire even if it takes over it’s still a human thing. It
  228. 30:45 it’s human. It cannot divorce its human roots. I don’t think I don’t think
  229. 30:51 artificial intelligence will ever be non-human. I that’s I think well it’s it’s it’s actually the the
  230. 31:00 reason it lies, the reason it future fakes, the reason it does everything,
  231. 31:06 the reason it has uh it’s racist, the reason it’s violent, the reason is
  232. 31:12 because it’s feeding off of our content. And so it’s very human in the sense that
  233. 31:18 it it it is only being fed information that we are. And yet in your book in your book
  234. 31:24 there’s this feeling that artificial intelligence is is an artifact. It’s not
  235. 31:30 really human like there’s humans. And then there’s for example artificial intelligence which is
  236. 31:36 not not human not fully not partly is not human. It’s a threat and it might as
  237. 31:43 well have come from Mars or another country or but it didn’t. I I have I have my master’s degree in industrial design which is product design and I do highlight furniture in the book which is
  238. 31:54 kind of an interesting you know I do that on purpose because we uh we have
  239. 32:00 always dealt with technology uh we build buildings that’s why architecture plays heavily in in the book as well uh we build buildings we build products we
  240. 32:11 make systems we create water technologies and Uh
  241. 32:18 AI is yet another product that we have, but we can discard it. We created it.
  242. 32:27 And you think it’s a threat? Well, there there there is this idea that we’re on this roller coaster
  243. 32:35 that we can’t get off of. Well, we can, but there are not enough people that want to because right now a lot of people are making millions of dollars.
  244. 32:46 um they like like those who were um discovering nuclear power at the time.
  245. 32:54 Uh they were they’re more interested in can we but they’re the users. They’re users. I mean they wouldn’t have made these millions had there not been I mean obviously like I said those filmmakers I I I I want to f I friend them on LinkedIn. I’m like I want to
  246. 33:10 know how you do this. Um it’s fascinating. It’s fascinating. But
  247. 33:18 the reason I in the book I went to the outside of it like well what what is it going to take? Is it it’s it’s drinking
  248. 33:24 our water. Is it more important? Will its thirst and consumption be more
  249. 33:31 important than what we need. So in ence artificial intelligence is a competitor.
  250. 33:37 I competes for scarce resources. When we start to talk about um power grids and
  251. 33:44 fresh water and nuclear energy needed to sustain it, when we start talking about
  252. 33:51 server farms or data centers um outweighing
  253. 33:57 an architect’s need to build housing for humanity, I think we have a problem.
  254. 34:03 We have a comp competition. We have a competitive life form. It’s a kind of life form that competes with us. Yeah. I mean, so so the the hand the 10% will be using this wonderful thing while
  255. 34:15 the 80% is is homeless and doesn’t can’t afford water. I I don’t know extending this this thinking to architecture. Many structures in architectures in architecture dehumanize
  256. 34:28 people. For example, huge monolithic structures in Soviet Russia.
  257. 34:34 Yes. uh huge huge palaces and castles in medieval Europe. I mean they they they
  258. 34:40 were hellbent they were intentionally dehumanizing people. Prison prisons prisons early hospitals
  259. 34:48 and so on. I mean monasteries I mean every technology I think can be used to
  260. 34:55 dehumanize people. And when we talk about competition artificial intelligence is going to consume is consuming power and water and so on. But the construction of cathedrals consumed huge resources. And in this and human life and human life. Yeah. And
  261. 35:11 so I think pyramids the pyramids were I think nothing is new with with AI. I think all technologies tend to
  262. 35:18 dehumanize and all technologies compete with us for scarce resources.
  263. 35:24 Architecture not accepted including architecture. Yes. Maybe even especially architecture
  264. 35:30 looking uh maybe I mean architecture was our I mean the pyramids think of the pyramids. It was it was our first
  265. 35:36 technology. Yeah. Exactly. And immediately it went out of control. Immediately it started
  266. 35:42 consuming human bodies, human resources. Competing with people for millions of
  267. 35:49 people died. Yeah. At this stage, not many people died with AI, but millions of people died with architecture. I have an interesting short film um that talks about how much
  268. 36:00 water the architecture all the materials in architecture consume and it consumes
  269. 36:06 fresh water. Yes. For to make steel um 200 gallons per ton of steel. Uh
  270. 36:13 concrete requires fresh water. So yes, this is a this is a question that we should just be talking about. I think I think this is the trajectory of all technologies.
  271. 36:23 All technologies go out of control and all technologies end up competing with us on scarce resources and killing many
  272. 36:30 of us. Uh, by the way, my father was a construction manager. So,
  273. 36:36 and very he was almost an architect. I mean, like he he was about to complete his studies. So, hope I hope he didn’t die. He he died in No, in in a pile of concrete or
  274. 36:47 something. No, no, he did didn’t. But he he constructed many he he managed the construction of many of the biggest
  275. 36:53 structures in Israel. So I have been privy I’ve had intimate access to the process. Mhm. And when I see artificial intelligence I think artificial intelligence is
  276. 37:05 symbolic architecture. That’s why we say computer architecture. I I think I think
  277. 37:11 well I I mean they they co-opted our name. A lot of tech these tech guys call themselves architects
  278. 37:18 and um yeah, I’m I’m writing a non-fiction book at the moment uh deconstructing the tower of data babble
  279. 37:26 because we don’t understand most people don’t
  280. 37:32 understand what’s happening what these words mean and I and I’ve I’ve done a hybrid uh non-fiction fiction book to do
  281. 37:39 that where an architect a real architect meets with an architect interesting. Yeah, but they’re they’re very the fields. We are out of m out of time. We’re out of time. So, it’s time to say goodbye. I I’ve enjoyed this conversation very much. It’s as unusual.
  282. 37:56 You would like to continue any kind of unusual. Yes. I wish you success with your book. Bandwidth. Yeah. Bandwidth. Yeah. Thank you. Thank you so much for um and and thank you for showing up in the novel. Thank you for
  283. 38:08 letting me have you be at my cocktail party. Yeah, I’ve been I’ve been flattered, of course. Thank you.
  284. 38:14 You say some very, very important things in that cocktail party in the middle of the book. Thank you again.
  285. 38:20 Thank you. Success with your next books. Bye-bye. Bye-bye.
Facebook
X
LinkedIn
WhatsApp

https://vakninsummaries.com/ (Full summaries of Sam Vaknin’s videos)

http://www.narcissistic-abuse.com/mediakit.html (My work in psychology: Media Kit and Press Room)

Bonus Consultations with Sam Vaknin or Lidija Rangelovska (or both) http://www.narcissistic-abuse.com/ctcounsel.html

http://www.youtube.com/samvaknin (Narcissists, Psychopaths, Abuse)

http://www.youtube.com/vakninmusings (World in Conflict and Transition)

http://www.narcissistic-abuse.com (Malignant Self-love: Narcissism Revisited)

http://www.narcissistic-abuse.com/cv.html (Biography and Resume)

Summary

The conversation explored the dystopian themes of the book *Bandwidth*, focusing on technology’s impact on humanity, particularly AI, SMS communication, Narcissism, and airwater generation, while contrasting optimistic views held by tech entrepreneurs. The speaker discussed the human-nonhuman dichotomy in technology, the anxieties provoked by digital communication, and the challenges AI poses as a competitor for scarce resources like water and power. Additionally, the dialogue touched on architecture's narcissistic tendencies, generational divides in technology use, and the evolving literary forms fitting modern fragmented realities.

Tags

If you enjoyed this article, you might like the following:

Narcissism: Birth Order, Siblings (Literature Review)

The discussion explored the likelihood of siblings developing narcissistic personality disorder, emphasizing that birth order and being an only child have minimal impact on the development of pathological narcissism, which is likely influenced more by genetic predisposition and environmental factors. Studies indicate that both overt and covert narcissism can arise

Read More »

Sexualizing Anxiety and Anxiolytic Sex: Misattribution of Arousal

The concept of misattribution of arousal, where anxiety and sexual arousal are often confused or interchangeably misidentified, impacting emotional and physiological responses. It highlighted how anxiety can be mistaken for sexual attraction and vice versa, with both conditions influencing behavior and perception, including gender roles and narcissism. Various studies were

Read More »

Artificial Human Intelligence: Brain as Quantum Computer?

The speaker discussed their new project focused on developing a mathematical specification for an implantable PLL chip that would enable the brain to perceive the entire quantum wave function, including all collapsed and non-collapsed states, effectively transforming the brain into a powerful quantum computer. They argued that the brain is

Read More »

Narcissist’s Idealization in Grandiosity Bubble

Sam Vaknin explained the concept of grandiosity bubbles as defensive fantasy constructs narcissists create to maintain an inflated self-image and avoid confronting reality, especially during transitions between sources of narcissistic supply. These bubbles serve as temporary, protective isolations where the narcissist can recover from narcissistic injury without experiencing humiliation or

Read More »

Your Defensive Identification with the Aggressor (Abuser)

The psychological concept of “identifying with the aggressor,” where victims of abuse unconsciously adopt traits and behaviors of their abusers as a defense mechanism to cope with trauma and gain a sense of control. This process, rooted in childhood development and psychoanalytic theory, often leads to maladaptive coping, perpetuates the

Read More »

Back to Our Future: Neo-Feudalism is End of Enlightenment (Starts 01:27)

The speaker discussed the ongoing societal shift from Enlightenment ideals—science, liberal democracy, and bureaucracy—toward a resurgence of feudalism characterized by theocracy, oligarchy, and totalitarianism. This regression reflects widespread disillusionment with elitism and institutional failure, leading to a nihilistic period where the masses reject Enlightenment values in favor of authoritarian models

Read More »

Healthy Self-regulation vs. Dysregulation

Sam Vaknin explores the concept of self-regulation, emphasizing that it primarily concerns controlling behavior rather than internal processes, and highlights its significance in goal attainment and impulse control. He critiques the traditional notion of the “self” in self-regulation, noting the fluidity of identity and the social context’s role, and discusses

Read More »

When YOU Adopt Slave Mentality in Narcissist’s Shared Fantasy

The speaker explored the concept of slave mentality in victims of narcissistic abuse, explaining how narcissists enforce a shared fantasy that suppresses victims’ autonomy and identity. The speaker emphasized that victims often succumb to this mentality because it offers a deceptive sense of safety, predictability, and unconditional love akin to

Read More »

10 Signs: YOU are Broken, Damaged, Scarred

Sam Vaknin discusses the psychological patterns and clinical features common among damaged and broken individuals, emphasizing the impacts of trauma, mistrust, emotional detachment, and difficulties with intimacy and boundaries. He highlights defense mechanisms such as hypervigilance, emotional numbness, conflict avoidance, perfectionism, and the harsh inner critic, explaining how these behaviors

Read More »

Narcissism is So Hard to Believe! (with Yulia Kasprzhak, Clinician)

In-depth analysis of narcissistic personality disorder, emphasizing the distinction between narcissists, psychopaths, and borderlines, highlighting narcissists as delusional and psychotic with impaired reality testing and confabulation rather than manipulative liars. It discussed the complexities of narcissistic relationships, including “hoovering,” the dynamics of narcissistic abuse, and the detrimental impact on partners,

Read More »