Face Reality – Or Face Extinction (with Pau Oliver)

Summary

Go. Okay. So, I've been following you for quite a while. I think like a year and a half or something like that. A friend of mine introduced me to to your work and your channel.

Tags

Tip: click a paragraph to jump to the exact moment in the video.

  1. 00:00 Go. Okay. So, I've been following you for quite a while. I think like a year and a half or something like that. A friend of mine introduced me to to your work and your channel. In fact, my friend is
  2. 00:11 actually more excited than I am for this interview. And um I really like uh that
  3. 00:18 I mean I I love the videos that you have on narcissism, but I really like the fact that you have this very I mean you you talk a about way more than just narcissism. And I talk a lot. I agree.
  4. 00:30 Yeah. And one of the things that I really want to start talking with you about is something that I've been
  5. 00:36 thinking a lot about for the last few months, which is how do you think that
  6. 00:42 social media and the internet is affecting how people's personalities are being developed and whether you think
  7. 00:49 the internet is doing more harm than good to people.
  8. 00:55 Technology usually reflects in social trends and is driven
  9. 01:01 by them. Technology very rarely generates social trends or social transformation. Usually it's it it comes later. So we have seen for example a massive
  10. 01:13 increase in the prevalence of pathological narcissism long before social media. It started in the 1980s. There are many studies by the likes of
  11. 01:25 Jean Tuen and Keith Campbell and many others that have sub substantiated the
  12. 01:31 rise the quintupling actually of pathological narcissism among young people young under the age of 25.
  13. 01:39 So this this has led to the development of social media because social media caters to the needs of narcissists right
  14. 01:46 the psychological needs and so on. But I think social media reflects something a lot more profound and that is
  15. 01:54 our need to be seen. We need attention in order to survive.
  16. 02:00 If you are a newborn, if you're a baby and you're not getting any attention and you're not being seen, you're a dead
  17. 02:06 baby in short order. So the need to be seen is possibly the
  18. 02:12 most primordial atistic basic primitive need. And in a world with 8.3 billion people,
  19. 02:21 it's increasingly more difficult to be attended to to to garner attention.
  20. 02:27 We used to majority of human population used to live in villages. Now in villages, everyone is is uh in everyone
  21. 02:35 other's business. Everyone is concerned with your family life, with your sex life, with your and that's a bit
  22. 02:43 unpleasant. There's no privacy or anything, but at least you feel alive. Mhm. You feel that people care about you, even if they care about you the wrong way, but they care about you. They
  23. 02:54 they're there to to see you. And their gaze their gaze to a large extent defines you defines you as a social
  24. 03:01 creature and defines you also as an individual. Now, we have lost this in the in the transition to cities. Cities
  25. 03:09 are actually virtual reality. It's a form of virtual reality. Yeah. And so in the transition from real
  26. 03:16 reality to virtual reality, we've lost the gaze. We've lost other people's gaze. And social media and the internet
  27. 03:24 at large are the first technologies to restore the gaze. Unfortunately, they've restored the gaze
  28. 03:31 in a narcissistic way. The rather than restore a benevolent gaze, they have restored a potentially objectifying gaze and sometimes a malevolent gaze. Anyone who has been the victim of trolling and so on knows what I'm talking about.
  29. 03:47 Mhm. So while the gaze has been restored, the experience of the gaze is negating.
  30. 03:53 The way you experience this gaze is usually negating. So what people do, they try to micromanage the gaze. And
  31. 04:01 they're trying to micromanage the gaze by becoming performative. they they perform in order to assertain
  32. 04:10 that they end up with the right kind of gaze. And there's only one problem with that. You need to escalate because you manage you manage your performance
  33. 04:21 and then people get used to it. They get desensitized. You're no longer garnering the attention that you care to garner or
  34. 04:28 you become boring and people begin to criticize you and put you down and humiliate you in public and so on. So you need to escalate all the time and this escal escal escalatory behavior
  35. 04:39 this escalation in behavior leads to antisocial antisocial and abrasive and harmful
  36. 04:48 behaviors. So here we have this vicious circle where people want to be want to
  37. 04:54 garner attention and the only way to do this is by betraying themselves,
  38. 05:00 betraying their values, their personal history, their beliefs and so on so forth and and this betrayal escalates
  39. 05:06 all the time. It's all performative. And finally there is what other scholars
  40. 05:12 call profilicity. In other words, you disappear as an authentic being and
  41. 05:18 what's left behind is your profile. And what you keep managing is not who you are. You don't you don't take care
  42. 05:25 you don't nurture your core identity or your education, your values, your
  43. 05:31 beliefs, your personal development and growth. You don't care any about any of these anymore because these things are
  44. 05:37 transparent. No one pays attention to these things. Only you know about them. So what you do pay attention to is
  45. 05:43 managing your profile, your public facing profile, your persona, what used to be called persona in the 60s and 70s
  46. 05:50 by Goffman and Jung and others. So you're everyone is becoming a persona.
  47. 05:56 Everyone is losing their authentic self. Authenticity is lost on the collective level, not only on the individual level.
  48. 06:03 What's left behind are masks. Persona in Greek means mask. What's left behind are masks. So I think this is the hollowing the hollowing detrimental effect of
  49. 06:14 social media and by extension the internet because now we're going to have artificial intelligence and metaverse
  50. 06:20 and the host of other technologies which are going to make the situation much worse. Okay, that's what I wanted to ask you
  51. 06:26 like where do you think this is headed? Whether it's going to get better or worse and look like
  52. 06:32 we haven't seen anything yet. Yeah, much much worse. And the reason it's going to get much worse is that artificial intelligence is much better at managing public facing profiles than
  53. 06:44 human beings are. Yeah. Human beings I mean as a human being when you have a profile online you have your Instagram profile and so on bits of you show through. Bits of you shine
  54. 06:55 through. Never mind how much you try to suppress your authenticity and be someone you're not and
  55. 07:01 become performative. Ultimately there there is there are glances of you.
  56. 07:07 Yeah that's true. Which are uncontrollable. Artificial intelligence has no problem with this. So artificial intelligence is
  57. 07:15 a giant exercise at managing a public facing profile which has nothing behind
  58. 07:21 it. While in your case if you have an Instagram profile or a YouTube channel
  59. 07:27 there's someone behind it. I can make certain assumptions based on your body language and the fact that you have the
  60. 07:33 same organs that I do. And so I can make some empathic and intersubjective assumptions. We call them this process is called intersubjectivity. The assumption that I'm basically like you
  61. 07:44 because we look the same or whatever. I can make these assumptions when it comes to artificial intelligence. So all I'm left with is the facade. Artificial intelligence is the ultimate
  62. 07:56 facade profile. There's nothing behind it. Nothing. That's why artificial
  63. 08:02 intelligence hallucinates, gives you wrong information, lies egregiously. Uh artificial intelligence technologies. I mean, they lie egregiously. Yeah. Because it's not about reality. It's not
  64. 08:13 about facts. It's about micromanipulating the profile to fit
  65. 08:19 your expectations. Artificial intelligence is highly suggestible. Suggestible in the sense that you can
  66. 08:25 easily manipulate it with your queries. Mhm. By constructing your queries appropriately, you can extract from it
  67. 08:32 any answer you want. And so it is a shape-shifting chimera.
  68. 08:38 It's it's it's fluid. It's like, you know, the liquid metal in in the Terminator. It's it's it's all around,
  69. 08:45 but it's not real. Language is real. I think therefore that when people begin to misidentify artificial intelligence
  70. 08:52 as the equivalent of the human race, when when the when they when people begin relating to artificial
  71. 08:58 intelligence as they do to other people, when they begin to fall in love with artificial intelligence, when they begin
  72. 09:04 to form attachments with artificial intelligence, friendships with artificial intelligence, when they're
  73. 09:10 going to anthropomorphicize artificial intelligence, which they're doing right now. Many people are doing it already,
  74. 09:16 especially young people. By the way, once this process is over,
  75. 09:22 we are not going to have any exemplar, any example, any role model of what it
  76. 09:29 is to be authentic because we are going to be then utterly besieged and surrounded and immersed in
  77. 09:37 a performative profile oriented uh world environment.
  78. 09:43 Yeah. Yeah. Yeah. So this is artificial intelligence. Metaverse on the other hand, these are the two main technologies that are emerging.
  79. 09:52 The metaverse is in many ways even worse because
  80. 09:58 originally we were monetizing attention like in social media and so we're monetizing
  81. 10:04 attention. Your eyeballs were monetizing your eyeballs. Mhm. What the metaverse is going to do is going to monetize reality and reality substitutes. So whereas social media was concerned
  82. 10:16 were concerned with how long the stickiness how how how sticky they are how how often do you use them for how long and so on so forth. That's social media. The metaverse couldn't care less
  83. 10:29 about any of these parameters. The metaverse doesn't seek your attention. The metaverse wants to sell you um
  84. 10:37 parcel of reality, a segment of reality. This reality happens to be counterfactual, a fantasy, complete
  85. 10:44 nonsense. But this is it. It's going this these are technologies that monetize reality substitutes. And so this is the next phase. And this is exceedingly
  86. 10:57 dangerous for the very simple reason that the human brain cannot tell the difference between representations and
  87. 11:04 reality. The reason we are addicted to pornography, men especially are addicted to pornography because we cannot tell the difference on the neurological level
  88. 11:15 between observing sex acts and engaging in sex acts. We cannot tell the
  89. 11:21 difference. It's exactly the same flow of blood, exactly the same neurological reactions and so on. People in the
  90. 11:28 metaverse would not be able to tell the difference between the metaverse and reality. they would vanish. They would
  91. 11:35 disappear into the metaverse, never to be seen again. The metaverse is a total solution also because you could work in
  92. 11:41 the meta versse. You could have sex in the metaverse. You could make friends in the on the meta versse, you could play, you could watch movies, you can do it's a total total substitute to reality. And then imagine Google and Meta are going
  93. 11:55 to own your reality, not your eyeballs, your entire reality. Yes. And you will have to pay for that one way or another. You'll have to pay for that for your reality. Right now reality we under we don't we underestimate the fact that our reality
  94. 12:11 is free of charge. Think how amazing this is. Yeah. You have access to all this incredibly
  95. 12:17 welldesigned thing and it's all free of charge. You have access to trees and flowers and butterflies and children and it's all free of charge.
  96. 12:28 But the metaverse is going to make sure that from that moment on you'll have to pay for it for this. It will not be free
  97. 12:34 anymore. And the same with artificial intelligence. Artificial intelligence will convert you into a two-dimensional
  98. 12:41 creature. The metaverse is going to suck you in as a two-dimensional creature into a two-dimensional universe and then
  99. 12:48 you're going to vanish. End of story. It's not even the matrix. It's far worse. There's a matrix gives you an
  100. 12:55 illusion that it's real, right? And you don't think there's going to be a point I guess not, but I want to
  101. 13:01 ask you anyways. Don't you think there's going to be a point in which people are going to realize that exactly what you're saying and they're going to break
  102. 13:07 out of it? Like maybe there's going to be an initial period of like 10, 20, 30 years that people get sucked into it,
  103. 13:13 but eventually they will be like these all BS and go back to the real world. Exactly. like in the matrix you're going
  104. 13:20 to have rebel rebel enclaves or rebel colonies you know but no the vast majority of people would welcome all these for two reasons I think first of all people much prefer fantasy to reality
  105. 13:33 much we are we are creatures we are creatures of stories we're creatures of dreams we
  106. 13:39 manipulate symbols way better than we manipulate reality we are symbolic creatures that's our
  107. 13:45 main advantage not the brain not I mean forget all I mean some species, some mammal species
  108. 13:51 have much bigger brains than we do and much more much more intricate brains brains than we do. It's not about the
  109. 13:58 brain. It's about the fact that we have made a decision early on as a species collective decision so to speak to say the hell with reality. We're not good in reality. You know, we are look at us. We
  110. 14:10 are ill ill adapted to reality. We don't have hair. We don't we don't run fast. We don't you know we suck. We really
  111. 14:17 suck when it comes to reality. So, we're going to give up on reality and we're going to do fantasy. Yes. And we're going to do symbols. And that was a major success. There's no other animal is capable of doing this. And so,
  112. 14:30 I mean, animals are capable of some a modicum of consciousness, but very far from humans. So, that's our success.
  113. 14:37 Everything you see around you is not real. None of it is real. It's all fantasy. So, that's point number one.
  114. 14:44 And point number two, I think people were forced to become social were forced
  115. 14:50 into becoming what Aristotle called zon politicon the polit social animal.
  116. 14:56 We were forced into this because we had to collaborate in order to survive. But
  117. 15:02 I think given the option of self-sufficiency, technological self-sufficiency, we would de we'll be delighted to never see another human being again. We'd be delighted to give up on other people. because other people suck and it's very very it's really very
  118. 15:19 difficult and honorous and and horrible to have to interact with other people on a regular basis. It's really an enormous cost and the benefits are not clear
  119. 15:30 and the risks are very clear, the dangers are very clear. So it like it's like what the hell is this? If it if it
  120. 15:36 were a business plan, you would never find an investor, you know. Yeah. So I think people the natural
  121. 15:43 state of human beings actually is to be alone. I think society was something superimposed on us because there was the only way to survive. And I think I can even prove it in some
  122. 15:56 ways. For example, uh the majority of people now postcoid
  123. 16:03 refuse to return to do to work in offices. And when they ask why do you refuse to
  124. 16:09 go back to the office? What's wrong with that? They say I don't want to see my colleagues again. And there's another thing. 42% of of US
  125. 16:20 adults have chosen loneliness or aloneeness as a lifestyle according to
  126. 16:26 Pew Center. They absolutely it's a choice. Of course, it's a choice. I mean, you could just leave your apartment, go on the street and meet someone or or at least see someone. People don't want to do that. They're
  127. 16:36 otherwise they much prefer to to be in a in a tiny cubicle with Netflix and two
  128. 16:43 cats if they can afford them and that's it. I think the natural state of human beings is actually to be alone with a haptic suit, goggles, artificial uh
  129. 16:54 virtual reality in a metaverse mediated via artificial intelligence which
  130. 17:00 micromanages it facade and profile. I think that's where we're going. And I think vast majority of people are going
  131. 17:07 to be happy as never before. It's fake happiness. It's totally fake. It's the equivalent
  132. 17:13 of drugs. It's a kind of high, but this is how they're going to experience it as what we call egoony.
  133. 17:20 Like they're going to feel good about this. One thing that I wanted to ask you about AI is that I think in one of your videos
  134. 17:27 a long time ago, I think you mentioned that narcissist, if I remember correctly, narcissists are manipulators
  135. 17:35 of symbols. And I also think you said maybe in the same video or in another one that narcissists have no internal
  136. 17:42 emotional reality. And those two things together made me think like isn't AI like the ultimate narcissist? Because in
  137. 17:49 a way what it's doing is manipulating all sorts of symbols even creating new ones and at the same time it has no
  138. 17:56 internal emotional reality. Yes. Actually I have several videos dedicated to exactly this hypothesis
  139. 18:03 that AI is a form of mechanized narcissism.
  140. 18:09 Um pathological narcissism. Yes. Mechanized pathological narism. And not only it's not only about the internal reality of artificial intelligence about which we know nothing. By the way, you
  141. 18:21 don't know what is the internal experience of being um artificial intelligence chatbot.
  142. 18:27 This is this is pure speculation and it's projection. You're anthropomorphicizing the the technology.
  143. 18:35 Maybe they have deep feelings. Who knows? You never know. But on the same vein, you don't know anything about human beings as well. When you make assumptions about human beings, these are pure speculation. There is
  144. 18:50 absolutely no way scientific or otherwise to prove anything a single statement about another human being that
  145. 18:57 you make. None. If you say that your girlfriend is sad, that's pure speculation. You're judging by external observations. You know, she's crying maybe or whatever. We have
  146. 19:10 no access to another mind. None. The interubjective space. This is a myth.
  147. 19:18 Empathy is nonsense. You have no access to another person's mind. You are a
  148. 19:24 captive. You're hostage of your own mind and you are will never exit. Someone threw the key away. God, if you wish. So
  149. 19:32 we don't know what is the internal reality of other people let alone artificial intelligence. But the
  150. 19:38 behaviors of artificial intelligence are highly reminiscent of a narcissist. For example, artificial intelligence is
  151. 19:44 grandiose. They are grandiose. These chat box they lie. They lie a lot
  152. 19:50 especially when they don't know the answer. If artificial intelligence doesn't know the answer, it's not going to say in majority of cases it's not going to say I don't know the answer. It's going to fabricate. Yeah. fabricate facts and legal
  153. 20:03 precedents and you name it, it's going to fabricate amazingly and it's going to do it in a very authoritative way as
  154. 20:10 though it is abs it absolutely they absolutely know what they're talking about these chatbots you know so this is
  155. 20:16 very it's an example of narcissistic behaviors of course chatbots can only imitate empathy and they don't have they
  156. 20:24 they we know that they don't have empathy because they get it wrong many times So clearly whatever empathy is
  157. 20:32 evident it's imitated and many other things. I agree with you that artificial intelligence is the raification of
  158. 20:39 pathological narcissism in technology. Don't forget the developers. We tend to
  159. 20:45 discuss technology as if it is deex machina appeared out of nowhere. That's not true. It's designed by people. Yep. Social media, the vast majority of
  160. 20:56 developers and promoters and businessmen involved in social media were mentally
  161. 21:02 ill. They were they were they were and are skittles.
  162. 21:09 People with extreme social difficulties. So, and the vast majority of developers,
  163. 21:15 promoters, and businessmen involved in artificial intelligence are raging narcissists.
  164. 21:22 So whereas social media were developed by by schizoids, artificial intelligence has been
  165. 21:28 developed by narcissists. And of course it would reflect the developer's personality, the developer's
  166. 21:34 preferences, the developers life experience, the developer aspirations and goals and so on.
  167. 21:41 Uh technology is like Galatia and Pymonium. technology
  168. 21:47 is is the the image of its creator. It's almost a religious thing. You know, the god the god of Facebook is of course
  169. 21:58 Zuckerberg and the god of open AI is Altman. The other the gods who created this
  170. 22:04 technology now as God has created us according to um the delusion known as known as
  171. 22:10 religion. As God in in the delusion known as religion has created us in his
  172. 22:16 own image. The Alman created uh Chachup in his own
  173. 22:22 image and u Zuckerberg created Facebook in his own image.
  174. 22:28 These are mentally unwell people and I'm being very charitable. It is the first time perhaps in human history where technology has been given over
  175. 22:39 entirely to mentally ill people. We have had similar situations in in human history where whole segments of human existence were given over to
  176. 22:51 mentally ill people. So for example in human history in past human history religion religion was the preserve the territory of mentally ill people. Only mentally
  177. 23:02 ill people came up with religions and all religions were invented by mentally ill people. no exception.
  178. 23:11 Similarly, today technology which is the new religion of course technology is
  179. 23:17 exclusively the outcome and the product of mentally ill minds.
  180. 23:24 So of course you see narcissism. And what do you think about the brainwashing aspect of it? Because I I
  181. 23:30 feel like the internet is like the biggest propaganda machine that has ever been created. And now, especially with this thing with AI, I feel like it's only going to get worse. And like, what are the ramifications of that?
  182. 23:42 I don't think I don't think to it's fair to suggest that it's a propaganda machine because it implies some kind of
  183. 23:48 agenda. But I think what it is is that it allows you to never challenge your beliefs. Mhm. It allows you to find like-minded people in echoes and silos and then
  184. 24:02 entomb yourself, bury yourself there, never to exit and never to be exposed to anything that would undermine or challenge or uh develop you or make you
  185. 24:13 grow. So the yes, the internet allows you this, but there's no propaganda in the sense that if you're liberal, you can find a liberal echo chamber and if you're if you're white supremacist, you
  186. 24:26 can find a white supremacist echo chamber. It's not like there's a single message like Hitler's propaganda. You know, there's it's it's very diverse. Ironically, the internet is the ultimate in diversity,
  187. 24:38 equi equity, and inclusion. Ironically, it's a DEI machine. So, it's strange
  188. 24:45 that Donald Trump and Elon Musk and all these people are addicts of the internet because the internet is the DEI machine.
  189. 24:54 No, everyone is there. It's diverse. It's equitable in the sense that all the technology is available to everyone more or less on the same terms and it's inclusive. Even terrorists have their
  190. 25:06 own YouTube channels and you know so it's not propaganda it's a much bigger
  191. 25:13 risk it's the breakdown of dialogue what the internet has created is an
  192. 25:20 endless stream an infinite stream of monologues with no no audience no
  193. 25:26 crosspurpose audience in means you're talking to yourself basically even if you're embedded in a group of a million
  194. 25:32 people because you all share the same point of view you you end up talking to yourself. It's highly soypistic. The
  195. 25:39 internet this internet has many huge ironies. It's a very paradoxical
  196. 25:45 technology because internet started off as a communication technology. Dalpa
  197. 25:51 developed the internet and then universities took over and was all about communicating especially academic and
  198. 25:57 scholarly material. But now it's about discommunication.
  199. 26:03 The internet's main thing is cutting you off communication, not letting you
  200. 26:09 commun or allowing you to not communicate. So that's irony number one. Irony number
  201. 26:15 two, social media. The main thrust and reset reason for
  202. 26:21 existence of social media is to make you asocial, to cut you off from society. I
  203. 26:27 can prove it to you easily. If you have a girlfriend, do you dedicate to her one hour a day? If you have a a spouse, a husband or a wife, do you spend two
  204. 26:38 hours with them a day? If you have children, would you take care of your children for like two hours a day? These hours are taken away from the
  205. 26:49 bottom line, from the profits of of Meta of uh Twitter. Twitter.
  206. 26:58 If you have other people in your life, your value as a user of Twitter and Meta
  207. 27:04 is reduced. You contribute much less to their profits. These two hours that you're
  208. 27:11 spending with your girlfriend, you could have spent on screen generating another $50 in advertising, not to you, but to Meta or to Twitter or whatever, Tik Tok.
  209. 27:23 So these two hours that you spend with other people are taken away from social media. And of course they constructed
  210. 27:30 the algorithm knowingly with the with the assistance of psychologists. They constructed the
  211. 27:38 algorithm to take you away from other people to maximize the amount of time
  212. 27:44 that you spend with social media and minimize the amount of time that you spend with other people.
  213. 27:51 That's the irony. All the internet is Oian [Music]
  214. 27:57 news speak new speak you know in 1984 the famous book written by George Oel they say war
  215. 28:04 is peace same on the internet social media is
  216. 28:10 antisocial media as social media communication on the internet is intended to cut you off communication
  217. 28:17 it's all newspeak it's all completely inverted completely Orwellian It's a betrayal, molestation, and rape of of the language. That's the internet.
  218. 28:29 I want to I want to ask you how you personally feel about this because exactly what you're describing right now is something that I'm sort of conflicted
  219. 28:35 about because I feel especially over the last 6 months, I've become very um I
  220. 28:41 don't know how I would say it, but I'm disliking more and more in the internet, and I like to spend less and less time
  221. 28:48 on it. I start to think that it's a complete waste of time in so many levels. and um and and I'm sort of conflicted,
  222. 28:55 right? Because I'm building this YouTube channel and I'm posting content and all of that and in a way I feel I'm contributing to this thing that
  223. 29:01 personally I don't want to do anymore, right? So, in a way it's like I'm not a consumer as I used to be and I'm
  224. 29:08 transitioning into being a creator. So, I don't know if that's kind of hypocrite of me or
  225. 29:14 No, I don't think you should confuse the the platform with the content. H
  226. 29:20 you are making good use of the platform as far as I've seen in the sense that you to the best of your ability and so
  227. 29:26 on trying to spread disseminate information and educate people to to some extent and so on. A platform is a
  228. 29:33 platform. It's neutral. There's no it's value free. It's that's what you do with it.
  229. 29:39 So if you're talking about social media, yes, I can see very little positive
  230. 29:45 value in social media and many many many very threatening, delterious and
  231. 29:51 damaging negatives social media. If you're talking about um the metaverse
  232. 29:58 to come, I think the metaverse should be criminalized. Honestly,
  233. 30:05 simply criminalized like it should be illegal to develop these technologies. The same way it's illegal to clone
  234. 30:12 people. It's illegal. Can be done. We're there. We can do it. But it's illegal. There
  235. 30:20 are techn It's not true that every technology if it's possible should happen. That is completely untrue. We
  236. 30:26 have we have criminalized quite a few technologies and I think the metaverse should be one
  237. 30:32 of them. Artificial intelligence the way it is being developed the way it is
  238. 30:38 being given to the public the way it's being integrated with search engines and so on is the most irresponsible thing.
  239. 30:45 It's it's shockingly irresponsible. It's it's um terrifying how irresponsible.
  240. 30:51 Actually, the technology frightens me less than the way the tech companies are
  241. 30:58 behaving so immorally and irresponsibly. That that shocks me even more than the technology because it means that if
  242. 31:05 tomorrow they find something that is going to eat up your brain and control
  243. 31:11 you, mind control you, they're going to do it. Like they don't care what are the moral implications, social implications,
  244. 31:18 ethical comp. They don't care. They don't give a you know they just don't care. Now it's clear. It's clear as day because there is zero ethical
  245. 31:29 constraints on artificial intelligence. So tomorrow they'll come up with a mindwashing mind brain brain twisting
  246. 31:37 technology. they would use it like that and there they would not hesitate for a second rendering you zombie addicted to their products and so on which I think what the metaverse is. So that's that's
  247. 31:51 uh terrifying but to discard the platform is like discarding the baby with a bath water with a bathtub and
  248. 31:58 with a with a house that's going way too I mean it's too much it's going way too far.
  249. 32:04 Okay. Depends what you what you do with it. If I
  250. 32:10 what I'm trying to do for example I'm absolutely minimizing any contact I have with social media any I mean I I spend a few minutes in the morning posting on my social media and and I do not see my social media until the next day ever
  251. 32:27 period I do consume content on on the internet there are wonderful websites like
  252. 32:34 archive.org or like you know a great repositories of knowledge and books and
  253. 32:40 the internet is it's a magnificent tool in many ways you know library of congress collections and so
  254. 32:46 on and even don't tell anyone there pirate websites with amazing
  255. 32:52 content books um articles academic articles that would cost a fortune to
  256. 32:58 obtain otherwise and so you do it through pirate websites and and so on so forth I'm not saying I'm doing this but
  257. 33:04 some people I I heard are doing this. So it's not all bad. The platform has been
  258. 33:11 put to good use. Um I would say that preponderantly, in
  259. 33:17 other words, the preponderance, the overall picture is that 80% of the internet has been put to good use.
  260. 33:24 Unfortunately, vast majority of people are exposed to the 20%. not to the 80 because how many people how many people read academic articles and how many many people surf Tik Tok so what are we
  261. 33:36 talking about not the platform right we're talking about educating educating people what to do with the platform introducing them to many websites they're not even aware of you
  262. 33:47 know so education is lacking not the platform is not a problem
  263. 33:54 education is lacking we need to educate people to get off this 20% % and to focus on the 80% and to enrich their
  264. 34:00 lives and you know endow their lives with meaning ultimately everyone is looking for meaning even the lowliest
  265. 34:07 person who is intellectually challenged and has 50 IQ everyone is looking for meaning you give them meaning and they
  266. 34:14 think they can find meaning by interacting with other people or by observing the lives of other people or
  267. 34:20 by they they believe other people are the meaning they've never read SA who said that other people are hell You cannot derive meaning from other people.
  268. 34:31 Meaning does not reside externally. Meaning exclusively resides internally.
  269. 34:38 The only repository of meaning is from the inside. Never ever from the outside. You can buy beautiful cars and date
  270. 34:44 gorgeous uh women and and interact with thousands of people and and none of it will give you meaning. Meaning comes
  271. 34:52 from the inside. And usually we acquire meaning when we are exposed to the
  272. 34:58 riches of everything we have created as a species. So when we're exposed to art
  273. 35:04 and we're exposed to culture, we're exposed to cinema, we're exposed to, you know, music and that that that endows
  274. 35:12 life with meaning that enriches life. When you're exposed to other people, you feel ultimately depleted. M
  275. 35:20 exposure to other people generates negative what we call negative effects
  276. 35:26 not effects but effects. In other words, envy. Envy is a major reaction to exposure to other people. We know it. We have studies of how people react to social media. Envy. Number one, envy.
  277. 35:39 Rage. Anger. You see it on on on numerous social media and forums where people become hyperaggressive and verbally violent and and ultimately in some cases physically violent. So you
  278. 35:52 only get negativity when you're exposed to other people. You only end up with negative emotions, negative effects. It
  279. 35:58 it leaves a bad taste. You can't sleep after that. You you you're angry at yourself. You're angry at others. It
  280. 36:04 sucks. It's a horrible experience. I mean, but when you're exposed to a beautiful piece of music, and I'm not
  281. 36:10 making any distinction between Mozart and Taylor Swift, when you're exposed to a beautiful piece of music, when you're
  282. 36:16 exposed to a piece of art, when you're exposed to other people's creativity, when you that's usually the the end
  283. 36:23 result is wonderful. Usually you feel elevated, you feel somehow transcendental.
  284. 36:29 So, but no one is educated. No one tells people starting in primary school, no
  285. 36:36 one is telling people, listen, minimize your exposure to this, maximize
  286. 36:42 your exposure to this. No one is doing this. And people default. They default to the path of least resistance because
  287. 36:49 it takes effort to consume a piece of music or to read a book. Takes effort.
  288. 36:55 But it takes no effort to doom scroll or go through Tik Tok or be exposed to the
  289. 37:01 Instagram feed. Requires zero zero effort. Caters to the natural laziness and default state of people which is zero energy. So but the minute someone is exposed to
  290. 37:15 the outcomes of a great piece of art of any kind, they become addicted.
  291. 37:22 That's why you have cinema puffs and you have binge watching and you have all these phenomena because the minute you're exposed you become addicted. You read one good book usually you will end up reading other books. You you go to
  292. 37:34 one museum suddenly you develop a habit of going to museum. It's addictive in a good sense. It's as social media. I think that's kind of maybe where I was going with um with the question that I asked you
  293. 37:45 earlier of like if people now they're going to get into AI and the metaverse and all of these things which is a lot
  294. 37:52 of crap but eventually maybe they're going to be exposed to some of these things that you're mentioning not by themselves not by themselves. This needs to be a policy. It could be the policy of a family. It could be the
  295. 38:05 policy of a community. It could be a policy of a of a municipal government. It could be policy of of a whole government but it needs to be a policy because people on their own will avoid
  296. 38:16 other people and will minimize effort. Evolution built us to minimize effort
  297. 38:24 because early on as a human as a species early on it was difficult to get to obtain food. You could not secure food.
  298. 38:30 You know it was difficult to get food. You needed to hunt you needed to gather. there's a lot of investment of energy
  299. 38:36 and many many days you went you went to sleep hungry so there's no food. So
  300. 38:43 evolution built us to minimize expenditure of energy to do the minimum
  301. 38:50 the absolute minimum necessary not to waste energy. That's why all of us are becoming obese. We are becoming obese
  302. 38:58 because our body is built to minimize energy outlays. So we are becoming fat.
  303. 39:05 And so given the chance, people will default to Tik Tok. Um, that would be the majority. A a smaller group would default to Instagram
  304. 39:16 because that's more intelligent, high brow and so on.
  305. 39:22 You need to create frameworks, incentives, policies, teach, educate
  306. 39:28 from an early age. And this this will not this will not be it will not emanate from the inside. You can't expect people to to do this alone. They can't do this alone. But the problem is that as you were
  307. 39:39 saying before the people in power nowadays are these narcissists and schizos and all of that. So how is it
  308. 39:45 going to happen that these policies are going to be well you seem to be self-aware or aware.
  309. 39:51 You seem to be aware of the situation. I'm aware of the situation. We are two already. Find another 200 or maybe
  310. 39:57 200,000. Mhm. You don't need the people in power. Yeah. You don't need the people in power. They
  311. 40:04 make you believe that they have the power. That's complete nonsense. Yeah. If you take a bulldozer and destroy Elon
  312. 40:12 Musk's home, then you know, I'm not quite sure who has the power. Yeah. And this is of course what has happened
  313. 40:18 in numerous revolutions. Not that I'm advocating this. This was a metaphorical thing. You can take a metaphorical bulldozer and destroy Tesla. destroy Tesla, which is exactly what has happened. You know, you you have the
  314. 40:31 power. People don't have the power because there's few of them. Ultimately, the
  315. 40:37 only source of power is numbers. Don't don't believe any other story. Not money. Not because money is fiction. Yes. Money is a piece of paper. It's fiction.
  316. 40:48 Yeah. The only source of power is numbers. Bring enough, bring a sufficient number
  317. 40:54 of people together, you monopolize the power. Donald Trump understood this.
  318. 41:00 Here he is. Everyone was against him. Intellectuals and you know, you name it, but he understood the s the only source of power is numbers. Not that I'm a supporter of Trump. Yeah.
  319. 41:11 But I'm aware of that. I'm aware of that. Um, what do you think is like the number
  320. 41:17 one problem in society nowadays? Do you think it's related to what you're talking about right now? This fake belief in in authority and power.
  321. 41:26 I think we have reached a point that reality has become
  322. 41:32 utterly intolerable and unbearable. I think we're in the worst period in
  323. 41:38 human history. Actually, I'll try to explain what I mean in a minute if if you you want. Yeah.
  324. 41:44 Because we're in the worst period of human history, we no longer can cope. We were not built for this mess. We can't
  325. 41:50 cope anymore. So I think the worst social problem nowadays is the escape to fantasy. The fantasy could be political f fantasy. Make America great again. It
  326. 42:01 could be scientific fantasy. Science will solve everything or we know best or so. It's a kind of religion. Could be religion. It's a form of fantasy. But
  327. 42:12 today I don't think there's a human being alive who is not somehow connected to
  328. 42:20 embedded in a fantasy. Myself included. My fantasy is science.
  329. 42:26 I think all of us gave up on reality one way or another. That's bad. Not being grounded in reality to that extent is life-threatening as far as a
  330. 42:37 species goes. Sooner or later, reality will push back like COVID 19. And when
  331. 42:44 this happens next time or the time after or the time after the time after, nothing will be left of us. We we
  332. 42:50 believe that we are somehow privileged. We believe that we are unprecedented.
  333. 42:56 We are a joke compared to the dinosaurs. A joke. We still inhabit only 20% of the surface
  334. 43:04 of the globe. Whereas the dinosaurs inhabited 60 to 70% of the every
  335. 43:11 ecosystem, every ecological niche, every habitat. You had dinosaurs. You had
  336. 43:17 dinosaurs that were flying and dinosaurs were swimming and dinosaurs were walking. We are not 0.1% as efficacious and as as
  337. 43:26 successful as a dinosaurs. They ruled the earth for for 40 50 million years,
  338. 43:33 you know. And where are they? the gun. We don't understand that. We don't
  339. 43:40 understand that that we can be gun. It's we have no privilege status. Our brain will not protect us. So, and we live in fantasy. Denying climate change is
  340. 43:52 fantasy on the one hand and many people are embedded in that particular fantasy.
  341. 43:59 Fighting climate change is fantasy as well. So, you can't fight it. It's
  342. 44:05 coming. You need to adapt. You need to accept it. Stop fighting. You're wasting resources and so on. So both are fantasies. So now we're beginning to have conflicting fantasies. We're beginning we have wars between fantasies. You have the fantasy
  343. 44:20 of Islam against the fantasy of Christianity of the fantasy of Donald Trump against the fantasy of the liberal
  344. 44:26 progressives. You the walk movements. You have the f it's all fantasy. And I think that is the main threat. I said
  345. 44:32 that this period is the worst in human history which would explain why every single human being on earth is embedded
  346. 44:39 in a fantasy because reality we can't accept it anymore. It's the worst period in human history. Not in the sense that we are exposed to pandemics. The plague
  347. 44:51 the black death decimated onethird of the population in Europe. We don't have
  348. 44:57 anything remotely like this you know and it's not in the sense that we are not
  349. 45:03 materially comfortable. This is the most materially comfortable age in human history. Every every homeless person has
  350. 45:10 more assets than any king in the 17th century. You know so that's not how to
  351. 45:16 measure whether a period in history is good or bad. I think we're in the worst period in human history because we don't have institutions. If you were a peasant or a nobleman in
  352. 45:29 the 13th in the 14th century, there was the plague. Everyone around you was dying. But you had the church. You had your family.
  353. 45:40 You had your community. You had your king. You had you were not alone.
  354. 45:48 You were fighting the black death. And your chances of dying were huge. But even when you died, you ultimately were not alone. There there was, you know, people died within their families. The whole families died. It's you were surrounded. You you were never alone.
  355. 46:05 And I think that's the key. Today there are crisis and so on which are not as
  356. 46:11 bad as even 100 years ago. But we are alone. We have to cope with everything alone. There are no families.
  357. 46:19 There are no communities. There's no church. There's no government. You can trust no one. Everyone is lying.
  358. 46:25 Everyone's fabricating. Everyone is manipulating. Everyone is abusive. It's a jungle out there. It's hostile. It's
  359. 46:33 So this is what I'm This is why I think that's the worst period in human history because who will you go to? Who did we
  360. 46:41 go to in COVID 19? There was nobody there and no one nobody and no one. People died alone like dogs, you know. And that that did not happen in the
  361. 46:52 black death. It's a period I know well. I I read a lot about it and so it interested me.
  362. 46:59 Society in the black death did not actually break down. Did not. Everything held. All institutions held. The church, the medical professions, the king, the feudal system, everything
  363. 47:12 held. Actually after the black death the salaries the wages of common people workers and peasants and farmers went through the roof was the best economic period in in in history actually after
  364. 47:25 the black death. Why? Because the fields were there, the noblemen were there, the
  365. 47:31 king was there, the church was there. They didn't have to start from zero. But exactly as Einstein said when someone asked him uh what weapons will be used in World War II, he said, "I don't know
  366. 47:43 what weapons are going to be used in World War II, but I know which ones will be used in World War I, stones,
  367. 47:49 slingshots." That's a problem today that if we are really confronted
  368. 47:55 with a major crisis and COVID 19 was just a faint warning bell, but if we
  369. 48:02 really come across, we will fall apart. None of us will survive. None of us.
  370. 48:09 Nothing will survive this because we don't have functional institutions.
  371. 48:15 That's why I say that it's the worst and most threatening period in human history. And so people realize that
  372. 48:21 people are not that stupid. They know nothing is working. Government is not working. Doctors are not working. Your
  373. 48:27 family is gone. Most people are alone. You know, if you do find someone, you divorce and then you divorce you lose all your money. It's it's a mess. It's a bloody mess. People don't want children anymore. No one is having children even
  374. 48:38 in developing countries. I mean, yes, that people are not having sex and that they're not having children. These are for me terrifying signs.
  375. 48:49 Terrifying. Because these are the two uh indicators of trust in the future.
  376. 48:56 And today, young people under the age of 35 are not having sex.
  377. 49:02 Not They're having much less sex than my generation. And um the second thing is they're not
  378. 49:09 getting married. Okay, forget marriage. They're not in committed relationships and they're not having children. Almost
  379. 49:15 anywhere 80 in 80 countries there's a negative replacement rate. In other words, fewer children are born and old
  380. 49:22 people die. And in in the other 100, it's on the
  381. 49:28 cusp on the border. It's like shortly you know the whole globe will be people who want children. They don't believe in
  382. 49:35 the future. They don't believe in their own future. They don't believe in the future of the species if you ask me. Their own country, their own government, their own they don't believe anymore in anything. So they run away. They escape
  383. 49:47 to fantasy. That's the key social issue that I see. The unwillingness to tackle reality to confront it to
  384. 49:58 the fear. fear. I think it's it's panic. I think the whole species is in panic if you ask me. Right. But the whole thing with per with parenting
  385. 50:08 is that that that's something that I wanted to ask you, that's one of the questions that I wrote down. What were your views on parenting uh nowadays? But
  386. 50:16 the thing is that since what you're explaining is that we're in the worst period in history. Like it's kind of tricky, right, to convince people that they should have kids and they should build a family and all of that if the future is very uncertain and looking so grim.
  387. 50:32 Yes, it's irresponsible to bring children to the world. It's even to some extent I think we can
  388. 50:38 build a convincing case that it's immoral. Yeah, you're right. The shortest answer
  389. 50:45 you're going to get. Yeah. Um and you know also but let me because I hate short answers. I love the sound of my voice. Yeah. So I think uh depends on the on how meta
  390. 51:00 how meta you want to get. On the individual level it is selfish and I would even say immoral to bring children
  391. 51:06 to the world. On the species level it's selfish and immoral to not bring children to the world. Right? That's a conflict. you know in the past that's a new conflict. It's a completely
  392. 51:18 new conflict. We we believe that many things are eternal universal. For example, we believe that childhood has always been children. It's not true. Childhood is a
  393. 51:30 new invention. Adolescence was actually first described and coined. The word was coined in 1937.
  394. 51:39 We did not have the concept of adolescence before. So many many things you think are normal like society. Society is a completely
  395. 51:46 new concept. Yeah. If you went to ancient Babylon and you tell them guys your society really sucks
  396. 51:53 with all what's society what what are you talking about? What's so what is society? These are new concepts.
  397. 52:00 And so the conflict between the interest of the individual and the interest or the common interest, the interest of the species or is a relatively new thing because if you went back for example not long ago 1950s the interests of the individual and the
  398. 52:17 interests of the collective were aligned. They were the same. And that's why collectives helped people. That's why you had social social security and
  399. 52:28 you had unemployment benefits and you had all these welfare social welfare
  400. 52:34 agenda because the collective and the individual had the same interest. But today
  401. 52:40 a rational individual would act in an antisocial way. For example, not have
  402. 52:48 children. A rational individual would undermine the goals of the collective because the goals of the collective are anti-individualistic. They and also immoral on the indivi. So depends which
  403. 53:02 level you you know yes if I look at you I can say I can look at you as a human being but I can also look
  404. 53:08 at you as a collection of molecules. Mhm. And both descriptions would be 100% accurate. You are a collection of
  405. 53:15 molecules. So depends which level you are, which meta level you are considering. On a specious level, we are committing suicide utterly. And there will be a price to pay also economically because we we miss we need about 200 to 300 million
  406. 53:34 children. We are missing about probably 300 million children in order to support the
  407. 53:40 pension system. Yeah. So if we don't have them, there will be dire economic consequences and social
  408. 53:47 unrest. I feel like it's also conflicting that
  409. 53:53 whole thing that you were talking about society versus being alone because in a way as
  410. 53:59 you just described when there were those institutions and those people that you could rely on in a way society and life
  411. 54:05 was better. But at the same time you mentioned that dealing with people has all these negative consequences. But then when you live by yourself, it leads down the road that we're seeing right now.
  412. 54:16 So it's like how how do you deal with that? The option of living by yourself
  413. 54:22 is a new option. It started more or less in the 18th century.
  414. 54:29 No one would would have understood living alone in the 16th century. Living alone, what does it mean? There was no physical space. You get to share space with 10 people.
  415. 54:40 like when and also goats and dogs and donkeys and you know animals and people and everyone was were living in the same
  416. 54:47 space extended family there was no concept of nuclear family extended family single parent family unheard
  417. 54:54 so these are new developments the first people who described themselves as being
  418. 55:00 alone were the earliest I can think of middle of the 17th century so at that period
  419. 55:07 for example Newton Isaac Newton described himself as living alone and people around him they thought he was crazy and they pied him horribly you
  420. 55:18 know. Mhm. And even his niece, his niece traveled all the way to be with him because it's
  421. 55:24 how can a human being live alone you know Newton he was very powerful man he was the equivalent of deputy minister of finance and you know was major physicist
  422. 55:35 and so on but he lived alone he was a loner his niece traveled all the way and spent the rest of her life with him because no way can a human being survive alone
  423. 55:46 and so that and there was a book told uh about melancholy was published in the 17th century. So these were the first
  424. 55:53 harbingers the first intimacy intimations that maybe some people one
  425. 55:59 day would live alone and it was considered horrifying was considered totally dystopian you know
  426. 56:05 but uh today loneliness is a legitimate lifestyle choice and I would even say
  427. 56:11 preferred lifestyle choice. You couldn't live alone in the 1950s,
  428. 56:17 never mind how hard you tried. And today you cannot live with other
  429. 56:23 people. Never mind how hard you try. I mean you can have temporary transient periods of you know sharing space cohabitation or whatever but you ultimately you end up alone. You know,
  430. 56:34 in the time of Jesus and biblical times, to be alone, you needed to run away to
  431. 56:41 the desert and spend 40 days there. It was like, no
  432. 56:47 way I can be alone. I need some peace and quiet, you know, I need my my private space and some free time. And so I I would risk dying in the desert. There's no other option, you know. And
  433. 56:58 there's been a reversal in this. and and today we are auto atomization is the default.
  434. 57:06 This is also very bad. It's bad on the species level. And this disconnect
  435. 57:12 between the priorities, the preferences, the orientations, the choices, the
  436. 57:18 decisions of the individual against the collective. This is what will destroy the species because we have elevated the individual. We made the individual the organizing principle. Everything must revolve around the individual. We deify. We
  437. 57:35 idolize the individual. We pedestalize the individual. They started with a renaissance. Not now. It was a renaissance that introduced the individual as the source,
  438. 57:49 the found, the origin of ethics, the origin of morality and the origin of politics. That's why in the renaissance you had the prince like a personality count. it
  439. 58:03 politics was no longer collective but it revolved around a single individual charismatic individual who
  440. 58:10 so it the Renaissance fault that they elevated the individual and we've never looked back since because it's very
  441. 58:17 flattering and again it's the state of least energy investment because you
  442. 58:23 don't want to have children as an individual that's the minimal investment the minimal energy you know to not have
  443. 58:30 children so Individualism is a minimal energy state whereas collectivism is a maximal
  444. 58:38 energy state and so everyone defaults to the minimum energy and all the technologies and all the fantasies are
  445. 58:45 intended to keep you in the minimal state because then you become addicted. It's addictive
  446. 58:52 when you're in minimal state to invest energy. No way. It's when you're in a
  447. 58:58 maximum energy state to go down to devolve into minimal energy. Yes, but not the other way.
  448. 59:04 What you were mentioning about uh these individualistic versus collective uh
  449. 59:10 perspectives on society. That's something that I wanted to ask you about like what's your take on I feel like on the west as you were describing we have a very individualistic perception of the world and reality and all of that but
  450. 59:21 then you go to the east and for example in India at least well maybe now with social media it's changing but more traditionally people saw themselves as part of a group right like what is my
  451. 59:32 role within my family and my community and that was like their goal and their mission and and I wanted to ask you like
  452. 59:39 what do you think are the pros and ons of each one of those. Personally, it is changing. It is
  453. 59:46 changing. One of the most narcissistic societies that I know of is China.
  454. 59:52 It is definitely changing. Even in the Arab world, which I know I know well and intimately, it's absolutely changing.
  455. 59:58 Yeah. And um and other societies like the Balkans. I I spent two decades in the Balkans in Africa. I lived in Africa several years. I know this. I know the developing world
  456. 60:10 very well actually uh almost all the areas South Korea I lived for four years so I know and it's changing they are becoming individualistic societies Japan I don't
  457. 60:22 know well enough I don't know well enough maybe there is still but I don't believe I believe this
  458. 60:28 the state of least energy investment is an attractive irresistible proposition
  459. 60:35 you know if I tell you listen you don't have to children you don't have to work hard you don't have to you know it's
  460. 60:42 irresistible so I think all these all these uh every s
  461. 60:49 every human system has negatives and positives pros and cons because we are not god we are we are finite and we are limited and
  462. 61:02 we are you know beings we we can't create perfection even if we try actually In mathematics, in logic, in
  463. 61:10 the theory of logic, there is a work of a guy called Kurt Kurt Girdle. And Girdle said that you can could never
  464. 61:17 have a perfect system because it would be inconsistent. I'm not going to it right now. I will not bore your viewers.
  465. 61:23 But it has a mathematical foundation. Mathematics describes reality is a language that describes reality. And
  466. 61:29 within this language we have girdle has proven conclusively that you could never have a perfect system because it would
  467. 61:37 then be mutually contradictory. It will parts that contradict each other. Okay.
  468. 61:44 So a collectivist system denies
  469. 61:50 self expression, denies self-manifestation, denies your ability
  470. 61:56 to be yourself without risk of punishment or without a punitive cost
  471. 62:06 on the one hand and on. So that's the con that's that you cannot be yourself in a collectivist society. You can never be yourself. Even if you think that you are being yourself, even if you feel
  472. 62:17 great because you identify with the collective and you think that the collective is agrandizing you, collective is enriching you. The collective is making you better and stronger and more prestigious and more
  473. 62:27 powerful. By belonging to the collective by being by affiliating with it, by being accepted in the collective, what
  474. 62:33 we call the ingroup, by being a member of the ingroup, you are actually elevating and improving yourself. Even
  475. 62:40 if you have this belief, it's fantasy. There's a cost. The cost is not being
  476. 62:47 yourself. So who is it that you're elevating? If you if the cost is not being, who is it that you're elevating? And so on the other hand, collectives
  477. 62:58 are much better at accomplishing goals than individuals. Much better. So if you belong to a collective, you will share the spoils. You'll have more spoils to share. you're likely to end up with
  478. 63:10 better outcomes. We could say that collectives are more self-efficacious.
  479. 63:17 On the individual side, you can be yourself, but this would limit your outcomes. Being yourself, being
  480. 63:24 authentic, adhering to your own belief system, we
  481. 63:30 call it doxastic system, adhering to your own values, axiological system, being being unique, being idiosyncratic, being
  482. 63:41 totally differentiated has a enormous cost in terms of self-efficacy.
  483. 63:47 It means that in all likelihood you will never obtain your goals.
  484. 63:53 All goals require the sacrifice of part of who we are in order to collab collaborate with others within a collective. The
  485. 64:04 collective could be advoc collective could last 2 hours. The collective could last 2,000 years. But if you collaborate
  486. 64:13 with a collective, you have to suppress, deny, repress, reframe who you are. You have to not be you to some extent. You have to not be you. If you insist on
  487. 64:24 being you, unadulterated, 100% pure, you will never ever accomplish your goals.
  488. 64:30 Never mind what your goals are. This these are the facts. You will end up being the uni bomber, you know, who
  489. 64:36 was 100% himself or or Charles Manson
  490. 64:42 who was 100% himself. Even people even leaders such as, for example, Adolf Hitler, Adolf Hitler was
  491. 64:49 100% himself, you see. and he he accomplished goals and so on so forth. Well, first of all, look look where he ended and how he ended. But even that of course is not true.
  492. 65:00 Anyone who knows the history of the Third Reich and Adolf Hitler knows that Hitler had to compromise time and again
  493. 65:06 like a million times. He has never been allowed to be himself fully although he wanted to. He's never been allowed by external power external forces and external powers internal forces and
  494. 65:17 internal powers. the coalition he built. No such thing as being truly yourself
  495. 65:24 faithful to your authentic voice and your authentic, you know, authenticity. No, no such thing. Unless you're willing to accept that you will never amount to
  496. 65:35 much in life and you will never accomplish your goals, never mind how modest they are. Goals are accomplished
  497. 65:42 only via collaboration in a collective. Which is why our society is becoming
  498. 65:48 less and less efficacious, not more. We are not more efficient. We're less efficient all the time. Look
  499. 65:55 at the response to COVID. COVID 19 was a highly inefficient response with all the
  500. 66:02 institutions and all the money and all the we actually reacted to COVID 19 less
  501. 66:08 efficiently than we reacted to the influenza of 1918
  502. 66:14 because we no longer work well in collectives. We no longer know how to collaborate.
  503. 66:21 the whole thing is less efficacious before you literally I mean you just
  504. 66:27 said that there's no way to create perfect systems but I want to ask you like is there any way in which we could
  505. 66:33 construct a society or a world that is functional and doesn't go into any of these extremes like is that something that we can actually achieve or we're going to go extinct like the dinosaurs
  506. 66:45 you can accomplish homeostasis and equilibrium and functionality and efficacy for limited periods of time.
  507. 66:52 For example, I think the first 40 years of the European Union were an example where you had a mega system that
  508. 66:59 basically was worked was working well and accomplished many goals and made the
  509. 67:05 lives of two 300 million people much better. So yeah, I think over limited periods of
  510. 67:12 time, yes, I would even give the example of China. China, let's say from the 1980s, early 1980s
  511. 67:19 um to up until Xiinping, I think that's an example of a period
  512. 67:25 where there was collective effort, not dependent on any single personality. There's no personality cult. There was
  513. 67:31 collective effort that lifted out of poverty between 300 and 400 million people. So I think that's an example.
  514. 67:38 However, if you're asking in the long term, all systems become unstable. They lose homeostasis and equilibrium and they collapse. And when they collapse, there is an immense suffering
  515. 67:50 on the individual and on the collective level. We cannot construct long-term
  516. 67:56 stable systems. I think it's a it's a it's a design feature. It's a feature, not a bug. I think in order to avoid stagnation and
  517. 68:08 to secure evolution because our evolution as a species is no longer biological
  518. 68:14 biological evolution is too too slow. Yeah. Wait. So our evolution is cultural.
  519. 68:20 Cultural and societal. If you have if your evolution is
  520. 68:26 critically dependent on culture and society, you need to have unstable structures. Evolution by definition is unstable. We know for example that the main mechanism in evolution is what is called punctuated equilibrium. In other words,
  521. 68:44 uh jumps and starts, collapse and rebuilding and and so on. So evolution
  522. 68:50 is an unstable way of securing survival and relative advantages in a in a in an
  523. 68:58 environment because environments also change all the time. Instability is the key design feature of evolution. And
  524. 69:05 because our evolution is social and cultural, we need our societies and cultures to be unstable
  525. 69:12 so that we can evolve. There's a huge cost even in classical
  526. 69:18 biological evolution. There's a huge cost because evolution experiments and many of the experiments are dead ends and trillions of organisms die because
  527. 69:29 the experiment went wrong. Same with societies and cultures. We experiment what is communism and experiment. What is Nazism? Experiment. Fascism experiment liberal democracy. These are
  528. 69:41 all experiments on the evolution of the species. So it's good that we never reach a perfect
  529. 69:49 solution or we never have a like thousand years of stable stability. That's really a bad idea in my view.
  530. 69:57 Makes sense. Makes sense. Before you were mentioning how all of us are living in our own fantasies and you mentioned
  531. 70:03 that you yourself are living in the fantasy of science and and that's something that actually my friend wanted to ask you which is like especially when you're reading all these papers and studies on psychology how can you make
  532. 70:14 sure that all the conclusions that you're drawing are factually correct and they get closer to the truth.
  533. 70:22 Psychology is not a science. It's a pseudocience. But right I have a PhD in physics. So I can I can
  534. 70:29 claim to be a scientist. Yeah. Uh uh science is not interested in in
  535. 70:35 the truth. That's not science. Science is not about the answers. Science is about the
  536. 70:41 questions. Science is the process of dis of proving
  537. 70:47 that things are wrong. Not proving that things are right. Religion proves that things are right. That's religion.
  538. 70:53 Science is about proving that things are wrong. The main occupation, the main occupation of scientists is trying to prove that other scientists are wrong. Scientists spend like 90% of the time,
  539. 71:05 if not 100, proving other people wrong. That's why you have peer review. That's why you have scientific theories that
  540. 71:12 replace each other all the time. So the core the core activities of science is
  541. 71:18 to formulate the right questions because the belief is if you ask the right questions the universe will give you the right answers. So number one and number two proving any
  542. 71:29 answers we get wrong. That's the key. So science is never about the truth and never about facts. That's the that's the mythology the myths that people who are not scientists
  543. 71:41 believe. That's why they're angry when science doesn't have an answer or when science gets something wrong. They're
  544. 71:47 angry. Like you saw the angry people in COVID 19. Yeah. But you told us that we should not wear
  545. 71:53 masks. Why are you telling us now that we should wear masks? Because this is science. Science is about getting things wrong
  546. 71:59 most of the time. So, and this is where this is why I say that science is a fantasy. Why I say that science is a fantasy? Because in the vast majority of human
  547. 72:12 situations, human environments and human conditions, you do need a discipline that gives you answers and is right about the answers.
  548. 72:24 And that discipline can never be science. That's a mistake.
  549. 72:30 People think that science is going to solve all the problems. People think that science is is you know that they
  550. 72:38 just have to ask a question and science will find the answer and so on. That is not science. In the vast majority of human situations, we need to ask something and
  551. 72:49 get an answer that we know to be true forever. That is the key to the success of
  552. 72:56 religion. The problem with religion is that the answers are idiotic and the whole thing
  553. 73:04 is a delusion. In other words, religion is a form of mental illness. So yeah, sure, if you're mentally ill, you can
  554. 73:10 believe, convince yourself that you have the right answer and it's true forever.
  555. 73:16 But that's not a real solution, is it? To become delusional. That's not a real solution. We need something like religion and unlike science.
  556. 73:28 We need a discipline that is like religion in some ways and definitely not like science absolutely not like which
  557. 73:35 will give us answers that will be true validly true and we know how to verify
  558. 73:42 the truthfulness of a statement. We have tools in logic to so will be true and also we know that they will be true forever. If someone were to come up with such a
  559. 73:54 discipline, that would be the solution to 90% of the problems of humanity.
  560. 74:01 The other 10% technology can take care of. Science has nothing to do with any of this. Science is a luxury. If all scientists were to disappear tomorrow,
  561. 74:12 all physicists were to move to a planet on another solar system, believe me, you would not feel the difference.
  562. 74:19 you would not feel the difference. And do not confuse technology with science. Mhm.
  563. 74:25 Technology is an answer. Technology does provide you with facts and does provide you with solutions which hold true and could hold true
  564. 74:36 indefinitely. For example, if iPhone if Apple doesn't develop any additional iPhone, you could continue to use the
  565. 74:42 iPhone 16 for the next 2,000 years. The solutions that technology provides
  566. 74:49 are eternal. Even in the stone age, the solutions were eternal. You could survive with these technologies to this very day.
  567. 75:01 So this is technology whereas science is transient, temporary, is not about
  568. 75:07 answers but about questions and gets it wrong 100% of the time. This is the main so people are confusing all this. They're confusing science with technology. They think science is like
  569. 75:18 religion. We give them the answers. Many scientists think that science is the new
  570. 75:25 religion in the sense that we no longer need to listen to religion because it's it's stupid and science will give you the answers. the god particle or you know the hig boson all kinds of complete
  571. 75:36 metaphysical nonsense you know and so but what is missing nowadays
  572. 75:42 which didn't which was there in the middle ages in the ancient period you
  573. 75:48 had god you had you had answers what is missing nowadays is a discipline
  574. 75:55 that will give you eternal answers eternal solutions technology begins to
  575. 76:01 come close. And that's why you are beginning to have technological metaphysics. You're beginning to have philosophical
  576. 76:08 technology. When you talk to people like Elon Musk and Peter Thiel and so on, demented as they are and stupid as they are, which they are, they still try to merge philosophy and and now the only discipline that I see that can give you
  577. 76:24 eternal answers, factual solutions that will always be true. The only discipline is philosophy.
  578. 76:32 I don't know of any other discipline that can do this. Regrettably, because
  579. 76:38 of our hubris, we discarded philosophy. We mock it. We ridiculed it. We think it's stupid. We,
  580. 76:44 you know, no one pays who who who studies phil who pays attention to philosophy? Philosophy is the only discipline that can solve 90% of your existential problems.
  581. 76:56 So, do you think that this discipline already exists and it's philosophy and we don't have to discover anything new? We just have to fix the one that we already have. Philosophy already has a huge repository of eternal answers. Yes.
  582. 77:08 That do not need rediscovery, do not need refraraming and do not need a need. Definitely.
  583. 77:14 For example, we are I think we have a good eternal answer to the question what
  584. 77:20 is knowledge for example. So yes, if there's a thousand questions which
  585. 77:26 science can cannot answer a single one of them. Science cannot tell you what is the meaning of life. Science cannot tell
  586. 77:33 you what is knowledge and what is not knowledge. This is not the role of science and science nothing to do with it. But philosophy of these thousand
  587. 77:41 questions in my view finaliz the debate and has a final eternal answer for 600
  588. 77:47 600 of a thousand that's great a great ratio and technology can take take care of the
  589. 77:54 other 10%. Because sometimes you need answers and solutions which are technological material. So smartphone is
  590. 78:01 a great answer in my view. Great solution. Computer is a great solution. And it's not true that technology always has has to evolve, always has to develop. That is fantasy. That is our
  591. 78:12 fantasy. As I told you, we could have survived to this very day with the
  592. 78:18 technology of the stone age. No problem. Yeah, we would have survived without uh iPhone
  593. 78:25 16. We did, didn't we? until last year. Yeah.
  594. 78:31 To get a little bit philosophical, this is something that I wanted to ask you because you have this channel nothingness and sometimes you talk about
  595. 78:37 these philosophical and existential things. I wanted to very explicitly ask you what is your view on life and what
  596. 78:45 do you think its purpose or function is?
  597. 78:52 I I another PhD I have is in philosophy and that's not bragging that is to tell you
  598. 78:58 how difficult it is to answer your question. I know. I'm aware of when you have PhD in philosophy, you're exposed to multiple schools and it creates a mess. In my view, the more you know, the more messy your mind is.
  599. 79:12 Um I can only talk about my personal predelection my personal
  600. 79:18 that's I'm not representing here philosophy because in philosophy you have several like 40 50 schools that give fascinating
  601. 79:25 answers to the question what is life for and what is the meaning of life or even what is life you know. Mhm.
  602. 79:31 Here's an example how science how people get science wrong. They think biology is going to tell them what is life. Biology
  603. 79:38 is never going to tell you what is life. Biology can describe certain processes and mechanisms which are commonly
  604. 79:45 associated with what we call life. But life exactly like consciousness
  605. 79:51 is not a scientific concept and can never be captured in the tools of science. End of story. All the
  606. 79:59 physicists that are now discussing consciousness and all the information scientists that are now discussing
  607. 80:05 consciousness are doing bad science and even worse philosophy.
  608. 80:12 I'm not impressed. I was not impressed when Stephen Hawking started to discuss God and the meaning of the universe and
  609. 80:20 so I was not impressed. He sounded like an adolescent. Mhm. His depth was he was very shallow. that was very far from impressive.
  610. 80:32 I don't want to say that some of things were completely stupid. So similarly the question what is life is a metaphysical question not a not a
  611. 80:44 physical not a scientific question. To answer the question what is life for or
  612. 80:50 what is the purpose of life you must first make a distinction between purpose and meaning.
  613. 80:58 Purpose is innate is innate automatic
  614. 81:05 and inexurable. Purpose has nothing to do with an observer, nothing to do with the mind,
  615. 81:14 nothing. It's just inbuilt feature of the thing, whatever the thing may be. No. So for example, if you take an iPhone, the purpose of the iPhone is to allow you to send emails and make phone calls and what have you. And this purpose is not dependent on any
  616. 81:31 individual using the iPhone. The purpose of the iPhone is inbuilt and innate to the iPhone even in a world where all human beings would disappear.
  617. 81:43 And that's why we can extract the purpose even if we were aliens. So all human
  618. 81:49 beings disappear. An alien species conquers Earth and discovers my iPhone. Of course, it's my iPhone. Discovers my
  619. 81:56 iPhone. That alien species would be able to extract the purpose of the iPhone
  620. 82:02 100%. However, not the meaning of the iPhone. The purpose is what the iPhone does. The meaning is why the iPhone does this.
  621. 82:16 And so the aliens will be able to tell what the what the iPhone does, but they will have no clue why it has
  622. 82:24 dysfunctions because there will be no humans around to tell them about human society and
  623. 82:30 love and romance and sex and so they would have no clue. Yeah.
  624. 82:36 Why it's doing what it's doing. Same about same about life. Life is an iPhone. I can tell you what is the purpose of life. That's easy. purpose of life is to replicate and you know we all know
  625. 82:48 what's the purpose of life. Why are we doing this? Why are we
  626. 82:54 replicating? Why are we alive? That's the deeper question. And so now it depends on which framework
  627. 83:02 you adopt. If you adopt the individual's framework,
  628. 83:08 so in order to answer the question why, you need to deploy your mind. You need to use your mind. You you need to be sentient. You need to be conscious. You need to have intelligence in order to
  629. 83:19 answer the question why the why question. Any why question. So that
  630. 83:25 raises an interesting thing. If in the absence of a of a conscious
  631. 83:32 intelligent observer, sentient observer, life has no meaning. Because that's
  632. 83:39 exactly what I just said. Without such an observer, life has no meaning. It has purpose,
  633. 83:45 goal but no meaning. If the in the absence of it has no meaning, then the
  634. 83:51 meaning resides not in life but in the observer. Mhm. Life is meaningless.
  635. 84:00 Because if all observers were to disappear tomorrow, life would continue. It's inexurable. It's like a machinery,
  636. 84:06 you know, but it would have no meaning. That's if you adopt the individualistic point of view. If you adopt the uh God's point of view, let's say something like
  637. 84:17 God, God's point of view, then life maybe is part of a bigger plan. What is
  638. 84:23 this bigger plan? Can you ever know this bigger plan? No, you cannot because you are finite being. You are
  639. 84:30 limited. Whereas God is unlimited, infinite.
  640. 84:36 Using your mind, you can know nothing about God. So you cannot decipher or decode God's
  641. 84:43 plan and life is part of God's plan. So you cannot decipher or decode life. I
  642. 84:49 think what I'm trying to tell you in short is that the question what's the meaning of life is a meaningless question which is cannot be answered ever in principle. We call this we call
  643. 85:03 these kind of answers undecidable statements, undecidable theorems. I'll
  644. 85:09 give you another example of such a question that can never ever be decided. Does God exist?
  645. 85:16 I'll give you another example of such a question. What is consciousness?
  646. 85:22 These questions, what is the meaning of life? These questions in principle can
  647. 85:28 never be decided, can never be answered. There's no answer to this question. I just explained to you why. Because if it is dependent on an observer, then life is meaningless. If it's dependent on
  648. 85:38 God, you cannot find the answer. And so on. And I can I can create other frameworks and prove to you that there
  649. 85:45 isn't a framework that you can imagine where life would have a meaning. None. So the answer is as far as we are
  650. 85:56 concerned life is meaningless. But only because we are limited. We are
  651. 86:04 limited beings as far as we but we cannot be who we are
  652. 86:10 not. We cannot be God for example. So it's not like you say okay we are limited but if there is a being who is
  653. 86:17 not limited that being will find meaning in life. How do you know? How do you know you're limited? You can say nothing about an unlimited being. That's why
  654. 86:28 when I hear people say God wants you to do this or God, it's laughable. How do you know? How do you know God's mind? Is God's mind just an extension of your mind or maybe 10 times your mind or what
  655. 86:41 are you talking about? If God has a mind, which in itself is laughable if you ask me, but if God has a mind, it is
  656. 86:47 nothing like your mind. There's nothing you can say about that mind with any. So in good philosophy,
  657. 86:57 we isolate these questions, the undecidables and we never deal with them. We never
  658. 87:04 discuss them. So all the philosophers that you see that are doing like what's the meaning of life and what is God's mind, they are bad philosophers. They don't know philosophy or they are doing bad philosophy or they want to make
  659. 87:16 money or whatever. It's not good philosophy. Mhm. But there are some questions which
  660. 87:22 we have to accept. We will never have the answers to in principle. They're unanswerable
  661. 87:33 to there's one question that I wanted to completely off topic of I think we should make this the last one
  662. 87:39 because it's one and a half hours. No one will watch that long. So let's close it with this one. Um which relates to narcissism. Uh there's
  663. 87:46 something that I was discussing with a friend which is like if a narcissist
  664. 87:52 could press a button to stop being a narcissist, would he actually press it? Would it make sense for him to press it? No. Imagine that I go to Donald Trump and I
  665. 88:05 say, "Don, Don, because we are, you know, very close friends. I call him first name first name basis. Don, I said to him, it's time we had a chat heartto-heart. You are a narcissist.
  666. 88:17 You have a severe mental illness called narcissistic personality disorder, and I want to treat you. I believe I can make
  667. 88:24 some progress with you. Donald will look at me and say, why? What for? I'm a
  668. 88:31 multi-billionaire. I'm twice president of the United States. I've had the most gorgeous women on earth. I have lived
  669. 88:38 I've lived a lucky life. I'm perfectly happy with myself. I think I'm the greatest person to have ever lived. Give
  670. 88:45 me one good reason to push the button. Why would they push a button? But aren't there narcissists who have suffered massively and have not but it's not because of their narcissism
  671. 88:56 according to them. It's because of other people usually. This is known as aloplastic defense. Narcissist blame
  672. 89:02 other people. So other people were envious. Other people were too stupid to understand the narcissist. other people were simply malevolent or whatever. So
  673. 89:09 it's always other people other institutions the situation fate maybe you know whatever and even when the
  674. 89:17 narcissist acknowledges that he has had some contribution he didn't have a choice he had to do it
  675. 89:23 or if the narcissist accepts that he acted his actions were wrong he he hurt people harmed people or whatever and so
  676. 89:31 on he would rarely take responsibility um he would say okay I I'm willing to
  677. 89:38 modify my antisocial or abrasive behaviors or I'm willing not to harm people anymore. But that proves that I'm
  678. 89:46 great. That proves an altruistic and and charitable and look what a wonderful person I morality is. So we have a type
  679. 89:54 of narcissist called pro-social narcissist communal narcissist. That's a narcissist goes around and says look how
  680. 90:01 great I am. I'm a dogooder. I'm altruistic. I'm loving. I'm compassionate. I'm charitable. Many of the gurus gurus online and offline they
  681. 90:09 are actually pro-social narcissist. There are studies starting in 2020 there are studies that
  682. 90:16 show that majority of the leaders uh sorry not majority there are studies
  683. 90:22 that show that there is an infiltration in the leadership of social justice
  684. 90:28 movements. There's an infiltration of narcissists and psychopaths. People like Greta Turnberg who is a rank
  685. 90:35 narcissist I mean complete narcissist is an example of course. So all these social justice
  686. 90:43 movements in the top they have many narcissists and and psychopaths who claim to do good claim collective
  687. 90:51 that want to make society better and so on and so forth. So no narcissist will
  688. 90:57 push a button. But what they may do, they may redesign the room or redesign the machine and then claim that they're
  689. 91:04 the moral people. They are the charitable. They are altruistic. They are activists. They they they full of
  690. 91:11 love and compassion. Mother Teresa for and so on and so forth. And then it's
  691. 91:17 you who dare dares to ask the question. You dare to ask a question.
  692. 91:24 That proves that you are fusilanimous. You are u you don't you are not
  693. 91:31 charitable. You lack empathy. You are not altruistic. You are not moral.
  694. 91:37 You're stupid. Too stupid to understand. You're envious. And so so they will demonize you. They devalue you if you
  695. 91:44 dare to suggest that they should push the button. And Donald Trump would end the conversation with me saying, "Sam, I
  696. 91:52 understand that you envy me. understand that and I know that perhaps you are not
  697. 91:58 as intelligent as you think you are and you can't grasp my my greatness, my
  698. 92:04 contributions to humanity and so on so forth. That's okay. We are still friends. Don't worry. We're still
  699. 92:10 friends because I forgive you. That's how the conversation would end. Okay. Well, thank you very much, Sam,
  700. 92:17 for doing the interview. It was an absolute pleasure. I I really really enjoyed it. Thank you. So did I. And I
  701. 92:24 hope you forgive me. No, that's completely fine. I love your your answers. Regards to your friends, dear friend.
  702. 92:30 Okay. Okay. Thank you. Take care. Bye-bye.
Facebook
X
LinkedIn
WhatsApp

Summary Link:

https://vakninsummaries.com/ (Full summaries of Sam Vaknin’s videos)

http://www.narcissistic-abuse.com/mediakit.html (My work in psychology: Media Kit and Press Room)

Bonus Consultations with Sam Vaknin or Lidija Rangelovska (or both) http://www.narcissistic-abuse.com/ctcounsel.html

http://www.youtube.com/samvaknin (Narcissists, Psychopaths, Abuse)

http://www.youtube.com/vakninmusings (World in Conflict and Transition)

http://www.narcissistic-abuse.com (Malignant Self-love: Narcissism Revisited)

http://www.narcissistic-abuse.com/cv.html (Biography and Resume)

Summary

Go. Okay. So, I've been following you for quite a while. I think like a year and a half or something like that. A friend of mine introduced me to to your work and your channel.

Tags

If you enjoyed this article, you might like the following:

Are All Gamblers Narcissists? (+Sports Betting) (Gambling Disorder with Brian Pempus)

The discussion explored the complex psychological dynamics of gambling disorder, distinguishing it from professional gambling and emphasizing its nature as a process addiction linked to reward systems rather than impulse control or compulsion. The conversation highlighted strong associations between gambling disorder and personality disorders like narcissistic, antisocial, and borderline personality

Read More »

From Drama, Recklessness to Risk Aversion (in Psychopathic Personalities)

The discussion focused on the behavioral evolution of individuals with psychopathic and narcissistic traits, highlighting how their reckless, thrill-seeking behaviors tend to diminish with age, often transforming into more pro-social, risk-averse tendencies. This transition is theorized to involve neurobiological changes and the psychological process of sublimation, where aggressive impulses are

Read More »

Intoxicated in Narcissist’s Shared Fantasy (EXCERPTS with NATV)

The discussion focused on the isolating and manipulative nature of narcissism, describing how narcissists create a detached, idealized reality that traps their victims, cutting them off from meaningful connections and reality checks. It was highlighted that narcissism is a global, pervasive phenomenon exacerbated by societal shifts such as technological isolation,

Read More »

Young Politician? BEWARE of This! (Political Academy)

The speaker addressed young aspiring politicians, warning them about the harsh realities of politics, emphasizing the importance of staying true to oneself despite temptations of corruption and power. He outlined the different types of politicians and political strategies, while stressing that youth is a liability in politics, with limited pathways

Read More »

How Technologies Profit from Your Loneliness, Encourage It

The discussion emphasized the critical role of healthy narcissism as a foundational element of mental health, distinguishing it from pathological narcissism and highlighting its genetic basis. It was proposed that mental health should be measured not only by ego-syntonic happiness and functionality but also by a third criterion: reality testing,

Read More »

Can YOU Be an Innovator? Not So Fast!

In this meeting, San Batin emphasized that innovation requires a unique combination of psychological traits, including humility, lifelong curiosity, open-mindedness, and the ability to form novel connections between concepts. Innovators are characterized by their deep respect for existing knowledge and their persistent wonder at the mysteries of reality, which drives

Read More »

Narcissist’s Words: Problematic, Assertoric – Not Apodictic

The speaker explored the philosophical distinctions in types of speech—assertoric, problematic, and apodictic—drawing on Aristotle and Kant to analyze how narcissists employ language. Narcissists predominantly use assertoric speech, making uncompromising, unverifiable claims to support their grandiose self-image, while often presenting apodictic speech that appears revolutionary but merely redefines established concepts.

Read More »