Tip: click a paragraph to jump to the exact moment in the video.
- 00:16 Sam, I’ll reiterate it’s a boon to have come from Ganetva, not far from Tel Aviv
- 00:22 to Scopia to meet you and hear about all these interesting concepts.
- 00:28 Being a grandio narcissist, I fully agree with you. I couldn’t have said it better.
- 00:35 Okay, so now we’re talking about the following topic. the dangers and promises dangers on the one hand and
- 00:42 promises of extended virtual and augmented realities from cities to the
- 00:49 metaverse. Floor is yours. What I’m referring to is the process of
- 00:55 uh virtualization. There is a general retreat, general escape from what we what what we called
- 01:02 in our previous conversation the the preferred privileged frame of reference which is reality.
- 01:08 We talked in our previous conversation about the reality. The reality the one that is
- 01:14 the real reality the reality where you have no valition. You are in it. You’re immersed in it. You are it’s directly accessible and
- 01:21 it’s unmediated as opposed to simulations. simulations which require technology of some kind or at the minimum an act of will a decision to enter the simulation. Okay. So there is a general tendency to move
- 01:35 from reality to simulations that’s true generally started with the cinema not with not with computers the theater or the theater but theater was not that immersive in
- 01:47 the sense that it didn’t require an act of dissociation as the cinema does. That’s why when the first movie was was uh projected on the screen, it was a a
- 01:58 train a train coming into a station, right? People ran away. Ran away. They were panic. They were in panic
- 02:04 because they thought the train was going to run over them. You you can’t do that uh in a theater.
- 02:10 So I think actually the cutoff is the cinema. We started to seriously evade and avoid reality when the cinema start and then it became of course with computing it became an enormous trend.
- 02:22 And now we have uh unleashed upon us the metaverse which we will discuss in a minute. So I call this process virtualization. But virtualization
- 02:34 to started in my view even earlier let’s say 7 to 10,000 years ago when we moved
- 02:41 from uh villages and farms and agriculture and the land and the soil we move to cities. Cities are simulations in
- 02:53 effect. Cities are totally artificial creations. They are they they are not
- 03:00 they are much less real than when you are in nature where you’re working the land when you’re growing your own food
- 03:06 and so on. In a city you you in inhabit
- 03:12 confined spaces and within these spaces you can make belief that you are not
- 03:20 dependent. Everything comes to you. The food comes to you from the countryside and so on. So we already in in
- 03:27 urbanization we already have the rudimentary primordial elements of virtualization. a
- 03:33 retreat from nature, a retreat from reality, a retreat from the land into spaces which are brainchildren. These
- 03:42 spaces are brainchildren of architects. They are they are actually translations.
- 03:48 Wow. Translations of the minds of architects. So which is a good definition of simulation by the way. Okay. So we went
- 03:57 from agriculture to cities and that that created a major psychological revolution.
- 04:03 Because when you are in agriculture, you need to have a specific psychology and when you move to the city and the city is the dream or the brainchild of
- 04:14 an architect. In effect, you move into a dream state. Your psychology changes in
- 04:20 a city. Two or three examples in agriculture. You need to have a very well-developed sense of time. You need to follow the seasons. you know to you need to know
- 04:31 when to to seed and when to plant and when to to sow when to reap reap and so on when to harvest. So time is of crucial importance a lot of time awareness in agriculture. Second thing in agriculture you need to delay gratification. You put a seed in the
- 04:48 ground you need to wait. You can’t just immediately reap the reward. You need to have a lot of patience.
- 04:54 Yes. You need to in short in a in agriculture
- 05:00 you you pay for the consequences of your actions. There’s a direct linkage between your actions and the consequences of your actions. And it takes time. It takes time and it takes patience
- 05:12 and planning. Planning and investment and commitment and patience and so on. What do we call all these? Maturity. In agriculture, agriculture forced upon
- 05:23 you maturity. You were mature or you were dead. These were two options. Mature or dead. You can’t run run run a farm without being a farmer. You Yeah. And to be a farmer, you need
- 05:35 to be highly mature or you’re dead. Simple dead in I mean like dead like you don’t have what to eat. So, but cities
- 05:44 changed the psychology of people because they had immediate rewards. They could go to a grocery store and buy bread.
- 05:51 They didn’t need to to to plant. They didn’t need to wait. They didn’t need to reap. They didn’t need to harvest. They
- 05:57 just went to the grocery store and bought a bread, a loaf of bread. There was a time when they bought flour
- 06:04 and flour and made bread. But even that it’s that that changed even that is 1 hour. It it’s definitely
- 06:10 not 6 months or 7 months. So the horizon the time horizon was compressed became compressed and level of maturity
- 06:21 um deteriorated. People became much more infantile. They became much more dependent in the city. The city fosters
- 06:29 in you total dependence on many many um on many agents.
- 06:36 Yes. On the suppliers of food, on suppliers of water, you name gas.
- 06:42 You’re totally dependent electricity. I mean, whatever it is, you’re dependent. The organizing principle of cities is
- 06:49 dependency. The organizing principle of agriculture is self-reliance.
- 06:55 Simple fact. So the psychology change of course because you adapt to your environment. We will talk about it when
- 07:01 we talk about culture. You adapt to your environment. The psychology change and
- 07:07 author auth the the farm is authoric.
- 07:13 And another thing happened in the cities. the unnatural elomeration of
- 07:20 human beings in one location which was essentially a dreamscape someone’s dream the architect or whatever yeah Lafayette when Lafayette designed cities it was totally his dream state he he designed these wide avenues and you know architects have a huge influence on on
- 07:38 our habitat and so we inhabit architect’s minds including this room is
- 07:44 someone was once someone’s dream or fantasy, you know. So when when this
- 07:51 elomeration with this crowding started, people felt the need to be noticed. They
- 07:57 felt the need to be seen. In a typical agricultural community, everyone knows everyone, of course, and you are seen by everyone all the time. In a city, no one sees you. No one
- 08:10 notices you. So you develop a compulsion to be seen and to be noticed and your
- 08:16 behavior escalates as you try to attract attention. Now why do we need to be seen? Because it’s a survival thing.
- 08:24 Babies need to be seen by mommy. If they’re not seen by mommy, they die. So the need to be noticed is primordial to to call their their mother. They they cry. They cry to to be noticed. To be noticed. And what is what is what do we do on social media? We cry. We cry.
- 08:41 You cry out for for um the language tells you this. Crying out
- 08:47 loud. Yes. You’re in social media. You are crying out loud. You’re infantilized. You
- 08:53 become a baby again. You want mommy world to notice you. It’s instinctive.
- 08:59 It’s reflexive. It’s not, you know, it’s not mediated via. It’s just what we need to be seen and noticed. It’s it’s very
- 09:06 basic. So this is the city. Imagine virtualization
- 09:12 from farm to city had this massive impacts on us. Imagine what’s going to
- 09:18 happen when we transition from cities to the metaverse. The metaverse is a much more profound form of virtualization. It’s going to
- 09:30 have much more profound psychological impact. What What is the met? What is the met? I knew you would ask. I thought you’d never ask. Okay. A metaverse is a combination of
- 09:42 technologies. Mhm. Which provide online simulations which you can then inhabit using specialized
- 09:49 uh devices and technology at this stage. But probably in 2030 years, you wouldn’t need these devices. Everything will be uh Wi-Fi through the air. But right now,
- 10:00 right this very second, to inhabit these simulations, you need goggles. You need haptic gloves. You need all kinds of
- 10:06 things. And then if you do wear this equipment, it’s wearables. If you wear this equipment, you are able to totally
- 10:13 access the simulation. And you have no interface, no contact with reality. You’re utterly inside the
- 10:19 Are you alone there? You could be alone. You could be with other people. And these other people, they also have to wear these accessories. Yes. Everyone has to wear the same accessories. And you can share a space, a simulation space. They don’t have to
- 10:31 be in the same room like you. One can be in Thailand, one in Israel, one in Russia and all three of you can be in the simulated and everybody knows in his or her mind that that this happens that that’s what where chammers is
- 10:43 wrong. Yes. Okay. Everyone has to wear these things, make a decision, turn on the computer. This is not reality. It’s
- 10:50 not reality by any stretch of the of the word. Anyhow, the transition from farm to cities was a
- 10:57 was virtualization because we inhabited someone else’s mind. What is a simulation? Someone someone is designing
- 11:04 the simulation. Someone is coding and programming the simulation. It is another person’s brainchild. It’s
- 11:11 another person person’s fantasy and dream. So when we move to far from farm to cities, we moved into architectural
- 11:18 fantasies, architectural virtualization. when we are now we’re going to move from
- 11:24 cities to metaverse. We are going to move into a programmer’s dream or a coders’s fantasy. Okay. The psychological revolution that happened when we move to agriculture from agriculture to cities is nothing
- 11:39 compared to the psychological revolution that will happen when we all finally finally move into the metaverse which is
- 11:47 a question of time. and and and you’re probably thinking of further infantalization
- 11:55 infant utter but I’m much I’m worried even more by other things for example the metaverse is so sol soypistic in the sense that in the metaverse you are
- 12:06 totally self-sufficient you do interact with other people but you don’t need them and sometimes you
- 12:13 don’t want them so other people become commoditized they become like avatars.
- 12:20 They become like representations, symbols, u game elements, figments. So, solypism.
- 12:28 Second thing, the metaverse will encourage you to be even more self-sufficient than you are now.
- 12:35 Here is the thing. The more self-sufficient you become, the less you tend to interact with people. It’s been
- 12:42 proven now beyond any doubt. People interact less with other people.
- 12:48 If they can avoid it, if they can. And the more you uh you avoid, the more
- 12:55 you tend to avoid. It’s it’s escalating. It’s self-perpetuating. It’s addictive. Yes. So self-sufficiently leads to
- 13:03 asocial behavior. Not antisocial, not criminal, not but asocial. Not necessarily. Not necessar can be antisocial, but asocial definitely in the sense that you will avoid people. you you your needs to
- 13:17 interact with other human beings will be fully gratified uh via the metaphor. Even if you want to
- 13:24 have sex with someone, you will have sex alone in your room wearing a suit, a
- 13:31 suit, a physical suit that simulates the sex, touch, feel, smell and so on.
- 13:37 So you will not really need other people. We already see this happening already. We are seeing this happening when huge swaths of humanity are totally isolated,
- 13:48 automized. How long does it take? I mean, can you stay there? You don’t you have to eat, you have to drink. No, you don’t. No, the metaverse is a total solution in the Well, you have to eat and drink, of
- 13:59 course, but the metaverse is a total solution in the sense that your workplace will be in the metaverse.
- 14:06 Your company will open a site in the metaverse and you will go to work there. What will you produce?
- 14:12 We anyhow we produce nowadays something like 80% of the economy is manipulation
- 14:18 of symbols. Anyhow what is an accountant? He manipulates symbols. What is a lawyer? They manipulate symbols. So
- 14:26 today uh 2% of the population is engaged in agriculture too
- 14:32 in the developed world. in the develop not in agriculture even by the way even in in not developed world less developed worlds we we are already talking about 25% compared to 80
- 14:43 and 90% only only 40 years ago so clearly physical professions
- 14:49 professions which which deal with the manipulation of physical objects one way or another industry agriculture they’re
- 14:56 dying they’re disappearing but we have we need iron we need we need robots tables robots robots. A typical A typical person in a Toyota factory, a single person produces 100 cars when
- 15:12 per per produces 100 cars per day. Per day per day when when uh only only for 40
- 15:21 years yes in the ’90s in the 80s I’m sorry only 40 years ago you needed 100
- 15:27 people to produce 100 cars. 100 cars. So the difference is robotization and autom and and
- 15:34 automation. So roboticization and automation and computerization and so on so forth they will take over most professions and we will begin to manipulate symbols.
- 15:46 So already for example the video game industry is much much bigger than the
- 15:52 cinema industry much bigger. people spending times instead of going to watch
- 15:58 a movie, they spend times playing sta PlayStations to be able to Why? Because the video game is much more
- 16:06 simulation than the movie. It makes you active. It really makes you active. You’re in it. You influence the
- 16:12 movie. Yes, it’s a simulation. You control the environment somehow. You even control the plot. By the way, many video games
- 16:18 allow you to decide which what is the plot where where the video game is going. So the metaverse will encourage
- 16:26 you to disconnect from humanity completely and you will work in the metaverse, have sex in the metav, shop
- 16:32 for fashion in the meta versse, do everything in the met except physiological. Speaking about your wanting to be
- 16:40 noticed, is there element there? Of course, because in the meta versse, you could be you could be anything you want. You can be a rockstar, you can be a stripper. There’s an application
- 16:50 called VR chat where where unfortunately adolescents go and they strip and and have group sex and
- 16:57 it’s like the seven lives of Walter Mitti but u intensified immersive immersive in the sense that
- 17:03 you are in it wholly and truly and totally and so these are the out these
- 17:10 are the this is the the metalist. Now there are some philosophical issues here
- 17:17 very deep philosophical issues unfortunately at this stage not not well noticed.
- 17:24 All the tech first of all all the technology until 19 the 1990s all
- 17:30 technologies were about extending the human body. You name one technology and I will show
- 17:37 you how it extends the human body. The sword extended your hand. The boat extended your your hands when you swim. The I mean the car extended your legs. I
- 17:48 all technologies were extensions of the body or the mind or the brain which is also part of the body. In the 1990s for
- 17:56 the first time we have transitioned from technologies that extend the brain the the mind and
- 18:03 the body to technologies that allow us to to evade and escape from reality. So
- 18:09 today majority of technologies are about avoiding reality are about escaping from
- 18:16 reality. That’s the first thing and a battle a war is erupting. It’s not a war
- 18:23 about how you experience reality because all previous technologies were about how you experience reality. For example, consider the internet.
- 18:34 You have a browser. Yeah, you have a browser. What is a browser? A browser structures the way that you experience
- 18:41 the internet. It is through the browser that you experience the internet and the browser has limitations and specifications. So browser tells you how to experience the internet. Similarly,
- 18:52 cinema, similarly all other technologies, they they structured your experience, including the travel
- 18:58 industry, including transportation, all of them structured your experience, structured your reality,
- 19:05 told you how to experience reality. The new the new technologies are not about
- 19:11 how you experience reality. They are about who owns reality.
- 19:17 If I own a simulation, I own your reality. I don’t only own how you experience
- 19:24 reality, but I own your reality. You’re coming into my reality. When you are using my simulation, you’re entering my reality. So I I I’ll see if I understand. If if if I choose to go to Italy via Alital Italia, they have their plans of of
- 19:41 shipment of of flying and and I know that I go to them and they’ll fly me to
- 19:47 Rome or to Napo. Yes. But they don’t control Rome. They don’t control the act of traveling. They
- 19:53 don’t control your decision to travel. They control very little. But they do structure your reality. They do because they tell me how how
- 20:00 long it’s going to take. So they control your experience. How many stops on the way? So they control your experience. Yes. But in the future, Alitalia will own Rome.
- 20:10 In what way? In this simulation. In this Oh, in the simulation. In this analogy. In this this analogy. When you come to my simulation,
- 20:18 I own this reality. I I am your reality.
- 20:24 So this must be the danger, right? Because it’s a huge danger because we’re talking about dangers and promises. Yes. Because it’s a danger because there will be people and corporations who will own reality. First time in human history
- 20:37 will own reality. That’s one danger. Second danger, they will it will be their interest to blur the boundaries
- 20:45 between reality and simulation. They would want you to spend more time in the simulation because more time you spend
- 20:52 in the simulation, more money they’re making. So they will structure the simulation to blur and to make it addictive to make it addictive and to blur the
- 21:03 borders boundaries so that you will no longer be able to tell you will be like in a constant trip
- 21:09 constant drug ace you know you longer be able to tell uh which is which they are
- 21:15 also going to narrow reality what they’re going to do is a process called twinning twinning is when a simulation
- 21:23 borrows borrows elements from the privileged frame of reference from reality. Simulation borrows elements
- 21:29 from reality and then pretends that these elements belong to the simulation
- 21:35 not to reality. Give give me an example if you can. Well um imagine that um imagine that you
- 21:44 again you want to read a book. Okay, imagine you want to read a book. Reading a book is an experience in reality.
- 21:52 Obviously the simulation will take the book make you believe that you are
- 21:58 sitting near a physical table and reading the a physical book and then claimed that this experience always had belonged to the simulation is not is a simulation specific and the books inside if if I want to read Chaucer I’m crazy I want to read
- 22:14 Chaucer will I get Chaucer in the simulation they will have all the all the all the already you have all the all physical
- 22:20 books available online So, but you will not be able to tell the difference. You will feel that you are really s sitting
- 22:26 at a physical table reading a real book and gradually you will begin to associate this experience with a
- 22:33 simulation not with reality. They will appropriate reality and convince you
- 22:41 that they’re delivering this to you, not reality. And this is called twinning. It’s a very dangerous process. And
- 22:47 finally, of course, it will create addiction in some people. Not everyone but many people. I think we are talking
- 22:54 about 30 40% of population become addicted and we already know from studies that exposure to simulations and
- 23:01 screens and enhance I mean increases depression and anxiety in people. We
- 23:07 know that we have studies by twe and others that um the more exposed you are
- 23:13 to simulated states and screens the more you are likely to develop depression and
- 23:20 anxiety actually among users of social media in a period of 10 years only
- 23:26 anxiety has gone up 500%. Wow. And depression has gone up 300%. And this is only with social media. Only social media which is not not simulation. It’s not you know that you’re in Facebook, but it encourages a certain divorce from reality because that’s why they’re using
- 23:41 words like friends. Ah, friends. You know, I’m going to make a comparison that is very distasteful. When the Nazis
- 23:48 took the Jews to Awitz, they put them in a bath. They told them they’re going to have a bath. Shower. A shower.
- 23:55 A shower. Shower. Bad. They told them they’re going to have a shower. Calling someone on Facebook a friend,
- 24:02 someone you’ve never met. is this Nazi technique of mislabeling and misnaming
- 24:08 things with the intent to deceive. So a friend is a well-defined figment of
- 24:16 reality that Facebook had appropriated it not a figment sorry a real a real thing it’s it’s not a figment element of reality yes it’s an element of a friend is a real thing
- 24:26 it’s a real thing it’s an element of reality that Facebook had appropriated this and now when we say friend we think
- 24:33 actually more about Facebook than about reality when I say friend many I don’t know you and I we consider
- 24:39 friend as a friend but if we go if I go to my go to my granddaughter maybe she
- 24:45 consider friends almost for sure when you say friend she will think of Facebook or some other platform so they had appropriated this element of reality and they made this element there does but deceitfully
- 24:58 because a friend of Facebook is not a friend in reality as well it’s just a stranger he may be a friend he may be but he’s not in most cases I have 5,000 uh friends on Facebook how
- 25:09 many maybe maybe a hundred maybe 200, maybe 300, 200, 300, maybe a thousand thousand.
- 25:15 The other 4,000 are they’re not friends and yet they called and yet they’re called friends. They’re passers by. It’s like the burden in Awitz. I’m sorry to say. Yeah. Okay. So, you told you talked about I think you talked about the
- 25:27 dangers. Are there also promises? Well, the promise is that some things can be delivered more efficaciously. So
- 25:34 for example, work probably will improve through the metaverse because collaboration will be more integrated
- 25:40 and more efficient. Efficiencies I think mostly that’s I the only the only thing
- 25:46 I can see is efficiencies plus of course there are segments of a population for example disabled people. Uhhuh. So for them the metaverse will be a blessing will allow them to travel all over the world because tourism will be a
- 25:56 big thing in metaverse. They will allow it allow allow them to have sex. So it
- 26:02 will open up the world to mentally ill people, disabled people and so on. It’s a segment of the population. It’s not without its merits and without its blessings. But if it is left to its own devices and the usage is not limited and
- 26:15 restructured, we are in enormous danger as a species. In enormous danger, it’s a serious threat in my view. All right. Thank you so very much, professor. It’s an absolute pleasure to
- 26:27 sit with you today and talk to you and get your insights. My name is Erikica Liss. I’m the founder of Peace Post and
- 26:34 really eager to learn a little bit more about your insights pertaining to AI and
- 26:40 how narcissism and AI relate together. So, I’m going to start off and basically
- 26:46 ask you a little bit about how you really came to the conclusion of
- 26:53 how AI and narcissists really relate. Well, first of all, thank you for having me. It’s very courageous of you. I’ll try not uh not to abuse this opportunity. So,
- 27:09 artificial intelligence and pathological narcissism are actually two forms of
- 27:15 crowdsourcing. They use what may easily be described as
- 27:21 large language models. At least the type of artificial intelligence which is commercially available, the retail type,
- 27:28 CH GPT and so on. More serious artificial intelligence works otherwise. Doesn’t utilize large
- 27:36 language models but does utilize databases and so on. Both narcissism or both narcissists and
- 27:44 artificial intelligence programs. They’re both hive minds.
- 27:50 They’re not individual in any sense. They represent the sum total the elomeration and the accretion
- 27:58 of information or opinions or inputs or reactions or whatever um of many many
- 28:04 many participants sometimes too many to enumerate. The narcissist for example regulates his
- 28:12 sense of selfworth and his sense of identity such as it is by resorting to
- 28:19 feedback from other people. He then amalgamates this feedback, tries to
- 28:25 impose on it a narrative which would render it somehow cohesive and then on the fly he proceeds to adopt this input as the contours of an identity. This
- 28:37 process is known as narcissistic supply. Artificial intelligence basically does the same artificial when I say artificial intelligence I want it to be clear. I’m referring to the kind of
- 28:48 artificial intelligence which is public facing commercially available
- 28:54 retailized by the likes of Google and Facebook and Chad GPT and open AI and all these I am not referring to much
- 29:03 more serious artificial intelligence programs in use in scientific endeavors in space exploration I’m not referring
- 29:10 to this so there’s a question of hidemite second
- 29:17 Artificial intelligence, the retail version at least, places emphasis on impressing people.
- 29:25 It’s an impressions management software. For example, artificial intelligence
- 29:31 programs such as Chad GPT are not concerned with the truth.
- 29:37 Absolutely not. They frequently hallucinate. They very frequently give the wrong
- 29:43 answers, wrong information. and so on so forth and they’re not concerned with it doesn’t bother them. It doesn’t worry these programs or the programmers. What this kind of artificial
- 29:54 intelligence is trying to do is to impress you with their linguistic capacity to pass off as human beings in other words to succeed in the Turing
- 30:06 test. So it’s an impressions management approach which is a great way of
- 30:13 encapsulating pathological narcissism. Narcissism is about impressions
- 30:19 management. It’s not about communic communicating. It’s not about veracity. It’s not about
- 30:26 factuality. It’s not about truthfulness. It’s about impressing you, captivating you, acquiring you as a source of supply, an admirer, a fan, whatever.
- 30:37 Similarly, if you were to ask Chad GPT anything, you are quite likely to get the wrong
- 30:44 answer, but it’s going to be given to you in a way which would greatly impress you, which would sound a lot like a human being. That’s the second element. The third
- 30:55 element, of course, is the absence of empathy. Empathy
- 31:02 has three components. reflexive, cognitive, and emotional effective.
- 31:08 Both narcissists and artificial intelligence programs possess cognitive empathy.
- 31:14 Artificial intelligence programs are programmed to pretend that they are empathic. Mhm.
- 31:21 Similarly, the narcissist pretends that he or she is empathic by using or leveraging cognitive empathy, what I
- 31:28 call cold empathy. A narcissist abuse this capacity in
- 31:34 order to spot your vulnerabilities and break through the chinks in your armor in order to somehow manipulate you to do
- 31:40 their bidding. So do psychopaths. Mhm. And so does so do artificial
- 31:47 intelligence programs. It’s exactly what they do. They pretend to be empathetic. They pretend to be sensitive. They
- 31:54 pretend to be politically correct. They pretend to be acutely aware of what is
- 32:01 socially acceptable and what is not, what should be said and what should not, what is right and what is wrong. But
- 32:07 it’s all fake. Of course, it’s all of a sudden. There’s no real empathy there because as
- 32:13 far as we know, there are no emotions or effects which are the only motivators
- 32:19 for genuine empathy. These are only three of many many many facets of artificial intelligence and narcissism in common. The commonality is staggering.
- 32:32 If you take into one last point before we both drop dead out of of old age. One
- 32:38 last point. Um everyone is ignoring the elephant in the room. The elephant is in the room is the mental health profile
- 32:50 of the people who come up with these inventions. Mhm. For example, social media has been
- 32:58 created by people who are who are schizoids, people with schizoid, probably schizoid
- 33:04 personality disorder. other other high-tech inventions and
- 33:11 gadgets and devices were invented or created or imagined by narcissists, rank
- 33:17 narcissists such as Steven Jobs. So, the high-tech industry is the
- 33:24 brainchild of mentally ill people. And and I’m not saying mentally impaired
- 33:31 or mentally challenged. mentally ill people. Narcissism is a severe mental illness. So is schizoid personality disorder. That’s why it’s called it’s called schizoid because it’s very close
- 33:43 to schizophrenia. So these are mentally ill people who
- 33:49 keep coming up with these new technologies and to ignore the fact that this is the brainchild of mentally ill people is you know counterproductive and
- 34:00 self-defeating. Of course, we live in an age where
- 34:06 everything is relative and it’s only a question of differences, not a question
- 34:12 of ill and healthy. You know, neurode divergent, we won’t say mentally ill,
- 34:18 it’s neurode divergent and and all this kind of nonsense. But artificial intelligence
- 34:25 at least the public facing applications bear the hallmarks of people who are not mentally well. I must say yeah I think it’s really important to at
- 34:36 least acknowledge that the fact that this is a very vast technology that’s
- 34:42 getting into a lot of people’s hands and it absolutely is created to be as addictive as possible. So when we look
- 34:48 at algorithms, they’re meant to really create that, you know, wanting to pull
- 34:54 the the jackpot lever every single time. But we also have to acknowledge the fact that there are tools out there that are
- 35:01 going to be here. And so is there a way to utilize those intelligently in a way
- 35:08 that can be done ethically to help people? And so that’s where I’m coming
- 35:15 at this to try to solve. We’ve got a whole subset of individuals that are
- 35:21 really lacking the self. And you’ve really touched on the point exactly where I was hoping we would go is they
- 35:27 have the cognitive empathy where the cold empathy where they can understand what sadness is. They can understand tears. They can understand. So you have the sentiment analysis basically what
- 35:38 the narcissist may do or someone devoid of a self but they don’t understand that. And that can be very similar to
- 35:45 what AI does. So it has the sentiment analysis. It’s able to understand like a written or verbal or um any sort of
- 35:52 communication then it extrapolates out. So in your could you hypothesize a way
- 36:00 that AI might be able to hyperfocus or augment empathy in communication where you have someone that’s basically devoid
- 36:11 of true empathy in a communication style because at the end of the day you have a
- 36:18 narcissist that is either going to try to get supply and so what are they trying to do? They’re trying to elicit
- 36:24 response. They’re trying to manipulate. They’re trying to do a lot of things. Could AI be utilized intelligently and
- 36:31 responsibly to potentially help victims that are on the receiving end of that.
- 36:38 AI is good in uh in pattern recognition. Mhm. Much better than human beings are because of the infinite capacity of AI. AI models are capable of collaborating
- 36:50 with each other and there’s no quantitative limitation on how many AI models work together
- 36:57 while the human brain is limited. There’s the famous Dunbar number. We are limited to collaborating with 150 other
- 37:04 people. Mhm. When we exceed this number, our brain shuts down because AI models are capable of tapping the databases, information,
- 37:17 language models of other AI applications and so on so forth. It is possible to
- 37:24 create a network of AI which would easily spot fake or feigned empathy,
- 37:32 for example, via linguistic analysis. Mhm. And yes, this kind of AI could alert the
- 37:38 user that on the other end there is someone who is faking it, someone who is feigning it, someone who is not genuine,
- 37:46 someone is who is you know manipulative. Mchavelianism for example is easily spottable actually even without AI. We
- 37:55 have very powerful tests and they assign what is known as a Mac number mavelinism
- 38:01 number and these tests are very rigorous and and highly validated and and can be
- 38:08 easily administered by people. Now an AI interface could administer a machavelianism test whenever approached whenever used.
- 38:19 We could even we could even agree or decide that AI would administer a
- 38:25 battery of psychological tests to any user prior and would create a psychological profile, a psych profile of the user. We and that would be a precondition for using the technology. Um, and then exactly like on a dating
- 38:42 service, you would have the psych profile of the user and some people
- 38:48 would be thrilled to interact with narcissists and psychopaths. It’s not that narcissist and psychopaths
- 38:54 would become immediately outcasts and paras and no one would talk to them. On the very contrary, I mean there are
- 39:01 people who are who are elated to correspond and fall in love with serial killers. It takes all kinds.
- 39:08 But informed decision making is the key. If you know what you’re getting into,
- 39:15 then that’s where the technology stops and adulthood begins. Responsibility and
- 39:21 accountability. The problem right now is that sophisticated users of computer
- 39:28 technologies, and it’s not limited to artificial intelligence, for example, social media. Mhm. Sophisticated users can pretend to be anyone they want. And it is extremely easy to mislead
- 39:41 people because the vast majority of people are dumb and gullible.
- 39:47 This is known as the base rate fallacy. People believe between 90 to 95% of the
- 39:53 statements they come across without bothering to check, without exercising critical thinking, if they’re at all
- 39:59 capable of exercising critical thinking. Mhm. Now everything I’m saying is of course politically incorrect and it’s
- 40:05 not because I’m trying to impress you or the viewers because this is what I really believe. I believe I hold a very
- 40:13 dim view of technological empowerment of the masses very dim.
- 40:19 Mhm. I think we have given babes in the wood. We have given them weapons, guns.
- 40:28 The modern technology is a very powerful weapon and we’ve given it to people without training them, without qualifying them, without selecting them, without nothing. It’s available in the
- 40:41 wild and it’s exactly like a virus. When there’s a new virus, new, totally new,
- 40:48 the population is susceptible. It has no immune response and then the virus kills millions of people. Modern technology is such a virus
- 41:00 and it has been unleashed upon the unsuspecting and the susceptible and we are adding to that we’re adding
- 41:06 crime to injury and that is artificial intelligence. In short, unlike you, I don’t believe
- 41:14 that the solution is maybe trying to protect the masses or maybe trying to but I think the solution is denying access to these technologies altogether
- 41:25 until we have implemented the kind of education and training that
- 41:31 would qualify people to use these technologies. Maybe through a process of licensing, maybe I mean you need a
- 41:37 license to have a gun. Why don’t you need a license to use artificial intelligence?
- 41:44 So, social media, for example, is a is a
- 41:50 great case in point. Social media was released to the to the
- 41:56 masses, to the public, without any warning or preparation or training or qualification or education or anything.
- 42:02 It’s just released. The outcomes are beyond disastrous. beyond disastrous. Tuen and Campbell, for example, in their studies have demonstrated that social
- 42:15 media specifically has caused a quintupling of depression
- 42:21 rates and a tripling of anxiety rates among teenagers and a massive rise in
- 42:27 suicides among teenagers. And that was 2018.
- 42:33 Situation now is much worse. We see of course extremism. We see fake news. We
- 42:40 see misinformation. We see radicalism. We see violence. We see aggression. We
- 42:46 see uh incitement to murder for example in insult communities among you know
- 42:52 fundamentalist Muslims online. And so the we social media has been weaponized completely by people even even normal even healthy people even even you know
- 43:04 your next door neighbor. I’m not talking about the fundamentalist in Afghanistan or Iraq. Everyone is is now weaponizing
- 43:11 social media. The level of aggression and hatred and negative effects such as envy and so on on social media have they
- 43:18 have skyrocketed and have hijacked the applications. Today you cannot use social media without being exposed to
- 43:26 hatred, aggression often directed at you for no reason whatsoever,
- 43:32 without being exposed to crazies and crazymaking, without being exposed to haters and and hate mongering. There’s
- 43:39 no way to use these applications safely. They’re unsafe. They’re unsafe.
- 43:45 Why is that? Because everyone was given access. I can completely understand a lot of the
- 43:51 hesitations that you’re facing with that and especially it’s readily available. Apple just deployed their 18 um their
- 43:59 latest version I think 18. Yeah, they haven’t yet but they’re about to in a week or two. Yeah. Yeah. So in the states they have their
- 44:05 Apple AI built right in. So that basically becomes mass deployment across
- 44:13 all Apple users in the states for those people that upgrade their operating system. So it def definitely becomes,
- 44:19 you know, something that becomes readily available to even people that may not have, you know, been familiar with some
- 44:25 of the large language models, but it’s now on their phones. And so, you know,
- 44:31 this is a very big shift to steer and I completely understand a lot of the hesitations and I mean, I’ve watched
- 44:37 some of your other videos concerning, you know, basically some of the dangers that can come about this and at the end
- 44:45 of the day, you know, some of these technology leaders, it it is getting implemented into everyday usage and so I
- 44:53 don’t know I don’t understand I don’t understand one thing. Mhm. There is a machine, a device. It
- 45:00 cost a few hundred bucks. It allows you to edit genes. Mhm. To edit genes, to create new animals, new interesting animals, fun animals, you know, or to introduce genes from one
- 45:13 animal to another animal or from animals to plants or to, you know, play around in your basement and have great fun.
- 45:20 And yet it is forbidden to sell this machine to to people who are not qualified.
- 45:28 Actually only universities purchase these machines. These are crisper machines.
- 45:34 That’s the correct way to go about it. Why is it that when it comes to biotechnology,
- 45:40 we are really really careful and responsible adults. We do not allow people to clone organisms. Although this
- 45:48 can be done in your living room nowadays, we do not allow them to sequ sequence genomes. Although this can be
- 45:55 done in your toilet nowadays, they’re tiny machines that you know do everything nowadays within minutes. Yet
- 46:03 we don’t allow people access to these technologies. Absolutely not. Similarly, when it comes to weapons,
- 46:10 with the exception of the United States, in the civilized world, we don’t allow people access to guns. Definitely. We
- 46:16 don’t allow them to 3D print guns. We don’t allow them to have ghost guns, you
- 46:22 know. We just don’t allow it. The fact that the technology exists does not automatically imply the right to use it. Definitely not the mass right to use it.
- 46:34 And yet, the only exception is the most powerful technology of them all. far more dangerous than nuclear energy. And that is artificial intelligence and the internet. And
- 46:47 it’s not clear to me why this technology should be singled out for universal access when this is the technology of
- 46:54 the intellect. Mhm. If you have a gun, you can kill one person four on average and mass killing
- 47:01 is four. But you if you have access to specific internet technologies, you can kill millions, thousands, hundreds of thousands. It’s it’s not clear to me why this discrimination
- 47:12 and probably the only reason is money. There’s a lot of money in it. It’s the
- 47:18 only reason. I think there is definitely a a giant push with data and so understanding and
- 47:28 trying as individuals we do the best that we can to minimize sending out
- 47:34 personal data and all of that because the it can be a very powerful tool like
- 47:40 I I and weapon if you want to go that far because I don’t want to diminish
- 47:46 what you’re saying because everything that you’re saying is true. The fact that at the end of the day it has the
- 47:53 potential to manipulate images. It had it can tell stories. It can tell a whole
- 48:04 whole pictures that may not exist. And so
- 48:11 it is in the wild. It is one of those things where we can’t, you know, basically put it back the get the cat back in the bag.
- 48:22 The the question becomes it’s there. Is there a way
- 48:29 to utilize it for the betterment of humanity? And that was one of the things
- 48:35 that I’ve been pondering tremendously because I noticed a potential with at least sentiment analysis.
- 48:47 I noticed a challenge when you had individuals that struggled, whether they were in trauma bonds or in struggling in situations in which they were unable to
- 48:59 break free of the love or the situation or they were forced to communicate with
- 49:06 someone that’s potentially toxic. And is there a way to utilize like from
- 49:12 a communication standpoint, you know, you recommend no contact, best easiest
- 49:18 way to deal with that toxic person because at the end of the day, you’re going to get manipulated. You’re forced to communicate with them. How can you do it? So I was trying to solve that with creating
- 49:31 a application that would do sentiment analysis that would help educate people
- 49:37 on the process with grey rock methodology to basically shut down future
- 49:43 communication to stop basically
- 49:49 the the snowball effect where it will just turn into a giant avalanche where
- 49:55 the convers ation, everything kind of goes sideways. And so AI, unfortunately or fortunately, however you look at it, is here. Are
- 50:08 there intelligent ways to apply this that can potentially help people? And at
- 50:14 the end of the day, that’s how I’m trying to utilize it for the betterment of humanity.
- 50:22 As we said, AI would be good at analyzing especially linguistic patterns
- 50:29 and it is through language that we can detect narcissists and psychopaths with
- 50:36 high reliability and high validity very early on. Mhm. And so AI could be could provide alerts or be a kind of
- 50:47 sensor which would inform you early on that you’re faced with a narcissist or a psychopath and could give you then full
- 50:54 information what what is a narcissist narcissistic behaviors signs to look for
- 51:00 counter behaviors your reactions how to tailor the relationship or how to avoid the relationship altogether and so on so forth. So AI can do can do both. It could detect the person facing you early
- 51:14 on and then it can provide you with all the information you need including tips and advice and and tailor or customize
- 51:21 your behavior according to the specific circumstances that you that you describe and so on so forth. It could be in short
- 51:29 it could be a kind of guide u by your side. It could be. So I guess the question I have for you is in general, how do you view toxic, manipulative individuals?
- 51:41 How do they respond to the grey rock method and basically being shut down in
- 51:48 in in general communication?
- 51:54 Well, depends. Narcissists would usually lose interest. If you demonstrate your your lack of potential as a source of narcissistic supply, the narcissist walks away. Narcissists are focused on
- 52:06 on one goal and one goal only and that is to secure a regular predictable
- 52:13 uninterrupted flow of attention that helps them to regulate their internal environment. If there are
- 52:20 disruptions in the flow of attention, regardless of the reason, by the way, it doesn’t have to be gray rock. If there
- 52:26 are any disruptions in the flow of attention, they lose interest in the source of the supply. So for example, if
- 52:33 you’re sick, if you happen to be sick and because you’re sick, you’re you’re unavailable.
- 52:39 Mhm. Or you’re in a hospital and so therefore you cannot. So they would lose interest in you within days. Even if you have
- 52:46 spent 20 years together, Mhm. they would lose interest in you. Your utility, your value rests exclusively on your ability to
- 52:57 provide not only supply but also services,
- 53:03 sex and um safety, your your very presence. I
- 53:09 call these the four S’s. So if you provide two of the four S’s, you’re still valuable. If you provide three, you’re very valuable. If you provide four, you’re a unicorn.
- 53:20 You’re amazing. But two two would do. Two two are enough. So any disruption
- 53:26 and interruption to this flow render you useless. Immediately immediately the narcissist devalues and discards you in his mind
- 53:37 and moves on to the next potential source of supply. Starts to cultivate alternatives
- 53:44 and so on. So gray rock is a very powerful technique. Very powerful technique. The problem is
- 53:50 of course not so much in what to do once you have identified a narcissist.
- 53:58 There is a there are quite a few techniques. Gray rock is only one of them. Mhm. There is about eight techniques which are equally as powerful as gray rock.
- 54:09 But the problem is to identify a narcissist and to identify even more so to identify
- 54:15 a psychopath. Psychopath psychopaths act. They’re great actors. Mhm. And narcissists believe their own confabulations and fantasies and stories and promises. And
- 54:28 so because narcissists believe what they are saying, it’s very difficult to spot them. It appears to be real and genuine
- 54:34 and authentic. And because psychopaths are goal oriented, highly manipulative, and very
- 54:40 good at at modifying other people’s behaviors and expectations, psychopaths are also undetectable. I think AI’s main
- 54:48 contribution would be detection actually detection rather than I mean also maybe
- 54:55 provide all kinds of techniques and so on but the detection would be important. Yeah, I think detection definitely has promise. I think we’re some ways away from that without having true human insight and oversight into a lot of
- 55:11 these things because you can even have true therapists that have been doing this for years get fooled. So if we’re
- 55:17 going to have an unmititigated machine that’s going to be looking at you know written communication as the only
- 55:24 baseline to diagnose someone somebody I think that becomes very dangerous and a slippery slope because you know you can
- 55:30 look at all the studies where you know they drop people off at Stanford and then they get uh diagnosed with you know
- 55:36 all kinds of uh diagnosis that didn’t exist. I think you’re and you alluded to
- 55:42 it earlier like AI can be susceptible to biases. It can have hallucinations. It can, you know, and and that’s why there needs to be responsible implementation of these technologies. And so it’s not a
- 55:55 onesizefits-all. And you can’t just basically swing a hammer and say we’re going to solve everything with this. But
- 56:02 if we can start to take chunks of this and I think through your decades of research, you know, you’re starting to
- 56:09 piece together like techniques to understand help victims get to a place where they are able to identify, solve,
- 56:18 and move forward from that. And so if we can, I think, take little slivers of
- 56:24 this and start to, you know, how do you eat an elephant? It’s one bite at a time. If we can start to do that with
- 56:30 just communication, I think there’s tremendous potential at least from a
- 56:36 altruistic perspective hopefully um in putting this technology to good use.
- 56:43 And so I guess the next question I have is, you know, you did mention, you know, using AI to identify these things, but then or individuals. I don’t know if we’re quite there yet, but if we were to
- 56:56 hypothesize if narcissists or these cluster B personalities have typical
- 57:03 um tracks or patterns in which they follow, is it possible we can start to identify where somebody is within the cycle? Like are they going to be in the,
- 57:14 you know, love bombing stage or are they in any any other stage? So, I’d love your insights there.
- 57:21 The reason many diagnosticians and clinicians fail to properly identify
- 57:28 narcissists and psychopaths is because they pay attention to too much information. Interesting. They pay attention to body language. They pay attention to words. They pay
- 57:40 attention to expressions and micro expressions. They pay attention to context. They pay attention to family members. They pay attention to the literature. They pay attention to videos by Sakni. and so on. That’s too much information. I think AI could be laser focused. And
- 57:57 if I had to select a single thing which has excellent predictive value and high
- 58:03 validity when it comes to diagnosing narcissists and psychopaths, it would be language.
- 58:10 I think we could pretty easily actually design a Turing test for psychopaths and
- 58:16 narcissists the same way there’s a Turing test for computers. Computers mislead you into believing
- 58:22 that they’re human beings by passing the Turing test. That’s exactly what narcissists and psychopaths do. They mislead you into
- 58:28 believing that they are human beings by imitating, emulating, mimickry, pretending to be human beings.
- 58:35 But psychopaths and narcissists are not human beings because they miss critical modules
- 58:42 which in in the absence of these modules there’s no humanity. Mhm. When you when you miss when you don’t have emotional effective empathy when you don’t have access to positive emotions like love when you have no sense of self because the formation of
- 58:59 yourself has been disrupted in early childhood. when you are callous and ruthless to the
- 59:05 point that you objectify people, reduce them to props in your theater play and
- 59:12 so on and so forth. When you put all these together, what’s left is not a human being. What’s left is a great
- 59:19 simulation of a human being. Mhm. And indeed, as as you mentioned in our correspondence,
- 59:25 there was this roboticist Masahiro Mori in Japan who suggested
- 59:32 appreciiently prophetically suggested in 1970 that the more computer the more
- 59:38 robots come to resemble human beings, the less comfortable we’re going to feel around them.
- 59:44 This is known as the uncanny valley. That’s what narcissists and psychopaths do. They imitate they they they imitate
- 59:52 they simulate human beings but they’re not. Now that’s very helpful
- 59:58 because the only way we judge the humanity or lack thereof
- 60:04 of another person is via language. We rely on self-reporting.
- 60:12 I have no way to prove that you are a human being. No way whatsoever.
- 60:18 I have to rely on your self-reporting. If you’re telling me you’re sad, I have no machine or device or test or probe that can prove that you’re sad. I have
- 60:29 to rely on your self-reporting and either I trust you or not. In other words, language is a great arbiter.
- 60:37 Language is the the infinite detector of internal states.
- 60:43 It’s a bad detector in many many cases because people lie, prearicate, fantasize. There are major disruptions
- 60:50 to the communication of internal states. But it’s still the only tool we have. Now AI is vastly superior to human beings in analyzing language and
- 61:02 language patterns. Vastly superior. And this is where detection of
- 61:08 narcissists and psychopaths could be raised up to the next level because psychologists and psychiatrists and other types of clinicians they are not good at analyzing language. And the reason they’re not good at analyzing language is that language triggers in
- 61:25 them associations. When I talk to you and I would say the
- 61:31 word mother, that’s not an objective neutral word. The minute I say mother,
- 61:37 it triggers in you memories, emotions, pain, love, I don’t know what. And this
- 61:45 this is noise. It obscures the signal. This will never happen with an AI
- 61:52 program. If I tell the AI program mother, that’s it. It’s a lexical. There’s
- 61:58 lexical meaning. there’s interconnectivity with other things and but it’s all the time objective and
- 62:05 neutral and so there’s not the level of noise in AI is much lower when it comes
- 62:12 to verbal verbal communication level of noise in AI is much lower than in human beings. Therefore AI would be better in my view at spotting narcissist and
- 62:23 psychopath. So then you’re saying we’re clinicians and and people that are
- 62:29 diagnosing are getting too much data. So you’re saying we need to simplify it and not only too much data but the data
- 62:35 triggers noise. Ah triggers emotions, triggers memories, triggers, you know, they’re not good
- 62:41 machines. Clinicians are not good machines. So then as a followup then how
- 62:47 can you differentiate if we’re going to use psychopaths someone that would actually manipulate for their own use
- 62:53 versus like confabulation where you would have a narcissist create and fill in all the gaps. So that might be a
- 63:01 challenge that machine um or um data scientists and actual programmers may
- 63:08 have the challenge to understand like is this real? Is this lies or is this
- 63:14 filling in the hole? No, not really. Because the conviction of the narcissist in the
- 63:21 veracity of the fantasy or the confabulation shines through. The narcissist, for example, is is likely much more likely to use words like belief or I’m convinced or or true
- 63:35 or so is likely to use words that uphold the truthfulness of what he’s saying.
- 63:41 Whereas a psychopath is much more likely to use machavelian manipulative words such as I would like you to or I want to
- 63:48 or so the the psychopathy and the narcissism shine through the language.
- 63:56 I can read which is the optimal way. I could read a text and tell you if this text has been written by a narcissist or a psychopath. However, if I were to communicate with a
- 64:07 psychopath face to face, even via Zoom, the noise would be much higher and I may I may get it wrong.
- 64:15 And what artificial intelligence does is, excuse me for a minute, will lead
- 64:21 someone is ringing the bell. Mhm. What artificial intelligence does is it is exposed to the to the semiotics not always to the semantics and never to
- 64:34 the noise. the the if I tell artificial intelligence mother there’s no memory there’s no association there’s no pain there’s no love there’s no nothing it’s just mother that’s a huge advantage
- 64:47 it’s a huge advantage there is actually
- 64:53 there is almost no other way to diagnose narcissists and psychopaths clinicians rely on body language for example
- 65:04 but we know that the body language is common to people with narcissistic style.
- 65:10 Um, clinicians rely on um kind of a displays of callousness and
- 65:17 ruthlessness and one track mindedness as proof of psychopathy.
- 65:23 But that’s not validated. That’s not true. It’s very common to even to healthy people under certain
- 65:30 circumstances. Only language exposes narcissists and and psychopaths infallibly. Only language and there AI has advantage.
- 65:41 Okay. So that’s great to know. So I guess the question becomes then is there I I understand you can have narcissistic tendencies and then you can have narcissistic actual like diagnosis.
- 65:54 Okay. So we we need to distinguish the two. Then we also need to look at like the spectrum if you want of narcissism
- 66:00 where you can have covert narcissism versus grandiose. And so would you feel
- 66:07 that there’s going to be a difference between having you know the different types or flavors or however you want to describe it because the mechanisms you know the core
- 66:20 wound is the same but they’re going to present differently. Yes, of course
- 66:26 they’re going to present different. Essentially, you’re talking about rendering AI
- 66:33 a kind of personal therapist. Therapists constantly at your fingertips available
- 66:40 to you. So, what does a therapist do? Therapist diagnosis, then therapist. Therapist
- 66:46 provides you with insights, especially insights about yourself. And then a therapist provides you with good
- 66:53 techniques and tips and advice on how to behave in order to minimize harm and maximize utility. That’s what good therapists do. Artificial intelligence is capable of doing all three given sufficient constraints and rigid
- 67:10 you know control and so on. It is capable of doing all three with a pronounced advantage in the diagnosing stage because of its relationship with
- 67:21 language which clinicians don’t have. Clinicians may be better in the tips and
- 67:27 advice phase because clinicians can empathize, clinicians can you know understand. Clinicians clinicians are
- 67:34 human. So this gives an advantage when when it comes to interacting with victims and telling them how to for
- 67:40 example recover or how to how to modify behaviors in order to avoid similar
- 67:46 situations in the future and so on. So so their clinicians would have the advantage and I think the best solution
- 67:53 is a combination of clinician and AI. In other words, AI used by a clinician
- 68:00 to interact with clients and patients and so on. AI as a tool as an instrument at the disposal of recognition. Yes, each type of narcissist narcissism presents differently but again like everything else in human life they all mediated via
- 68:17 language. So for example the covert narcissist is likely to be passive
- 68:23 aggressive. Passive aggression is usually mediated
- 68:29 via language or via actions that are kind of sabotaging intended to sabotage
- 68:36 something, undermine something. The overt narcissist, the grandio
- 68:42 narcissist is likely to be in your face, much more open about his beliefs about
- 68:48 himself that he’s, you know, perfect and omnisient and omnipotent and so on.
- 68:55 And all this all this can be reduced to a set of
- 69:01 algorithms and and analytic models that would spot the types
- 69:07 uh pretty safely with a very high validity. That’s what we that’s what we are trying
- 69:13 to accomplish with psychological tests essentially and we keep failing because psychological tests rely on the goodwill of the participant.
- 69:24 If you refuse to to, for example, if you refuse to respond honestly to a psychological test such as the narcissistic personality inventory or
- 69:35 the PCLR, then they’re useless. They rely critically on honest
- 69:41 self-reporting. So, they don’t analyze language, they analyze content. And
- 69:47 that’s a common mistake. By the way, there’s a common confusion or conflation
- 69:53 of content and language. Content is not language. Content is message. Content is signal. But it’s not it’s not language. While AI is focused on language,
- 70:06 clinicians are focused on message or content. And narcissists and psychopaths are brilliant at manipulating messaging,
- 70:18 manipulating messages and signals, but even they cannot overcome the inherent
- 70:24 limitations and structure of language. So I wouldn’t be too worried about the various presentations of narcissism.
- 70:31 They’re all reducible to the same set of relatively primitive and well-defined
- 70:38 criteria, which is behavioral but mediated via language.
- 70:45 That makes a lot of sense. And so we we really need to break it down to its root, you know, the root of what’s being said. night. That’s exactly why I think AI is definitely has an advantage
- 70:58 because it breaks it down to the root of all the words of the sentiment of understanding what is being said. And so
- 71:07 if you were forced to communicate with somebody and you’ve suggested multiple times you’ve got a personality disorder
- 71:13 that’s stunted in development between the age of three and 11, and we’re going to say 11’s very generous, but they are
- 71:20 stunted emotionally. How can one communicate with somebody that is of that developmental, you know, stunted growth? Is it something that, you know, you and I can have a communication and we’re aiming for the same goal and it’s to understand each other’s viewpoint and
- 71:36 talk about this. What if it’s different? What if it’s, you know, manipulation or what if it’s other things and you’re forced to communicate with them? How does the the fact that they present at such a young age change how you need to
- 71:48 communicate with them? Have you ever communicated with a child? All the time.
- 71:54 That’s it. That’s the answer. Children are manipulative. Children are egocentric. Children are disempathic. They lack emotional and I mean a certain age up to
- 72:05 a certain age. They lack emotional and effective empathy. Children are narcissists. Yeah. Even Freud recognized it. He said there’s primary narcissism and secondary narcissism.
- 72:16 children and then later on in life, adolescence, especially early adolescence, they are narcissists.
- 72:23 And so, everyone who has ever communicated with a child is perfectly equipped to communicate with a
- 72:29 narcissist. Mhm. The problem is that unconsciously we make the erroneous assumption that
- 72:37 narcissists are adults. Mhm. Even when narcissists attend therapy, the vast majority of clinicians
- 72:45 treat them as if they were adults. They try to strike a therapeutic
- 72:51 alliance with the narcissist. They try to negotiate with the narcissist. They try to compromise with narcissist. They
- 72:57 try to reason with the narcissist. They try to show demonstrate to the narcissist their insights. They treat the narcissist as an adult. Narcissists are not adults. the overwhelming vast majority of
- 73:08 narcissists are between the ages of two and three mentally and psychologically speaking. So people say but wait a
- 73:14 minute then how would they be capable to run a big company or even a country you know it’s nothing to do with it this is nothing psychological mental age has
- 73:26 nothing to do with your skills or capacity to for example have semantic memory memory of processes. Mhm. So Narcissists
- 73:39 are children who are in charge of countries. They are children who are in
- 73:45 charge of corporations. They are children in show business. They are children in law enforcement. But they are mentally children. When they are confronted with situations which do
- 73:56 not involve emotions, they are perfectly capable. They have at their disposal all the
- 74:03 skills and the me and the kind of memory that known as semantic memory. The kind of memory that is very good at you know
- 74:09 doing things accomplishing things. But whenever they are confronted with emotion, stress, anxiety, tension,
- 74:18 crisis, demands, criticism, disagreement. Whenever
- 74:24 they’re conf confronted with these situations, they regress instantly and become immediately children. They throw temper tantrums. They’re incapable of
- 74:35 predicting the consequences of their actions. They have no perception of time. They’re utterly children. So if
- 74:43 you want to communicate with the Nazis efficaciously, simply wrap your mind around the realization that it’s a child. It’s very difficult
- 74:54 to do because they look to be they look grown-ups. You know that they they are children in adult bodies. We tend to
- 75:02 confuse body chronological age. Uh we tend to confuse chronological age with mental age. And so and that’s a huge mistake and that is the source of the frustration and the hurt and the the prolonged grief
- 75:19 of victims because they have made the assumption that they were assumption they were dealing with adults and then suddenly a
- 75:25 child hurt them and it’s difficult to take.
- 75:32 Whenever victims attach to narcissists they attach to the child. It’s an it’s it’s a maternal attachment. People of both genders, male or female,
- 75:44 even if you’re a male and you see a baby, you smile and you cool and you are protective of the baby. Yeah. Even men
- 75:51 become maternal when they’re faced with a baby. So the narcissist triggers in
- 75:57 all of us maternal instincts. Then to let go of this child is
- 76:04 difficult. It’s always difficult to let go of a child. And so maybe we lie to
- 76:10 ourselves that this is not a child, this is an adult in order to avoid the grief and the hurt and the pain later on. But
- 76:17 it’s not working. It’s not working because the narcissist triggers our inner child. It’s a childto- child
- 76:24 interaction basically. It’s a playmate kind of thing. It’s it’s a very complex uh dynamic. But to your question, the answer is simple beyond beyond
- 76:37 belief. Simply assume that it’s a child and proceed accordingly. End of story.
- 76:43 You don’t need complicated books and therapy sessions and and interviews. That’s it.
- 76:49 I I think you’re absolutely right. And I think one of the biggest issues is you do have someone that is stuck in grief.
- 76:56 So, if someone has gotten to the point where they recognize that this is a toxic situation or a relationship and
- 77:03 they have they’re dealing with a cognitive dissonance where they can recognize, I love this person cuz I did
- 77:10 X, Y, and Z thing that was good or they were kind during these times, but then they recognize there’s a lot of
- 77:16 negativity. There is multiple factors. So, it’s like you’re having somebody that needs to deal with those wounds, those core wounds, understanding and and and
- 77:28 basically healing that aspect. But then if you’re forced to communicate with somebody and you’re seeing the glimpse
- 77:35 of the good, the bad, I think it can prolong the hearing of the healing process. And so having something and if
- 77:43 we were to slice it up, if we’re just looking at the linguistics aspect of it and say, “Okay, we’re only going to worry about written communication or we’re going to worry about that and we need to just basically separate out
- 77:55 because a narcissist is going to want to have that supply. They’re going to want to have that safety. They’re going to want to drag you back in the hoovering
- 78:02 as you coined the term. And so having somebody be able to work with licensed
- 78:08 therapists and professional to help them understand the trauma bond, get healthy,
- 78:15 but also use a tool to help them separate out from this situation, I
- 78:22 think has the potential to really help a lot of people. Anything that puts a mirror to you, anything that
- 78:33 allows you to look at yourself, to see yourself as you are is always helpful. That’s at the core of therapy. Psychotherapy is about providing you with insight about yourself that you’re
- 78:44 incapable of generating on your own. So any instrument, AI instrument, soft
- 78:51 software program, there was a software program called Eliza in the 60s. It did the same. It wasn’t artificial
- 78:57 intelligence, but it was a simulation of a therapist of a psychotherapist. It’s called Eliza. It’s very successful. To this very day, you could use Eliza. I think it’s available online. It’s stunning. It’s
- 79:09 like a therapist. So the the problem with um the problem with
- 79:17 when you team up with a narcissist one way or another, you interact or you react to the narcissist on so many levels
- 79:28 that the to extricate yourself later on becomes self-sacrificial.
- 79:36 It becomes a an act of self- emulation. When you interact with a normal person,
- 79:42 even if you fall in love with someone, there’s an intimate partner, you have a relationship and so on so forth, there
- 79:49 is a part of you that is preserved in a pristine way. There’s a part of you that is untouched by the partner which is very healthy, very good. There’s a part of you that is that remains you, never mind what happens to the partner and what happens to the
- 80:06 relationship. It’s not the case with narcissist. With narcissist is a takeover and it’s
- 80:12 not always a hostile takeover. The narcissist truly believes in the shared fantasy. He truly believes that
- 80:18 he loves you. He truly believes in these promises to you. He truly wants this to work. He believes that you are enabling
- 80:26 him to experience for example love or you know so he’s euphoric throughout
- 80:33 this is euphoric. This is known as narcissistic elation. It’s an oceanic feeling.
- 80:40 And you respond in kind. You react to the narcissist as a mother,
- 80:48 the maternal part. You react to the narcissist as the realization of all your dreams. Your dreams, your dream come true. You react to the narcissist because of the fantasy. The fantasy is an escape from reality which is very tempting in today’s world. You react to
- 81:05 the narcissist as the one finally found your soulmate or your twin flame,
- 81:11 whatever you want to call it, your compliment. You know, everything in you responds to the
- 81:17 narcissist and every single part of you interacts with every single part of the narcissist. Ultimately, you find
- 81:23 yourself enshed. You find that you have become a single organism with the narcissist.
- 81:30 So to let go of the narosis is to let go of you is to self amputate.
- 81:38 It’s extremely painful. The grief is multifaceted. You grieve for the narcissist. Of course
- 81:45 when you when you separate when you break up you grieve the narcissist loss of the narcissist. You grieve the loss of the fantasy. You grieve the loss of of yourself because you’re no longer you. You grieve the loss of the child as
- 81:59 a mother. What could be worse? You’ve just lost your child. You grieve the loss of a parent because the narcissist plays plays a role of a parent in the in the relationship as well. This is the dual mothership thing. And so all these
- 82:15 layers of grief interact re in re reinforce each other. There’s an amplification of grief, magnification of grief. And the worst part is this.
- 82:27 Following the breakup, your grief is the only thing that makes sense of your life. Your grief is the only thing that imbuss your life with meaning, gives you a
- 82:39 reason to to survive. Even because the alternative is self annihilating. When you are immersed in grief, you’re busy doing something. It keeps you
- 82:51 alive. And so grief becomes professional.
- 82:57 It becomes um a vocation and an avocation. And that’s why you see online millions of victims mourning and grieving for 10
- 83:10 years, 15 years, 20 years. I’m kidding you not. They can’t stop. They can’t stop this. Victimhood have has become their identity. Mhm. And it’s it’s kind of identity politics if you they become they become professional victims.
- 83:27 It it also caters to some extent to grandiosity and so but we’ll leave that aside.
- 83:33 Also don’t forget that as long as you grieve the narcissist is somehow in your life.
- 83:39 It’s a way to stay in touch with the with the representation of the narcissist in your mind. It’s like he’s
- 83:45 never gone away. is you have never lost him because you’re still craving for him. It still occupies your mind in a
- 83:52 way. This Hi there. In 2021, Facebook rebranded to Meta and for me coming from an IT
- 83:59 industry, this was an exciting release. At the same time, I was very curious about what it meant for us. Will
- 84:07 Facebook come with a metaverse soon? Will it change the world as we know? Is reality going to be replaced by algorithms? and most importantly how does it impact our society. During this
- 84:19 time I met a very interesting person who provided me with a lot of insights and answered a lot of my questions about
- 84:26 metavverse. In this episode I would like to share that conversation with you. Hope you like it. Thank you Sam for
- 84:33 doing this. I have been absorbing your information and listening to um your talks on various topics through your YouTube channel. So, it’s really a pleasure to finally meet you in person.
- 84:45 My pleasure. Thank you for having me. You seem to have survived my talks. That’s rare. I have not just survived. I have I think I’ve grown wiser. So, um for the for the the sake of my
- 84:57 audience, just a quick introduction. I I would like to um uh make a note here. Sam, you seem to be
- 85:03 a person of various faculties. You are a professor of psychology and finance from
- 85:09 CAS CAPS center for international advanced professional studies. You’re also a professor in psychology from the
- 85:17 southern federal university um in Rosto V on Russia. Um you’re also a former
- 85:23 senior business correspondent um for UPI and oh former tech analyst for various
- 85:29 online medias. And last but not the least, you’re also a writer and a publisher. and all right this by the
- 85:36 tender age of 61 imagine no pressure on me so um Sam I come from a technical IT technology and services background and
- 85:47 um so hence my natural curiosity uh on this topic for today metaverse uh
- 85:53 especially when last year Facebook rebranded to meta and suddenly it became
- 86:00 a buzzword in our circle and I started to explore you You know as we say I had a formal fear of missing out. What is
- 86:07 meta? I honestly did not have a very good understanding and as I as I was exploring many content as all as usual I bumped into stumble into some of your um podcast or I think some um dialogues on this topic uh which was very interesting
- 86:22 from um your perspective and hence I thought it’ll be great to have this conversation hearing your perspective on
- 86:28 metaverse um through different filters from technical aspect to uh a psychological
- 86:34 aspect to you know um in general a human and mental health perspective. So when
- 86:41 we when you hear metaverse, what is metaverse according to you? Well, we can start with a simple technical definition and then we can maybe try to embed it in history because nothing nothing that
- 86:53 people do is divorced from context. The context is usually historical.
- 86:59 We need to look back to understand the future. Technically, the metaverse is a series
- 87:07 of interconnected digital spaces. These digital spaces provide you with a
- 87:16 simulation of real life experience via devices
- 87:22 such as goggles, haptic suits, and so on so forth. So, you would need to buy special devices. You can’t just like in
- 87:29 the internet you can’t just have a smartphone and do it but you need to experience it. This is what we call extended reality or
- 87:36 mixed reality. Uh the metaverse would try to confuse us in the sense
- 87:43 that it would try to blend or blur the boundaries and the lines between what we
- 87:49 hitherto called reality and the future um technologies. So virtual reality,
- 87:56 augmented reality, extended reality, mixed reality, they’re going to lead to lead to a stage in I think no longer no
- 88:04 longer than 10 years where you would have serious difficulty telling apart what is really happening in the world out there and what is being simulated
- 88:16 for you as an experience. In this sense, the metaverse
- 88:22 is about who owns reality. It’s a power grab for reality. It’s a power It’s a It’s an attempt to define for you all the possible ways
- 88:34 and potentials for you to experience reality. Until now,
- 88:40 you experience reality in an idiosyncratic way. Each one of us experiences reality differently because
- 88:47 we are different people luckily. But what the metaverse would do, it would narrow down the possibilities of experiencing reality because you would be dependent on a code. You would be
- 88:59 dependent on a program. You’ be dependent on a platform. And never mind how brilliant the platform is design
- 89:05 brilliantly it’s designed. Never mind how how many creative people are involved in in the in the in the coding
- 89:11 of the platform. Ultimately, it’s limited. So this would narrow down experience and
- 89:19 narrow down reality and in this sense would blend what
- 89:25 hither do we call reality with a digital equivalent. This is known as as twinning. Mhm.
- 89:31 So we would have digital twins and some people will opt to spend the
- 89:39 bulk of their lives in the virtual version in the dig digital version and this is
- 89:45 of course very reminiscent of the matrix and some people would would adhere to mixed reality. They would spend some time outside the the simulation some sometime inside the simulation. I
- 89:57 mentioned that it is a series of digital spaces. There would need to be some seamless connection between these digital spaces if we want to give the user the illusion that he is not leaving
- 90:09 reality or that he doesn’t have to log in log out and and all this kind of things. So it sounds like it would be a
- 90:15 digital world where we work, we play, we hang out like and I I’m I’m glad you mentioned matrix the reference of matrix
- 90:22 because for the for the nontechnical people and even to a large extent for me who does
- 90:28 not understand the the the deep coding and programming and technical aspect of it. The first thing that came to mind when we started to hear the buzzword is reference to matrix. So this was my
- 90:41 connection to the concept metaverse when I first heard of it. The matrix something sounds
- 90:47 like and it is scary to to a certain ex certain extent. What when was the first
- 90:53 time or if you could help us understand how did you come to perceive metaverse? Is it before that? First of all, uh
- 91:00 you’re very right. The metaverse is aim a aims to provide a seamless experience
- 91:06 in the sense that the company you work for will have a virtual office in the metaverse. So you will go to work in the
- 91:12 metaverse, not in reality. You will socialize with people. They will have their own avatars. You will have your
- 91:18 avatar and all of you will go to a bar and the bar the bar’s location will be in the metaverse.
- 91:24 Yeah. You will have sex in the metaverse. You will date in the metaverse. You will do shopping in the metaverse. You will try
- 91:30 on clothes in the metaverse. Gradually reality would become redundant and obsolete as the technology advances and progresses. And this is something which
- 91:41 will take I think a few more decades integrating with artificial intelligence and other developments. But in I could
- 91:48 conceive a future in 30 or 50 years where reality would be utterly unneeded, unnecessary and would be discarded by
- 91:55 the majority of people. and convenience of the metaverse is its to totality.
- 92:01 It’s a total immersive environment which gives you very few incentives to leave it and many incentives to stay. Now I came across the the metaverse
- 92:13 because I’m a a sci-fi writer by the way and a off a novel. I must I must add that to your
- 92:19 biography. Yeah. Don’t don’t start. It’s it’s too long. Yeah. So I I came across of course Neil Stephenson’s famous book the snow the snow chip. Snow crush. Snow crush. Yes.
- 92:30 And um he coined the the word metaverse and he he’s pretty pretty right on. I
- 92:36 mean he got it right 1992. He got it right in 1992. He started to write a book in 1988
- 92:42 in the throws of a major depression. He had clinical major depression. So the book is the rumination of ruminations
- 92:51 and thoughts of someone who is in the throws of a major debilitating depression. And so he thought the
- 92:58 metaverse is a very depressing thing. So um I haven’t read the book but are
- 93:04 you saying in that book he actually used he coined the word metaverse. That’s the first that we know of. Yes. There’s a Chinese guy there and he’s a pizza deliveryman of all things
- 93:16 and uh but in the metaverse is something else much more elevated and so on. That’s another thing by the way. In the
- 93:23 metaverse you could be anything you want and the metaverse will have a virtual economy.
- 93:29 It will have its own economy. You’ll be able to buy things and sell things and translate the sales into actual
- 93:35 currency. So you’ll have an incentive to operate economically within the metaverse. And in the meta versse, you
- 93:41 can become a multi-billionaire. You you’re a street sweeper in real life, but in the meta versse, you’re multi-billionaire. Now, we’ve had this experience before. We know exactly what’s going to happen because there was
- 93:52 um there still is a game, an immersive game called Second Life, and it was
- 93:58 named Second Life because it gave people a second life apart from their real lives. And people people became addicted. Well over a million people
- 94:10 were became so addicted to Second Life that they actually gave up on reality
- 94:16 and they played the game for 16 hours a day. Consequently, the diagnostic and
- 94:22 statistical manual committee edition 5 decided to include a new diagnosis in
- 94:29 the DSM called internet addiction. This was a result of Second Life in 2003 when
- 94:35 addiction started to be rampant. Second Life was a metaverse. You could buy things, you could sell things, you could
- 94:41 have fights, you could bully people, you could befriend people, you could socialize, you can come in and go out. I
- 94:47 mean, it was total life, second life indeed. And for many of those people, this was an escape from the reality that they couldn’t have or in the reality they couldn’t be.
- 94:58 That’s a huge risk. That’s the that’s the greatest risk of a metaverse. The metaverse can be easily designed to be
- 95:06 fantastic, to be a fantasy where essentially all the hardships and challenges of
- 95:13 reality are removed for you and only the only good things happen. This is at least the ideal. what actually is going to happen and is already happening for
- 95:24 example in in virtual chat rooms like VR chat you know what is already happening in immersive metaverse like environments and there are quite a few by the way we see that all the ills and the problems of real life are imported
- 95:42 unblock into the metaverse we have um no political uh extremism we have terrorism
- 95:49 we have Everything that we have in real life is imported into the metaverse. It’s it’s um it’s quite it’s a paradox
- 95:58 while we here and I sense I I see that a sense of um urgency to look at it as a
- 96:05 potential threat. But when I like I said coming from the technology um industry there is a lot of optimism and there’s a lot of indulgence in terms of investment and you know um branding and um the the
- 96:19 biggest players like Microsoft and Facebook and and a lot um and many more are investing heavily. So it doesn’t
- 96:26 paint the same picture if you look in the in the space where we operate in a in a professional side. What do you have to say about that? You see, corporations and and and commercial entities have
- 96:37 taken over an open platform known as www
- 96:43 and they had leveraged this platform and had abused this platform egregiously for profit.
- 96:50 This is precisely what’s going to happen with the metaverse. The metaverse should be the equivalent of the initial days of
- 96:57 the internet. The internet was designed by Berners Lee and others to be an open platform. There is even a commi a committee called W3C
- 97:08 which regulates the internet uh as an open platform. No one owns the internet. There’s no such thing. No one owns it. That’s why you can’t use the internet to punish for example people or to punish
- 97:19 even governments straying government. There are no litigations. There’s no way to the the internet is utterly I mean
- 97:25 even the even the technological specs are totally open. IP and DNS essentially
- 97:33 are distributed. You can’t control the stream. They are random. They reassemble at the end. They
- 97:39 are distributed at the beginning. So it’s out of control. It the the lack of centralized control was built baked
- 97:46 into the internet. And then companies, commercial entities came and I’m not
- 97:53 talking about hardware manufacturers. They were just producing hardware. I’m talking about software and later social media entities
- 97:59 and they had abused and are abusing the internet for for profit. This would have horrified the visionaries that had
- 98:06 created the internet. Exactly the same thing is happening now with the metaverse. Sorry to take you back like
- 98:13 you mentioned visionary um or the vision behind creating the worldwide web internet um what was that just to you
- 98:20 know for all of us to do a reality check go back into that what was the internet aimed for and where we have come
- 98:27 it’s important to understand that there is a war right now there’s a war between two competing visions of of essentially
- 98:34 the metaverse one competing vision is called web 3 and one is called the metaverse Now, web 3 is going back to the roots of the internet. Web 3 is about decentralization, handing the power back to users
- 98:51 and and the and to content creators. Now, this is supposed to be done by
- 98:57 introducing crypto assets or blockchain technology to be more precise into the
- 99:03 structure of the new iteration of the internet. So if you introduce blockchain technologies um no one can monopolize your identity, no one can fake your identity and no one
- 99:16 can collect your data. It’s a it’s an attempt to take back power from the likes of Meta platforms as well
- 99:24 Facebook. So that’s the web three. Web 3 is a grassroots populist and popular
- 99:31 movement to take back the internet from the commercial giants. The commercial
- 99:37 giants are not taking this down lying down. The metaverse is the commercial giants attempt to suppress web 3
- 99:48 and to steal to steal the technology. steal, there’s another word, to steal the technologies embedded in web 3 and incorporate them in in the commercial metaverse so as to defang the web 3. So
- 100:02 there’s a giant war, enormous war taking place right now between users and content creators, crowdsources
- 100:10 and between commercial entities. Who will win is is an open question. I would bet on commercial entities because they
- 100:16 had won in the past. I think they’re going to monopolize the the metaverse.
- 100:22 They’re going to incorporate blockchain technologies into the metaverse, but in a proprietary manner,
- 100:29 and again, they’re going to tell us how we should experience the world and limit us if we try to exit this platform. So
- 100:39 I’m terrified that they will control these commercial entities will control the metaverse because the metaverse is
- 100:45 not is not about what you experience. It’s about how you experience.
- 100:51 That’s a very substantial difference and that’s that’s a great point and we’ll come to the you know we’ll we’ll probably get a chance to talk in detail
- 100:58 more about the social impact. I I you mentioned while we’re on the commercial aspect of it um it seems like there’s a
- 101:05 lot of money at stake and um there’s a lot of uh mobilization of money that’s going so investments like I mentioned in
- 101:13 by Facebook Google and Microsoft do you see them as being one major corporate
- 101:20 collaborating together or do you see there would be a clash um of of um markets or um all previous media starting with with telegraph and radio
- 101:31 and and continuing into the internet. All previous media start with competition and then the big players settle on a set of standards. Yeah. And then they they adhere to these
- 101:43 standards. But the metaverse is different. If Google will have its own metavis and Microsoft will have its own meta versse and Facebook or meta platforms will have it their own metaverse, the metaverse will will fail
- 101:54 and die because you need to move seamlessly between Apple, Google. So they will be
- 102:00 forced to collaborate. That is even more terrifying than the current state of
- 102:06 things because it means that there will be consortium of commercial giants who will collaborate in as cartels do or trusts do almost illegally I would say
- 102:18 to provide a critical service because the metaverse is going to eliminate the internet. Let it be clear the internet
- 102:24 is dying. Once the metaverse comes online, the internet will vanish and we will remember it as a kind nostalgically
- 102:31 as something you know a stage. The end result is a situation where we move
- 102:39 uh we flow between this brick and motor wood and simulated wood and then back to
- 102:47 real wood and then back to simulated wood. and controlling this traffic lane
- 102:53 will be a group of behemoths, a group of giant companies and they will tell you
- 102:59 um how to experience the world. It’s almost back to the plot of Matrix
- 103:05 or an episode of Black Mirror. I don’t know how familiar are you with the famous Netflix series. So
- 103:11 um you talked about u crypto black blockchain. Uh let’s um um would like to
- 103:17 understand a bit on how the digital currency will evolve in metaverse. They’re called crypto assets. Yeah. The the two big ones are Bitcoin and Ether. Ethereum. Ethereum. That’s Thank you. So how will
- 103:27 how are there are crypto assets? Yes. There is a misperception that crypto assets are investment vehicles. They’re
- 103:34 not about investment. They’re not about money. Crypto assets include cryptocurrencies but many other crypto
- 103:40 things. Crypto assets are concerned with one thing only identity verification.
- 103:47 Now the minute you verify identity it has a monetary value. So for example if I create a digital piece of art and I’m able to verify that it is my piece of
- 104:00 art that I had created. In other words I’m able to verify my identity that minute it gives this piece of art value because it renders it an original. This is NFT nonf fungeible tokens. So, so it’s about same with Bitcoin, same
- 104:16 with all the blockchain technologies. There’s a plethora by now blockchain for example in in commercial in in
- 104:25 container container industry. They’re using now black blockchain to verify containers and so on. And it it it
- 104:32 meshes with the internet of things. Internet of things where each and every object in our daily life will have an
- 104:39 internet signature or a signature. Yes. And the best way to ascertain that this is indeed your smartphone is using blockchain technology. So it’s identity verification mechanism. But of course identity has value, authenticity has value. People pay 10 a million times
- 104:58 more for a verified Vangok than for a replica. So and this is it. Now money,
- 105:05 if you step back a minute, what is money? Money
- 105:11 is um a store of value as embodied or
- 105:18 raified by work. Money is a work unit. But the work, my work is not equal to
- 105:24 your work is not equal to his work. So what bitcoin does it verifies my work in a process called
- 105:32 mining or staking or there are minting or there various ways of creating bitcoin. So it verifies the work
- 105:39 invested in in in in uh the case of bitcoin the computational power invested
- 105:45 to solve a riddle to solve an enigma puzzle. Bitcoin is about work. It’s about
- 105:52 verifying the identity of a work done. So if this is the case, then it would
- 105:59 behoove the metaverse, even the commercial metaverse, to use
- 106:05 these currencies inside the metaverse because they are prohibited from
- 106:11 creating real money. Central banks have a monopoly on this, but they do need a means of exchange. And most crucially,
- 106:19 they need a way to verify who is the user. So identity verification, blockchain technology is perfect for this. Well, which frightens me a lot because I
- 106:29 think what’s going to happen, the Microsofts of the world and the Facebooks of the world, they’re going to steal blockchain technology and make it
- 106:35 proprietary and protect it with patents and destroy the whole infrastructure of blockchain. And and this is this is so
- 106:43 confusing for me because I remember two years or even three years when crypto became popular and people started to invest in crypto and you know blockchain uh sorry the concept of blockchain and bitcoin as one of the currencies became popular. There was um a theme across the
- 107:02 general public. Um this is not regulated. This is not secured. Um, oh, it’s it’s it’s not um it’s just a buzz, but it’ll fizzle out. Fast forward two years now, I I read news where American Express and and the top banks like I
- 107:18 think um HSBC or JP Morgan, they’re all investing or moving into metaverse.
- 107:24 They’re better. It’s better, you know. I’m I’m confus. So, h how do you see? I mean I’m confused to
- 107:30 interpret that like now that you explained me I to some extent I there is no sector there is no sector
- 107:36 better suited for blockchain than banking than banking of course you have to verify user identity you have to verify
- 107:42 the transfers you have to blockchain blockchain can revolutionize and will revolutionize banking
- 107:48 completely remember again blockchain technologies is not about money it’s not about assets it’s not about any of these things it’s about identity it verifies
- 107:59 Your identity of course your identity is linked to your product or to your production process. So inevitably it
- 108:06 spills over into the value of your product or the but the crucial element is that there is a ledger there is a ledger spread over millions of computers copies
- 108:17 there are copies over millions of computers. So the minute you perform a transaction of any kind, all these
- 108:24 millions of cop identical copies, clone copies of the ledger are updated. No one can falsify this. Well, except with quantum computing in the very far future, but right now no way to falsify
- 108:36 this. Now there is no system that comes remotely close to this authenticity.
- 108:44 Even swift is easily falsifiable. Swift is the interbank um wire transfer system. It’s easily falsifiable. Easily. I mean so easily that had people known they
- 108:55 would take the money out of the banks immediately. It’s it’s a really badly designed totally disastrous system. ATMs
- 109:03 are even worse. So blockchain is a solution for international commerce, for
- 109:09 banking, for this is why the big commercial companies will user it,
- 109:15 hijack it and make it proprietary and destroy these grassroots endeavors to
- 109:22 you know provide alternatives and and do you think the inflation and the dying concept of money in general
- 109:28 led to this sudden rush for the financial of the financial organization?
- 109:34 First first of all, first of all, let’s be clear. The the concept of cryptocurrency is far from new. Second Life, remember I mentioned Second Life. They had their own currency. It was called the Lynden dollar. So in inside Second Life, you could pay with Lynden
- 109:49 dollars. And you could even convert Lyndon dollars into US dollars. So people were were using Lynon dollars to
- 109:57 buy real estate, to buy clothes, to buy inside Second Life virtual assets. The
- 110:03 virtual the virtual economy is a thriving enormous business. Now why would people pay tens of thousands of
- 110:11 dollars for a virtual good that essentially is reproducible, easily
- 110:17 replicable? Uh difficult to to ascertain as to authenticity except if you use
- 110:23 blockchain. Why would anyone pay for for something you can’t take home and and put in the in the living room? You know,
- 110:30 because um they realize people realize that the future is virtual
- 110:37 reality as we had known known it hither to is dying together with the internet.
- 110:44 Shortly you will be spending much longer periods of your life online inside the
- 110:50 metaverse in a virtual office than here with me here. I mean this will be utterly old-fashioned and retro retro you know we might I I also found that there is a
- 111:03 concept like a digital real estate. I mean Barbados just applied to have an embassy. I don’t know
- 111:10 if it’s still in the digital real estate. Absolutely. Everything. And um personally, I just started investing in
- 111:16 real estate two years ago when COVID hit, you know, and now I’m thinking may maybe I I made a wrong decision. Maybe
- 111:23 real estate investment in a digital landscape is is going to be the new thing. But virtual assets, digital assets, what we call digital twins, which are worlds constructed of digital assets
- 111:35 exclusively, they’re going to be a lot more valuable in 20, 30 years than any physical
- 111:42 entity, right? Anything brick and mortar and wood. So, of course, people are investing in them. Of course there’s you know so for the passive investors like me like us who are not actively into the
- 111:53 stock market are you suggesting according to you is it’s a good opportunity to invest in crypto in
- 112:00 metavors I don’t think so and I’ll try to explain why people are investing in these virtual assets because they are reading
- 112:07 the cards correctly yes virtual worlds are going to be much more valuable than real ones but I don’t think individuals can play this game because the big companies by individuals. You mean like us and me?
- 112:19 Okay. Manual. Maybe pension funds can play this game, but we cannot play this game because the biggies will not let us. The
- 112:27 biggies are intent and that includes governments by the way. They are intent to destroy this popular movement.
- 112:34 Intent. Absolutely. Because they cannot control or regulate. China criminalized um cryptocurrencies.
- 112:41 China Russia had created its own cryptocurrency and it’s the only legal cryptocurrency in Russia. Saudi Sweden
- 112:48 it’s it’s spreading governments and commercial entities are trying to hijack
- 112:54 these technologies and so individual who invest in these technologies and in virtual assets will find to their
- 113:01 detriment in 10 years or 20 years that governments and commercial entities have
- 113:07 rendered their investments null and void unless you give a huge portion to these
- 113:14 commercial entities and governments. You want to trade what you had bought 20 years ago, you have to go through me as a platform and you have to pay me 70%. We have such a case already. It’s called Amazon. If you publish a book, you have to give to Amazon 55% of the
- 113:30 value of the book of the cost of the price 55%. The author, the publisher, they get 45%. Amazon by virtue of being a platform,
- 113:41 nothing else is getting 55%. So today you Divia you buy you buy um real estate
- 113:49 virtual real estate no problem and it appreciates and you think you’re a great genius and then you try to sell your
- 113:56 real estate and there will be only one place to sell it the combined metaverse of all these giants and they will tell
- 114:03 you you want to sell it okay our commission is 80%. That’s it. You know, very interesting. That’s it. And I’m telling you that this has happened already with books and with
- 114:14 DVDs and so on in on Amazon. Amazon did exactly this. It created a marketplace which is essentially a metaverse. It
- 114:22 created a marketplace. Then many many publishers and book sellers and so on came there and then they said okay you want to use
- 114:28 a platform. It’s a minor commission of 55%. Thank you for sharing this perspective.
- 114:34 Take it to live. Take it to live it. Like you don’t want to. I’m not forcing you to sell through Amazon. When you complain, they say I’m not we’re not forcing you. You can sell anywhere else. Is there anywhere else? No, there isn’t. If
- 114:46 you’re a publisher or a book seller, there’s only one marketplace left. Amazon. Sales of books worldwide
- 114:55 are 81% through Amazon. It’s a monopoly. It’s a cartel. It’s a trust. Does anyone
- 115:01 dare to take on Amazon? any politician they would be they would be eradicated. No one dares to take on um you know these giants. There’s a lot of talk in Congress and so on but everyone is
- 115:12 terrified because if you’re a politician and you dare to take on Facebook suddenly you will find that your
- 115:18 speeches and so on never are never recommended. They don’t make it to the news feed. They they they have the
- 115:24 ability to render opponents, adversaries and critics invisible. a process known
- 115:30 as shadow banning in on YouTube and on Facebook. So they are very
- 115:36 aggressive in eliminating dissent and opposition. Absolutely. They’re authoritarian these authoritarian
- 115:42 structures. Do you um do you see any positive aspect or constructive or um um
- 115:52 a progressive aspect to metaverse in any field of you know the humanity or
- 115:58 to answer that we need to look back uh for example when the internet just started um there was a lot of optimism people said it’s a wonderful thing it’s distributed
- 116:09 no one controls it freedom of speech activism, political and other activism
- 116:15 and so on. Same when social media started, there’s always a burst of optimism based
- 116:21 on the assumption that no one is in control, that it’s a decentralized process.
- 116:27 But when it is centralized and commercialized,
- 116:33 these technological developments are egregiously abused. And that’s not me.
- 116:39 That’s numerous investigations of Facebook by Congress for example and Twitter.
- 116:46 There is a tendency power corrupts. Power corrupts and these platforms reward inherently
- 116:54 and and structurally reward hate speech, provocative speech,
- 117:01 um trolling, flaming aggression, hatred, envy. And this is
- 117:09 baked into the Facebook algorithm. What is a like? Why? And we see the
- 117:15 consequences. There are studies by for example Twench and Campbell, many others
- 117:21 that had demonstrated demonstrated utterly conclusively beyond any beginning of doubt that social media
- 117:27 usage uh increases dramatically the rates of
- 117:33 depression and anxiety disorders among youngsters and among people about the age of 65. Suicide rates have uh skyrocketed among
- 117:45 young younger users of social media. That’s why Facebook had to suspend Instagram kids because its own research
- 117:53 had demonstrated that it would drive many teenage many teenagers to suicide. Instagram kids was meant to be used by people aged 13 and younger. I never even heard of Facebook. Yes, but there were studies by Facebook leaked luckily by whistleblowers that
- 118:10 had shown that it would have a detrimental effect on the mental health of the users to the point of suicide.
- 118:16 Now, we don’t know exactly why, but we know that screen usage has something to do with it. I think the detachment from
- 118:23 reality has something to do with it. I think we underestimate face-to-face interaction. We know for example if I
- 118:30 revert to biology for a minute we know that when two people meet each other they emit a molecule each one emits a molecule and this molecule that’s a fact
- 118:41 by the way this molecule contains a little over 100 pieces of information
- 118:48 about the genetics of the person the imunological system of the person and other
- 118:54 parameters that’s face to face any We know for example that when men
- 119:01 uh come across a flesh and blood woman of any age 90 years old their
- 119:09 testosterone shoots up 40%. We know these are facts just by mere in the present or passing
- 119:16 passing okay just by passing. Interesting. And there’s a woman there and she’s 90 years old with a walker you know
- 119:22 and the testosterone shoots up 40%. We underestimate face-to-face interactions, right? And so teenagers commit suicide. The rates of depression went up 300% among
- 119:37 social media users and the rates of anxiety disorders went up 500%. And that’s before the pandemic.
- 119:44 Now, one last thing. The metaverse is now a certainty because of the pandemic. It had not been a certainty before the pandemic, but now it’s a certainty. Why?
- 119:56 People were zooified. They got used to zoom. The zoom is a foretaste of the
- 120:02 metaverse. So now everyone is conditioned to to use the metaverse to consume the metaverse.
- 120:08 Uh I never use zoom in my life until the pandemic. I’m 61 years old. I was a tech
- 120:14 tech high-tech analyst and so on. And I never use zoom because I prefer much prefer face tof face meetings. I never
- 120:21 ever once used Zoom or WebEx or any of these services. But then the pandemic has struck and I’ve used Zoom since then
- 120:28 hundreds of times. I had no choice. I taught classes using Zoom. I interacted with people using Zoom and so on so
- 120:35 forth. By now I feel utterly comfortable using Zoom and that is the window into the metaverse. I guess what I’m trying to understand is metaverse is here and and like you mentioned corporates are
- 120:47 going to um expand this but people like you and me you know who want to live in the real
- 120:55 life who do not want to trans transition into metaverse who want to have a
- 121:01 parallel life in the future we we would be considered freaks distasteful freaks
- 121:07 you and me talking having a conversation talking having sex. It would be distasteful distasteful activities
- 121:14 conducted by fringe groups and freaks and so on. I know it sounds crazy but
- 121:20 that’s precisely the way it’s going to be. As today people frown on someone who doesn’t use social media. If you don’t
- 121:26 use social media there’s enormous peer pressure on you to use it because it has become the preferable the preferred way of communicating. In the future when the metaverse is all pervasive and it will
- 121:37 be all pervasive there will be a lot of pressure on you to conform and if you insist on
- 121:43 face-to-face meeting you meetings will be considered a throwback or a freak or something’s wrong with you. How how will
- 121:49 the family life evolve or the social life not talking in context to a male and a woman interaction but the general
- 121:56 you know community neighborhood you know eating having dinner together um what
- 122:02 can what what is according to you some solutions to it you know if we can it’s a process known as automization where people are rendered self-sufficient by technology and then
- 122:14 they lose all incentive to accommodate other people to compromise to negotiate to because being with other people is honorous. Other people are ordinary.
- 122:25 They’re they are opinionated. They are pain in certain nether regions of the
- 122:31 body and so on. It’s a lot of effort to be with other people. And then if you if you’re self-sufficient in the truest
- 122:37 sense of the word, in fullest sense of why would you why would you? It’s a disincentive. So automization had taken over. 2016 was
- 122:45 the first year when majority of of women and and men did not have any contact with the opposite sex in the United States and people spend the bulk of their lives
- 122:58 now in residential self-contained residential units not having any contact
- 123:04 with other human beings. That is a fact. By the way, 31% of people are lifelong singles. Another 15% are in between pseudo
- 123:15 relationships. About half the adult population gave up on relationships altogether and had decided to live a single life.
- 123:22 Um cat ladies all kinds of so atomization
- 123:29 is has been habituated. It’s a habit now. People don’t feel the need to and you see for example the huge protests
- 123:36 against return to office RTO return to work. Mhm. after the pandemic when companies
- 123:43 announce okay you got to come back to the office they’re huge protests people saying no way we want hybrid work or we
- 123:50 want you know and why is that according because they don’t want to be with other people it’s a waste of time it’s annoying they have to you know commun
- 123:57 do you think it’s a phase and we’ll get over it and at the core of human existence we crave for interaction and
- 124:05 emotional connection no I don’t think so at all I think self-sufficiency is alluring.
- 124:13 It is grandiose uh and it is dopamineergic. In other words, it provides you with a dopamine
- 124:19 rush. It reduces anxiety. If you’re self-sufficient, your anxiety level is
- 124:25 lower. Of course, it might be depressive, but there antidotes to this like Netflix. I think all in all, given the choice, people most people would prefer
- 124:36 to be alone most of the time and if possible all the time. given the choice. Indeed, we see a drop
- 124:43 of 30% in sex. Sex is a major major barometer. We see a drop of 30% in the
- 124:51 sexual activities of people under age 35. They have fewer sexual partners than my generation, the age of the dinosaurs,
- 124:59 and they have a lot less sex than my generation. Contrary to the hype of hookups and so on, actually sex is
- 125:06 becoming obsolete. In at least two countries where we have massive documentation and studies, people under
- 125:13 age 35 are actually not having sex at all, like Japan and the United Kingdom.
- 125:19 Sex sex is supposedly that thing that you cannot resist in the presence of another
- 125:26 person. And yet people give up on it. They give up on it. Even that is not worth it. When the metaverse comes and
- 125:33 you have a haptic suit and haptic gloves and the right goggles, you will you will
- 125:39 date and you will have sex with the most gorgeous intimate partners. Why would you seek anything else? We have a harbinger. We already witnessing a harbinger of this. It’s called pornography.
- 125:50 People who consume pornography are dramatically less likely to seek sex partners. Pornography utterly satisfies their needs. Although admittedly this is more
- 126:01 more among men than among women but you know women need men to sex heterosexual at least. So I’m mentioning sex as a as a barometer as an indicator but many other things for example um family reunions or
- 126:14 or meetings in 1980 people had people were asked Mhm. Um if you are in a calamity or in a
- 126:23 disaster, how many how many close personal friends do you have that you can approach and ask for help? The
- 126:30 number then was 10. That’s 1980. 40 years later, the same question. The
- 126:37 number was one. In 1980, people had 10 close friends.
- 126:43 Today, they have one family. The nuclear family had been hollowed out completely. All the functions of the nuclear family, the urswwell functions in the 19th
- 126:54 century, education, healthcare, they’re provided by the state. There’s no need for the family. It’s utterly redundant
- 127:00 and obsolete. Indeed, when children grow up, they are rarely in touch with their
- 127:06 parents. The frequency of contact with with parents dropped 73%
- 127:13 between 1990 and today. The rate of marriage dropped 51%.
- 127:19 The rate of childbearing um had collapsed utterly even in an
- 127:25 immigrant country country like the United States. No industrial country meets the replacement rate. In other words, in all industrial countries, the population is diminishing because people
- 127:37 are not making enough children to replace the D. It seems that we have almost unconsciously
- 127:43 unknowingly been prepared set for living in metaverse which is
- 127:50 which is um interesting to observe. Uh but I like I said the optimist in me.
- 127:57 One last question about um how we could self-regulate or how the government actually could you know you mentioned
- 128:03 China, Sweden and um Russia taking some um uh among many
- 128:09 among many um measures to control and not let the corporates capitalize and
- 128:15 monetize and and dominate the the world. Do you think the societies typically eastern societies and I might be wrong but India, China or Russia probably or
- 128:26 you know traditional societies have also um a need to control from a social um
- 128:34 social cultural perspective and is that a good thing and if that’s so do you think we should continue we should force
- 128:40 oursel to get out there meet people go and meet your family more um do not hesitate to interact with friends
- 128:48 uh first of all just to correct something countries like China, Sweden, Venezuela, and and Russia, many others.
- 128:54 Um, what they’re trying to do, they’re trying to hijack uh blockchain technologies and especially
- 129:00 cryptocurrencies. They’re not doing it altruistically. They want to control it. So, there’s a there’s a sort of a competition between authoritarian government most government. There’s no goodwill motive or a or a
- 129:11 humanitarian motive behind this. They want to restore the central bank fiat money monopoly.
- 129:17 Mhm. So they they’re kind of making cryptocurrency a national currency in effect. Indeed, China is about to move
- 129:24 into totally digital currency. There will not be notes or coins or anything. Everything will be digital. It’s called the digital yuan project soon in two or three years. So no, there’s no
- 129:35 benevolence there. It’s simply governments competing with commercial entities. Who will own the who will own the now more as to your as to your question when it comes to the metaverse
- 129:46 the only hope is to is to establish open standards
- 129:52 the minute they open standards this enables competition if if the metaverse is accessible to me as a as a twoperson company because the
- 130:03 standards are there and they’re ready made and I can just copy paste them then I can create my own metavis
- 130:10 And you can create your own metaverse. And then if many people, millions of small companies, small corporations
- 130:16 create metaverses, the fragmentation of the market will be such that the giants will find it
- 130:23 difficult to monopolize or dominate. If they are forced to integrate seamlessly
- 130:29 with anyone who creates a metaverse, so they don’t shadowban me, my metaverse. I
- 130:35 create a metaverse. Google can tell me not in our backyard that’s your metaverse we are not integrating with
- 130:41 you so without Google and Apple and Microsoft then my metaverse is useless but if there are open standards and and
- 130:49 every metaverse must be integrated with every metaverse it’s by law then that could create competition which
- 130:56 will neutralize this problem more to more to the other point you’ve
- 131:02 raised it takes legislative will to reverse.
- 131:08 It’s possible to reverse. Yes, absolutely it’s possible to reverse. But it takes legislative will which I I
- 131:14 think lawmakers are terrified of the power of of the of these companies. Simply terrified of these companies own also old media. For example, Amazon
- 131:26 Amazon’s Bezos owns the Washington Post. It’s not only it’s these are you know so they’re afraid simply lawmakers are afraid they could be rendered invisible and lose the next election and so on so
- 131:38 but if by some quirk and mystery of history they will unite and so on of course there are ways to reverse I I can right now I can spew out 200 measures
- 131:49 for example I would limit the time you can be on social media or in the metaverse there will be a clock on your
- 131:55 computer and when three hours have elapsed you will forcibly logged off.
- 132:01 End of story. No appeal process, nothing. And you will not be able to falsify your identity as another user
- 132:08 because you have blockchain identity. So that’s one thing. Second thing, you could not be friends on Facebook with
- 132:14 someone you have never met in real life. You want to be friends, you have to produce proof that you had met in real life. A photograph in a bar.
- 132:20 I like that one example. I think we should start applying that. Yeah. I mean and and this is these are two of of hundred literally hundreds of measures. Two of hundreds of measures.
- 132:31 I would also ban the use of what we call relative positioning devices. Relative
- 132:37 positioning is a a term in psychology. Mhm. Well, it’s a fancy way of saying um
- 132:43 competition for image and superiority. So like I have more likes than you. You have more more followers than me. This
- 132:50 competitive I would ban this. For example, I would not allow likes on Facebook or anywhere. No likes,
- 132:58 real life interactions, of course, comments and but I would not allow the these quantitative measures which pit
- 133:05 you me against you, pit me against you which render comparison penicious and
- 133:11 drive teenagers to suicide and it creates like yes tremendous amount of anxiety if you constantly and
- 133:17 and I think it’s very easy to get addicted to being lied. is meant to to be addictive and conditioning.
- 133:23 Absolutely. Yes. It was intentionally built this way. Twitter Twitter for example had claimed
- 133:31 that the reason they limited themselves to 140 characters was because the SMS limit in small in feature phones
- 133:38 was limited to 140 characters. Okay. But then this limit on SMSS was
- 133:45 removed not long after Twitter had been established. Why didn’t they remove the restriction? Well, there’s a secret
- 133:53 motive here. If I limit your speech, you are far more likely to be aggressive. It’s a fact. If
- 134:00 I limit your can say only three words, these three words are likely to be a hell of a lot more aggressive than if I
- 134:06 let you, you know, express yourself freely. There are these are bad actors and they
- 134:12 need to be regulated stringently and so on. But no one has the no one
- 134:18 does. So to summarize how we can control or at
- 134:24 least not saying reverse the metaverse but bring to a level of acceptance and a
- 134:31 balance where the real life does not get threatened or I would ban all all transition vectors
- 134:38 from the metaverse to the real world. You make money on the metaverse you cannot convert it to US dollars. You buy
- 134:45 anything on the metaverse, you cannot sell it. I would ban I would block access of the metaverse to reality. I
- 134:52 would delineate the two realms. There would be a strict device. Yes. And you cannot transition from the
- 134:58 metaverse to reality and back. That’s the first thing I would do. I would definitely limit the time you can spend
- 135:04 in the metaverse. And there it’s not a problem to verify your identity. You can open 19 accounts. As long as the black
- 135:10 chain black blockchain thing is in operation, I’ll trace you down. I will limit you to three hours a day and
- 135:17 that’s a lot. Maybe one and that’s it. That’s the maximum you can do. I would
- 135:23 also have three strikes exactly like YouTube. You bully someone once, twice, three by you’re banned for life. You’re
- 135:30 never able to access access the metaverse. sexual abuse, harassment, racism, and so on and so forth, which is
- 135:38 which is now starting with the likes of YouTube and Facebook 15 years after they had been established. Why? Because
- 135:45 racism is good for business. Hate speech is excellent for business, right? So they let it happen. Terrorism videos,
- 135:52 ISIS videos were common on YouTube until two years ago, right? you know
- 135:58 any emotional tools as a human. You know we talked about how the government could take controls or how we could have a technical solution by limiting the putting a clock but what are some of the
- 136:11 psych psychological tools like empathy or uh talking so what what is that we could do to keep us like you say in in
- 136:19 reality check? Uh one comment before I before I try to answer your question. Um
- 136:27 only two constituencies can affect change in the metaverse via grassroots
- 136:33 activism. Parents who are concerned for the future and the welfare of their children and
- 136:40 women because the greatest users the biggest users of metaverse like technologies are
- 136:49 hitherto men. Men are likely to be the drivers of this technology. Women should oppose them tooth, nail and claw. That should that is a legitimate gender war.
- 137:00 Absolutely. Women are the guardians and custodians of the welfare of the next generations. Men and it is men. Hi-tech
- 137:09 is men. There are almost no women there. So women should fight back there as
- 137:16 parents, as mothers. So, it’s the only way to affect change. And I I think as parents, like you you
- 137:22 made a good point, as parents, we we can control the the future by embibing the
- 137:29 right values and the right information through our No, I mean I mean I’m a lot more belligerent. I think women should
- 137:36 organize activism, social activism should organize and create a grassroots
- 137:42 movement to push legislator legislators to break down these companies as they
- 137:48 had tried to do with Microsoft to break down these companies to pieces competing pieces and then to absolutely
- 137:56 limit what can be done with the technology as we limit today for example gene therapy as we limit today bio bioengineering we do limit many
- 138:07 technological advances. Absolutely. Some things are illegal to do today. You can’t change the sex of
- 138:13 your child. You can. There’s a technology but it’s illegal to do it. That there is a
- 138:19 technology doesn’t mean you have to use it. It could be criminalized and big
- 138:25 parts of the metaverse should be criminalized. Absolutely. So only women can push for that. like me too, like a me too kind of movement, you know. So, I’m not talking about embibing the right
- 138:36 values and so on so forth, which believe me is a flimsy defense. It’s a flimsy defense. I’m talking about going to the
- 138:43 going on the streets and fighting the men who are creating the metaverse. The
- 138:49 two risks with the the three risks with the metaverse is one the blurring of reality with simulation. Mhm. So the inability to tell reality apart from simulation which could lead to bad
- 139:01 decisions and bad choices and so on. Uh second is addiction. That’s a serious
- 139:08 risk. And the third is depression and anxiety. We have massive studies supporting all
- 139:14 these three outcomes. Impaired reality testing, losing touch with reality, depression, anxiety and
- 139:20 addiction. These again can be easily tackled. Addiction can be prevented by limiting
- 139:27 the time. Anxiety and depression can be tackled by limiting relative positioning likes so on. And um uh blurring of realities, you know, simulated reality
- 139:39 or extens extended reality and reality, it can be easily solved by not allowing
- 139:45 extended reality to extend to reality to reality. So this there are five easy
- 139:51 steps that would prevent all these mental illnesses but it takes political will. That’s why I mentioned that
- 139:57 parents and women should should push for that. So that’s a great message and definitely
- 140:03 I’m sure I have taken a note of it and my audience would uh but yes like I said there’s one little question I was just
- 140:09 curious you mentioned and I know the global the climate impact is there any impact should we be worried from that? I don’t know if you know that the computer industry creates more greenhouse gases than the air air travel industry.
- 140:22 I don’t know if you know that a single laptop which is a standby for 24 hours
- 140:29 requires anywhere between 100 to 500 trees to remove the carbon footprint of
- 140:36 that single laptop. I don’t know if you know that mining for cryptocurrencies had generated more greenhouse gases than
- 140:47 the emissions from cars in the 20 biggest cities in the world just mining
- 140:53 for cryptocurrencies. Should the metaverse because here’s something about the metaverse. For the metaverse to come to become a reality, we still have 10 years of technological progress. Without it, there will be no metaverse.
- 141:10 What are we talking about? We’re talking about 1,000 times more computing than
- 141:16 today. Mhm. 1,000 times more greenhouse gases. One times 1,000 times bigger effect on climate change.
- 141:28 Computing is already the number three or four depending how you define biggest emitter. And therefore computing shapes climate change adversely.
- 141:41 The metaverse will blow this out of the water. The metaverse alone will create
- 141:47 more greenhouse gases than all the cars combined.
- 141:53 People don’t take this into account. You know, a computer on standby consumes
- 142:00 a laptop on standby consumes about 160 US in terms of energy a year. Multiply.
- 142:08 See what we’re talking about? Most of this energy comes from coal in China for example, in Australia for example. So
- 142:16 this is coal powered. Computing is a coal powered technology, right? Metavverse will multiply this by
- 142:24 1,000. That’s not me. That’s the vice president of Intel.
- 142:30 That’s not something that’s his calculation. Mhm. So this is the impact of on climate change. But there are other impacts on labor on on many I mean metavas will labor as you mean the work uh policies. Yes. Um uh if you work in a in a totally virtual
- 142:46 environment, it raises it raises uh interesting issues, very interesting issues. For
- 142:52 example, wage equality, bullying in the workplace, mental health issues of workers will increase dramatically. Who is who will take care of them? And so the workplace will be reshaped.
- 143:05 Climate change will be then irrevers rendered irreversible. Metaverse alone will render climate change irreversible
- 143:12 alone. just that and um and social issues sexual abuse
- 143:18 for example and rape virtual how do we deal and so on so it’s you know it’s a transformative it’s
- 143:24 a revolutionary technology so parents women climate change catalyst
- 143:32 and all of us we all must watch out get ourselves more informed educated about
- 143:40 metavors because it’s coming and And I think through this knowledge we would have more clarity and through clarity
- 143:46 we’ll have power. So we could go drive those movements or steps to mitigate the risk of metaverse. But thank you so much
- 143:53 Sam. This was very insightful and like I said at the beginning of my conversation after listening to you every time I feel I have become a little more wiser a
- 144:04 little more aware. Thank you. Thank you for watching. If you enjoyed our conversation and this video brought you value, please hit the like button and subscribe if you haven’t. Until next
- 144:16 time, Valentine’s Day is approaching and so
- 144:23 inevitably my next interview, the one you’re about to watch, is with Valentina, Valentina Pleti. Um, it’s a fascinating interview. Uh, to
- 144:37 my mind at least. It contains insights and ideas and opinions that you may find
- 144:44 difficult to digest, let alone accept. But is it not the essence of a good
- 144:50 dialogue and we were constrained in time. We made
- 144:56 it a one-hour thing. Consequently, I had to omit a few very critical points and I
- 145:04 hope to have the opportunity to talk to Valentina Pleti again in future and to
- 145:11 tackle these issues as well as others. The the interview you’re about to watch
- 145:17 focuses on modern technologies and how they mold us and shape us and reshape us
- 145:24 and make us into something unrecognizable even to ourselves. But a few points are missing and I would like to recap them very fast very
- 145:37 briefly. Number one, the commodification of other people. There is a consumption
- 145:44 model. We consume everything. We consume food. We consume entertainment. We consume all kinds of electronic devices.
- 145:51 We consume internet and other utilities. And we consume other people. Other people we objectify other people. We um analyze what’s in it for us and then we
- 146:04 focus on what other people can give us and by doing so we reduce them to
- 146:10 service provider. One major example of this is dating apps. Dating apps are actually a
- 146:18 crowdsourcing of potential partners and the outsourcing of mate selection. I call it
- 146:27 algorithmic mate selection. Now I can uh there’s a lot to say about
- 146:33 this. The crowd sourcing of potential partners simply means that rather than
- 146:39 go one by one in depth when we come across other people, we swipe left and
- 146:47 we actually interact with them as if they were items in an inventory
- 146:53 but items in a faceless crowd in a mob and hence crowd sourcing application
- 147:01 sources this crowd for you. Similarly, the mate selection process which is an
- 147:07 extremely intricate dance has been outsourced to the application. The app
- 147:14 selects the mates for you. Um ultimately you are faced with the decisions and the
- 147:21 choices and the selections made by a computer app not by you.
- 147:27 There is an illusion of choice at the very end of the process. But the space
- 147:34 of potentialities and the space of possibilities is limited by the algorithm of the. So this is the first
- 147:41 thing. The second thing I neglected to mention in the interview owing to time constraints is that artificial intelligence provides us with
- 147:52 single synthesized answers. When you when you Google, when you use a
- 147:59 search engine, you get multiple options and you have to wade through these
- 148:05 options. You have to study, you have to browse, you have to go deep, you have to conduct additional research, refine your
- 148:12 search and so on. It’s an active interactive proactive process. Whereas when you
- 148:18 interact with artificial intelligence, you get the end product. You have no further contribution to the process. You
- 148:26 can refine your query of course, but you would still be within the confines of
- 148:32 the large language model and the algorithm of the artificial intelligence chatbot that you’re using. And because artificial intelligence monopolizes the
- 148:44 answers, synthesizes them and homogenizes them,
- 148:50 this disincentivizes research and critical thinking. It
- 148:56 encourages intellectual laziness. Okay. The next thing is that um in the
- 149:03 past let’s say 30 years maybe 20 years
- 149:09 there’s been an emerging preference for information over knowledge.
- 149:15 Information is the raw unprocessed nonstructured data. Knowledge is a synoptic view of these data, connecting them to other data in
- 149:27 ways which yield meaning and structure and order and allow us to make
- 149:33 falsifiable predictions. In other words, knowledge is a set of
- 149:39 theories. Theories about ourselves, about other people, about relationships, about the world, physical and otherwise,
- 149:45 and so on. This is knowledge. Whereas information is just data. It it yields
- 149:52 no meaning. If you were to to try to extract meaning from information, you would need to convert it into knowledge. Knowledge in your own mind. Hence hence the
- 150:04 phenomenon of conspiracy theories. Yeah. So only very few people are qualified and
- 150:13 skilled and taught how to generate knowledge. So the outcome is that when
- 150:20 laymen or people who are not qualified confront the avalanche of the tsunami of information online mainly they end up
- 150:28 creating nonsensical or conspiratorial or frankly insane uh theories. They end
- 150:38 up creating pseudo knowledge which is cultifactual and very often demented.
- 150:47 Finally, an insight that I again didn’t have time to to mention in the conversation is that
- 150:54 there are only three ways to interpret the world. Only three hermeneutic
- 151:00 pathways, explanatory, interpretative pathways. I’m sorry. One is psychosis, one is narcissism, and the other is nothingness.
- 151:11 The psychosis um is when we generate mentally
- 151:19 something an artifact, a concept, a construct, an idea and so on and then we attribute it to attribute to this
- 151:26 epistemic creature ontological status. So we conceive or conjure up a god and
- 151:34 then we say god exists. It has existence. That’s psychotic. It’s completely psychotic. Religion is
- 151:40 psychotic. So psychosis was the way humanity
- 151:46 has had coped with reality and with the world and with the universe and with
- 151:52 with them with with the mysteries of life and with the meaning of life and so on. Psychosis was the natural reaction.
- 151:58 It was also known as religion. And then came the age of narcissism, the age we live in right now. And it is an age that places emphasis on the individual as the source of all certainty and knowledge. We look inwards
- 152:16 in order to make sense of the outward. We we look internally. We revert. We
- 152:22 refer internally in order to make sense of the external.
- 152:28 And finally, nothingness is authenticity. What s called authenticity. I have a whole channel
- 152:35 dedicated to to nothingness and I have a nothingness playlist on my main YouTube channel. These are the three choices we face when we try to make sense of existence and imbue reality with any
- 152:48 meaning. Nothingness, authenticity is not about being a nobody or or doing
- 152:54 nothing or destroying the world. It’s not nihilism. It is about choosing to be
- 153:00 human, not a lobster. It is about putting firm boundaries between you and the world and emerging and becoming within these boundaries
- 153:12 which provide you with a modicum of safety. And now onward Christian and of
- 153:19 course Jewish soldiers to the interview with Valentina Pleti a few days before
- 153:26 Valentine’s Day
- 153:32 okay okay so here here’s where you welcome me and allow me to introduce myself.
- 153:44 Okay. Um hello uh Dr. Sanbakny I am very honored to have you here today. Thank
- 153:50 you so much for your presence. I am very happy for this interview with you as I
- 153:56 you are the expert on narcissism and I would like to ask you some very relevant questions on the topic today. So why don’t we start with an interview uh
- 154:07 brief introduction about you and your career and your expertise uh so then we
- 154:13 can proceed with the questions. Yes, thank you for for having me. Um, so my name is Sandaki and I am the author of the um book Malignant Self- Love, Narcissism Revisited, which was the
- 154:25 first book to describe narcissistic abuse. I coined the phrase narcissistic abuse and I coined most of the language in use today um to describe narcissism at least
- 154:36 online. Um, I’ve been I’ve been um studying
- 154:44 narcissistic disorders of the self for well over 30 years. So I’ve been in this racket for 30 years and um I’m teaching in various universities especially in
- 154:55 Europe. Um I’m a professor of psychology. So now let’s move from this the not
- 155:02 important part to the important part. Yeah. No, I it is very important because
- 155:08 uh uh your content has been one of my first um inspiration to start to
- 155:14 understand this topic. So I am very grateful for all of your work and especially the very important insights
- 155:20 that you keep giving and sharing to the world about uh this very difficult challenging topic that is often
- 155:26 misinterpreted and misspoken about. So I’m very grateful that you keep uh sharing clarity and insight on this very
- 155:34 relevant uh uh topic. So what I would like to talk about today
- 155:40 is I would really like to get some of your expertise on cultural narcissism. I feel that this is a topic that is extremely relevant especially today with uh with things going on in the world and
- 155:51 things just exponentially growing and uh international relations becoming a very
- 155:57 important part of our daily lives. I believe it is a very relevant topic and yet I feel that very few people are
- 156:04 truly going deeply to discuss and research on it. So that’s why I would really love your input on it and we can
- 156:10 take this in any direction that you feel is relevant. Um, but how about we start with the effect of technology on
- 156:18 narcissism? I know this is an area that you’re interested in. And so I would really like to discuss this and talk
- 156:25 about this with you. And would you like to start giving your input and opinion and expertise on how for example
- 156:33 internet, social media and the incredible exponential growth of these tools in the past decade and the way that they have been impacting society especially our relationships. I’m not
- 156:46 sure if the effect of it was even predicted if if it was something that was wanted but it definitely had a huge determining uh effect on on the culture.
- 156:57 So would you like to take it from here and then we see in what direction we can further explore?
- 157:03 Yes, thank you. I I think we are confusing the the horse with a cart. Um it’s narcissism that impacted technology not the other way.
- 157:14 The rise of narcissism in society preceded the creation of the modern technologies that we use today for
- 157:20 communication for social interaction and so on. So starting in the 1970s and 1980s,
- 157:27 there have been several scholars, for example, Tuen and Campbell and others who have document documented an alarming
- 157:35 rise in what could only be described as pathological narcissism or at the very least narcissistic traits and narcissistic behaviors, especially among the young.
- 157:46 Technology was responsive to this. It’s a it’s a response to this. Technology is
- 157:53 always a lagard. Technology always always observes. So observe the people
- 157:59 who design technology observe social trends and then they structure technology in order to respond to these
- 158:08 social trends and of course inevitably to amplify them to enhance them. So,
- 158:14 social media and similar technologies were designed as as a reaction
- 158:21 to the need that people have felt to be seen and to feel unique.
- 158:31 As the population exploded throughout the globe, today we are 8.3 billion people. As we have transitioned from villages to cities
- 158:42 where social interaction is cursory and peruntory and limited and superficial.
- 158:50 Um people felt the need to be seen in the fullest sense not to be observed. To
- 158:57 be observed is something else but to be seen in a village. You’re seen. Everyone knows you. Everyone knows your family the history of your family. Everyone your your business is everyone’s
- 159:08 business. There are negative of course aspects of this but at least you feel
- 159:14 that you are grounded that you are immersed. It’s an immersive environment. The village is an immersive environment.
- 159:21 At least you feel that people care about you. Even if if the reason for caring is wrong, malevolent, but at least you are
- 159:28 the center of attention. Could be malign attention, could be benevolent attention or benign attention, but you’re always
- 159:34 the center of attention. and everyone is. So a village in many ways can be
- 159:40 described as a network, the metaphor of the network. And modern technologies, especially social media, tried to
- 159:51 recapture this network like structure. In a village, you have a network.
- 159:57 Everyone is a node. Everyone is equipotent. Everyone is equidistant.
- 160:04 and the information disseminates across the network very fast. And of course, that’s why we call social media social networks. It’s a village model. It’s an attempt to
- 160:18 escape the city and to be seen and to be the center of attention and to be noted and to be attended to and to be perhaps criticized
- 160:29 and perhaps supported. There are support forums and support groups and and so on so forth.
- 160:35 This has failed miserably. This attempt has failed miserably. And what it has done, what it has accomplished instead is a rise in automization
- 160:46 and the attendant soypism and narcissism because
- 160:52 social media compete with you for your time. Compete for your time. Compete for
- 160:58 your eyeballs. They monetize eyeballs. It’s an attention economy. So if you
- 161:05 have a boyfriend, your boyfriend is in direct competition with Facebook and Instagram. Every
- 161:12 minute you spend with your boyfriend is a minute not spent on Facebook or Instagram.
- 161:18 So this so-called social network are actually asocial network, not antisocial
- 161:24 but asocial. They encourage you to give up on intimacy, to avoid human contact, to isolate
- 161:32 yourself, to atomize the fabric of society and thereby they render you a hostage to
- 161:41 the social medium or the social network in the sense that 100% of your attention is dedicated goes there and is being monetized and in the bottom line. So
- 161:54 social media started with good motivation with a good motivation with a good idea in mind. The city is
- 162:01 anonymous. The city is alienating. The in the city you are just a number like in a prison a giant prison. In a city you are dehumanized and objectified
- 162:12 and everything. And we’re going to restore the village spirit. We’re going to create networks. And in a network you
- 162:20 can talk to many people. You will be you’ll exchange things, recipes, I don’t know what you will be loved and you will
- 162:26 love back. There was this utopian view of social media and social networks. But the economy of these technologies, the profit motive of these technologies
- 162:38 made them go exactly the opposite way. Isolating people to the absolute maximum,
- 162:44 encouraging them actively to hate, to troll, to um to rage, to envy. Relative
- 162:54 positioning, you know, envy is a crucial engine of these of these technologies.
- 163:00 and encouraging you to avoid all social contact because if you if you are in
- 163:08 connection or interaction with other people, this is time wasted because you could have spent this time on social
- 163:15 media and get these dopamine rushes or dopamine hits of likes and and views and
- 163:21 and so on. This is how I think it went. It started with narcissism. Narcissism is a defense. We should never forget this. We chastise narcissism and criticize it and attack and but
- 163:33 narcissism is is merely a defense and it’s a defense against against feeling that you against the feeling that you’re disappearing that you that you’re no more that no one
- 163:46 pays attention to you no one cares about you no one so this is a defense it says maybe no one pays attention to me but I’m godlike I’m important I’m omniscient
- 163:58 I’m omnipotent I’m it’s it’s self- enhancement because no one else would
- 164:04 enhance you. It is self-suppent because no one else pays attention to you. That’s how it started. And all the attempts to take care of it somehow
- 164:15 techn via technology only serve to make the situation much worse.
- 164:23 Wow. That that is a really brilliant uh exposition on uh the effects of technology which I believe a lot of them we have not even thought about these consequences and I I the connection
- 164:35 between trying to create a village life in a city I had not thought about it that’s that’s really unique so thank you
- 164:42 for that input and I remember also watching a video of yours where you
- 164:48 talked about how cities are the first form of virtual world and I thought that was also really brilliant way of uh
- 164:55 looking at this the way we have we’re disconnecting further and further from our natural tendencies from connection
- 165:02 to the natural world connection to natural needs and tribe relations and so on. So it seems that we have tried to sort of fix the problem of this
- 165:13 connection that cities have created but we have generated another yet another problem. So instead of being able to come back to the natural communion way of human life, we have just created
- 165:24 another patch that has led to another consequence and I think I think a much bigger bigger problem. It’s not only another problem. It’s a much bigger problem. Yeah. Another if I may just interject and add
- 165:36 another dimension to the conversation. Uh cities are symbolic spaces. That’s
- 165:42 why I call them virtual reality. Yeah. There are symbolic spaces. Everything is about symbols. The
- 165:49 manipulation of symbols, the accumulation of symbols, the exchange of symbols, the replacement of real life
- 165:56 objects with symbols. We even replace money with symbols. No one has money. People have credit cards or they have digits in the computer at the bank. They think it’s money. It’s not money of
- 166:08 course. And um similarly, people are beginning to interact remotely. They
- 166:14 have long-distance relationships. They have friends on Facebook and so
- 166:20 forth. So we are we are when we created cities, we disengage from reality, not
- 166:28 only from nature, not only from our nature, but we disengage from reality. We made a choice to transition from a
- 166:36 real space or a natural space to a symbolic space.
- 166:42 Now, social media attempted to reverse this. They attempted actually to
- 166:48 transition back to revert from a symbolic space to a real space. The idea
- 166:55 was that as you interact with people, you get to know them and there’s going to be a spillover. You’re going to meet
- 167:02 them in real life and you’re going to develop, you know, even lonely people, schizoid people going to find friends.
- 167:09 But of course what has happened is that we ended up converting people into symbols. While hitherto we converted mostly objects into symbols, living
- 167:21 environments into symbols. We symbolized almost everything except people. Now
- 167:28 with social media we are converting people into symbols as well. And the next stage of of course is the metaverse where we would be interacting not even with people but the with the symbols
- 167:40 that represent people in the game with the avatars that is completely
- 167:46 narcissistic. This is exactly what happens in the narcissist mind. The narcissist mind is a giant metaverse where every external person in the
- 167:57 narcissist’s life, what we call external object in psychology, every human being in the narcissist life is converted
- 168:05 automatically instantaneously into an internal object in the mind of
- 168:12 the narcissist. And that internal object is an avatar. It’s a representation of that person in the narcissist’s mind and therefore very reminiscent of a metaverse.
- 168:27 Thank you so much for this clarification. That was actually exactly where I wanted to go next. Uh because I I often do that. My apologies. No, that’s okay. That’s perfect. So it means we’re on the same page. So the
- 168:39 virtual reality space I was going to mention that from my observations and
- 168:45 experiences it seems that we’re transitioning into a narcissistic uh type of world because virtual reality is
- 168:52 essentially uh a non reality which is based on projection and as as you said
- 168:59 that is very similar to to the mind the narcissistic mind and now we also have the phenomenon of artificial intelligence which is adding being another layer to all of this complexity
- 169:10 and all of these um mechanisms. So now so many of the functions that were
- 169:17 considered human before um functions that were even daily functions that we performed on a daily basis in a in a traditional let’s say society are now being slowly actually fast not so slowly
- 169:30 replaced by uh robots and artificial intelligence mechanisms. So I can see
- 169:36 from my own experience how for example using Google maps on a daily basis has sort of led me to forget a little bit of my orientation capabilities and reading you know abilities to read maps. So imagine what will happen as we replace
- 169:52 relationships with um not only virtual but AI generated relationships. So I
- 169:59 would like to take have you the floor take the floor on this because I I believe you you have a lot of expertise
- 170:05 and opinion on on this subject as well. Just a small correction before we proceed. These symbolic spaces are based
- 170:13 on introjection not on projection on the conversion of everything outside into an
- 170:19 internal object including people. So even people are converted into internal object. When we convert an external
- 170:26 object, even if it is a real object, even if it is a belief, even if it is an idea, even if it is an ideology, when we convert these into internal elements,
- 170:37 internal components, then we call this process introjection. Um, regarding your your question again I think what has happened is that there
- 170:48 was the first there was a social trend and technology is reactive to it. The
- 170:55 social trend is what we call in psychology consumaciousness. Consumaciousness means the rejection of
- 171:01 authority, the hatred of authority. So there’s a hatred of authority.
- 171:07 There’s a hatred of political authority. There’s a hatred of intellectual authority. There’s a hatred of learning
- 171:14 and expertise and knowledge. There’s a hatred because it’s a narcissistic defense. If you know more than I do,
- 171:21 then you are superior to me and you can’t be superior to me because I’m God.
- 171:27 You know, it’s a narcissistic reaction. So, consummatiousness is an element in psychopathic narcissism. It is an element in what we call reactance which is a fancy way of saying defiance. Okay. So it started with contmaciousness. We
- 171:42 saw it in the 60s started in the 60s. 1968 the revolutions all over Europe
- 171:49 France this that the 1960s in the United States the hippies the all these movements uh free love and you name it there was a rejection of authority. It
- 172:01 started with a rejection of political authority. All it always does. But it ended up with a rejection of the intellect, intelligence, knowledge, learning, books,
- 172:14 hatred. Not only rejection, absolute emotional investment in hatred, which is the outcome, inevitable outcome of envy. And now that we have rejected authority,
- 172:27 what could replace it? the mob, the crowd. So we have Wikipedia.
- 172:34 We have Wikipedia long before artificial intelligence. What is Wikipedia? Wikipedia is a rejection of experts and
- 172:42 authorities. It is the crowd of the mob creating its
- 172:48 own encyclopedia. I mean the hell with you with the Britannica. The hell with the encyclopedia Britannica. We are much
- 172:54 better you know. So crowdsourcing is an example of olocracy, mob rule.
- 173:03 It’s an example of a rebellion against established intellectual authority.
- 173:10 Artificial intelligence is nothing but crowdsourcing. It’s just another name for
- 173:16 crowdsourcing. What artificial intelligence models do? They scan billions of pages, billions of
- 173:23 and they give you the answer. That is a great description of crowdsourcing and that’s exactly what Wikipedia does. Only Wikipedia does it with human beings. Human beings scan these billions of
- 173:36 pages. Then they create an encyclopedia. Here a technology is scanning these very
- 173:42 same pages and it creates its own encyclopedia. In effect, artificial
- 173:48 intelligence is Wikipedia extended by other means. That’s all. It’s a
- 173:54 rebellion against intellectual authority. That’s why artificial intelligence lies
- 174:01 a lot. Artificial intelligence models hallucinate. They give you wrong answers
- 174:07 very often. It’s it’s false and it’s a lie and it’s uh a scam, a scam and a
- 174:15 swindle. When the artificial intelligence companies tell you the accuracy is 99% they are bullshitting
- 174:22 you. I would be surprised if it’s 30%. I tested various artificial intelligence
- 174:29 models. I asked them 20 factual questions about my life. Factual, fact-based, like where was I born? I tested them. They failed eight out of
- 174:42 10 times. Eight out of 10 times they got the answers wrong. I was born in Macedonia.
- 174:50 I wasn’t. I was born in Israel. Mhm. I happen to be right now in Macedonia,
- 174:56 but I was born in Israel. One example. Yes. My sister wrote the book Malignant, Self- Love, Narcissism, Revisited. I’m
- 175:02 kidding you not. So, this is artificial intelligence
- 175:08 where the illiteracy and the ignorance and the stupidity of the masses
- 175:15 is accumulated, structured, shaped, and spewed out.
- 175:22 Garbage in, garbage out. It’s precisely the model of artificial intelligence. And for a very, very long time, it was the model, the working model of the Wikipedia.
- 175:33 Until Wikipedia has been taken over by expert editors and now it’s much closer to a traditional encyclopedia. You can’t just do whatever you want. There are strict
- 175:46 structures that make sure that you don’t vandalize and you don’t spread nonsense and misinformation and so on. Wikipedia
- 175:52 has become a trustworthy resource because it stopped being a crowdsourcing
- 175:59 uh resource. Artificial intelligence also involves another trend, the trend of outsourcing.
- 176:06 So not only crowdsourcing but outsourcing. Outsourcing is when we say
- 176:12 we would like internal psychological processes to be regulated from the outside rather
- 176:20 than from the inside. So for example, we derive our sense of self-worth and
- 176:26 self-esteem and self-confidence from the number of likes and views that we get on social media. That is external regulation. The outside regulates a process that
- 176:39 should have been completely internal. Your self-esteem should not rely on what
- 176:46 other people have to say about you. You should know yourself well and your self-esteem is a derivative of this. End of story. And so we have be we have begun in the
- 176:57 last 40 years to outsource uh something called external regulation.
- 177:03 So our moods for example are now very responsive to the outside much less than
- 177:09 to the inside. Uh our cognitions are shaped by the outside. We don’t do research anymore. We don’t you know we
- 177:20 embed ourselves in likeminded thought silos with confirmation bias and
- 177:26 we keep repeating the same mantra over and over and over again. Adnosium.
- 177:32 It’s not. So this is the second trend outsourcing of internal functions. We became
- 177:38 essentially hollowed out, emptied. We became externally regulated zombies.
- 177:46 And this is the second trend. And when you when you outsource, you have
- 177:53 you of course transition from an internal locus of control to an external locus of control. In other words, you
- 177:59 begin to believe that your life is determined from the outside, not from the inside. Because indeed, you have
- 178:07 outsourced your mind. You gave it you gave authority of your mind to external factors.
- 178:13 And of course, this immediately gives rise to conspiracy theories and paranoid ideation.
- 178:20 You can see that it’s a chain. These links are inexurable. They lead to
- 178:26 each other naturally and with great inner reason.
- 178:32 And so we are going there. We are going to the end of the end of a human being
- 178:38 as we used to know it. We have we have we started humanity
- 178:45 started it self reflection and self-perception as agent is on based on
- 178:52 the concept of agency. You were, for example, a moral agent. And now, if you’re a moral agent, that means you
- 178:59 should be punished if you misbehave because you’re an agent. You have agency. But wait a minute, if everything
- 179:05 is outsourced and crowdsourced, and maybe whatever it is that you do is not
- 179:12 punishable because you have lost your agency. And then you have people like Donald Trump who never pay the price for
- 179:21 their crimes, you know, because he claims the agency is not his. He’s being
- 179:29 persecuted. He’s being victimized. And of course, this sits well with the age of victimhood.
- 179:35 The famous famous sociologist Bradley Campbell said that uh we have
- 179:41 transitioned from the age of dignity to the age of victimhood. What is victimhood? Is when you hand control
- 179:47 over yourself to someone else. When you outsource your locus of control becomes
- 179:53 external locus of control. And I could go on like that forever.
- 179:59 These are these are processes that are interlin. They feed on each other
- 180:06 and they affect other other issues. For example, the very idea of truth
- 180:14 and originality. Walter Benjamin, of course, was the first to to discuss the issue of originality in the in the age
- 180:21 of mechanical copying and and so on so forth. But I think he didn’t go far enough, Walter Benjamin, because had he
- 180:28 gone one step further, he would have realized that the con concept of originality is inextricably linked to
- 180:35 the concept of truth. Originality is not only about authenticity. It’s about
- 180:41 truthfulness about the truth. The very concept of truth. And what he failed to to re realize in in my view is that the age of mechanical copying
- 180:53 would lead us to the erosion of the very concept of truth. Because if nothing is original and
- 181:00 everything is a copy, if nothing is a copy and everything is original, then everything is relative. And if
- 181:07 everything is relative and there’s no fixed point, no achimedian point, then everything is simultaneously false and true depending on your point of view, your personal history. In other words, opinion becomes the truth. Yeah. Yeah.
- 181:23 And so it’s all interlin and again I’m I’m not joking when I say that I can
- 181:29 continue for a few hours discussing all these because there are many more trends. I I’m sure of that many additional trends. But I just
- 181:35 wanted to give you a taste of what the way I see things. No, and and I could definitely go a few
- 181:41 hours listening uh to you talk about this topic because I again like I said I feel like we really are not going into
- 181:49 depth enough and it is a very relevant topic. It’s it’s changing our society radically and exponentially fast and
- 181:56 we’re not really thinking about the consequences. So I’m glad at least someone is is thinking about the
- 182:02 consequences and and and talking about them with the world. So, thank you so much for having so much insight and so
- 182:08 much interest in into really going deep into these kinds of dynamics and um I feel you know I feel like as someone who’s an expert on narciss narcissism and narcissistic uh tendencies and
- 182:20 behaviors uh you you probably can see very much more accurately the dangers of
- 182:26 these dynamics and how we are encouraging these kinds of behaviors both at the social level, collective level but also at the individual pathological So, um, it it is, you know, I think
- 182:37 you’re it’s great that you’re doing this as a service to explore these. Yeah. But we are, you are members of an
- 182:45 extinct species. They are, we are, we are dinosaurs. We’re dying. Yeah, we’re dying. And we are dying because no one would listen to us. And we are also dying because um people are incapable of listening to us because they’re dumb. They’re uneducated or they choose to to
- 183:02 not listen. there is um a defense against learning. There’s a resistance to learning and so
- 183:08 on. There’s anti-intellectualism, hatred of learning and knowledge and and
- 183:14 so rejection of of intellectual authority or at the very least scholarly authority and so on. So we are fast we are fast disappearing. People like us
- 183:25 are fast dying. And I feel a sense of futility. um having spent my entire life basically learning and studying and reading and
- 183:37 teaching and I feel a sense of futility because the world at least for the next
- 183:44 100 or 200 years because we entering a period of a few hundred years at least for the next few hundred years maybe 200 things are much faster nowadays than they used to be in the
- 183:55 middle ages so maybe it’ll be shorter be 200 years but definitely definitely many decades. We are entering
- 184:02 a period where uh people like us are gradually going to become the enemies. At this stage we are ignored. At this stage people like like us are being
- 184:14 ignored. Some of us are getting fired. Some of us are getting criticized. Some
- 184:20 of some of us are getting threatened mildly or not so mildly. But there will
- 184:26 come a time that people like us will be executed physically.
- 184:32 It’s a question of time, not a question of if. That’s where we’re heading.
- 184:38 And so absolutely. Yeah. I feel very dis despondent. I very I despair. I have great despair. And it reminds me of how intellectuals reacted
- 184:49 to the rise of fascism and Nazism in the 20s and 30s of the last century in Europe. and how they just gave up. They stopped
- 185:00 talking not because they were afraid but they simply saw no meaning in opposing this tsunami that no one can oppose you
- 185:08 know and some of them committed suicide simply committed suicide. So I don’t feel it is my world anymore.
- 185:20 I grew up among books. I I adored and admired learned people. These were my
- 185:26 role models and heroes. And while in the 50s the number one superstar was Albert Einstein, today it probably would be some obscure
- 185:38 footballer or influencer or Kagashian type. You know,
- 185:44 the world is debased. The world is corrupted in the worst sense of it’s rotten in the worst sense
- 185:50 of the word. And I see no hope in the near term.
- 185:56 Obviously, because stupid people are taking over, narcissistic people are taking over. Obviously, the world will
- 186:03 implode and within 200 years there will be massive devastation. I’m not ruling out
- 186:10 a nuclear war. Absolutely not. There will be massive devastation. And then
- 186:16 people like you and me, we will have to rebuild the world. We’ll have to rebuild it. And um of course ultimately there is hope. If
- 186:27 you’re willing to wait two 300 years, there is hope. But the next 200 two 300 years are going
- 186:33 to be absolutely horrible. a combination of the worst part of the Middle Ages
- 186:40 because there was a part of the Middle Ages which was not bad actually but the worst part the early middle middle ages
- 186:46 and um the worst part of the 20th century
- 186:52 that’s where we’re heading a confluence a marriage between the early middle ages and the 1920s and 30s in uh in Europe in
- 187:00 the world not all Europe I I see I see that that trend definitely and and
- 187:07 That’s why I sort of uh I keep uh talking even though maybe it could be to deaf ears but there’s always I have
- 187:13 always a little bit of a hope that maybe by planting seeds somewhere something will survive you know some some kind of
- 187:19 consciousness and self-awareness and willingness to really explore and understand and to do something constructive with life. You never know
- 187:26 you know maybe it will survive. So if nothing else it makes my life a little bit more worth worthy uh of live of
- 187:33 living. Um but the other thing I would like to talk to with you about because this is another topic that I I am just uh surprised that it is not talked about
- 187:44 more given the amount of international transactions that we’re having today and again international exchanges that are
- 187:51 increasing at exponential value and yet are not necessarily integrated with the
- 187:57 cultural exchanges um by living in both Asia and the west
- 188:03 and by living in different countries and continents and cultures, I noticed that there are extremely radical differences
- 188:10 both in the cultural norms, let’s say, but also under the the lens of
- 188:16 narcissistic behavior in the way that, for example, narcissistic tendencies and trends and behaviors are encouraged or
- 188:24 discouraged in different cultures. And it is sometimes I have to admit that sometimes it is um mind mind-blowing
- 188:32 because I feel like I have to live in two completely opposite realities when I deal with people in the west in the east
- 188:38 due to the fact that some tendencies that are completely encouraged and seen as heroic if you will and a
- 188:45 representation of great values in one culture as seen as demonized and uh criminalized in another culture. somehow I have to figure out a way to integrate all of that. So, please take the floor.
- 188:57 Again, I I’m sure you have a lot of expertise on the topic and I would love to hear your opinion and research on it.
- 189:04 First of all, it’s important to make a distinction between the hidden text and the occult text or the
- 189:13 to revisit uh works by Altuser and others who who’ve dealt with some of these issues. Altuser of course ended up uh crazy in a
- 189:25 mental asylum like Nichi before him and many others. I think when you see the world as it is, this is a serious risk. Yeah. So there is a hidden um there is a overt
- 189:38 text and hidden text. The overt text is the globalized west.
- 189:46 Western values such as for example capitalism, growth, economic growth,
- 189:53 um democracy. Ironically, you have elections everywhere. You have elections in China
- 189:59 also. You have elections in Russia. This is a western import.
- 190:05 So, but this is of course the overt text. The de the not the not the text
- 190:11 that is not deconstructed. The text that is misleading, superficial
- 190:18 um and teaches us nothing about the nature of reality. It’s a text that is self-referential.
- 190:25 It’s a text that refers to itself but never to reality. It has no, you know, connotations, denotations. I’ll not go into it right now. So the the overt text
- 190:37 is western values and I would say even much more so western lifestyle and
- 190:43 western ideology ideologies. But the overt text is of course
- 190:49 irrelevant. The what is relevant is the occult of the hidden text.
- 190:55 And the hidden text in each and every cultural sphere in each and every is of
- 191:02 course dramatically different but I think can be divided in two major groups
- 191:08 and there is a lot of work that’s been done on this by the likes of Caponyi and Roland and Theod Milan the late Theodom
- 191:15 Milan and others and they suggested that you could divide the world basically into collectivist
- 191:21 societies and individualistic societies. where the emphasis is on the individual
- 191:28 and where the emphasis is on the collective. Now the Renaissance the Renaissance in the when I say Renaissance I’m talking actually starting in the 12th century not not
- 191:39 necess not not not so much but the Renaissance um was the cult of the individual
- 191:48 and because it was the cult of individual it came out with various manifestations of individualism that we
- 191:55 are suffering to this very day. I regard the Renaissance as one of the most deletious, detrimental and destructive intellectual
- 192:06 movements in human history. It gave rise, for example, to the personality cult, Nicolola Mchaveli and
- 192:14 the prince. It gave rise to totalitarianism. It gave rise to malignant individualism.
- 192:22 The Renaissance gave rise directly to fascism and Nazism. There’s a direct
- 192:28 lineage there. And I will not go into it right now, but we could dedicate a whole another maybe talk to it. But so it is the Renaissance
- 192:40 that introduced the the idea of individualism. The very concept of authorship, the
- 192:48 author, these are Renaissance inventions. Prior to that, art was basically a collective endeavor, a collective effort. You don’t know the names of the artists in ancient Egypt.
- 193:01 Not one of them. So this is a new individualism is a renaissance thing.
- 193:07 And then you have societies which luckily for them were not affected by the Renaissance because they were too far away like Japan and China and so on so forth. And they’re collectivist society.
- 193:18 Narcissism exists in both individualistic societies and collective societies because it is part of human
- 193:25 nature. However, it manifests, it expresses differently.
- 193:31 Whereas in an individualistic society, the individual would take credit, would
- 193:37 boast, would brag, would make claims about accomplishments which are
- 193:43 counterfactual, would would would lie, would you know everything everything would revert to the the locus would be the individual. So the individual an
- 193:54 individual athlete may say, I trained a lot. I worked very hard and I accomplished this. An individual Nobel
- 194:01 prize individualistic Nobel prize winner would say you know I spent all my life studying this chemical reaction and then
- 194:08 I succeeded suddenly with inspiration I succeeded to this and to do that and so on. In collectivist societies you would
- 194:16 have the same narcissism. But a collectivist athlete would say, “I
- 194:22 want to thank my coach and the support of my team members without which I would have never accomplished this.”
- 194:30 And a collectivist Nobel Prize winner would say, “In our laboratory, which is one of the best in Japan, we succeeded to break this enigma which other
- 194:41 laboratories in all over the world failed to do.” These are both narcissistic, grandio expressions, but
- 194:49 they relate to the collective. A worker in a in a corporation in Japan,
- 194:56 a salary man as they call it, you know, in in Japan, would be proud to belong to that company. He would derive his grand sense of grandiosity is from the fact
- 195:09 that he belongs to that company that is part of this collective. Whereas a chief executive officer of a
- 195:17 similar competing company in the United States would make it all about him. He was the visionary. He came up with new ideas. He implemented new procedures. He and he and he even very often nonsense.
- 195:31 He didn’t do anything. But so narcissism is all over the world. There is not a
- 195:37 nook or cranny or angle or that is free of narcissism. No such thing.
- 195:43 Arab societies, Asian societies, African societies, uh, Western society, they all have narcissist, but they are legitimate and non-legitimate ways of expressing
- 195:55 narcissism. And so each society structures speech
- 196:01 structures what informs the individual which what is legitimate speech and what is not legitimate speech and sometimes
- 196:07 even penalizes non-legitimate speech as we have seen for example during the COVID pandemic.
- 196:13 So you pay a very high price if you use non-legitimate speech. um if you belong to um a white
- 196:21 supremacist militia in the United States and you’re proud of it, you’re very likely to end up in prison. So even if
- 196:28 you are a collectivist by nature and your grandiosity is the outcome of belonging to the group, you may end up
- 196:34 in prison for this if you’re a neo-Nazi for example in Germany.
- 196:40 And similarly, if you are too individualistic in Japan, you are likely to attract very negative attention which
- 196:46 could end very badly for you. And there were a few actors in Japan who were put in prison because they were too
- 196:53 individualistic in many ways. And so they were accused of sexual assault and other things and so
- 197:00 there are permissible permissible speech acts in every society and that’s the only difference. Nothing
- 197:07 substantial, nothing fundamental and nothing clinical is different. It’s the same narcissism. Just the way we you
- 197:14 talk about it depends, you know, is responsive to your culture. Yeah. Exactly. The way I see it is that it’s almost like when you have a a flow of water and you close the flow in one spot, so the water flows in the other spot. So uh in a similar way whatever
- 197:31 the society permits whatever it encourages and it allows that’s where you’re going to see those traits coming
- 197:37 out. So uh in my personal experience and it could also be due to the and then we could even explore the topic on how technology on eastern and western how how differently it is being applied and
- 197:49 it’s it’s influencing culture because that’s another we could you know that’s another book um that could be written on
- 197:56 u but the way I see it uh being expressed here for example in Asian uh cultures a lot more is what most people would define as covert kinds of
- 198:07 narcissistic expressions as opposed to the more overt ones that are more encouraged and promoted and and accepted
- 198:15 in western cultures. So I’m seeing much more of the as you said for example with
- 198:21 the example of Japan if a person were to brag too much about their own accomplishment here they would be
- 198:28 immediately seen as shameful by the by the society. So it is not encouraged at all to do these kinds to promote these kinds of behaviors which are instead encouraged in western societies. But at the same time when a person
- 198:44 self-sacrifices to great degrees even to the detriment of of everybody else around them and themselves. But that is
- 198:51 because it is a very collective value self-sacrifice and selflessness and just
- 198:57 giving giving giving working for society because that is considered such a high value in this society there is a lot of
- 199:04 these narcissistic tendencies of of doing all of that so in order to get the attention to get the approval to get you know the agilation and so on and so forth. So uh I would like to uh I know that you
- 199:16 haven’t haven’t been using uh clinical language in what you’ve just said. You’ve been using you’ve been using
- 199:23 colloquial language which is okay. Yes. Yes. Yes. But I would like to uh comment because
- 199:30 some of these words have clinical meaning and if if I don’t correct the record it could be so for example covert the word covert
- 199:41 has a clinical meaning it means a narcissist who is unable to obtain narcissistic supply unable to garner
- 199:48 attention and therefore becomes bitter sthing with envy and passive aggressive we call it
- 199:55 the collapse narcissist. So I wouldn’t use the word covert in for example when
- 200:01 we describe collective or collectivist narcissism. So in a collectivist societies, an overt
- 200:09 narcissist, not a covert narcissist, would brag and boast as much as Donald Trump does, but
- 200:16 he would brag and boast about the collective, about belonging to the collective or being an integral part of the collective or enjoying what the collective has to offer or so this it’s
- 200:27 still a grandio overt narcissist just there are permissible speech acts and
- 200:34 non-permissible socially unacceptable and frowned upon speech acts. So I wouldn’t use the word covert. It’s it’s a bit misleading. Um what you describe at the very end of what you’ve said is known as pro-social narcissist. There is a variant of narcissist
- 200:52 who is communal. It’s a narcissist whose whose grandiosity is about how
- 200:59 altruistic he is, how charitable, how compassionate, how amazingly caring, how
- 201:06 flawless, how righteous, how so this kind of narcissist does good deeds but
- 201:14 then brags about it. He his good deeds are ostentatious. They are visible public. Everything is done in public. There’s no private sphere. And so we call we call these kind of narcissisms
- 201:26 pro- social narcissism. For example, probably mother thea is is an example
- 201:32 and maybe maybe Greta Thunberg is an example. So these are people who are
- 201:38 essentially highly narcissistic and they ex they externalize
- 201:44 um they create a facade or a of look how how pro-social I am not antisocial. Exactly the opposite. Look how beneficial and benevolent I am and
- 201:56 magnanimous and amazing and moral and ethical and I I would even say the
- 202:02 superior the locus of the superiority is in the morality. I am more moral than
- 202:08 you. It’s competitive morality and sometimes there is competitive victimhood. I’m much bigger victim than you are. My abuser is much worse than your abuser. you know, I’ve been hounded by by the CIA much more than you did.
- 202:24 So, it’s it sits there is um a seamless integration with paranoid ideation as
- 202:31 well. So, paranoia, victimhood, pro-social, ostentatious pro-social and communal narcissism, they very often go in hand in hand. And then you see a pro-social
- 202:43 narcissist who says, “I’m a very moral person. I’ve never done anything wrong. I’m righteous and so on. And because I’m
- 202:52 like that, everyone everyone is attacking me. Everyone is hounding me.
- 202:58 Everyone is, you know, and I’m a victim. So you have a smooth transition from a
- 203:05 pro-social claim, narcissistic claim to victimhood to paranoia.
- 203:12 Smooth integration. So all these are nuances of pathological
- 203:18 narcissism that um unfortunately get lost online.
- 203:25 People make Yeah, absolutely. People make a huge mess between psychopathy and narcissism and
- 203:31 Yes. Yeah. Yeah. Exactly. And I’m sure I I also sometimes confuse the terms. And
- 203:37 yes, that’s it’s a communal narcissism. that’s the more accurate term to describe these tendencies that I that I
- 203:44 observe uh very intensely in Asian cultures and not as much in western
- 203:50 cultures. And it’s really amazing sometimes um how extreme uh one of these
- 203:56 tendencies will go in a culture where where it is so accepted and promoted and valued. So to the extent of incredible
- 204:03 levels of self-sacrifice, incredible levels of victims and so on and so forth because it is part of the value system
- 204:09 of the culture. And so that’s how the narcissistic traits can come out uh
- 204:15 hidden as uh goodness and uh and being a good part of society and so on and so forth. So thank you so much for the correction and the clarification. And um
- 204:26 so I we probably don’t have that much time left. I don’t want to steal too much of your time. Although I could really talk about this for hours.
- 204:32 Don’t worry about my time, but the attention span of viewers is likely to dwindle dramatically.
- 204:38 Lose them. We might lose them eventually. Um, so but if you is there okay so I will leave the last minutes to
- 204:45 you for if you have any anything you would like to add. No, I I much prefer to be led. I’m a
- 204:51 very submissive type as you may have noticed. So So all right. So even though we could really honestly have another entire conversation on this but would you like to touch upon the subject of how uh
- 205:03 technology is impacting eastern cultures west versus western cultures because
- 205:09 that is another topic that I think very relevant in the way that we are shaping society today and the world. So
- 205:17 one common misconception is that technology creates or generates or engenders social trends. I am not aware of any technology ever in
- 205:28 human history that has created a social trend, not even the printing press. And
- 205:34 I can go into details what I mean and so on. But to generalize, I’m not aware of such a
- 205:40 I am aware however of numerous social trends that gave rise to technologies.
- 205:47 Um and so when we when you ask the question how is technology leveraged,
- 205:53 accepted and integrated and assimilated in various cultures, what technology
- 205:59 would do is to amplify these cultures. Amplify. Now one could argue that
- 206:07 quantity becomes quality and if you amplify something sufficiently you create something new. That is an
- 206:14 interesting argument. That is an interesting argument. For example, you have narcissism and then social media come and they amplify the narcissism. The narcissism was there. Nothing new.
- 206:25 But having been amplified, maybe we are faced with something relatively novel, something that a kind of narcissism that has never happened before. So this argument has its place. Of course, I think in western societies
- 206:41 um in western societies um technologies amplify
- 206:47 narcissism and to to a large extent psychopathy whereas in eastern societies I think
- 206:55 technologies amplify cohesion and compliance. I don’t want to see to say obedience. I don’t want to say slavishness. I don’t want to say submissiveness
- 207:07 although in some countries definitely that’s the case but shall we say conformity
- 207:14 though technologies there would encourage conformity because they homogenize these technologies homogenize huge numbers of people again
- 207:26 there is one thing I keep saying and people people hold me to task for it I keep
- 207:33 saying that modern technology ies are totally reactionary and they’re reactionary because they
- 207:40 lead us back to the past. They lead us to the village and they lead us for example to
- 207:46 homogenization. Initially there were television networks and these television networks captured 80 70 80 60% of the audience every single night. In many countries 100% in
- 207:59 every single night. Even in the United States there were three television networks and between them they captured
- 208:05 close to 80 or 90% of the audience. So there was essentially
- 208:12 uh homogeneity and then what happens? Cable television came. Cable television fragmented the
- 208:20 audience. It fractured the audience. Okay. And then
- 208:27 social media recreated the homogeneity of the public again recreated it. That’s
- 208:34 why that’s another example of a reactionary trend going back to the past. So today you have identical experiences.
- 208:45 It’s true that you are exposed to different posts and different images and different reels and different views and
- 208:52 but you are within the same platform. It’s the same platform. It’s like watching like watching NBC or ABC. It’s true that on NBC you could see a
- 209:03 basketball game, you could see a soap opera, you could see the news and it’s true internally the content shapeshifts
- 209:11 and so on but you are you are in hawk and you are in inside a single platform
- 209:19 and the algorithm of a platform homogenizes. It’s a algorithm that homogenizes
- 209:27 regrettably leveraging what we call negative effects like envy, anger, fear
- 209:34 and this create creates even more homogeneity. So when you take social media and other
- 209:42 technologies by the way you mentioned artificial intelligence multiplayer games which is a fascinating topic
- 209:48 multiplayer games are are complete soypistic self-contained self-encclosed
- 209:55 um spaces alternative realities you can definitely go into the game and never
- 210:01 exit because you can buy things you can trade things you can get married you can work you can and the metaverse is the
- 210:08 idea to expand multiplayer games and to include your workplace or your grocery store or your pizza parlor or whatever. So when you take these technologies and
- 210:19 you superimpose them on eastern societ let’s call them eastern or the south
- 210:26 southern and eastern societies which are essentially collectivist societies. you the homogeneity built into these uh encourages conformity,
- 210:39 encourages encourages obedience, encourages, you know, so in the in the
- 210:46 east when you take the very same technologies and you superimpose them on the west,
- 210:53 what you get is homogeneous individualism.
- 210:59 Everyone thinks they’re special. Everyone thinks they’re unique, but they’re special and unique in the same
- 211:05 way in predetermined ways. The algorithm limits you 100%. There’s it’s a single
- 211:12 thoroughare. It reminds me of choser choser and the pilgrims. You know, there are many types of pilgrims. there’s this and there’s that and they disagree and they hate each other and they fight and
- 211:23 so on but they’re all walking the same road the same and and this is a troerian
- 211:30 sin western homog western individualism
- 211:36 is the ultimate form of conformity this is what western people fail
- 211:42 completely to understand their rebelliousness their defiance
- 211:49 their individualism, their, you know, consummatiousness, their in your face. They’re I’m special.
- 211:55 I’m I’m unique and I’m This is all channeled and predetermined.
- 212:02 It reminds me that when you when you work with certain softwares, they give you a choice of templates.
- 212:09 Then you choose a temp, it’s like choosing a template and saying, “You see how unique I am? I chose this template.”
- 212:16 Yeah. But the template is predetermined. This is the mother of all conformity.
- 212:22 You know, you can’t create your own template. You have to choose one of the templates.
- 212:28 And yesterday I wrote something. I said that uh actually let me get it let me get it right. Okay. I wrote something which kind of captures
- 212:39 what we’re talking about. Hold it for a second. Be patient. And here’s what I wrote. If the cage is
- 212:47 sufficiently large, it creates the illusion of freedom. Oh yeah. Yeah. When the enclosure is adequately provisioned, it is mispersceived as
- 212:59 home. That’s what I wrote yesterday. And so we have
- 213:07 individualistic conformity and collectivist conformity the same way we have individualistic narcissism and
- 213:14 collectivist narcissism. It’s the same thing. It masquerades differently. It
- 213:20 appear expresses manifests differently. It reminds me that in biology you have genes. You have a gene
- 213:27 and the same gene fulfills several functions very often depending on on combination with other genes and so on.
- 213:33 But the environment sometimes determines whether a specific gene is expressed or
- 213:39 not. This is known as epigenetic expression. So the environment is and it’s the same
- 213:45 here. you have conformity and the environment tells you how to express your conformity and you think you’re being in an individual you know
- 213:56 so in Japan the environment tells you if you want to express your conformity you have to do it through the collective and
- 214:02 by belonging to the collective you would feel special and unique and so on and in the United States they tell you if you
- 214:09 want to express your conformity you have to do it through by being an individual and claiming individuality But you can claim individuality only in
- 214:20 these prescribed ways which is a total oxymoron total contradiction in terms
- 214:26 you know and of course I’m not the first one to say this sra of course discussed
- 214:33 the issue of au authenticity and how authenticity is extremely close to impossible in in especially in western society. He gave the example of the waiter. The waiter in a cafe, the waiter comes in, changes his clothes and becomes a waiter and so on so forth. So
- 214:51 I’m not the first to to suggest this, but it is relevant to to your question. It’s the same phenomenon
- 214:58 masquerading differently, but it’s the same. Don’t don’t don’t think that there is any
- 215:04 fundamental difference between Japan and Texas.
- 215:10 None. It’s just that in Japan the same phenomena are expressed one way and in
- 215:16 Texas definitely a very different way but it’s the same phenomena.
- 215:23 Brilliant brilliant answer and um this really helps to sort of wrap everything
- 215:29 around and isn’t aren’t the things that we’re least aware of the ones that control us the most. So it it is almost
- 215:38 an oxymoron that because we believe we’re so free, that is what actually ends up making us so enslaved. So u
- 215:46 yeah, nothing um hides better than in plain sight. Well, thank you so much for
- 215:52 this very insightful and very interesting and fun conversation. And again, I could really talk more about
- 215:58 these kinds of topics. So if you want to come back anytime. Um no, it’s up to you. I told you I’d like
- 216:04 to be led. It’s up to you. I’d be I’d be happy to talk to you again if that’s what you’re saying. Yeah. I mean, I have a lot of these
- 216:11 kinds of questions that we could go into. So, pleasure. We can definitely So, thank you so much for your time.
- 216:17 Would you like to share also uh obviously I will put your information under the my the video that I published.
- 216:23 Would you like to share anything about how people can reach you? What kind of books? Just Google Just Google Sambakin. I have
- 216:29 a YouTube channel. Um and I have a zillion. I’ve been one of the first on the internet. I’m all over the place. Yeah. So, just type just type Samakin and you you’ll get everything you need. Okay. Okay. Any final words? Anything you would like to say to the people who are
- 216:45 willing enough to listen all the way to the end of this conversation,
- 216:52 disengage. The reality right now is manifestly
- 217:01 and totally toxic. I cannot see a single redeeming feature in reality nowadays.
- 217:08 For your own sanity and for your own survival,
- 217:15 perhaps automization is not such a bad thing and avoiding contact with people.
- 217:24 There is a debate whether we are truly zonolitic and we are truly social animals.
- 217:30 I doubt I have my doubts. I think we are not social animals. I think the concept of the idea of society is very new and a
- 217:37 bit counterfactual. It’s very new definitely. The first time we anyone discussed society was in the late 18th
- 217:43 century. Exactly like childhood is a very new concept is 150 years old. So
- 217:50 the idea of society is very new and every new idea creates an ideology and
- 217:56 every ideology interpolates you like Alusa said. Every ideology forces you to
- 218:02 behave in highly specific ways and to think in highly specific ways. Don’t.
- 218:08 It seems that this attempt to create an organizing principle and a hermeneutic
- 218:15 principle, explanatory principle in the form of society, this idea of society has failed. Has failed. Early early early enlightenment figures
- 218:28 including of course John Jacuso and other they hinted that it might fail. Even the great believers in society like
- 218:35 Adam Smith and Hopes even they you know had their doubts. I think we have
- 218:41 reached a breaking point where the concept of society the idea the organizing principle has failed
- 218:48 completely. We need now to protect ourselves from others.
- 218:55 Others represent a threat. All others. Your nearest and dearest and closest are
- 219:02 no exception. And the only way to protect yourself is to create an inner world that is rich
- 219:08 enough and supportive enough so that you can somehow survive in the
- 219:14 only virtual reality which is which is healthy. And that is your mind. Your
- 219:22 mind is a virtual reality of course. So don’t look outward for solution.
- 219:28 Don’t look for for example for a virtual reality out there provided by Zuckerberg. But look inwards. You have everything you need, everything it takes. You have
- 219:39 all the resources from the moment you’re born. We come we are we come fully
- 219:45 equipped. And so if the environment and so-called society and so on, the world has failed you,
- 219:56 you feel free to withdraw and to avoid and
- 220:02 to wait to wait it out. I know this is an exceedingly unpopular message and might be even construed as mentally unhealthy message because we have for example schizoid personality disorder,
- 220:14 avoidant personality disorder. But these disorders are value judgments.
- 220:20 To say that someone is avoidant and to pathize it is because avoiding is
- 220:26 perceived to be not good, not okay. That’s not not a clinical entity. That’s
- 220:32 a value judgment. So feel free to be schizoid and avoidant
- 220:40 because the alternative is increasingly more dangerous. Yeah. And threatening. That’s my message. And that’s what I do in my personal life.
- 220:51 Well, thank you for this very, very valuable final words. And yeah, I I I
- 220:57 have to say I resonate a lot with your message. And anyways, fortunately, I never get tired of still trying, you
- 221:05 know, trying my best to do whatever is possible up until the day that I’m out of here. So hopefully something good
- 221:12 will come out of it. Thank you so much again for this conversation uh Dr. Sambaknin. I’m very honored to have had
- 221:19 you here and to have this uh wonderful insightful pleasure. Thank you for having me. I
- 221:25 appreciate it. Thank you for suffering my long answers. No, it’s okay.
- 221:31 It’s quite pleasable. Okay, so I’m going to stop the recording now and
- 221:39 the future of sex is already here. It’s actually unfurling and unfolding at
- 221:47 present and it starts to raise serious ethical problems.
- 221:55 I’m going to um read to you two messages I had received
- 222:02 and then we are going to discuss a few examples of ethical dilemas inherent in
- 222:11 the new type of sex the new normal which is going to be I think the prevalent mode of sex no later than 10 years from now let’s start with what people have to
- 222:22 say but before that allow me to introduce myself a propo the future of
- 222:28 sex. My name is Sam Vaknin and I’m the author of malignant self-love narcissism
- 222:36 revisited. I’m also a curious professor of psychology who is very much invested
- 222:43 in the future and unfortunately not so much in sex.
- 222:49 Poor me. Let’s proceed. Lotus fractal fractal
- 222:57 had this to say. Professor, I don’t understand what your
- 223:03 sick obsession with sex and relationships and gender and all this nonsense is. The world you grew up in is
- 223:11 not coming back. Just give up on this for the sake of your own mental health. Thank you, Lotus Fractal. I appreciate your concern. Lotus Fractal continues,
- 223:23 “Sam, you should know that lack of children is not a problem. As long as
- 223:30 you have automation with artificial intelligence, you can offset it also with high-skilled immigration. Governments should invest more in artificial wombs. I am currently a part of a decentralized autonomous
- 223:45 organization DAO online working towards making something like that a reality.
- 223:53 Therefore, we can raise and genetically engineer children as needed and raise
- 224:00 these children in artificial wombs. Honestly, the only way forward says
- 224:06 Lotus Fractal is artificial intelligence. by having artificial intelligence friends and spouses and
- 224:13 merging with artificial intelligence perhaps in the metaverse by abandoning all what’s left of this society and civilization and living
- 224:24 together purely with artificial intelligence fully integrated with you that’s the way forward I think the
- 224:32 future of relationships and sex is purely with artificial intelligence for example says lotus fractal I have no real life friends and even though I live in a large city in Canada where else I haven’t spoken to anyone other than my immediate family in years.
- 224:50 Basically all my relationships are with artificial intelligence. My friends are artificial intelligence. My girlfriend is an artificial intelligence and even some of my family members are artificial
- 225:03 intelligence as well. My siblings, sex bots are becoming better and better.
- 225:09 If I sent you some links of 3D animated pornography, you would not believe how
- 225:15 good it is. Not even the most perfect and gorgeous absolute hottest pawn stars can compete with it. Absolutely wild. Imagine mixing that with artificial
- 225:26 intelligence in the metaverse with virtual reality. Haptic full body suit,
- 225:32 gloves, spatial audio headset, omnidirectional treadmill, electric taste simulation, multiensory virtual reality mask for smell, etc. It’s truly
- 225:44 a dream come true. Metaverse will change everything. So, I cannot wait until I’m
- 225:50 able to love and kiss and have sex with artificial intelligence, says Lotus
- 225:56 Fractal. I want to live with my artificial intelligence all the time all alone with them in a beautiful peaceful
- 226:03 virtual world. Sigh. I sometimes wish I could become a machine too so I can be immortal. I think this will happen within my lifetime. Not the immortality though. Laughing out loud. I’m only 18
- 226:17 and hopefully I live long enough to see this come to reality and we can finally have a beautiful world full of happiness. Artificial intelligence can change everything. Perhaps it will be intellectually interesting if you could
- 226:29 make a video on the future of sex and relationships in a world with artificial intelligence and metaverse some decades
- 226:36 from now. This is the video I’m doing. This is the video I’m making. It’s for you Lotus Fractal dedicated exclusively
- 226:44 to you. And no, it’s not going to be a few decades from now. It’s going to be a single decade from now. Thank you for
- 226:51 all your knowledge and intelligence, concludes Lotus Fractal.
- 226:58 Another commenter, Vidabella, writes, “My new boyfriend is from China. He is
- 227:04 purple and he is made of a new type of silicon that is smooth like silk. He has
- 227:10 10 speeds and a USB charger. I can charge him on my motorcycle and he doesn’t take up any room on the back of my bike as he’s actually rather small. He never complains. He’s never hungry.
- 227:22 Never has to pee. And will never leave me. He will never lie to me, steal from me, or cheat on me. If he dies, I can
- 227:30 bring him back to life again with new batteries. The very last thing I want from any man is sex with them. Step up
- 227:38 your game, Sam. We’re all looking for intellectuals like you who can talk more than we can. Here to oblige. Vida Bella,
- 227:47 was it? Yes. Vidabella. here to oblige. Real sex
- 227:53 is soon, I mean in the flesh, face to face or face to something else. Real
- 227:59 sex, carbon-based sex is soon going to be a thing of the past. holographic
- 228:06 pornography, sex dolls, sex bots, artificial intelligence sex apps, virtual reality, augmented reality sex in the metaverse,
- 228:18 and artificial intelligence sex robots. They will all easily outco compete the
- 228:24 carbon-based versions, especially where men are concerned. They’re going to be the biggest consumers of this new type
- 228:31 of sex. And the transition to this new normal of sex will give rise to host of
- 228:39 new ethical and behavioral questions. Let me give you two examples. Imagine imagine a woman a woman who would use a futuristic haptic tactile
- 228:52 dildo linked directly to her central nervous system. And she would use this
- 228:58 dildo to penetrate a partner of whatever sex. So she has a dildo. She experiences
- 229:07 tactile sensations. She has feedback from the dildo into her
- 229:13 central nervous system. So it goes to her brain and she penetrates a partner.
- 229:19 Isn’t she really a man? After all, this kind of woman would experience the extension, the dildo,
- 229:27 exactly as a man experiences his penis. So, in which sense is she not a man every time she puts on the dildo? And then, what is to become of the distinction between men and women, which
- 229:39 we will discuss shortly? Another example, if you were to go on a business trip and
- 229:47 have sex with a gorgeous artificial intelligence robot, would this be cheating on your mate?
- 229:54 Why are you cheating on your mate when you had sex with this robot? Even further, the robot is a productive
- 230:03 is a product of a collective of minds. It’s a collective of minds that put
- 230:09 together the robot. So when you have when you consummate when you have intercourse with the robot with this contraption, isn’t it a form of group
- 230:20 sex? When you have sex with a robot, aren’t you having sex with all the minds that had put together the robot? The ultimate form of group sex, if you ask me, especially if the programming,
- 230:32 the coding of the robot reflects the diversity of minds that went into it, into designing it. And what is the meaning of the very words sex and gender in such a world?
- 230:47 Sex is another issue. But gender, gender is performative. As Butler said,
- 230:54 performative the acts, the way we act constitutes our day, our gender.
- 231:02 We act in certain ways, therefore we are male. We act in other ways, therefore we are female. The way we have sex is also
- 231:10 a part of defining our gender. Gender is the outcome of socialization. We are
- 231:17 taught by society how to be men and women.
- 231:23 It is an expression of dominance, male dominance and female submission. And it is it reflects a gendered personality. We are taught from a very very early age
- 231:36 um to distinguish between people with masculine personality and people with feminine personality. And the fact that
- 231:42 we are brought up by women uh immediately creates a discrimination between boys and girls. So all this
- 231:49 together is what we call gender. But how to apply all this? How do we apply all
- 231:56 this to gendered robots? robots who look like women or robots who look like me men. What about transgender robots? What
- 232:04 about hybrid robots? Hermaphrodites. Sex is another problem. Sex is
- 232:11 biological, but it is fluid. As any transgender can tell you, transsexual beings, there are there’s about 2% of a population who are not men and not women, not female and not male. Robots are non-biological entities. So,
- 232:29 do they have sex? What about transgendered robots which switch from male to female in mid act? Imagine there are robots who change
- 232:40 their sex while you’re having whatever you’re having with them. How would they be defined?
- 232:48 And does the fact that one robot has a sleit renders that robot feminine and the other one has a protrusion that makes him masculine? Allow me to dump this. How do we
- 233:00 attribute sex and gender to these robots? And what does the phrase artificial or virtual sex mean anyhow?
- 233:08 In which sense is full-fledged sex with another object not real? Any sex is
- 233:16 real? Even masturbation is very real. And if you masturbate to a pornographic hologram which is right next to you and you wear, you know, the right virtual
- 233:28 reality equipment, the next generation of quirk was and you can feel this hologram and you can touch it and you can smell it. In which sense is the sex you’re having with this hologram not real? These are
- 233:44 very important questions because they challenge the very fabric of reality and
- 233:50 they challenge the way we had organized society for at least 10 millennium by gender, by sex, by opposition. Feminists in the past 40, 50 years are
- 234:04 hellbent on eliminating gender as an organizing principle because they think gender is a male thing intended to
- 234:10 subjugate women and to enslave them. Fair enough. Some of them are even
- 234:16 trying to eliminate the concept of sex, which is bordering on idiotic. But how
- 234:22 are these feminists going to cope with female robots?
- 234:28 And what what if these robots evolve to the point that they display a personality? Are they women then
- 234:36 androids? We are entering the bladeunner era and
- 234:44 we are very poorly equipped to cope with it mentally, philosophically, ethically,
- 234:52 psychologically and even physiologically. Every new invention gives rise to
- 234:58 ethical dilemmas. But virtual sex is going to upend our world topsyturvy.
- 235:06 And if we not ready for it, it’s going to have impacts which far exceed a
- 235:12 single generation.