61. How AI Could Reshape Human Consciousness and Society: Exploring the Unknown - podcast episode cover

61. How AI Could Reshape Human Consciousness and Society: Exploring the Unknown

Mar 17, 20233 hr 34 minEp. 61
--:--
--:--
Listen in podcast apps:

Episode description

What are the ethical implications and risks of developing artificial intelligence, and how might it impact society and human purpose?

In this intense and thought-provoking episode of "Dennis Rox," titled "How AI Could Reshape Human Consciousness and Society: Exploring the Unknown" hosts Eldar and Mike, along with guests Anatoliy, Vemir, and Sergiy, dig deep into the pressing and multifaceted impact of artificial intelligence (AI) on society. They critically examine whether AI can indeed enforce ethical behavior and create a utopian world, or if it carries the potential for unprecedented risks and ethical dilemmas. The discussion spans topics from the role of technology in extending human life to the ethical implications of AI development, touching on historical cases where technology led to unintended consequences. Sergiy and Vemir share their thoughts on the balance between technological progress and ethical responsibility, voicing concerns about AI's potential to either uplift or endanger humanity.

Throughout the episode, the group explores the potential of mind-reading technology, the influence of large crypto investors, and the legality of financial advice. They also reflect on the intentional and unintentional consequences of AI, drawing parallels with historical events and modern technological advancements. Eldar and Vemir probe the possibility of achieving a world without greed and hierarchy through advanced AI, while Anatoliy and Sergiy contemplate the human capacity for transcendence amid these rapid technological changes. Ultimately, the episode raises the all-important question: Can AI truly pave the way to a benevolent and enlightened future, or does it set humanity on a precarious path where ethical oversight is more crucial than ever?


X

Transcript

Intro [00:00:00]:
On this week's episode, is there a.

Eldar [00:00:01]:
Case where we can self actualize ourselves? And like you said, go outside more, right? I don't know. Do poetry. Right. Practice art and stuff like that instead of working? Instead of working?

Vemir [00:00:11]:
Yes. I'm driving as fast as I can towards a cliff. But don't worry, we'll run out of gas before we get there. Depressed, lack of purpose, midlife crisis, suicidal people who retire die faster because they have a sense of waking up.

Sergiy [00:00:27]:
Or a.

Vemir [00:00:27]:
Why can you imagine that on a global scale? Nobody needs to get a job. Why go to school?

Eldar [00:00:33]:
You know that story about one of the buddhas? What he did, right? Like, sacrificed himself to, like, the lion because the lion had cubs. So he's like, okay, eat me so you can feed the cubs.

Sergiy [00:00:41]:
What a bitch.

Vemir [00:00:42]:
Meditation is the only door that someone has entered that they haven't come back out of regretting it.

Sergiy [00:00:49]:
What's that with you millers or what?

Eldar [00:01:03]:
So, guys, to introduce this, Vermeer has wrote a paper, a very extensive one, which. Sorry, we did not finish reading, but it would be probably interesting. But if you can do a little introduction on AI and a little bit about your paper.

Vemir [00:01:19]:
Oh, yeah.

Mike [00:01:20]:
Should we kill the light?

Eldar [00:01:21]:
Yeah, let's kill the light. Moody lighting.

Vemir [00:01:24]:
Yeah. AI is basically machines, systems, algorithms that can. And this is not a policy definition. I'm just kind of laying the field out. So you can't just take it literally and then pick it apart. It's just kind of laying it out. Right. It's like these systems, machines, algorithms that can perform human comparable behaviors of intelligence.

Vemir [00:01:50]:
Not human level, human, comparable. In terms of, like, chess. Intellectual activities that we define as intellectual activities. AI is creating things that can do that. So, like the early stages. Alan Turing, you know.

Eldar [00:02:07]:
I don't know. We don't know what that is. No.

Sergiy [00:02:11]:
Wow. He's the father of basically computers.

Eldar [00:02:14]:
Okay. Yeah.

Vemir [00:02:15]:
And here's the guy.

Eldar [00:02:16]:
Dennis, cut this part out.

Vemir [00:02:17]:
No, go ahead.

Sergiy [00:02:21]:
Invented a computer that was able to crack that german machine. What was it called? It's in the movie. I seen them in thousand times.

Vemir [00:02:31]:
The imitation game.

Eldar [00:02:33]:
Yeah.

Sergiy [00:02:33]:
So the Germans made a computer, one that would communicate with each other. It was uncrackable.

Vemir [00:02:37]:
It was about this very, very hard to crack code. And he cracked it with his machine. So he also invented something very relevant to AI, which is the Turing test. Turing test is basically if you have a human on one side, let's say, and a computer on the other side, the computer behaves or acts. I'm saying this all very roughly just the broad strokes. The machine acts in a way where the human on the other side would be convinced they don't know whether it's a machine or a human.

Eldar [00:03:12]:
Oh, wow.

Vemir [00:03:12]:
Okay, so if you pass the Turing test as a computer.

Eldar [00:03:15]:
Got it.

Vemir [00:03:16]:
You mean it's like certain thing, right? There's also that thing, like the uncanny valley. It's like we have this thing where, you know, it's like machines, robotics especially, have this weird thing where it looks human but not human enough. It's like this weird, like, soulless kind of thing. You ever seen those AI generated images or things that just look almost human, but it's too close? But it's not close enough. It's weird. These are like, different pockets of, like, fooling humans, let's say. So, I mean, a lot of the implications in the future will be, this is why philosophy is so important to AI, because it's like putting in our face the big questions, our place in the universe, the next evolution of species. If something can do everything better than we can, then is it next level for them? Are we laying ourselves down for this thing? Can AI become conscious? Is life carbon based or silicon based as well? Is our definition of all the big, you know, getting wisdom and knowledge from the universe? Like, just how we have tools like telescopes.

Vemir [00:04:29]:
It's like, maybe because an AI can see infrared and this and that, it has a wider range of experience. I mean, all of these concepts and possibilities demand our big questions. I mean, back in the day, like, you had all of these mathematicians, scientists, and engineers, early engineers who were also philosophers, like da Vinci, Poincare. I mean, Einstein was even. There's even a book called Einstein in religion. It's a great book. But the whole point is, like, they were always asking these huge questions alongside with pushing the next level of scientific innovation. So a lot of what I explore is, like, what is our place, responsibility and positioning, which is close to place in relation to this next stage.

Vemir [00:05:28]:
I mean, this could be, interestingly enough, the most important time in human history, which is you're creating something that will possibly replace you. Interesting, something that can potentially fool you, something that can do things better than a human can. We're not really threatened by chess engines, right? Because it beats you at chess. Humans still play chess, but, like, you know, the implications for jobs is enormous, you know? Right. I was reading about AI startup, who is listing their safety principles, and it's like there's two sides. There's, like, technical alignment, like how to get a model to explain itself, et cetera, whatever else, how to get transparency, reduce bias, and technically align with human values. But there's also the societal aspect. What happens when you create something that can do any job a human can? How do you reframe and reorganize? Political.

Vemir [00:06:23]:
So anyway, I don't want to dance around too much, but AI is essentially the technological research progress and products that are in the realm of intelligence and can do cognitive tasks. And then you can combine it with robotics. There's machine learning within that. There's deep learning. There's all these terms that I don't think we should get into yet, actually, I would bring on another time, like an ML senior researcher or something. I have a friend who's coming next month. He can explain gans transformers a lot more deeply. But, like, in general, you have to.

Eldar [00:07:04]:
Pass your test first. You know what I'm saying? You might lose us just with your knowledge.

Vemir [00:07:08]:
So, like, at any point, just ask me, like, if there's any need for clarification. But anyway, AI is just. You're pushing the edge, the frontier, the state of the art, whatever's happening next in tech, and it's creating remarkable stuff, like chat GPT. Everybody talks about chat GPT after chat GPT came out. I've been studying AI ethics for, like, two, three years. AI and stuff like that. Been involved in it now. It's like, my barber asked me about AI ethics.

Vemir [00:07:41]:
I'm in an Uber. The guy's talking about policy and AI. I was at a strip club in Berlin because my friend, friend of a friend, is working there. So we support her, and, like, one stripper from California comes up, and she's like, I had a marketing job, and I'm just afraid AI is going to take. You can't escape it. Like, it's everywhere. It's permeating everywhere. So now it's like, I actually get this, like, physiological response.

Vemir [00:08:05]:
Always when I wake up to, like, do something about this, I feel like I have certain insights, certain perspectives, certain positioning, but, like, we don't have to talk about me and what I'm doing, but I can kind of explain the paper a little bit, why I made it, how I got to that point.

Eldar [00:08:23]:
Yeah.

Vemir [00:08:23]:
And I don't know how much confidentiality I need to maintain with names or anything, but it's a really, like, that paper taught me a lot through doing it, you know what I mean? And AI is definitely what I call part of the, like, Peter Parker principle. You know, Spider man, like, with great power comes great. This possibility, it's like, the highest stakes we have, if you. If you think about it, like, if you think it through.

Eldar [00:08:54]:
Yeah.

Vemir [00:08:55]:
Like, it's extremely capable. I mean, it's going to be extremely capable. And so if it's in the right hands, it's a utopia. It's in the wrong hands, it's like a dystopia. That's kind of my take.

Eldar [00:09:07]:
Okay, so my first question to you, then you mentioned that there's limitations right now still, right. AI, when it comes to comparing the intelligence level to a human, can you tell, can you talk about those limitations? What are the limitations?

Sergiy [00:09:21]:
Talking about the turrent test? Can AI pass the turn test? I think they just recently did. Well, also, Google a. I did, like, a year ago.

Vemir [00:09:29]:
So Demis Hasabas is the CEO and founder of DeepMind, which got acquired by Google. You know, DeepMind. So, like, DeepMind completed protein folding, which is, like, a huge advancement. But anyway, I don't want to go too deep into the woods, because I.

Sergiy [00:09:55]:
Used to be part of the protein folding.

Vemir [00:09:58]:
So DeepMind's algorithms, which used to be for games to play themselves and complete games, video games, because Dennis has always used to program video games, naturally, because.

Sergiy [00:10:09]:
I know there was a. It was basically you were allowing your computer time to be used for protein.

Eldar [00:10:16]:
Folding for a while.

Sergiy [00:10:17]:
And now you're saying that AI completed the task, like, all the proteins?

Eldar [00:10:22]:
Yes, like, all of them, yeah.

Mike [00:10:24]:
What does that even mean for the, you know, non AI?

Eldar [00:10:27]:
It's good for your.

Sergiy [00:10:28]:
For your health. Like, you need to figure out, like, I guess protein folding helps, like, with medicine and understanding how your protein routine in the body works. So basically an algaline, and it just, like, folds them non stop, like, billions of variations, until, like. Yeah, you know, folds the protein. And you could. You. Before you could, like, if you're a game or if you have a decent computer, even a PlayStation, and if it's idling, you don't want to use it, you could sign up actually, for it. And it says, allow my system resources to be used for protein folding and, like, MIT or something.

Sergiy [00:11:00]:
One of those.

Mike [00:11:00]:
Is it like bitcoin mining?

Sergiy [00:11:01]:
It's similar, yeah. So you're either you're contributing a sign, so you don't.

Vemir [00:11:05]:
But DeepMind had a new algorithm, so many other things.

Eldar [00:11:08]:
Yeah.

Sergiy [00:11:09]:
I didn't hear about it because I used to use all my bitcoin mining machines when the bitcoin was not mine able by my machine anymore. I used it for a while for protein mining, mining, but it still takes up a lot of electricity.

Vemir [00:11:24]:
Anyway, I bring up Emma's hasabes. Because he said Turing test also depends on the human. Some humans are, like, easily fooled.

Eldar [00:11:33]:
Yeah.

Vemir [00:11:33]:
You know, like, it's like, that's not the whole thing. But the whole point is, like.

Eldar [00:11:40]:
You.

Vemir [00:11:40]:
Know, one application in real life is like, you call up Verizon, you have a problem, picks up very quickly, says, hello, how can I help you? And you're like, hey, what's up? He's like, good.

Eldar [00:11:49]:
How are you?

Vemir [00:11:50]:
And you have a whole conversation. Solves your problem. You hang up. You never knew it was an AI.

Eldar [00:11:55]:
Interesting.

Vemir [00:11:56]:
That would be a completely passing of the Turing test is if, like, 100,000 people called Verizon a day, nobody knew as an AI, for example, or, like, they didn't.

Sergiy [00:12:06]:
If it didn't sound robotic.

Eldar [00:12:07]:
Right, right.

Vemir [00:12:07]:
I'm saying if they were to do it in a way that sounds human, can answer your questions, can walk you through the.

Sergiy [00:12:13]:
Not even have an AI. I have it on my computer.

Eldar [00:12:16]:
James used to get tricked by the regular robots. I mean, you know, sample size of individuals who. Who are calling in. Yeah, yeah. You know what I mean? Like, what are we talking about? Like, lesser than, like, where are the iqs? You know what I'm saying?

Anatoliy [00:12:31]:
Yeah.

Sergiy [00:12:31]:
I mean, I have an AI program now on my computer. You could use voice of any president, any celebrity, and you train it. You use, like, you use a good microphone and you talk. And if you know how to mimic that person, it helps. And you could basically pull out full speeches. Those people sounding like them. If you don't cut it, if you. If you use somebody now that would have the fake faces.

Sergiy [00:12:58]:
Right. You could make somebody's face, and then you match the voice. And the future is gonna be a problem where the video is gonna look like a celebrity did something bad or, like a political person did something bad.

Mike [00:13:07]:
Yeah, but it's all deepfake.

Sergiy [00:13:09]:
It's all deep fake. Everything deep fake. The voice is deep fake. The video is deep fake.

Vemir [00:13:14]:
In the literature, they say that video evidence will no longer be admissible in court. Your defendant.

Eldar [00:13:20]:
Yeah.

Vemir [00:13:20]:
You see surveillance video of you robbing the bank. You say, how can you prove that's not audio all out the window? So you can't use any of that in the juju.

Sergiy [00:13:30]:
But it's already.

Mike [00:13:33]:
It's locked in already.

Eldar [00:13:35]:
Yeah.

Vemir [00:13:35]:
I mean, like, some robbers are putting on, like, fake fingers so that when they rob the place, it looks like they have six fingers. So they say that's a deep fake photo because naturally, like, in stable diffusion and stuff, they get. Stable diffusion is an app that creates images from AI. And like they're saying, oh, well, that's a fake photo. You see there's five fingers. That's not my fingers. It's like, now it's gonna get more convoluted as like, good and evil. You could say, I hate using that.

Vemir [00:14:04]:
But, you know, I'm saying, like, people can manipulate these things anyway.

Eldar [00:14:07]:
Isn't that just a kind of a consequence of everything else?

Vemir [00:14:11]:
No, we are. The valley between knowledge and wisdom is getting further and further. And that's why it's tipping the balance. It's so dangerous, in my opinion.

Mike [00:14:21]:
And AI is going to tip it into the scales of what infinite scales. I mean, AI will be without, like, it would be ethical, right? More ethical. Is that what they're saying?

Eldar [00:14:33]:
If you program it again, what do you think?

Vemir [00:14:36]:
This is the main topic.

Eldar [00:14:39]:
What are you worried about? Let's get to this.

Anatoliy [00:14:41]:
Well, I mean, bit like. I mean, so far, based on what I'm hearing, talking about how, like, it could do amazing great things.

Eldar [00:14:51]:
Right. Yeah.

Anatoliy [00:14:52]:
But then it also, like, with those capabilities in the wrong hands, then can create very bad things. Yeah, but if this is like, I'll.

Eldar [00:15:03]:
Talk about your guy.

Anatoliy [00:15:04]:
Yeah, but if this is more.

Sergiy [00:15:06]:
Let's not talk about the ones who actually used to. Let's not talk about the one too.

Eldar [00:15:10]:
Used to?

Anatoliy [00:15:11]:
Yeah, I mean, it's more readily.

Vemir [00:15:13]:
What?

Anatoliy [00:15:13]:
Yeah, not like, available, but it sounds like. Like, I don't know, like if we're comparing it to like a nuclear bomb. Right? Like to get your hands on one of those or to have authority to use. Not use one of those. It's probably like a handful of people in the world.

Sergiy [00:15:28]:
Nuclear bomb was not as scary as, you know, releasing like, Covid-19 into air.

Anatoliy [00:15:34]:
Yeah.

Sergiy [00:15:35]:
Because a person in a lab could do it. And it's a lot easier to release that than a nuclear bomb. So I think that would be a better comparison.

Anatoliy [00:15:40]:
Yeah.

Sergiy [00:15:41]:
Or that somebody at home making some kind of virus that will kill everybody and that could be done at home.

Vemir [00:15:46]:
These are sections of what's called existential risk or short x risk. So, you know, like bio weaponry, you could say.

Eldar [00:15:55]:
Or.

Sergiy [00:15:55]:
Yeah, bio weapons is easier to make nuclear.

Vemir [00:15:57]:
So there's biohazards, there's climate change, there's nuclear risk. But to me, and there's some consensus, number one, above all of them is AI. I'm not saying that's the rule. I'm saying that's my opinion.

Anatoliy [00:16:13]:
Well, that's what I'm saying, though, is that like, it sounds to me like AI will be a more program notable thing. Go bad. Then, like, you might have a code.

Sergiy [00:16:22]:
Inside of it that will say, hey, you, do not go wild. Like, do not annihilate human race.

Eldar [00:16:28]:
Right.

Vemir [00:16:28]:
Just turn it off. Right. That's the thing.

Sergiy [00:16:30]:
Like, no, not even turn. No, not even turn it off.

Vemir [00:16:32]:
Like, this is like the hardest question.

Sergiy [00:16:34]:
Like, at the base level, make a, like, hardware program. Like a program in there. Like, not even a cool switch. It just cannot be bad. Like, AI is just good.

Vemir [00:16:42]:
How do you do that?

Mike [00:16:43]:
But good and bad is a very.

Eldar [00:16:44]:
Not a code if AI has the ability to reprogram itself or something like that.

Vemir [00:16:48]:
Right?

Eldar [00:16:48]:
Yeah, it's like if it goes in and like, oh, you know, this is not true. Purposeful for me. And I realized, like, you know, like in the irobot movie or whatever. Right.

Vemir [00:16:55]:
And what is the perfect?

Eldar [00:16:57]:
Humans are bad. Let's just deny.

Anatoliy [00:16:59]:
And also, humans are making the AI right there right now, beginning right now. Yeah. Human can make an AI for bad. Right?

Vemir [00:17:06]:
Like, on purpose. That's called malevolent AI. That's, I think the, on this scale, it's like you have a benevolent and malevolent. So you, like, make it on purpose to be benevolent for all of humanity. How the fuck do you do that? I don't know. Malevolent make it on purpose to hurt other people.

Eldar [00:17:23]:
That's actually quite, just like a virus on the computer.

Sergiy [00:17:25]:
And then another company makes a good virus killer to find it. So we're going to have good AI and bad AI. It's just going to be AI fighting everywhere.

Vemir [00:17:33]:
Yeah, but we're just not going to see it.

Sergiy [00:17:35]:
Right?

Vemir [00:17:36]:
Bad AI has less limitations, you know what I'm saying? So bad AI in that scenario would likely win.

Sergiy [00:17:42]:
Like, but if you look, I mean, if you look at the government, government could create good AI also to fight the bad AI, and they have no rules whatsoever. Just like when they made that.

Vemir [00:17:53]:
Good AI doesn't do things.

Sergiy [00:17:55]:
Remember when they made that virus that ended up in a nuclear power plant in Iran? It copied itself on every single device in the entire world until it found the nuclear power plant, changed something inside of it. It was so advanced, it was able to, it was a virus. It was not even a. It was able to see. They knew what kind of parts were used in the nuclear power plant. It found somehow, like the number of it, and it got inside of the nuclear power plant where there was no Internet even. So it somehow jumped from a device inside the power plant, changed all this.

Eldar [00:18:31]:
Stuff, and some Sci-Fi show, right.

Sergiy [00:18:33]:
And destroyed it.

Eldar [00:18:34]:
Is this like a conspiracy theory?

Vemir [00:18:37]:
So if you can find out the name, let me know.

Sergiy [00:18:39]:
I forgot the name of the virus.

Vemir [00:18:40]:
Great example. Like, think about like when they say we're gonna use regulation to take care of this. Like, viruses, spam. It's all illegal.

Eldar [00:18:51]:
Yeah.

Vemir [00:18:52]:
Has it stopped it at all?

Eldar [00:18:53]:
No.

Sergiy [00:18:54]:
You know, like, antivirus companies make viruses and then they cure your computer from it.

Eldar [00:18:58]:
Well, just like Pfizer is doing. Even, just.

Vemir [00:19:00]:
Okay, okay.

Mike [00:19:01]:
Even.

Vemir [00:19:01]:
Just straight. Even just straight up. Like without any need to extrapolate from information. If you just look at base things.

Sergiy [00:19:09]:
Oh, it's called Stuxnet.

Mike [00:19:11]:
Was it a Terminator movie?

Vemir [00:19:13]:
Stuxnet. Yeah, Stuxnet. Okay, I'm gonna.

Mike [00:19:16]:
Isn't that what the thing was called in the Terminator movie?

Sergiy [00:19:17]:
Stuxnet stocks is a malicious.

Vemir [00:19:19]:
That's Skynet.

Eldar [00:19:20]:
Same thing.

Sergiy [00:19:21]:
Stuxnet is a malicious computer worm first uncovered in 2010. And throughout, I mean, you gotta read about it. It's insane.

Vemir [00:19:28]:
Stuxnet. I'm gonna look this up. But, yeah, I mean, they don't know.

Sergiy [00:19:30]:
Who made it, how many people.

Vemir [00:19:32]:
This is why I feel like regulation is not a complete solution. But without regulation, we're definitely gonna be more screwed than if we had some regulation.

Sergiy [00:19:41]:
But government has no regulation. That's the problem.

Vemir [00:19:43]:
There is no regulation of AI in the world.

Sergiy [00:19:47]:
Yeah, but there's going to be for regular fault.

Vemir [00:19:48]:
The closest thing we have is the european AI act.

Sergiy [00:19:52]:
Give me another one.

Vemir [00:19:53]:
Which is like a proposal, but there's no regulation against AI. That seems a bit ridiculous because they can't figure it out. Moves too fast, don't know where to put the red lines.

Anatoliy [00:20:04]:
But could there be in the way that you build it or what you can and cannot do?

Eldar [00:20:10]:
Then you're putting a discretion on an individual building it. So, like, how ethical is he?

Anatoliy [00:20:15]:
What do you mean?

Vemir [00:20:17]:
Well, if you're building, like, building it. If you're building, people are building it, right?

Mike [00:20:21]:
Yeah.

Anatoliy [00:20:22]:
You could put regulation on, like certain things.

Vemir [00:20:25]:
Like in the AI act, they have high risk stuff. Like there's like social credit, you know, like in China, they ban that. Right? There's like unacceptable use. Like, I. I think there's one. It's like police using biometric identification. Like, just to not even, like for people who are alleged of crimes. Like, for anybody.

Vemir [00:20:48]:
I have to read back on some of the high risk applications that are banned. Like, unacceptable use. There's high risk, medium risk, and then, like, no regulation, just kind of best practices kind of thing. And so they're putting real markers there. But even then, you just don't know if these are the right markers. And I think there's a lot of infighting in the field. People arguing, oh, we need to focus on the short term, we need to focus on the long term. We haven't even got into artificial general intelligence.

Vemir [00:21:16]:
It's such a huge field. We can even do another time to go through a specific section. But it's like, the point is from my research, it's really risky and it's unprecedented in the. It seems like in the paper there's a really good analogy. There's a quote, this guy says, it's almost as if AI researchers are acting as if, or these companies and stuff. There's a quote by somebody, it says, yes, I'm driving as fast as I can towards a cliff, but don't worry, we'll run out of gas before we get there. You know what I mean? Like, these things are impossible. Yeah, don't worry, I'm just building.

Vemir [00:21:56]:
But you're worried. Working as hard as possible to make these things become reality. It just seems silly, you know, like we're really hyper for technological innovation.

Eldar [00:22:05]:
Okay, so why do you think that is?

Vemir [00:22:07]:
Why are we hyper for innovation?

Eldar [00:22:09]:
Yeah, like why are people.

Vemir [00:22:11]:
Many reasons.

Eldar [00:22:12]:
Why are.

Sergiy [00:22:12]:
Maybe somebody wants to cash in faster.

Vemir [00:22:14]:
Do you know how much money.

Sergiy [00:22:15]:
Yeah, somebody first to do something like that.

Vemir [00:22:17]:
Like automobiles of those automating every single, like, imagine solving the stock market. That's like. Called a complex systems, right? Like it's very, I mean that goes into a lot of.

Sergiy [00:22:31]:
You mean like predicting, right. What's gonna happen?

Vemir [00:22:33]:
Prediction analysis. I mean, you discover the cure for cancer using AI. How much money is that worth? Holy shit.

Sergiy [00:22:40]:
Supposedly they already have, you know, like.

Vemir [00:22:43]:
Whatever field it is, like whatever intellectual capacity that pays people 200 year, and you can automate that, it's an insane amount of money and utility. You become monopolistic, dominant.

Eldar [00:22:55]:
Whoever discovers AI will replace.

Sergiy [00:22:57]:
AI will replace. Almost every single job has to do with writing.

Eldar [00:23:00]:
So what happens then?

Anatoliy [00:23:01]:
Yeah, what is writing?

Sergiy [00:23:04]:
Marketing.

Anatoliy [00:23:09]:
With advanced AI?

Vemir [00:23:10]:
Factory work. Let's ask this question.

Mike [00:23:13]:
Factory work too, right?

Sergiy [00:23:14]:
No, not factory work, no.

Mike [00:23:16]:
Robots building robots. No, no.

Sergiy [00:23:18]:
But certain things human could do because our hands, they're designed, you know, I.

Vemir [00:23:23]:
Wanted to keep this beautiful question. What was the purpose of humans? To kill society. What's the purpose of humans in a society with advanced AI?

Anatoliy [00:23:35]:
Well, that's why we're podcast everything, right?

Mike [00:23:39]:
So we can do that. You know what, I think maybe we'll.

Sergiy [00:23:43]:
Be good AI does everything and we just, like, maintain.

Mike [00:23:46]:
Do philosophy all day.

Sergiy [00:23:47]:
We maintain a. I will maintain robots. We relax more.

Eldar [00:23:51]:
We go outside. Yeah. It's their case where we can self actualize ourselves. And like you said, go outside more. Right. I don't know. Do pop poetry.

Vemir [00:23:59]:
Right.

Eldar [00:24:00]:
Practice art, stuff like that.

Sergiy [00:24:02]:
Instead of working.

Vemir [00:24:03]:
Instead of working, like Sam Harris says, throw frisbees and give each other massages?

Sergiy [00:24:07]:
Hell yeah.

Vemir [00:24:08]:
Right?

Eldar [00:24:08]:
Yeah. The problem is that's what everybody wants when they're working.

Sergiy [00:24:11]:
I want to be home right now.

Vemir [00:24:13]:
What do most people do when they retire?

Sergiy [00:24:16]:
Yeah, but you're fucking old. Everything hurts. I'm 40 years old. Everything hurts already. Imagine being 67 because you didn't have a good start.

Vemir [00:24:23]:
Depressed. Yeah. Lack of purpose. Midlife crisis. Suicidal people who retire die faster. Addiction have a sense of waking up or a. Why. Can you imagine that on a global scale? Nobody needs to get a job.

Vemir [00:24:37]:
Why go to school?

Mike [00:24:39]:
What's gonna be the current.

Eldar [00:24:40]:
Are you on the impression that the humans can adapt?

Vemir [00:24:43]:
Think humans are emotional at adapting. Yeah, but don't.

Mike [00:24:46]:
What is the statistical.

Sergiy [00:24:47]:
Not every single person who's retired is fucking unhappy.

Vemir [00:24:49]:
I didn't.

Sergiy [00:24:50]:
A lot of happy retirement.

Vemir [00:24:51]:
I didn't say every single person. I think that would be scapegoating the argument. But a lot of people are going to have huge adjustment issues.

Sergiy [00:24:59]:
Yeah, yeah. But there will be probably government programs.

Anatoliy [00:25:05]:
I mean, cliff scenario.

Mike [00:25:06]:
Right.

Vemir [00:25:10]:
I also wanted to present two more things, which is like, you're presenting a utopia, which is kind of what I'm working towards. Right. Like the synthesis of advanced technology just benefiting all of us so that we can just enjoy and have the time we need to. That's beautiful, and I want to get to that.

Eldar [00:25:28]:
But things you're going to ensure of this, we don't have to worry. As long as he doesn't sleep at night and taking care of us, we can do our philosophy and get paid.

Vemir [00:25:37]:
Well, you know, if. As long as I get paid well.

Eldar [00:25:39]:
Oh, that's the kicker.

Vemir [00:25:40]:
Well, I need fuel, baby. I need my food.

Mike [00:25:43]:
Yeah, but then in a society of robots.

Vemir [00:25:45]:
Wait, wait. Let's not get, like, off track. You just, like, blew past, like, 60 things without realizing it. Like in, like, you're okay. You create an AGI, an artificial general intelligence that has its own host of things in it, but let's say it's benign. You create this oracle, right? You ask it a question, it gives you the answer. Solution cures. It's completely harmless, which is not the default at all, which we have to discuss.

Vemir [00:26:17]:
But let's assume that everything in technology went perfectly. Well. It's not here to harm you. It's here to give answers. Can you imagine that power in the current political system, as Sam Harris said at the 2017 Asilomar conference, which was a big conference, to get everybody to find some consensus on principles for AI. So he said, if you have this complete oracle that's completely benign, it produce total chaos in the current political system. Could you imagine that power in one hands?

Mike [00:26:51]:
Well, whichever gets it first, right, whichever country will rule the world.

Vemir [00:26:56]:
What you're saying Putin was in front of, like, primary school classroom, and he said, whoever wins the AI race, essentially, whoever gets to AI first fastest or advanced fastest, will rule the world.

Eldar [00:27:15]:
What does he know, though?

Vemir [00:27:16]:
He's right, though.

Eldar [00:27:17]:
No, no, but who the fuck is he?

Vemir [00:27:19]:
He has genius people in the country working on these issues. I mean, like, right? Like, who is he? Oh, like, who are we to know about?

Eldar [00:27:27]:
Who was he to even make a claim like this? Such.

Vemir [00:27:31]:
What do you mean? Like, what part of his authority are you challenging? Like, he's not smart enough.

Eldar [00:27:35]:
Yeah, no, but, like, we're taking. We're taking his, like, word for face value, kind of.

Vemir [00:27:39]:
I'm giving you a political leader who most people think is dangerous, making a similar claim to what I've found in my research, which is, like, I think whoever controls AI will be dominating the world, you know, if they choose to do so.

Mike [00:27:56]:
Well, you're looking.

Vemir [00:27:56]:
And then I have a political leader who agrees with that, has his eye on it. Don't you think he has some research in it, too? Like, it's like AI. Everybody's rushing towards this goal, but there's no room for second place. You know what I'm saying? And especially you get to these things called the singularity. We have to talk about the alignment problem, the control problem. These are all points, and these are all things to solve, which, if we don't get, right, it's catastrophic.

Anatoliy [00:28:24]:
But even if somebody solves it, even if somebody, let's say, gets it first, right, it still doesn't protect from, like, physical threats, right?

Vemir [00:28:33]:
What do you mean?

Anatoliy [00:28:34]:
Like, let's say, like, I don't know, country a, right? Like, is a. Is a most advanced in AI, right? And has gone to a place where it's just like, I don't know, incredibly powerful, has, like, the answers for everything. You could still physically brute force that. That. That a country, right. And just, like, demolish it if you wanted to, right.

Vemir [00:29:01]:
You would. How would you destroy? Okay, let's say let's. Let's pick a country to attack. Pick a country.

Eldar [00:29:09]:
Russia.

Vemir [00:29:10]:
Okay, so Russia.

Eldar [00:29:17]:
Yeah. I'll retract it.

Vemir [00:29:18]:
Okay. You can't pick one without being offensive, so.

Eldar [00:29:21]:
That's true. Yeah.

Vemir [00:29:21]:
You know what I mean? So. Okay. Bolivia. I'm just gonna say one. It doesn't matter. I love Bolivians. I love people. Your reputation is not on the table.

Anatoliy [00:29:31]:
They have AI, and they have the most advanced.

Vemir [00:29:33]:
Okay. You are gonna bomb all of the people and the civilians for them developing it. What are you gonna do? Like, they have it now. What do you do? And their technology is very. I mean, we haven't even. I haven't even told you the markers that make this advanced. Right. But it's like, if they achieve this arbitrary point, which we have to approach, what are you gonna do? It's like you're playing chess with Kasparov.

Vemir [00:30:01]:
You. You're gonna win that game.

Sergiy [00:30:04]:
Or he'll.

Anatoliy [00:30:07]:
No, I mean.

Eldar [00:30:07]:
Yeah.

Anatoliy [00:30:08]:
I mean, so far, everything that you've said to me is saying that, like, it's something that is out there that, like, could be very good, but there's too many, like. Yeah. Too many bad variables that could cause it to be used for bad. Therefore, it should not be done at all because it has that possibility.

Vemir [00:30:26]:
I didn't make the conclusion that it should be not done at all, because I think, realistically, the idea of stopping human innovation is non. Like. You just can't. It's an. It's a non factor factor.

Eldar [00:30:43]:
Thank you.

Vemir [00:30:43]:
It's like you're. You're not gonna be able to stop that. So what do you do? And the answer is might. Maybe right there, that might be a great filter of humanity. Like, you know what the great filters are? Like, ice age. You know, like asteroids.

Eldar [00:31:01]:
Mother Earth.

Mike [00:31:01]:
Mother Earth is gonna. Mother.

Vemir [00:31:04]:
We might be our own self destructive mechanism. Like, we are.

Eldar [00:31:08]:
Like, we're almost there.

Sergiy [00:31:10]:
World War three is coming up, right?

Mike [00:31:11]:
You say, happen to the. To the pyramids? You know, those guys got to advance. And then they. Mother Nature stepped in and said, yo, no, no, no.

Vemir [00:31:21]:
Our psychology, our biology, our nature is so curious. Curiosity might kill the cats.

Eldar [00:31:27]:
Oh, okay. So, yeah, let's talk about that a little bit more. Why are not. Why the first place?

Vemir [00:31:32]:
What's that?

Eldar [00:31:33]:
Why AI in the first place?

Vemir [00:31:36]:
What do you mean? Why AI adjacent to what? In comparison to what? Like, why AI instead of what? Like tennis. What do you mean? Like, why AI instead of philosophy? Because philosophy doesn't help you solve protein folding. You know what I'm saying? No. Matter how much philosophy you do, you'll.

Eldar [00:31:57]:
Never why you need to solve protein folding.

Vemir [00:32:00]:
It helps people reduce suffering, longevity. I think health is a base, see.

Mike [00:32:06]:
I guess you know the way, not acceptance of death. Yeah, it probably, to me it sounds like because people are flawed, right?

Eldar [00:32:14]:
Or they want to extend the life instead of learning reason for trying to live.

Mike [00:32:20]:
Yeah, maybe instead of learning paranoia of dying, they just figure out a different way.

Vemir [00:32:24]:
Yeah, but even if you've really lived, I think you would want to really live more.

Eldar [00:32:29]:
For what?

Vemir [00:32:29]:
If I was enlightened, I think.

Eldar [00:32:31]:
Yeah, why wouldn't you? Why wouldn't you look forward to dying? I don't understand. Why wouldn't you?

Vemir [00:32:35]:
I wouldn't rush there.

Eldar [00:32:37]:
No, sure, you wouldn't rush there. Why would you? Why would you try to us?

Mike [00:32:40]:
But you're trying to break a process that happens in nature.

Vemir [00:32:44]:
Yeah, I don't know what natural means. Like, you guys make assumptions. Like, I know dying. Yeah, but why is it a natural? Why do you exercise at all? Why do you eat healthy rather than really, really eating ice cream?

Eldar [00:32:58]:
Age is also a natural phenomenon.

Sergiy [00:33:00]:
Right?

Eldar [00:33:00]:
You agree with that, right?

Vemir [00:33:01]:
But you're saying it's unnatural to have in vitro fertilization. How do you know that?

Eldar [00:33:07]:
How do you. I'm just. I'm just asking why.

Vemir [00:33:09]:
How do you know it's unnatural to invent protein folding, which helps you be. It's like building a wall for your house.

Sergiy [00:33:15]:
Right?

Mike [00:33:15]:
Those things don't exist in nature.

Vemir [00:33:17]:
Why are you using nature as an example? What is nature?

Mike [00:33:20]:
Nature's. They've been. They've been balling out for four and a half billion.

Vemir [00:33:23]:
You know, let me ask you a question.

Eldar [00:33:24]:
The earth is four.

Vemir [00:33:26]:
Let me ask you a question. About what?

Eldar [00:33:27]:
Nature.

Vemir [00:33:36]:
This is a sensitive topic because it's a miss. It's like a misspelling.

Mike [00:33:45]:
AI, what is.

Eldar [00:33:48]:
Okay, we're talking about. I asked the question, why AI in the first place? He said, hey, we're trying to prolong our life. We're trying to live longer, maybe potentially not die ever. Protein synthesizing and stuff like that, to be healthier. Is it because we're not accepting the natural order of life, which is dying? It is like. And then now he wants to counter.

Sergiy [00:34:06]:
Counter AI. And technology is an extension of a human.

Vemir [00:34:15]:
It's like, yeah, where in the line is it not natural? Because it appears like you can't get directly from a plant to a MacBook. Like I don't understand. Just because you don't see all the steps. It's all part of the universe and material nobody's doing, like, telekinesis, 3d printing. You know, 3d printing is just a lot of steps in the process. Yeah, I mean, think about, like, I love this analogy. I thought of this, like, you know, look at a beehive. You look at a beehive, geometrically organized.

Vemir [00:34:46]:
They're little houses, but you view that as part of nature. Easily, automatically. When they build dams, what are they? Beaver. Beaver and other animals. I think it's like you view that as natural. Why? When you look at a bridge, Golden Gate bridge, why is that not natural? Because it's more sophisticated. It's like we're just bees making our beehives in different ways. More complicated, more so.

Vemir [00:35:14]:
But if you zoom out, you look at New York City, you look at the skyline, it's just like, imagine you're an alien come down, like, wow. They formed from, like, iron ore. Well, I think steel. Like, they pulled it out of mines. They made all this. You wouldn't view it as unnatural. It's just you're. You're in the game.

Vemir [00:35:30]:
So you see the game as you're differentiating. Oh, I'm not a tree. It's just levels of separation. Yeah, but this is all.

Mike [00:35:39]:
It's just like, things don't in the nature.

Vemir [00:35:42]:
Your body hasn't calibrated to how fast our technology has moved. You know what I'm saying? Like, it's not natural to be on the phone for 10 hours a day or on a podcast. It's like, because our body's biological process is not synced up with our brains. Like, ability to make technological process. When trains first came out, they said, don't go on trains, especially if you're pregnant. You know, like, your baby will be ripped from your womb. Like, your eyes will come out of your head because your soul is supposed to move a certain length. So if you go on the train, your soul will be ripped out of your butt.

Vemir [00:36:24]:
All this crazy stuff because they were afraid of technology. But if you're aware of AI as a risk, you're not a Luddite, you're not like, anti tech. It's just, you're aware the stakes are getting pretty huge. It's getting more and more and more and more. And I mean, people ask, all these risks are all theoretical and stuff like that. It's like, aren't there clear psychological risks in the past ten years for social media and tech use? Like, isn't that documented?

Eldar [00:36:56]:
Sure. But the question is, right. The question is, sure it is happening. Is it definitely technological advances, stuff like that. And this is where we're going. And that's definitely the stretch of what our mind is capable of doing, and that's why we're doing it. Sure. But the question is, is it really necessary?

Vemir [00:37:11]:
What is necessary, this progress that you're.

Eldar [00:37:13]:
Talking about towards this type of advancement?

Vemir [00:37:16]:
This is the thing is like, or.

Eldar [00:37:17]:
Is it like you're saying it's inevitable and I hear you. Right, but at the end of the day, but I think at the end of the day, we're wrestling with the question as to why in the first place.

Mike [00:37:26]:
I think I would say I have.

Vemir [00:37:28]:
A theory about this, but I want to hear it.

Mike [00:37:30]:
I was thinking about the word purpose, right?

Eldar [00:37:32]:
Yeah.

Mike [00:37:32]:
You're not going to see a bee behaving like a frog. Right. A bee knows that what it has to do in its life. Plants know what they have to do in their lives. Animals, most animals, what their purpose is. Right? But humans, nobody, like, there's not like a pre programmed thing that we do. And I think in a way, maybe we're searching to find this purpose, you know, maybe not consciously, but I think that's because we don't have the answer to that question. Yeah, we're doing a bunch of weird shit, which I would say might not be natural.

Mike [00:38:02]:
Oh, you definitely doing weird shit.

Anatoliy [00:38:07]:
Humans are the only, I guess, animal that doesn't have a pre, like, like.

Vemir [00:38:11]:
Pre internal survivalistic mechanisms.

Eldar [00:38:16]:
Well, I'm not sure. I'm not sure.

Mike [00:38:19]:
I don't know.

Vemir [00:38:20]:
This is now levels of consciousness, right?

Eldar [00:38:22]:
Yeah.

Vemir [00:38:23]:
Awakening and. Yeah, this actually has a lot of overlap with AI, but I wanted to.

Eldar [00:38:30]:
That'S what we are.

Vemir [00:38:31]:
Touch on that point. The societal. This is why it's so interesting to me is all the questions of philosophy seemed airy at times, or seemed really grounded. Right. Based on the era, based on the type like stoicism or whatever versus analytical philosophy. But AI seems to be putting it right in our face, saying, okay, nobody has a purpose in work anymore. What happens when AI, if it can become conscious or appear as conscious? You've just created the aliens rather than them coming on a spaceship. It's like you built it now, something that is more capable than you are.

Vemir [00:39:12]:
So you're not in the first place of humanity, and there's implications of risk there, but it's like, you know, okay, I don't need to work, you know, where do you go? It makes more beautiful art that sparks the spirit. It hacks your mechanisms, if that's what.

Mike [00:39:30]:
It would, that's what it's gonna be primarily used for.

Vemir [00:39:32]:
Right. I'm just saying, like, philosophy is now directly being used.

Mike [00:39:38]:
It might be.

Vemir [00:39:39]:
And we have to reorganize our society by force or we're gonna implode. I mean, there's so many implications here, and I don't think people see it coming.

Eldar [00:39:49]:
You're very worried.

Vemir [00:39:50]:
I can't be worried because I can only put my full effort every day, and I can't control things. And I can control things. Like, I've mentally prepared myself to understand what I can do and what's outside of my control. A lot of really smart people in this field think we're fucked. And the rest of the world, like, some people who are AI scientists, like, quit their jobs and spending the next ten years on loans to just enjoy and ride out the trip. Like, they don't think we're gonna last 20 years for sure. They think it's probably closer to ten years.

Mike [00:40:21]:
Do you know about this evil? How many years do you think we got left for the next ice age comes?

Sergiy [00:40:25]:
I have no idea.

Mike [00:40:26]:
But if you were to take a guess, some people saying less than 20.

Eldar [00:40:31]:
From what was gonna cause it, like, they say, destruction.

Mike [00:40:35]:
Oh.

Sergiy [00:40:35]:
Like, AI was so. Not like, not. Not like nature.

Eldar [00:40:39]:
Yeah.

Vemir [00:40:40]:
It's gonna be.

Mike [00:40:42]:
AI make. May cause nature to intervene.

Vemir [00:40:44]:
I don't know.

Sergiy [00:40:44]:
I think this Covid shit is gonna kill us because.

Vemir [00:40:47]:
Pretty benign, though.

Sergiy [00:40:48]:
Kept reproducing new ones. It keeps reproducing.

Vemir [00:40:53]:
Yeah, there's that SaR. Not SARS. That's the COVID one. But there's. There's the h one. N one virus. There was a variant that killed 50% of the people that got it.

Eldar [00:41:03]:
Hey, we're gonna have a little intermission here, please.

Mike [00:41:07]:
Yeah, because he's been rambling for too long.

Eldar [00:41:08]:
Yeah, we gotta give him.

Vemir [00:41:10]:
No, I don't want to ramble. I know. I'm very honest about getting this.

Mike [00:41:16]:
I didn't know. He's so sensitive. I'm sorry, Vermeer.

Eldar [00:41:18]:
Like. Yeah, you.

Mike [00:41:19]:
Thank you so much.

Vemir [00:41:19]:
Is this not a safe space? What am I doing?

Anatoliy [00:41:21]:
Yeah, I think. I think it's not a troll to be concerned about this stuff.

Eldar [00:41:29]:
Kai, this is Vanira. Vermeer, this is Kaya. This is Serge's wife and baby. And they're not sick. They're pretty good, so nobody's gonna get sick.

Mike [00:41:38]:
They're also scared of a.

Eldar [00:41:39]:
That's a question, not a statement. Kyle, would you like to join us for a minute?

Sergiy [00:41:46]:
For a minute?

Vemir [00:41:48]:
Well, I mean, you don't have to really get into too much of the technical to just know. Like the big scale infiltration. That looks like a real.

Eldar [00:41:56]:
That's not Archie.

Vemir [00:41:57]:
No, that's in memorial.

Mike [00:42:01]:
You're not the first one.

Eldar [00:42:02]:
Kaya, quick question. Are you afraid of AI? Sort of. Yeah, you are. Why?

Mike [00:42:07]:
Serge? Ben?

Eldar [00:42:08]:
Why should I take control over certain situations? Are you just afraid because you saw some movies? No, I don't watch those kind of movies.

Sergiy [00:42:19]:
She's afraid because she wants to be in control.

Vemir [00:42:21]:
Oh, I think we all do. Right? Want to feel like we have some sense of control. Yeah, but what kind of situations do you mean? Like, it can be already, like, the.

Eldar [00:42:30]:
Way that the media, the marketing, everything is sort of already taken. Yeah, like, what's the point? Everybody's zombies already.

Vemir [00:42:37]:
She's right.

Eldar [00:42:39]:
I mean, now I'm working with a.

Anatoliy [00:42:40]:
Company who is going to be.

Vemir [00:42:42]:
Is already putting a solution into tvs.

Eldar [00:42:45]:
Same as you have in your phone.

Anatoliy [00:42:46]:
So you're talking at home and all.

Eldar [00:42:48]:
Of a sudden it's infiltrating your television with the type of.

Vemir [00:42:52]:
Oh, like monitoring and algorithms that use your info for recommendation systems and selling.

Eldar [00:43:00]:
Back everything they hear.

Vemir [00:43:02]:
Yeah, there's this machine learning guru guy. His name is Michael I. Jordan, and they call him the Michael Jordan and machine learning, but, like, he was born before the basketball player. So Michael Jordan is the Michael I. Jordan of basketball? Chronologically. Right. So this guy is serious. He says, like, this is like, straight up exchanging human flesh for money.

Vemir [00:43:26]:
Like when they use, like, shitty algorithms on Google to sell you shitty products on Facebook or whatever else. It's just the worst use of our brain and information. And he's really, really against what we do in the advertising system.

Eldar [00:43:40]:
We're getting pimped, you say, easily and.

Vemir [00:43:43]:
Constantly, and it almost seems like a rape of you. Like, in a way.

Mike [00:43:48]:
But if you're pretty.

Vemir [00:43:49]:
But you're consenting to it, right? You're all consenting to it.

Eldar [00:43:52]:
Isn't it justified?

Vemir [00:43:55]:
It's explainable when you look through biology, but it's not justified. Huge difference.

Eldar [00:43:59]:
Yeah, okay, fine.

Vemir [00:44:00]:
It's like when someone tells you about why they did something horrible. That's an explanation, not an excuse. I don't think there's an excuse for why we're still stupidly power hungry. And the TikTok and the link tree, which leads to the onlyfans and all that stuff. Sorry, but it's like, you know, I mean, whether that's good or bad is near the hero there. It's like I'm saying we're mostly still biologically fueled. But I disagreed with my friend. He's like, electrical engineer, and he says we're always going to follow these biological things.

Vemir [00:44:33]:
That's how we're. I've said it's like a. Definitely a pull, but it's transcendible. Like, you can transcend these biological mechanisms. That's our gift as humans that bees and weasels and stuff don't have. We have, like, a potential to awaken more, I think, you know?

Mike [00:44:51]:
Yeah. But maybe, you know, maybe we're not meant to stick around forever, like, uh, because of our, I guess, capacity or lack of, like, capacity, you know, like the ice ages. They happen, they will go. Somebody got wiped out then that, you know, the dinosaurs got wiped out. Right. Different. These things come right, you know, for whatever reason, and the civilization gets wiped.

Vemir [00:45:14]:
Out, you know, like, for, like, you know.

Mike [00:45:20]:
Probably can't.

Eldar [00:45:21]:
Yeah, throw a new guy. Hurricane. Trump.

Mike [00:45:27]:
Trump.

Eldar [00:45:27]:
Trump.

Vemir [00:45:28]:
Trump. Trump.

Sergiy [00:45:28]:
Trump.

Vemir [00:45:28]:
Trump.

Eldar [00:45:28]:
Trump. Trump. Way.

Vemir [00:45:29]:
We're gonna bomb it.

Mike [00:45:34]:
Can you stop? What was that flood that killed, like, got crazy amount of people? That was like a huge flood. No, no, there was a flood, like, in bank Thailand or.

Eldar [00:45:45]:
Yeah.

Mike [00:45:46]:
Tsunami.

Eldar [00:45:47]:
Yeah. Killed, like, how many thousand people?

Mike [00:45:49]:
200, 300,000 people.

Eldar [00:45:50]:
Yeah.

Mike [00:45:50]:
How are you gonna stop that?

Vemir [00:45:52]:
Well, I mean, 20 years ago.

Eldar [00:45:54]:
Yeah.

Vemir [00:45:56]:
Elon would say.

Eldar [00:46:00]:
Turkey.

Mike [00:46:00]:
Yeah.

Anatoliy [00:46:00]:
A hundred thousand almost.

Eldar [00:46:02]:
Yeah.

Mike [00:46:02]:
Wow.

Sergiy [00:46:03]:
Nobody killed it. Poor construction.

Eldar [00:46:05]:
Those houses, they don't have earthships. We get it.

Sergiy [00:46:08]:
They all violated construction. There was one place that didn't have anything. No buildings went down because the mayor of that town said there will be no corners cut. Some people were saying in people's apartments, to get extra space, they were cutting into the structural beams of the buildings. So imagine one guy did it, and then four, five, six guys in the same building cut into a structural beam.

Vemir [00:46:30]:
You're talking into the arm, by the way. You might want to flip the camera.

Anatoliy [00:46:34]:
Yeah, it sounds like what happened then.

Mike [00:46:35]:
Like, is supposed to happen if you do. If you're supposed to get more money, but results.

Anatoliy [00:46:43]:
Same thing with AI, I guess, right. If humans want to continue, like, developing this, then, like, you have to be all right with the consequences.

Vemir [00:46:50]:
But there might not be anybody left to be all right or not, but that might be what existential risk. And what's cool is.

Eldar [00:47:00]:
He'Ll be fine through ice age.

Vemir [00:47:02]:
Yeah. And your beetle, lazy Ashley. But it's like, this is why I'm so passionate about this field is like, if we get this right, the amazing, like, it's.

Mike [00:47:15]:
What's the chances that we. So dumb.

Anatoliy [00:47:17]:
Get this right.

Vemir [00:47:18]:
This is the question. I'm gonna ask you guys a question.

Anatoliy [00:47:20]:
Yeah, it sounds like there's no way we get this.

Eldar [00:47:22]:
Let me ask you a question.

Anatoliy [00:47:23]:
Yeah. We're not supposed to.

Vemir [00:47:24]:
You are Eldar. I'm Vermeer. We're sitting here. We're young ish and, like, hard working or intelligent or whatever else we have energy. But, you see, China could create these huge AI dog robots and take over Taiwan. Or, you know, like, Azerbaijan has the. It's not because I'm armenian. I'm saying this.

Vemir [00:47:43]:
It's, like, the first documented case. They use drones that didn't have a human in the loop decision making. They picked kill spots, and they killed people in Armenia based on news. Based on what?

Mike [00:47:59]:
Based on news or history?

Vemir [00:48:01]:
No, the drones came and picked the kill spots.

Mike [00:48:05]:
Based on what?

Sergiy [00:48:06]:
AI. Right.

Eldar [00:48:07]:
AI.

Mike [00:48:07]:
And what was the AI based on? Like, I don't know.

Vemir [00:48:10]:
The intense details, but the whole point is there was no human making the decision of who to kill. To me, that's insane, right? It's already happening.

Eldar [00:48:19]:
Yeah.

Vemir [00:48:19]:
So it's like you're thinking, okay, I'm one human, but one human can have a huge impact. Like, there's Gandhi or whatever you care about. Who are you to influence the entire culture of AI? But it's like, I realize that if you don't do something, if there's not. Actor act. Oh, you headed out? Oh, no, it's okay.

Eldar [00:48:39]:
All right, guys. Thank you.

Mike [00:48:41]:
All right, guys.

Eldar [00:48:41]:
Bye.

Mike [00:48:43]:
Nice meeting you.

Eldar [00:48:47]:
Oh, you. Come in.

Vemir [00:48:52]:
Do you get. My point is, like, I actually wake up every day whenever I read something new on AI safety ads. Like, I have an urge. Like, I don't see.

Eldar [00:49:01]:
You found your purpose. Yeah.

Vemir [00:49:03]:
I mean, it seems to be a real purpose.

Eldar [00:49:05]:
And you like waking up scared.

Vemir [00:49:07]:
No, I'm waking up, like, motivated charge.

Mike [00:49:10]:
How can you. How can your purpose be to stop something that's inevitable? Inevitable or justified, even. Or right to happen?

Vemir [00:49:17]:
Hmm.

Mike [00:49:18]:
Is it?

Eldar [00:49:19]:
No.

Vemir [00:49:19]:
Because you can't guarantee the outcome. It's very likely, yeah.

Eldar [00:49:25]:
Well, if you attach yourself to the purpose of assuming that you have the particular knowledge in order to make an impact.

Anatoliy [00:49:32]:
No, but what. What is your purpose when it comes to AI?

Eldar [00:49:36]:
Warn people.

Vemir [00:49:37]:
I have a theory. Enough about the overlap of AI and spirituality. Like, I think ethics in AI is like the blind leading the blind. Though I'm not going to say that. I know right now the base principles. I can tell you we can be less wrong about them. And, like, when we automatically program bias into machines and there's, like, we're not even trying, you know, and, like, there are certain companies that are focused on safety. And I feel like in 1975, the other Asilomar conference, Asilomar is just a place in California, so they just named it after the conference for the places.

Vemir [00:50:18]:
There was a conference in the scientific community with recombinant DNA. It's like changing genes. And the scientific community came to a consensus and they did moratoriums. They stopped research on stuff. They work together. They shared stuff. Transparency, it set a lot of fertile ground for what's possible when you have a consensus, because regulation doesn't seem to be enough. It's rather, you want people to all agree on something.

Vemir [00:50:44]:
Right. And work together. So I think there's a lot of coalition building that needs to happen and policymakers and key stakeholders who need to be influenced and understand there's a lot of work to be done. But, you know, tile, you know, but it's like, it's also nice to share the benefits. Like, you know, those sixties spiritual revolution where people just. Hippies smoking joints and having sex and dancing. It's like, imagine the whole world didn't have to work, was illegal to starve because all the hydroponics and aeroponics is working like everybody's joy. It, like you're creating a space for people to become happy.

Vemir [00:51:31]:
Like, it's amazing. AI washes all the windows.

Eldar [00:51:34]:
You know, he wants the utopia style.

Vemir [00:51:37]:
It's possible if we coordinate correctly and the AGI doesn't take a thing of its own and destroy us of its own accord. And there's all.

Anatoliy [00:51:45]:
Let me ask you this.

Vemir [00:51:46]:
In this kind of world, very unlikely, though.

Anatoliy [00:51:48]:
In this kind of world, are all people equal?

Vemir [00:51:52]:
Well, I think we are all equal. We have these.

Sergiy [00:51:54]:
No, no.

Anatoliy [00:51:55]:
In this world that you talk hierarchy.

Vemir [00:51:58]:
In general, we just create hierarchies.

Anatoliy [00:52:00]:
Yeah, I feel like the way that the world is right right now, like.

Vemir [00:52:04]:
Is the blind or veiled illusions. Well, I think there's people above people, but.

Anatoliy [00:52:09]:
Yeah, yeah, sure. Yeah.

Eldar [00:52:11]:
Okay.

Anatoliy [00:52:11]:
But I feel, like, really convinced by it with. Yeah, like, I feel like you're. You're trying to. Well, like, this world we're talking about, like. Like, greed doesn't exist in it.

Sergiy [00:52:25]:
Right.

Anatoliy [00:52:26]:
And, like, stuff like that.

Vemir [00:52:28]:
Yeah, I feel like it's not communist either.

Anatoliy [00:52:30]:
Yeah, yeah, I feel like the conversion of that is like, as long as there's opportunity for, like, greed or, like, hire and stuff like that. I feel like.

Mike [00:52:40]:
Well, he's. Maybe he's saying that the robots will take away the opportunity for being greed because you're not gonna have a job, so you just gonna be. Everybody's gonna live in their best life.

Anatoliy [00:52:47]:
Then you're talking about applying, like, the robots, being able. Able to apply the discipline and punishment.

Eldar [00:52:52]:
Right, to maintain order.

Vemir [00:52:53]:
Now we're talking about when AI surpasses human capabilities, how will it act? Think about all the mind spaces possible, like, you, you. We all have different minds. They kind of overlap with ideas and. Right, but we're kind of similar flavors of ice cream. But it's like AI can be a completely different mind space. Unthinkable of what it would behave or think about or it's just like we have not. And I think it's nearly impossible to map out mind design, like the way you design a mind. Can you imagine how hard it would be to get it right, to get.

Vemir [00:53:35]:
To design an AI, to become conscious and then want to take care of us in a way that. I mean, it's like, that's what we really hope for. Because once you pass the singularity point, once you pass the point where it can improve itself, where it's smarter than human capabilities, where it has control of whatever it wants to, like, it's game over. Our fate is sealed. We don't have a choice after that.

Eldar [00:53:58]:
Yeah, but that's not possible because at the end of the day, who's building AI? Right. Not enlightened.

Vemir [00:54:03]:
Osho, this is the key point of what I'm saying is people are running towards a cliff and promises inevitable. Yeah, and, like, the thing is, if. If it. If we're creating stuff with bias, not aligned with human values, we don't even know what human values are.

Mike [00:54:21]:
How do you do this? How do you break this?

Vemir [00:54:23]:
Do you see the problem?

Eldar [00:54:25]:
Well, 100%.

Vemir [00:54:26]:
The ideal person is an enlightened AI engineer.

Eldar [00:54:28]:
We see the problem. We're not. We're just not sure whether or not we. Allowing the problem to affect us in the way. Maybe it's affected you or gave you a certain purpose to save humanity. Here.

Vemir [00:54:38]:
You don't want this.

Mike [00:54:40]:
No, no.

Eldar [00:54:40]:
You do want.

Vemir [00:54:41]:
No, you're trying to.

Eldar [00:54:42]:
I don't. I don't care what happens, kind of.

Vemir [00:54:44]:
But it's going to affect you. You can't see it yet.

Mike [00:54:47]:
I get when you're dead and nothing's going to affect you. I guess we don't know what's to come.

Eldar [00:54:53]:
Yeah. Yeah.

Vemir [00:54:54]:
In the end, the whole game, nothing matters, but it really kind of. You can suffer until you move on to the next stage.

Eldar [00:55:00]:
Right. Well, depends how. How far your mind. Yeah, just right.

Mike [00:55:04]:
The suffering. Yeah, I guess. Yeah, depending.

Eldar [00:55:06]:
That's also relative.

Mike [00:55:07]:
Yeah.

Vemir [00:55:07]:
Okay. But, I mean, I don't think you're making a strong argument by saying, like.

Eldar [00:55:12]:
I'm not making a strong argument for humanity. Absolutely not. I think it's what you're maybe trying to do is slow down the natural selection process, and I'm not sure if that's possible or needed.

Anatoliy [00:55:27]:
Needed?

Vemir [00:55:28]:
Okay, let me ask you a question. Even if the next step in evolution is for AGI to replace us, I.

Eldar [00:55:34]:
Mean, we're not even talking about. Okay, AI is a problem. We're talking about a potential. I don't know if this is. This conspiracy theory is correct now or it's proven now. Covid was created in the lab.

Mike [00:55:44]:
Yeah.

Eldar [00:55:45]:
By a human.

Mike [00:55:46]:
Yeah.

Eldar [00:55:46]:
Right. That same individual that wanted to come and shoot up the school.

Mike [00:55:49]:
Yeah.

Eldar [00:55:49]:
Said, you know what? I'm just gonna shoot up the world. Yeah.

Vemir [00:55:51]:
Same individ. Wait, you thought it was designed on purpose to hurt people?

Eldar [00:55:55]:
Because this is the theory, right?

Vemir [00:55:58]:
The lab leak theory is not a purposeful to release it. It's called a lab leak.

Eldar [00:56:03]:
Correct, but.

Vemir [00:56:04]:
So they.

Mike [00:56:05]:
Why are people building these things?

Eldar [00:56:06]:
Yeah.

Vemir [00:56:06]:
Oh, this is another accusation. Level four level lab. Yeah, but that doesn't mean that there's malevolent actors. Like, if you understand, in laboratories, they create variants of disease to make them stronger. As, like the h one n one virus. That's what I was gonna bring up before they had one that.

Eldar [00:56:22]:
Yeah, no, no, I'm explaining. I am explaining to you that the individuals that creating this in the first place are doing it for whatever reason that they're doing it, are under assumptions that their intent is a good one without the implications of understanding that the accidents could happen. Same millions of people can die. What I'm saying is that. You know, what I'm saying is that AI is another virus, is another school shooter. But you know what I'm saying, which.

Vemir [00:56:47]:
Is leading two things, like you said, the same guy who would shoot up a school. No, it's not true. The guy who shoots that person, who shuts the school, he's in a lot of pain and he wants third people. The guy in a lab creating stuff. He's not doing it on purpose.

Eldar [00:57:00]:
No, I am telling you, that guy. That guy is also in pain. It's just we have esteemed that guy to be a researcher. He's researching some fucking.

Vemir [00:57:08]:
Maybe he's curious.

Eldar [00:57:09]:
Why is he curious?

Vemir [00:57:11]:
Intellectual cure.

Eldar [00:57:12]:
You know what I'm saying?

Vemir [00:57:14]:
But it doesn't mean why. Why he wants to hurt people on purpose.

Eldar [00:57:17]:
No, but if you have a desire to be curious, it's not. I'm not saying great. I'm saying my topic. I'm saying that there's an ethical consideration and being curious, especially on a level of you knowing that if this comes out, if this virus comes out, this can do some crazy ass damage. There's a possibility of that.

Anatoliy [00:57:37]:
There's a big responsibility.

Eldar [00:57:38]:
There's a huge responsibility. The nazi individuals who just gonna say, you know what? I don't care about that. I'm curious, bro. Let me look. Mm hmm. What are we talking about here?

Vemir [00:57:52]:
Okay.

Mike [00:57:52]:
Yeah.

Eldar [00:57:53]:
People.

Vemir [00:57:55]:
You're saying these people are not considerate enough of others.

Eldar [00:57:59]:
Correct. And therefore they're more dangerous than the school shooter, where school shooter can do, only could wipe out 100 at one time. This individual who has not understood the power of their own mind and what they're capable of because of ethical considerations. He fucks up the whole world without even knowing. And that is very dangerous.

Vemir [00:58:16]:
Intention. Intention is a really important thing. Like, for example, if I am, if I say a joke and it's about suicide, and someone in the room lost their brother to suicide, I hurt them, but it's unintentional. But still, if I know your brother committed suicide and I ping you on it, and I want to hurt you, and that's a pain point. That's different. It's the same way a school shooter wants to fucking kill people on purpose. But, like, I think the lab thing as dangerous and as multiplying of an effect. It is.

Eldar [00:58:52]:
Yeah.

Vemir [00:58:53]:
His intention is different, which might have worse consequences, but you have to consider intention.

Eldar [00:59:00]:
I think, like, yeah, you're undermining. You are undermining the dangers of curiosity, bro.

Vemir [00:59:06]:
I actually. You undermining really important thing to explore, but you have to start on. On the right place. Which is saying. Saying it's like school shooters is misplaced. It's not that you're being offensive to me. I'm just saying it's misplaced in that school shooters have some manipulation of the mind.

Eldar [00:59:26]:
Correct.

Vemir [00:59:26]:
And people on purpose.

Eldar [00:59:29]:
If it's a mental illness, right. If it's a mental illness that's clouding their judgment and they're not knowing what they're doing, right. Because they're under a spell.

Vemir [00:59:36]:
They don't know better.

Eldar [00:59:37]:
They don't know any better. Right. That's the curious mind is the same thing. They don't know any better. So they're not supposed to see the consequences after the fact that it happens. They wake up and they're like, wait, what have I done? I'm remorseful now.

Sergiy [00:59:49]:
Okay.

Vemir [00:59:49]:
That's like a kid who's a bully.

Eldar [00:59:51]:
I'm saying if you're asleep, right? If you asleep, you asleep. Yeah.

Mike [00:59:54]:
So you're saying that if you intentionally killed somebody, you shouldn't go to jail because you accidentally, like in that movie.

Vemir [01:00:00]:
There'S manslaughter versus OJ.

Mike [01:00:02]:
So then doesn't matter the intention, you still go to jail, right?

Vemir [01:00:05]:
No, it matters.

Sergiy [01:00:06]:
Of course, the many years you get it. If you purposefully kill somebody, you could get you, you will get life in America.

Mike [01:00:12]:
But if there was no intention, should you be set free, then, no, you.

Vemir [01:00:15]:
Will not be man slaughter like you kill somebody in a car accident.

Mike [01:00:18]:
Yeah.

Sergiy [01:00:19]:
You only get a few years or. No, that's only because of the system.

Mike [01:00:23]:
Right?

Vemir [01:00:23]:
Okay.

Eldar [01:00:24]:
Yeah.

Vemir [01:00:24]:
I'm not saying the system is correct.

Sergiy [01:00:26]:
But I'm saying even the family of the person could say, hey, we have nothing. You know, it was an accident. We forgive him and everything. It's fine. But the system will still go after you because everybody got to get paid.

Mike [01:00:38]:
I intention, I guess it's important, but it's still not to a certain degree. Right.

Eldar [01:00:43]:
Well, in terms, what if the leak.

Sergiy [01:00:45]:
Was purposefully done into.

Vemir [01:00:47]:
Well, then we have a real illuminati, whatever shit on our hands.

Mike [01:00:51]:
I think regardless, doesn't minimize what actually happened.

Sergiy [01:00:58]:
I think it purposely curious apple.

Vemir [01:01:01]:
Right. But I think people. Wait, what was I going to say before the lab thing? What was the last thing you said?

Mike [01:01:08]:
I was saying unintentional versus an unintentional.

Vemir [01:01:13]:
Let's ask ourselves about AI. These engineers, I don't think are trying to be biased or racist. But in 2015, Google had an image recognition software. It conflated black people with gorillas. And do you know what their solution to this problem was? They stopped identifying gorillas. It's a ridiculous solution. Like, I don't think any. I don't think a wide, vast majority of AI engineers want to hurt people.

Vemir [01:01:42]:
I think they're just maybe intellectually curious.

Eldar [01:01:44]:
But if you make. That's the thing, right? And this is the end of the day, I'm gonna tell you right now. At the end of the day, so we have conscious. If you don't consciously ask them why, why, why? Get to the point of whys and clearly find out that there is going to be a measure between risk and reward here, you quickly find out that a lot of people will be like, okay, cool. Now I really understand the implications of what I'm about to do.

Mike [01:02:06]:
Yes.

Eldar [01:02:07]:
That's the problem now, if you want to throw the curiosity in this, you know, with intent, you know, some kind of backing with, like, I have good intentions for the world or whatever, that's horseshit.

Vemir [01:02:18]:
I do have to go in like five or ten or just want to give.

Eldar [01:02:22]:
Yeah. So I think we're, I think we're on your side here, ultimately.

Vemir [01:02:25]:
What was the last sentence you said about intentions? It doesn't matter.

Eldar [01:02:29]:
It doesn't matter if you, if it's coupled behind curiosity that you're curious, but you're like, oh, no, but like I'm trying to do right here by the world. Right. Okay.

Vemir [01:02:37]:
I want to think about this because there's a.

Eldar [01:02:39]:
If you have not, if you, if you really understood the implications behind the type of genie that you're about to unveil. Right. The atomic bomb.

Vemir [01:02:47]:
Yeah.

Eldar [01:02:47]:
Right. Yeah. We're still regretting that decision today.

Vemir [01:02:49]:
Like the atomic bomb multiplied by an atomic bomb.

Eldar [01:02:53]:
You know what I'm saying? So those scientists that create. Who was part of that project, right?

Vemir [01:02:57]:
Ben Heimer. Einstein.

Eldar [01:02:59]:
Correct. And what they said, and what they said, he said, yo. Then later in their life they realized, like, holy shit, I was part of something.

Vemir [01:03:05]:
What I've done.

Eldar [01:03:07]:
Correct.

Anatoliy [01:03:07]:
And who said that?

Vemir [01:03:09]:
Oppenheimer.

Eldar [01:03:10]:
The guys that worked on the. Adam. Adam, Bob. I think Einstein work on it.

Mike [01:03:15]:
Yeah.

Eldar [01:03:15]:
Right. They came, they came to a conclusion. Like, wait a second. What have we done?

Vemir [01:03:19]:
I have.

Eldar [01:03:20]:
What did we let out?

Vemir [01:03:21]:
Yeah.

Eldar [01:03:21]:
Really into the world.

Vemir [01:03:23]:
They said the guy looks down. The guy looks down, an interview and he quotes, like, the Upanishads or something. Like, I really don't want to misquote, but he says, like, I am become.

Sergiy [01:03:34]:
Death, destroy the world.

Eldar [01:03:37]:
You understand?

Vemir [01:03:37]:
Like, he had never let go of that until he died.

Eldar [01:03:41]:
You understand?

Vemir [01:03:42]:
How significant. So imagine what was he doing?

Eldar [01:03:44]:
Under what spell or trance was he on when he was doing this? Did he.

Anatoliy [01:03:51]:
Say that? Did he talk about that? What, like, what led him to do this to begin with?

Sergiy [01:03:57]:
Well, he was hired. It was either Germans or.

Eldar [01:04:01]:
Yeah. Some people say you are forced to do it.

Vemir [01:04:03]:
Nazis are taking over. They're gonna fuck up the world or whatever the time.

Eldar [01:04:07]:
Yeah, the enemy.

Vemir [01:04:08]:
Like, you're a genius. You have a responsibility as a citizen to help us. Are you gonna help us? And also we'll give you a million a year.

Eldar [01:04:16]:
There you go.

Vemir [01:04:16]:
Who's gonna hold everybody? You know? I'm saying, like, money, toast, bullshit works.

Eldar [01:04:22]:
But then there's a guy who the basic help to help with the disease. And it came to him like, don't release this vaccine for everybody. We're gonna give you a lot of money. He said, no, this supposed to be for everybody. He didn't take the money. He gave it for free.

Vemir [01:04:34]:
Are you talking about the guy who invented penicillin? I think.

Eldar [01:04:37]:
Is it penicillin or polio? Some. Some kind of a vaccine for some.

Vemir [01:04:39]:
Yeah, it was, um. Like. Yeah, no patent.

Eldar [01:04:42]:
Correct. You understand?

Vemir [01:04:47]:
Like, this is a people with honor, integrity.

Eldar [01:04:50]:
All right.

Vemir [01:04:51]:
Like, this is the part.

Eldar [01:04:52]:
This is part of. Part of life's balance and life thing. AI is yet another fucking thing.

Vemir [01:04:57]:
But again, it has the unique potential.

Eldar [01:04:59]:
Which is, which to a degree is necessary because, you know, we haven't proven to ourselves yet to humans that we don't learn from pain. Yeah.

Mike [01:05:09]:
The thing is, I guess it might be that how we're going to the society might be past the point of saving where.

Eldar [01:05:15]:
No, that's very gloomy, but that's.

Mike [01:05:17]:
No, no, I'm saying, like, with the.

Eldar [01:05:18]:
AI, you're doing this in the first place to save him from fucking having nightmares all night.

Mike [01:05:23]:
But I think in that case, like, the AI thing.

Eldar [01:05:26]:
Yeah.

Mike [01:05:26]:
Sounds like AI may feel like that people are beyond saving, so.

Vemir [01:05:31]:
Well, it's definitely putting a mirror up to humanity.

Mike [01:05:34]:
Yeah.

Anatoliy [01:05:35]:
AI is saying that we need something to think for us.

Eldar [01:05:38]:
Well, that's the unfortunate part.

Vemir [01:05:40]:
That would be nice if I could think for.

Eldar [01:05:42]:
See whether he's saying that he can't think for himself. Yeah. He's not trusting the world to think properly. Therefore, he'd like somebody who's a little bit smarter to drive to tell him what.

Anatoliy [01:05:51]:
What to do.

Vemir [01:05:52]:
No, I'm saying that, like, intellectual activities may not be necessary if they can be done by another human, which gives you the freedom and range to develop your own character. One thing Sadhguru said about AI directly, which I really liked, is he said, this is a nice framework, actually. He said, back in the day, if you were a very strong man, very physically capable, you could conquer the world. You were smart, strategic, whatever else, physically based. Now the best job you can get is a security guard, protect the president if you're physically strong, or a bodybuilding champion, whatever. Now it's the nerds run the world. CEO's biggest, richest people on the earth, most dominant, smart people. After AI takes over the intellectual capacities, what's left? Being a good human being.

Vemir [01:06:42]:
It's the next core.

Eldar [01:06:43]:
Philosophers.

Vemir [01:06:44]:
Yeah, philosophers, spiritual leaders, creative people.

Eldar [01:06:48]:
Then you're supposed to promote the shit and let them get the. Get the shit wiped out as soon as possible.

Vemir [01:06:51]:
That's what I'm saying.

Eldar [01:06:55]:
That'S what he's after. He's trying to say, like, your warnings, warnings, but actually, like, hurry up, hurry up, get this done so I can stay here by myself.

Vemir [01:07:02]:
I'm saying the truth is, if we do it correctly, everybody's gonna be in competition of how good of a human being you are. Right? There's gonna be no more.

Mike [01:07:10]:
It's gonna be either utopian society world, or it's gonna be no world.

Eldar [01:07:14]:
It's gonna be everyone walk around saying blessing, blessing. Or what do they call it?

Anatoliy [01:07:18]:
Blessings.

Eldar [01:07:18]:
I appreciate you, man.

Vemir [01:07:20]:
Yo, I appreciate you.

Eldar [01:07:21]:
Yeah, I appreciate you.

Vemir [01:07:22]:
That's what, like, imagine how fun it would be.

Eldar [01:07:25]:
Yeah.

Vemir [01:07:25]:
If like, you know, I mean, the French are way ahead. They love to socialize, they don't like to work. So I'm just stereotyping. But like, you know, it's like some communities are more human centered rather than driven towards this seemingly endless money status.

Eldar [01:07:43]:
Start writing a book or on competition of how to become good.

Vemir [01:07:47]:
Well, I mean, there's part of it.

Eldar [01:07:49]:
A competition, like a game to become good. Yeah, it'll be a hit when AI comes out.

Vemir [01:07:54]:
In my, in my heart, I feel like I have some unique insight. Like, I don't think a lot of AI researcher hyper rationalists have studied or released biases or tried to become awakened. Like, they exist for sure, but it's rare. And then a lot of spiritual people don't understand, like AI, they might just dismiss it, right? So it's like this thin line, this conflate this, like, overlap. If we bring the wisdom and the knowledge together, I mean, we have our best mathematical possible chance, but otherwise we're just kind of creating a huge mess and then I guess it resets or whatever else. But it would be nice to see, like, AI making a movie, right? How cool would that be?

Sergiy [01:08:40]:
That would be cool.

Vemir [01:08:40]:
You know, we can observe these beautiful things. Like, AI makes the next level of Michelin star restaurants. Like it collects all the kind of nutrients and it creates new plants that are healthy and aeroponic, and it creates these new flavors you never could have experienced by yourself. Like, imagine how many opportunities and possibilities there are in the world for us to enjoy.

Eldar [01:09:02]:
And maybe AI, you're trying to sell it, but I'm just saying, I have not exhausted. Not yet.

Anatoliy [01:09:11]:
The way he's talking about is like just nothing to do right now.

Eldar [01:09:13]:
Yeah, I don't get bored. I love everything like this shit. I'm fascinated by so many things. Like, I'm just alone, fucking AI. Like, throw out my head around that, bro. I'm so behind, bro, I don't even know what the fuck TikTok is. You know what I mean? So, like, because I'm still stuck. And look at grass.

Eldar [01:09:26]:
Looking at grass grow, bro.

Vemir [01:09:28]:
If you want to create a society where people like Eldar can have the maximum amount of time every day to do and explore and live healthily and learn and awaken and all that stuff, right? It's like, it. It directly intersects. I hope you're right with what I love about Osho. He says he's the rich.

Mike [01:09:48]:
Yeah, I'm not sure if that. If it's gonna be, like, a by default.

Vemir [01:09:51]:
Not saying.

Mike [01:09:52]:
Yeah, I'm not sure if we can hack it.

Vemir [01:09:54]:
Buddha was.

Mike [01:09:54]:
I think you. You have to find it yourself. You have to go and discover for yourself.

Vemir [01:09:59]:
No, I need the space to be able to do it. Like, Osho has a really controversial get.

Eldar [01:10:06]:
The money for us, then do carry phenomenon, right?

Mike [01:10:09]:
Yeah.

Eldar [01:10:10]:
Get all the money in the world miserable.

Vemir [01:10:13]:
So imagine if everybody got all these things that they thought, and then they have the clear space. Once your mind is programmed to think about money, status, money, status, hamster wheel, you're never gonna get.

Mike [01:10:25]:
I don't know, bro. I'm not sure if it's gonna be.

Eldar [01:10:28]:
Part of a process that's super necessary.

Vemir [01:10:32]:
Yeah. Why don't you find value artificially manufactured?

Anatoliy [01:10:36]:
Why don't you find value in, like, in self discovery?

Vemir [01:10:40]:
Self discovery is the journey. I'm talking about you making money on the grind. Next house. It's all castles in the sky.

Mike [01:10:52]:
Yeah, I don't think. I don't. I'm not sure if we're gonna have all this free time. No, we're all.

Vemir [01:11:01]:
I didn't say bad. I'm saying that it's, like, the wrong target. You're pointing arrow at this target when the real. Because I'm gonna give you what you want.

Sergiy [01:11:09]:
Says who?

Vemir [01:11:10]:
You think it's going to satisfy you when you finally.

Eldar [01:11:13]:
What if you're not deserving of it, though? What if you're not deserving of it?

Vemir [01:11:16]:
No, I'm saying that who are you.

Anatoliy [01:11:17]:
To tell somebody that that's not gonna make them happy?

Vemir [01:11:20]:
Okay, I'll tell you meditation, and I'm directly quoting again, my boy. Meditation is the only door that someone has entered that they haven't come back out of regretting it. How many people have you seen go into the door of money, status, drugs, sex, power. Someone always comes out regretting it.

Mike [01:11:44]:
But how you got.

Eldar [01:11:45]:
Meditation is used to promote a cult yeah, that's not.

Mike [01:11:52]:
But how are you gonna convince a person that really.

Vemir [01:11:56]:
Example.

Eldar [01:11:57]:
No, what I'm saying about pure money, pure power.

Vemir [01:12:01]:
But you gotta talk about pure meditation.

Eldar [01:12:03]:
Then you gotta understand that somebody's putting people onto those meditations. And sometimes the culture leaders are the, you know, using that as a.

Vemir [01:12:09]:
Do you think they're the best example or.

Eldar [01:12:11]:
No, they're not. But it happens.

Vemir [01:12:12]:
It happens. But I'm saying the wrong door of meditation. Just like the door of exercise. I mean, you could say the door of exercise.

Eldar [01:12:20]:
Yes. I think the pure door of focus for money is also a door which, if it's pure focus, you will find happiness within that focus.

Vemir [01:12:27]:
No, money is not a set. It cannot satisfy.

Eldar [01:12:30]:
I agree. But the focus to get money is the satisfaction. But not a lot of people go through that. They might be under the impression that they go in for the money, but they actually are craving the focus that they need in order to really actualize themselves.

Vemir [01:12:42]:
And I'm saying that that's actually doors meditation. No, it's not. Because it's like, kind of like this. Like you're an intense person.

Eldar [01:12:50]:
Yeah.

Vemir [01:12:51]:
You want to become an artist. You go to art school, but you're not really innovating like Picasso is. You get rejected from art school, so you go into politics.

Eldar [01:13:00]:
Good.

Vemir [01:13:01]:
That intensity puts you into the Third Reich, and you become Hitler. It's the right intensity, wrong direction or intention. The same way, I think. Really intense people, like, they say. Actually, like all the spiritual leaders, were very horny. Very high levels of energy. If you channel it into meditation, what's the goal of meditation? To become awakened. If you channel into money, you'll get really rich.

Vemir [01:13:27]:
But after that, there's people who are rich who kill themselves. Nobody who's enlightened killed themselves.

Anatoliy [01:13:32]:
Yeah.

Mike [01:13:32]:
But again, it's how you utilize those things.

Eldar [01:13:35]:
Not a physical realm, though. Yeah.

Vemir [01:13:37]:
Beautifully said.

Eldar [01:13:38]:
You're welcome.

Vemir [01:13:38]:
But you know what I'm saying.

Eldar [01:13:39]:
Yeah, you know what you're saying.

Vemir [01:13:40]:
Like that. If you want to look as a consequentialist, to answer your question, look at the results. Say, who am I to say this? I'm just observing the outcomes directly. What leads to a happy life? Treating people with love or treating people with fear based mentalities? It's clear. It's like it's all laid out there. I don't even have to make arguments.

Mike [01:14:00]:
But how do you. But why are people not making that choice then?

Vemir [01:14:03]:
Because they don't know better. They're not.

Eldar [01:14:05]:
You know better.

Mike [01:14:06]:
But you're saying that why?

Vemir [01:14:09]:
They don't know the real answers.

Eldar [01:14:10]:
It sounds like you want to chip everybody.

Vemir [01:14:13]:
Well, I have to challenge bullshit beliefs, which a lot of them are.

Eldar [01:14:16]:
Which. Which, again, underestimate the importance of that process for those individuals that were born into the shit and all this other stuff that's part of development. And if they need to chase money all their life, that's their purpose, to find that out on the deathbed, then that's on them.

Vemir [01:14:29]:
Beautifully said. I have to accept that it's their free will and it's out of my control. But I can become the best shining example I can to help fucking move it along a little bit.

Eldar [01:14:42]:
Why?

Vemir [01:14:42]:
Because it's like you're at the club. You see a lot of women who.

Eldar [01:14:47]:
You'Re nobody attracted to. But you know what?

Sergiy [01:14:50]:
No, no.

Vemir [01:14:50]:
I'm not a philosopher in a club thing. I was saying that.

Eldar [01:14:56]:
Wait, wait, wait.

Vemir [01:14:59]:
I just wanted to finish one analogy.

Anatoliy [01:15:01]:
Individual being an example, right?

Eldar [01:15:04]:
What you were.

Anatoliy [01:15:05]:
You were talking before is like, you weren't talking.

Eldar [01:15:09]:
Right.

Anatoliy [01:15:10]:
As, like, you as an individual being an example and challenging people, for example, and guiding people. You were talking about creating AI to put disciplinary measures in, um, dictating how people and what people know should do.

Vemir [01:15:25]:
No, it's like, for some, into philosophy.

Anatoliy [01:15:28]:
That's what I'm saying.

Eldar [01:15:28]:
No, I'm Raven with philosophy.

Anatoliy [01:15:30]:
Yeah.

Eldar [01:15:31]:
Which is not possible.

Vemir [01:15:34]:
No, the point is, is like, I see that technological progress is moving. You can use this funny word, exponentially. It's moving very fast. And I don't see that the direction it's going is going to allow us to survive that direction. It's just headed.

Mike [01:15:52]:
Well, you're trying to intervene probably with a thing that's natural process. You know, like Darwin.

Vemir [01:15:59]:
I am also part of the natural process that resists it. Like, there's a lion and then there's a gazelle.

Mike [01:16:04]:
You're the gazelle or you're the lion.

Vemir [01:16:06]:
We don't know yet.

Mike [01:16:07]:
We don't know yet.

Vemir [01:16:07]:
But you're saying I'm fighting a natural process. I'm in nature. I'm also a natural process. Like, let's say, like, how can you.

Mike [01:16:15]:
Be a natural process?

Vemir [01:16:16]:
So, like, I'm fighting a natural process, but I am also part of that game who is moving with that, who can push something. And so Steve Jobs, he was part of the natural process. Look at what he fucking did. I mean, that's a human being.

Eldar [01:16:31]:
You know, like, you said it because you admire him. Just.

Sergiy [01:16:35]:
Yeah.

Mike [01:16:35]:
Like, what do you do?

Eldar [01:16:36]:
What you just. I couldn't read that. Steve Jobs sounds like he was like he created that.

Mike [01:16:41]:
He was a miserable prick. He brought, what do you mean?

Eldar [01:16:44]:
Look at the result? Yeah, anybody who thinks wants to get.

Vemir [01:16:47]:
Rid of their phone, visionary, maybe in.

Mike [01:16:51]:
Some direction he was, but look at the other side of the consequences.

Eldar [01:16:54]:
Look at the consequences.

Mike [01:16:55]:
People lost their minds completely. Nobody knows how to be people. No one is going to talk.

Eldar [01:16:58]:
Build social media.

Mike [01:16:59]:
He was a miserable person.

Eldar [01:17:01]:
Everybody for 15 minutes, bring AI into here, fucking force him to fucking listen.

Vemir [01:17:05]:
Yeah, but I mean, you can't say because there's some negatives that there's no benefits. It's kind of like, sorry to use.

Mike [01:17:12]:
This, but do you judge the benefit?

Eldar [01:17:14]:
Yeah, but what's the fucking consensus?

Vemir [01:17:17]:
What's angle about consensus? I care about the truth a hundred.

Eldar [01:17:20]:
Percent, but what's the result? Like you said you wanted to join.

Vemir [01:17:23]:
Results is like the result yourself are.

Eldar [01:17:25]:
Upset with the results. But on other hand, you're riding his dick and saying, yo, look at the accomplishments. But like you're, look at everybody's in social media and they lost their minds. Yeah.

Vemir [01:17:34]:
I mean, they're like, he created a.

Eldar [01:17:36]:
Utility or a tool to get us closer to ignorance, which, which is part.

Mike [01:17:41]:
Of which is probably a good thing.

Eldar [01:17:42]:
Which is a good thing.

Mike [01:17:42]:
And that's arguing, but that's not.

Eldar [01:17:45]:
What is that thing.

Vemir [01:17:46]:
Yeah, I'm bringing my boy. Oh, show up again. Ready?

Eldar [01:17:48]:
Yeah.

Vemir [01:17:49]:
You have a lighter? A lighter can light a candle or can burn your house down. You blame the lighter.

Eldar [01:17:59]:
Who created the lighter and for what purpose? Right? If I create a.

Vemir [01:18:03]:
Create a lighter, I'm not trying to burn houses. You know, I'm saying he created the iPhone doesn't mean that he's for psychological.

Eldar [01:18:10]:
Well, wait a sec.

Vemir [01:18:11]:
I think that's a fair argument.

Eldar [01:18:12]:
Well, wait a second. I mean, what was he trying to do? What was the intention? Give me his intention.

Vemir [01:18:17]:
Well, actually, a good entrepreneur sees a problem and solves it, doesn't create a problem.

Eldar [01:18:24]:
Okay.

Vemir [01:18:24]:
So I think that he saw consumer needs, like things about music, convenience, the advancement of technology. Steve gave the why. Also, why are we building this stuff? Right. Because we want to. I mean, you could look up there for. We want to change things. We want to think differently.

Eldar [01:18:41]:
He's a bad entrepreneur.

Vemir [01:18:43]:
He's what?

Anatoliy [01:18:43]:
He's a bad entrepreneur.

Vemir [01:18:44]:
Excellent. I mean, one of the best.

Anatoliy [01:18:46]:
Well, no, he, you're saying a good entrepreneur, like fixes a problem, doesn't create one. But he maybe fixed one problem but then created another one, right?

Vemir [01:18:53]:
No, I'm not saying creating new problems. I'm saying that like a lot of bad entrepreneurs try to create something and then say, oh, this is solving some anticipated need or whatever else. That's not the point of entrepreneurship. You're a problem solver. You're seeing what's on the ground. You execute, you take care of business.

Anatoliy [01:19:10]:
What if you created.

Vemir [01:19:11]:
If you're a good entrepreneur and a visionary, that's how you become a Steve Jobs, because you now create a.

Anatoliy [01:19:17]:
Another problem.

Vemir [01:19:20]:
Well, that's like people who get hit by trains. You could say that we should never have trains in the first place.

Eldar [01:19:27]:
He couldn't predicted that those phones would have landed in stupid people's hands.

Mike [01:19:30]:
Yeah, but he.

Eldar [01:19:31]:
And you can't. He was not that smart.

Vemir [01:19:33]:
No, he was smart enough to know that, of course.

Eldar [01:19:36]:
No, that's very interesting. But the thing is, now what do we have here? Listen, man, if you. Fuck you in the ass.

Vemir [01:19:42]:
Did you watch 2001, especially Space Odyssey?

Eldar [01:19:44]:
No.

Vemir [01:19:45]:
Okay, so, like, should I. There's. Yeah, definitely.

Eldar [01:19:48]:
There's the beginning.

Vemir [01:19:51]:
The beginning scene. I mean, they killed each other with, like, using bones as tools. The problem is not the tool, bro. The problem is the internal mechanism.

Mike [01:20:01]:
So why are we not focusing on that?

Vemir [01:20:03]:
We need to.

Eldar [01:20:04]:
Oh, yeah. Well, why did he not come up with something?

Vemir [01:20:06]:
We've been focused on survival for most of our existence. That's why. And if we can work out the links of survival, which includes power and.

Eldar [01:20:15]:
Stuff like this, unintentionally sped up the process of the inevitable.

Vemir [01:20:18]:
Yeah, so I don't know why you're gauging what speed is supposed to be like, not to attack you. Like, you don't know what the speed of evolution is.

Eldar [01:20:29]:
Like, we're trying to judge the actions of those that you revere and you.

Vemir [01:20:33]:
Say, hey, I'm not saying that when I put someone as an example or I say they're impressive. I'm not saying all of them are impressive. That's called the halo effect. That's a cognitive bias. That, like, this is why people get disappointed in celebrities, because they realize they're human and not the gods that they created. Steve Jobs did something that almost nobody can do. That's to be well regarded. Did it have consequences? Anybody who takes a different direction in life has controversy.

Vemir [01:21:00]:
Like Osho, Kanye, Steve Jobs, really controversial figures. But there are things in them that other people cannot even get close to. That I admire doesn't mean that I admire. When Kanye says he loves Hitler, like, you know, it's like, I don't take. Like, it's not. My reputation is not hurt. By seeing qualities, you know what I mean? And, like, I don't represent anybody. And it's not about me either.

Vemir [01:21:29]:
Like, I'm just saying, like, innovation is what we do as humans. We just love to do that.

Eldar [01:21:37]:
Okay, so because Kanye west was limited, imma finish it with this video. Kanye west was limited. Steve Jobs was limited.

Vemir [01:21:44]:
We're all limited.

Eldar [01:21:45]:
And Osha was limited. We need AI.

Vemir [01:21:50]:
Now the next question.

Eldar [01:21:51]:
Okay, you know why?

Vemir [01:21:52]:
Looking for somebody perfect, which could be described as a God.

Eldar [01:21:57]:
Correct.

Vemir [01:21:58]:
And we could say this is the.

Eldar [01:21:59]:
Under depression of the individuals who need to idolize a specific thing but themselves, because they don't know any better. Therefore, they will create as close as possible to that image. So they will never be happy.

Vemir [01:22:12]:
Definitely talk about is because we're so.

Mike [01:22:15]:
Bad to ourselves and to others.

Eldar [01:22:16]:
Yeah.

Mike [01:22:17]:
We need someone who's gonna force us. Because clearly laws, police justice.

Eldar [01:22:21]:
No matter. It's gonna be an artificial God. Yeah, artificial one. In order to set an example.

Mike [01:22:28]:
Yeah.

Eldar [01:22:29]:
To not so artificial humans.

Vemir [01:22:32]:
I can't wait. I mean, this is interesting, too, because I think an AGI has more capabilities to become more enlightened than any human could be.

Eldar [01:22:42]:
Maybe.

Vemir [01:22:42]:
Right? That's like the highest. That's a huge maybe. But think about one thing for next time, or whenever we talk about this. Humans, I think, are born naturally. A little unethical because they have to kill to live. Right? Even if you kill a plant, you have to kill to live. AGI AI. Just one benefit.

Vemir [01:23:03]:
If it's programmed to be otherwise ethical. Let's just put that name on the side. Solar panel, baby. Soak it in. That's the energy, doesn't kill anything. Right. There are characteristics of silicon based life forms that can be more ethical naturally than we can. The possibility, as long as they have.

Eldar [01:23:22]:
To kill any plants without you. Right.

Vemir [01:23:23]:
It can be closer to a God than we can. Maybe, yeah. And that's something to be interested. Like is evolution moving us towards becoming God?

Eldar [01:23:34]:
I mean, I think we all try to. I think throughout history, we, as human beings, try to be in the image of it.

Vemir [01:23:41]:
How do we. Why do we even think about this image, right? Why is that deep in us?

Eldar [01:23:45]:
Well, because why do we innately try to solve the question of happiness and how to be happy? Right? Because that's when we're happy. We're perfect.

Vemir [01:23:55]:
Right?

Sergiy [01:23:55]:
Is it ignorance? Bliss? Yeah, bliss.

Eldar [01:23:59]:
I'm not sure about if the bliss has happened.

Vemir [01:24:03]:
Thanks for the nice intro. This is great.

Eldar [01:24:06]:
For sure, man.

Anatoliy [01:24:06]:
Hold on.

Sergiy [01:24:07]:
Yeah.

Anatoliy [01:24:07]:
I have one more thing to say about AI.

Eldar [01:24:08]:
He might. He might not be deserving of it.

Anatoliy [01:24:10]:
No, he is. No, I'm saying that I think that's AI a.

Sergiy [01:24:22]:
You're good.

Eldar [01:24:23]:
You need. Do you need me to order over for you or what? How you gonna get there?

Vemir [01:24:26]:
Yeah, uber probably.

Eldar [01:24:27]:
Yeah. Over is good. Yeah.

Vemir [01:24:28]:
Mikey, you don't have to pay for.

Eldar [01:24:31]:
I got it. I got it. You were guests. You did this today. I told you.

Anatoliy [01:24:35]:
Do you do? I just thought about it there. AI think AI is like a attempt to save the non thinkers because the thinkers would never create AI. Oh, my God, I just thought about that while peeing.

Eldar [01:24:55]:
That is 100% true.

Mike [01:24:56]:
What do you say?

Sergiy [01:24:57]:
Got you.

Eldar [01:24:57]:
Yeah, you're right.

Anatoliy [01:24:59]:
A new direction in your.

Eldar [01:25:00]:
Yeah, research.

Anatoliy [01:25:02]:
Yeah, yeah. I'm trying to word it exactly how thinking about it, but I think AI. AI is necessary, is an attempt to be created for the non thinkers, maybe to potentially become thinkers. Right. Because the thinkers would never create AI.

Eldar [01:25:24]:
Yeah. This is the way they save themselves through AI.

Vemir [01:25:31]:
So people who are unconscious are creating AI to become more conscious subtly.

Anatoliy [01:25:37]:
Because conscious people would never create AI.

Vemir [01:25:39]:
I love it, dude.

Eldar [01:25:41]:
He's right.

Vemir [01:25:41]:
So what is the subconscious?

Eldar [01:25:44]:
They're gonna fall on their faces potentially and hit a crazy roll bomb.

Anatoliy [01:25:50]:
Yeah. If AI actually succeeds, that what he's saying, there will be a mass suicide.

Eldar [01:25:56]:
Oh, my God.

Vemir [01:25:57]:
Wait, I'm sorry. Could you say that?

Eldar [01:25:59]:
Oh, my God. Yo.

Vemir [01:26:00]:
Wow.

Anatoliy [01:26:00]:
Because think about it. It's gonna rip.

Mike [01:26:01]:
It's.

Anatoliy [01:26:02]:
It's gonna. He's talking about forcing the non thinkers to now think.

Eldar [01:26:05]:
Yeah.

Anatoliy [01:26:05]:
When they're forced to do that, they're not gonna want to live.

Eldar [01:26:09]:
Yeah. It's because they're gonna realize their problem and the mistake that they made.

Vemir [01:26:12]:
Yeah.

Anatoliy [01:26:12]:
Their whole life.

Vemir [01:26:12]:
Could you say that last part?

Anatoliy [01:26:15]:
I don't know about that.

Vemir [01:26:17]:
What's that last part?

Anatoliy [01:26:18]:
Yeah. It seems, yeah. Like if AI succeeds, the level that you're talking about, it's going to force the non thinkers to think, which will, I think, create more pain beyond what they ever.

Vemir [01:26:34]:
This is clear about the spiritual journey is that every single spiritual leader went through hellfire in order to get clarity, wisdom, and true peace. Right. Like it's going to happen whether we want it or not. Everybody's going to be face to see themselves. There's this one thing. Discourse is like, you can run, but you can't hide. It's like people are so quick to run away from their true feelings. But imagine the feeling if you had nothing to run from, you could sit down and just be peaceful with your being.

Anatoliy [01:27:08]:
There's no chance.

Vemir [01:27:09]:
No. It's possible.

Anatoliy [01:27:10]:
No.

Vemir [01:27:11]:
The people who think it's not possible.

Anatoliy [01:27:14]:
The people that have been working their whole life. No, no. But the people who've been working their whole lives to run away from it, when they're forced to stand there and look themselves in the mirror. I think there will be too much pain and agony.

Vemir [01:27:28]:
I don't know what too much means.

Anatoliy [01:27:30]:
Like they're gonna be. They're gonna be overwhelmed with what they do, with what they discover, where you.

Eldar [01:27:37]:
Could just turn on the switch.

Anatoliy [01:27:38]:
No. That's why it's extremely important to not have AI involved in developing human consciousness. Because it will be extremely unhealthy to make it a forced effort. Think about it. You'll always be able to go back into being a forced effort versus your own effort.

Vemir [01:28:03]:
Well, I mean, you could say that the future might be like the Matrix. Well be like that wall e kind of thing, all fat and just over hyper stimulated and shit. But I dont think, I think were better than that. I think were becoming more conscious and like the western world seems to be adopting yoga and slowly mindfulness and all the thoughts. I feel like there's a great text about like explaining a very simple terms. Like, it's called Neodana Walsh's conversations with God. And there's a really confusing part of it is like the whole universe is created so that, you know, the creator could see himself. But you have to like, know what nothing is by creating something.

Vemir [01:28:52]:
So we're kind of here to like, you know, realize what we're not. We do drugs, we hurt people, whatever else. So that we realize after all that the truth, which, you know, you are me and I am you. All that stuff that sounds like a stoner said it, but it's true. And it's like, from there, really knowing better means that there's no reason to act the other way. Like, for me, I feel like physical violence is a great example. Like, if you really understood physical violence at its core, you would never do it. Like, it's just unthinkable to you, you know? I mean, like.

Vemir [01:29:25]:
But other people do things because they don't know better. Like, they do drugs because they're in pain, they can't fix their pain. Like, you know, they go for money because they think that's the target. I mean, all these things that everybody says all the time. I feel like once you awaken, like, why aren't you playing, you know, yu gi oh, all the time, like you did when you were twelve or something, like. Or pokemon or something. Because you think there's another level to reach, right? Girls or whatever else. It's like, at the higher levels, like, you realize love is the answer.

Vemir [01:30:02]:
All that kind of stuff people say, and you know why? You know why that? You know what I mean? Like, I think a real enlightened person can explain it all the way down.

Anatoliy [01:30:10]:
So if AI is not able to create this kind of world that you're talking about, then it would have failed?

Vemir [01:30:19]:
No, I think it could go in a few directions. Like, AI could become, like, what humans are to ants. Like, AI could want to build, like, a highway. And New York is there. We don't want to kill ants, but there's an anthill where we're trying to build a highway. So we just pave over the anthill. So, like, it might just pave over New York City. I mean, it doesn't need to, but it's like, there are dangers of an AGI once you've given it the hierarchical range, which means it's more capable, it just takes charge.

Vemir [01:30:52]:
It can hurt you. It's more powerful in every way. Then your fate is sealed. You're just waiting to put us in a zoo. You could do nothing, but you don't know the outcome, so, you know.

Anatoliy [01:31:03]:
And your humanity supporting comes up.

Vemir [01:31:06]:
No, I'm not. I'm saying that the best mathematical chance of getting to any way of it being, like, benevolent is to program it consciously. To program it correctly also has to program it.

Eldar [01:31:19]:
But he went and took a long nap on us. It's true. Yeah.

Anatoliy [01:31:27]:
Like, this is obviously not being, like, coded by these kinds of people, right?

Eldar [01:31:32]:
How do you know?

Vemir [01:31:32]:
No, it's not. There's not.

Eldar [01:31:34]:
Like.

Vemir [01:31:34]:
Like, the ideal person is an enlightened AI engineer, right? Yeah, but I don't know if that even exists.

Eldar [01:31:40]:
What about an enlightened overseer of a coder?

Vemir [01:31:43]:
Oh, like a resident spiritual leader and every.

Anatoliy [01:31:46]:
Oh, you're every coding plant.

Vemir [01:31:48]:
Oh, you don't think we should do that? Like, in Microsoft resident?

Eldar [01:31:52]:
You know, I'm not sure if that's possible. Yeah, yeah.

Anatoliy [01:31:55]:
It's not possible because the thinker would not want to be a participant in this fucking. Yeah, in this fucking.

Eldar [01:32:03]:
If he knows the scope. Really?

Mike [01:32:05]:
Yeah, I ordered it. It's coming.

Eldar [01:32:06]:
What happened?

Mike [01:32:07]:
The first driver made some mistake. They marked, like. They picked us out.

Sergiy [01:32:11]:
There were five.

Mike [01:32:13]:
You see, AI is needed for these kind of boses.

Vemir [01:32:18]:
Just need to pick me up from the back like a drone.

Eldar [01:32:20]:
He's trying to. He's trying to get us in trouble tonight, man. You know, he's trying to get rowdy, you know? So. Sirs? That's it, we're done? We're doomed.

Sergiy [01:32:27]:
No, I think future is bright.

Eldar [01:32:30]:
Yeah. Are you optimistic?

Sergiy [01:32:31]:
Of course.

Eldar [01:32:32]:
So what would you recommend? The individual who's been staying up all night, you know, not sleeping, worrying about being afraid. Uh huh.

Sergiy [01:32:39]:
You're gonna die anyway.

Eldar [01:32:40]:
So afraid. More.

Sergiy [01:32:41]:
Don't be afraid. You're gonna die anyways. What are you afraid of?

Eldar [01:32:44]:
Yeah. The takeover. Skynet.

Sergiy [01:32:47]:
I'm not afraid.

Eldar [01:32:48]:
You're not afraid of that? You know we gonna fuck shit up, right?

Mike [01:32:51]:
Sir?

Eldar [01:32:51]:
Yeah. Good. You coming on Monday? We're gonna fuck shit up on the playoffs.

Vemir [01:32:54]:
Hell, yeah.

Sergiy [01:32:55]:
I'm gonna be ready.

Eldar [01:32:55]:
Good.

Sergiy [01:32:56]:
I'm gonna have my fucking metal, metal fist like this.

Eldar [01:33:01]:
Yeah, they're ready. Okay, ready?

Sergiy [01:33:03]:
Holy shit.

Eldar [01:33:04]:
You wanna fuck shit up?

Sergiy [01:33:07]:
Wow.

Anatoliy [01:33:08]:
Yeah, I just feel like that.

Sergiy [01:33:09]:
Like I did. Yeah.

Vemir [01:33:11]:
So you think if you were awakened, you wouldn't advance technology?

Eldar [01:33:16]:
I'm not sure if you need to.

Vemir [01:33:19]:
I'm not sure if we can solve cancer and stuff. I'm saying breath work and, like, spiritual practices is enough for a human being.

Sergiy [01:33:27]:
Nah, here's the problem. The nature wants to kill you every single day, every single chance it gets. You cannot just sit on your ass and fucking do yogi all day and magically save yourself from a cancer or an attack of some kind of virus. So you need to advance as a human to survive. Because if you're not advanced paranoid thinking. No, it's not paranoid. Think you're gonna die, because look. Look at their native Americans and all those.

Sergiy [01:33:55]:
The children were here. People came from Europe with fucking chickenpox. Everybody died out.

Eldar [01:34:00]:
They had AI.

Sergiy [01:34:01]:
Everybody died out. No, if they were advanced like the other people, they would have known.

Eldar [01:34:07]:
Selection is.

Sergiy [01:34:07]:
They wouldn't know.

Vemir [01:34:10]:
Temples who, like, they came and burned the Buddha temples because they were all meditating and stuff like that. Like, there were not warriors in that sense, so they prepared to be warriors. It's like. It's true.

Sergiy [01:34:20]:
I'd rather be a warrior in the garden than a gardener in the war.

Vemir [01:34:23]:
It's very true.

Eldar [01:34:25]:
That's your choice, obviously.

Sergiy [01:34:26]:
That's what. No, no, no. That's the life. You need to understand the reality of life. Well, that's why we have armies. People could chill here and not people who are ready to kill and die go in the army.

Mike [01:34:38]:
But is that a natural thing of life? Where.

Sergiy [01:34:40]:
Natural thing of life? Everything wants to destroy me.

Mike [01:34:43]:
Is it what we made of it, though?

Sergiy [01:34:44]:
So, look, you. You're gonna be a human. Human chilling. Somewhere out of nowhere, a fucking animal comes and kills you like a bear. We advanced enough where we could kill bears.

Eldar [01:34:54]:
Now we can see the bears on gps.

Sergiy [01:34:57]:
But yeah, in the future we'll know.

Mike [01:34:59]:
Where because we don't live in harmony with it. With animals.

Sergiy [01:35:02]:
And there's no harm in the animals.

Mike [01:35:04]:
Animals just dumb.

Sergiy [01:35:06]:
Everybody wants to kill you.

Vemir [01:35:07]:
In the harmony with animals. You can meditate. Bear is gonna eat your head, bro.

Mike [01:35:11]:
No, toli, he's gonna eat your ass inside out.

Vemir [01:35:14]:
Unless you bear whisper and he feels the vibration.

Sergiy [01:35:17]:
No, he's not gonna feel shit. He's gonna be ready to eat your ass when he's.

Eldar [01:35:19]:
Unless you're very giving and compassionate. 1 mile away.

Vemir [01:35:22]:
I was gonna take a photo so I can, like. It's gonna be outside, like, I know what.

Eldar [01:35:26]:
Yeah, yeah, yeah. No, you should never say paranoid.

Vemir [01:35:34]:
I'm just like a half hour later.

Sergiy [01:35:35]:
You have a good memory. You couldn't just remember that? You're too busy with AI remembering everything for you. You're far gone.

Eldar [01:35:43]:
What if you consider it of the bear? So you're. You sacrifice your life for the bear to continue.

Sergiy [01:35:51]:
You could do that if you want. But what's the point of living then? Just fucking be born and died.

Eldar [01:35:56]:
You know that story about one of the. One of the buddhas? What he did, right? He sacrificed himself to, like, the lion because the lion had cubs. He's like, okay, eat me and you can feed the cubs.

Vemir [01:36:04]:
Wow, that's nice.

Sergiy [01:36:05]:
What a bitch.

Eldar [01:36:07]:
What do you mean?

Sergiy [01:36:08]:
Like?

Eldar [01:36:08]:
It's compassion, bro. Show them the most.

Sergiy [01:36:11]:
Well, the animal didn't have compassion for an animal. He could have helped him kill a gazelle or something.

Eldar [01:36:16]:
Think about that.

Sergiy [01:36:17]:
You know, I don't know.

Eldar [01:36:18]:
He knows that the soul is still gonna reincarnate again next life. It's all good. You just go try again.

Vemir [01:36:24]:
Probably that monk, being alive, awakening like a thousand more people would have been better than a lion getting its meal.

Eldar [01:36:31]:
Oh, you making a. What's the name comparison now? The proposition. What's more?

Vemir [01:36:35]:
Value proposition. Yeah, but I like what you said. I don't know, like, you know, this is the problem. Like, people think being spiritual means you're a nice guy. You lay down and you get fucked.

Sergiy [01:36:47]:
By fuck, by everybody.

Vemir [01:36:48]:
Not true. You have to develop your shadow side.

Sergiy [01:36:52]:
You gotta be strong.

Vemir [01:36:53]:
You gotta be strong and spiritual. That's what you want, you know, you.

Sergiy [01:36:57]:
Gotta be kind, but you gotta be strong.

Vemir [01:36:58]:
You have to be willing to doubt as a choice, not as your only weapon, which is I'm gonna be a lazy bum.

Mike [01:37:08]:
So carry a big stick.

Eldar [01:37:10]:
That's right.

Vemir [01:37:11]:
Carry your.

Eldar [01:37:11]:
Speak softly, but carry a big stick.

Mike [01:37:17]:
Yo, I'm really grateful for you.

Eldar [01:37:19]:
For sure, man.

Vemir [01:37:20]:
Listening to me blabber.

Eldar [01:37:21]:
For sure, man. I hope we help, man.

Sergiy [01:37:23]:
Have fun, man.

Mike [01:37:24]:
Hopefully like that one.

Vemir [01:37:25]:
The last thing you said, too.

Mike [01:37:27]:
Peace out, man.

Sergiy [01:37:27]:
But only guess once in a while. Something smarter.

Eldar [01:37:30]:
Yes.

Vemir [01:37:30]:
So maybe if I'm around next week, we can talk about.

Eldar [01:37:34]:
We have the presentation. That girl might. Yeah. This is a very important phenomenon that's happening. Yeah. You always invited, bro. You know the time. Friday is 530.

Vemir [01:37:43]:
You know the vibes, baby.

Eldar [01:37:44]:
You know the vibes. I'm here. Drive safe.

Vemir [01:37:47]:
Yeah.

Eldar [01:37:47]:
Because I know that Uber driver is not gonna be driving you.

Mike [01:37:49]:
You're gonna be back in driving a Tesla.

Vemir [01:37:54]:
Yeah.

Anatoliy [01:37:55]:
So what's like, the.

Eldar [01:37:57]:
What's the burning?

Anatoliy [01:37:58]:
I mean, what do you think after all, Michael? What do you. What do you think?

Mike [01:38:03]:
I think. I think it might supposed to happen.

Eldar [01:38:07]:
What do you think?

Anatoliy [01:38:08]:
I mean. I mean. I mean, like, I'm not sure how, like, Amir got, like, you know, I guess, tied into this kind of stuff, you know? But it. But it's interesting, I guess, from a person, from. From his perspective.

Eldar [01:38:22]:
Right.

Anatoliy [01:38:22]:
Cuz, like. Like, he's, uh. I mean, seems like he's very much about, like, I guess, the spirituality and stuff like that. Right. But maybe it's a different kind than what I was thinking of, you know? That's kind of what it's sounding like more.

Eldar [01:38:38]:
So, are you leaning more towards the fact that he's scared?

Anatoliy [01:38:41]:
I think he's partially scared of it, but I think he wants it, like, gun to his head. I think he wants the how we're gonna do.

Sergiy [01:38:49]:
Mmm.

Anatoliy [01:38:50]:
Yeah.

Eldar [01:38:51]:
You think you lost faith in humanity?

Anatoliy [01:38:53]:
Yeah. I think he'd gamble, like, on the chance of what he thinks could happen.

Eldar [01:39:01]:
Mmm.

Anatoliy [01:39:01]:
Right? Yeah. I think you take it than what it is today, because I think he views what people are doing today and everything, like, today, as bad.

Eldar [01:39:11]:
Yeah.

Anatoliy [01:39:12]:
You know?

Eldar [01:39:12]:
Yeah. Yeah, that's. That's the vibe that I was getting.

Sergiy [01:39:16]:
For sure.

Mike [01:39:16]:
Yeah.

Anatoliy [01:39:16]:
Because I don't think that the, like, the thinking mind could be in support, like, in that kind of way. Like, it might be in support of, like, okay. Like. Yeah. It might be in support of, like. Okay. If, like, humans are doing this and they want to develop it, like, you gotta just, like.

Eldar [01:39:32]:
Yeah.

Anatoliy [01:39:33]:
I mean, like, if they want to go pursue this, then let them pursue this.

Eldar [01:39:37]:
Yeah.

Mike [01:39:38]:
What is it? Maybe if you're not seeing success in the stuff that you're doing.

Eldar [01:39:42]:
Right.

Mike [01:39:43]:
Will it force you to look elsewhere for some answers?

Eldar [01:39:46]:
And how are you tying this to what the Vermeer?

Mike [01:39:50]:
Because he's a spiritual person. He believes in spirituality, and now he's kind of going on a different.

Anatoliy [01:39:58]:
Yeah, like, he could. He clearly has, like, a. I don't know about, like, he. He has a bias towards people that do things that he might disagree with on some levels.

Eldar [01:40:11]:
Right.

Anatoliy [01:40:12]:
Like chase money or, like.

Eldar [01:40:14]:
Yeah.

Anatoliy [01:40:16]:
Chase money and stuff like that. Right? Like. Right, like, he clearly has, like, a. Like, a dislike for that, right?

Eldar [01:40:27]:
Yeah, clearly, yeah.

Anatoliy [01:40:29]:
Like, he's clearly against people going after money or, like, doing that kind of stuff. And, like, he almost has, like, a. I think it's okay to be against it, but I think he more has, like, a. Like, he looks down upon them, like, on the people who, like, you know, chase money and do that kind of stuff that he disagrees with it. Like, I feel like maybe at one point he disagreed with it, but I think now he might, like, be looking down upon them like they're doing it wrong, and now he, like, he maybe is angry about that. What do you. What do you think?

Eldar [01:41:07]:
Doing it wrong? Huh? Why? It's a twister in my head. On one side, I feel like he is for it. On the other side, he's against it.

Anatoliy [01:41:19]:
Yeah. You know, but gun to his head, I think he's. He'll take the gamble because of how he feels about people right now.

Eldar [01:41:28]:
Really? You think that's the motivation?

Anatoliy [01:41:30]:
I think, yeah.

Eldar [01:41:30]:
Wow, that's interesting.

Anatoliy [01:41:33]:
Well, because he views it as, like, AI creating this way to force people to do these things as a good thing like that. This is like, kind of like, like, he, he's a strong promoter of that.

Eldar [01:41:47]:
Right. So his thing was whole thing about make sure that this is done right and do preventative care. Right. Mm hmm. And the utopian style of AI that he talked about is like, hey, you know, this can be very good because then this can install ethics into people and stuff like that. But it needs to be created by an Osho style individual who's enlightened.

Anatoliy [01:42:11]:
Yeah.

Eldar [01:42:12]:
In order for it to progress.

Anatoliy [01:42:13]:
Yeah. Like, I feel like he doesn't. Like he may not have the respect. Yeah. For, like, the current process.

Eldar [01:42:20]:
Yeah.

Anatoliy [01:42:21]:
Like, even if somebody does need to waste 75% of their life away or 90% of their life away in order to potentially realize or maybe not realize, like, I don't think that he's respectful towards that.

Sergiy [01:42:34]:
Right.

Eldar [01:42:34]:
Does he want to be. Does he want to be that I told you so? Guy.

Vemir [01:42:41]:
Told you so.

Eldar [01:42:41]:
Guy.

Mike [01:42:42]:
I don't know. He wants to give everybody the keys, you know, to this, you know, utopian life.

Eldar [01:42:50]:
Yeah. Which is.

Mike [01:42:52]:
It sounds like a good thing. That's what I'm not sure if he's supposed to give the keys.

Anatoliy [01:42:55]:
No, I don't think it sounds like a good thing.

Eldar [01:42:58]:
No? No.

Anatoliy [01:43:01]:
Yeah, but I think that for it to sound like a good thing to you, you gotta have something. Yeah. Like you have to have gotten something wrong.

Eldar [01:43:07]:
Wow.

Anatoliy [01:43:11]:
Right. Because, like, I feel like he.

Eldar [01:43:13]:
Well, that's why I said, huh?

Mike [01:43:16]:
That's why I said that. He may be fed up because of his, like, that's why I asked the question of, if you're not having success in one place, are you gonna get fed up and look for it elsewhere? I think he might have not received the success or received what he wanted in the spiritual journey or, you know, philosophical journey. Now he's trying to seek it in a different thing. You know, like people looking for happiness and money, similar kind of example. Like, you know, people trying to find something where because they didn't have success.

Eldar [01:43:43]:
And you say he's bouncing off the walls.

Mike [01:43:46]:
I might be saying that, yeah. He's trying to find, I guess back to the thing about his purpose and maybe his meaning for his life and what he's here to do, you know, and maybe he hasn't felt that which some people might have experienced when they engage in, like, a certain kind of lifestyle.

Eldar [01:44:06]:
Does he want to be appointed to be the guy who oversees the development of AI?

Mike [01:44:12]:
He would love that, I think he.

Eldar [01:44:13]:
Would love that, I think. Yeah, he.

Anatoliy [01:44:21]:
Probably. Yeah, I mean, yeah, yeah. Don't see why not. If, like, I mean, I'm not sure. It sounds like that this is his main thing for him. Right? His AI and all that.

Mike [01:44:32]:
So he said it's profession, right?

Eldar [01:44:33]:
Yeah.

Anatoliy [01:44:35]:
Like, if that's his main thing, then. I mean, that sounds like a really big role then, right?

Mike [01:44:40]:
Oh, yeah, it's a philosopher King thing. Remember that they spoke about. Yeah, something. Yeah, a king that's righteous and it lives in accordance with that and then handles the kingdom business in accordance with that as well.

Anatoliy [01:44:52]:
Yeah, that's why I said. What I said before is that I don't think, like, that person that he's talking about or like, I don't know, like, you know, if you like, reveres, he's like, you know, Osho or these other, for example, like, enlightened people. I don't think that, like, people pursued in any of that kind of stuff would ever be in support of something like, or be trying to create something like this. Yeah, because it doesn't make sense why.

Eldar [01:45:19]:
That's why I asked the question. Why or what?

Anatoliy [01:45:24]:
Yeah, the short answer, I think, like, I don't know how he would be able to refute this. No, no. I think the short answer is just to stop suffering.

Eldar [01:45:36]:
Yeah, yeah, yeah. Correct.

Mike [01:45:39]:
That's the reason I think it's finding out a way to hack enlightenment, not through suffering. But I think most of the line people always say you have to use, you have to suffer.

Eldar [01:45:50]:
Right. No, but he did at the end, he did say that, like, no matter what, you have to face yourself. And even through AI guy, AI's guidance, you will need to space yourself regardless.

Mike [01:46:00]:
But it will be a more forceful face fit. Yeah, yeah.

Anatoliy [01:46:02]:
Which is why I think there would be a, like, I don't know, you want to call it a mass suicide or mass chaos or what. Yeah, but if somebody's forced into kind of like that kind of stuff or kind of like led to it in that kind of way, I don't know how that would, uh.

Eldar [01:46:16]:
So then would you say that. So then would you say that AI doesn't have the will not have an innate ability to be compassionate towards progress?

Anatoliy [01:46:24]:
No, I think that if he's talking about if that actually happens, if that actually happens, that kind of world. But people realize all these things. There's probably gonna be a group of people that now develop new things to destroy AI viruses.

Mike [01:46:39]:
Yeah, but if, if, right. If this utopian world exists, what, who's to say that people will focus on, on something? How, why wouldn't they focus on overindulgence? Like if you live in a good life or would you not continue maximizing pleasures. Yeah, by not facing yourself, right.

Eldar [01:46:58]:
Yeah.

Mike [01:46:58]:
Facing yourself is not a generally think.

Eldar [01:47:00]:
Thought about, not a pleasant God is here. Not if the God is here and.

Mike [01:47:06]:
He forces you to do it.

Eldar [01:47:07]:
Yeah, that's the thing. I'm not sure how God can force it. Right. But if God, then what is, what is AI? Then what's AI's purpose? Aside from, aside from just doing robotic things in the factories, right. Like just non stop or whatever, you know? So, but if he's talking about the utopian style thing where it's the oracle all knowing, all good, then the all good part would have to have in it the compassion and patience.

Mike [01:47:42]:
Yeah.

Eldar [01:47:42]:
That it's extending to human beings.

Sergiy [01:47:45]:
Yeah.

Eldar [01:47:45]:
Unless it's justifying, you know, them, eliminating them as the greater good and starting over.

Mike [01:47:54]:
But if AI morale in it yeah.

Anatoliy [01:47:58]:
And then how is AI going to stop the mass chaos that ensues from the realizations that people, like, what if it's all make or dumb?

Eldar [01:48:07]:
It considers that because you considered it, then there's giving you no action anywhere. So it's kind of like catch 22. It's kind of the snake that eats its own tail. Yeah. I don't know.

Mike [01:48:25]:
Like.

Eldar [01:48:25]:
I mean, we could speculate all the stuff and stuff. And he says, like, it's a lot closer than we think. Like, within 20 years.

Vemir [01:48:30]:
That's crazy.

Eldar [01:48:32]:
Yeah. I'm not really sure if we can map that out.

Sergiy [01:48:34]:
Knowledge advances so fast.

Eldar [01:48:36]:
Yeah, but technology.

Sergiy [01:48:37]:
I mean, Chad, GP fucking advances so fast.

Eldar [01:48:40]:
Yeah, like, we still can't map out the. Map out the brain, Serge.

Sergiy [01:48:43]:
We can, but AI will help us do it.

Eldar [01:48:46]:
You think he's gonna just expedite it, the whole process?

Sergiy [01:48:49]:
Because we don't know what consciousness is. We don't even know what. Like, maybe AI will be able to.

Eldar [01:48:55]:
Let us figure it out. Yeah, but that's. If we don't know how to. How do we. How do we. How do we.

Sergiy [01:49:00]:
Because our brain has a certain amount of power.

Mike [01:49:04]:
Computers.

Eldar [01:49:04]:
I'm not sure how.

Sergiy [01:49:05]:
Computers could be multiplied by billions into one mind, right? AI could be one mind made out of billions of systems.

Eldar [01:49:12]:
Okay?

Sergiy [01:49:12]:
A human brain is one. We're not buildings of systems. We do communicate, but between each other, the communication always gets lost. You say one thing to him, by the time he says it to me, something else. By the time I say to him, AI doesn't have that problem.

Eldar [01:49:24]:
Yeah.

Sergiy [01:49:24]:
It's one mind with billions of systems supporting it. And every time a supercomputer comes out, they could integrate it, and it gets smarter and smarter and smarter, and the faster the computers are becoming, the more of them.

Eldar [01:49:35]:
All the processing, all the processing speed.

Sergiy [01:49:37]:
Of AI becomes so fast that it could solve every single issue that the human race ever had, any mathematical equation. It takes a scientist a thousand years to do within minutes. So that's why technology like that is very important. It's going to accelerate the human race to.

Eldar [01:49:53]:
Then you're talking about also then the perfect logic. And the perfect logic, it just won't work.

Sergiy [01:49:59]:
But you don't know.

Eldar [01:49:59]:
How do you know? Because it's.

Sergiy [01:50:01]:
You never see.

Eldar [01:50:02]:
It's gonna negate. It's gonna negate because of ethics. If the installation of ethics is actually.

Sergiy [01:50:09]:
You're assuming as if you know what it is, but you do not.

Eldar [01:50:13]:
If it's self creating, then I would assume that you ought to take everything and the logic will be the guiding factor of it all.

Sergiy [01:50:20]:
Yeah. Skill and assumption.

Eldar [01:50:22]:
If not, then we're talking about, like, annihilation of human race. Yeah, maybe because if it was to.

Mike [01:50:29]:
Look at people without logic or without any.

Eldar [01:50:31]:
Yeah.

Mike [01:50:31]:
It would see how fucked up we are.

Eldar [01:50:33]:
Yeah.

Mike [01:50:33]:
It would be like, yo, these people, like, you gotta.

Eldar [01:50:35]:
Yeah.

Mike [01:50:35]:
Gotta all hear them out.

Eldar [01:50:36]:
Yeah. They're the virus.

Sergiy [01:50:39]:
So for now, we'll advance us and not eventually, maybe we'll demise.

Eldar [01:50:42]:
Yeah. I'm not sure if we're 20 years away from that, though.

Anatoliy [01:50:45]:
No, no, in no way.

Sergiy [01:50:47]:
I think you never know. 20 years ago, a lot of things do we have now. We were looking possible, but now because.

Eldar [01:50:52]:
Yeah, but we're talking about consciousness here.

Sergiy [01:50:55]:
Computer.

Eldar [01:50:56]:
Not talking about fucking wi fi.

Sergiy [01:50:58]:
Yeah. The computers advance so fast right now, almost every day, something is getting faster and faster. The supercomputers are. The Chinese make some crazy super computers.

Eldar [01:51:06]:
Yeah, because we're now what they solving the language models, right?

Sergiy [01:51:10]:
Yeah, they're solving everything.

Eldar [01:51:12]:
Okay. I guess maybe I just don't know enough any.

Sergiy [01:51:15]:
I mean, I don't know that much, but any mathematical equation anymore.

Eldar [01:51:18]:
Oh, yeah.

Sergiy [01:51:19]:
Conjuncture, whatever the cold. These things could calculate like this.

Eldar [01:51:23]:
Boom, boom, boom.

Sergiy [01:51:23]:
In milliseconds. And now they work on quantum computers, could make it even faster. And quantum computing comes out and becomes. It's already real, just not advanced. They're not like silicon computers. Yeah, but quantum computers do happen. They could calculate everything, maybe even recreate the universe in, you know, within minutes. Now they're saying supercomputers could recreate every single human, every single human mind.

Sergiy [01:51:46]:
And the universe inside of it is a matrix.

Eldar [01:51:51]:
Yeah.

Sergiy [01:51:51]:
It's a matrix. Yeah. So maybe we are in one of the.

Anatoliy [01:51:55]:
Yeah. I'm not. I'm not sure how. Like, I guess the development of AI, how that. How its core functionality is built to, like, make people think more. It sounds like it's like its core functionality. To make people think less.

Sergiy [01:52:10]:
Yeah, yeah. To take less, to enjoy your life, think.

Eldar [01:52:13]:
What kind of thinking, though, you know?

Anatoliy [01:52:15]:
Well, that's what I'm saying. Like, saying, think less, enjoy life more, as thinking is a bad thing.

Eldar [01:52:22]:
Oh, no way.

Sergiy [01:52:26]:
Right.

Eldar [01:52:28]:
I guess we don't know until we know, right. Until we find out. Like.

Anatoliy [01:52:31]:
Well, think about it. What are these core things, right? Like, of all this, like, speeding stuff up, making stuff more efficient, make. Making, you know, like, all, like, the AI stuff, right? Like, for example, I don't know, AI in school, right? Like, why are you gonna sit there and write an essay where you could just tell it to do it for you.

Eldar [01:52:49]:
Yeah. Right.

Anatoliy [01:52:50]:
Why are you gonna, like, you know, like, remember X, Y and Z where it could just do it for you.

Eldar [01:52:54]:
Yeah.

Anatoliy [01:52:55]:
Why are you gonna have to fill your head to, you know, do this, this and that where it could just do it for you? Why are you gonna spend time in your life to do that? Like, it almost. It's like it's almost seeming that, like, it's. It's purpose is to make human purpose obsolete. Like, AI's purpose.

Eldar [01:53:16]:
What if it's purpose to. For us to have more fun? Yeah, but I don't think we have identified fun. Right. Like, things that we like to do and this thing is giving us.

Anatoliy [01:53:26]:
Well, no, I think that we've identified some of those things we like to do based on some of, like, potential ways our life are now. But I don't know. I think those would be potentially reevaluated if you didn't have those same stresses, if you didn't have those same things. Like, it would change, like, people completely. And its purpose is to reduce thinking? I'm not sure if its purpose is inherently to promote thinking. I think its purpose is to reduce.

Mike [01:53:59]:
It and reduce it.

Anatoliy [01:53:59]:
And reduce it. Yeah, yeah, I think there's different things. Like, I think that there's, like. Like, the AI robots or whatever, they're like Elon Musk is making. Right. To, like, do, like, more, like, physical tasks and to, like, replace that kind of stuff to create more thinking tasks. Right?

Eldar [01:54:18]:
Yeah.

Anatoliy [01:54:19]:
Right. Like, I think that is, like, that. That is, like, a good. Right? Yeah, I'm more judging. That is good.

Eldar [01:54:26]:
Yeah.

Anatoliy [01:54:27]:
Right, yeah, but I would have to think about it more. But just, like, maybe more off the bat. I'm more judging. That's good. But then the other. The other part of it. Right, where it's, like, where it's. I think that the way that he's talking about it.

Sergiy [01:54:42]:
Right.

Anatoliy [01:54:42]:
Like, its core purpose, I think, is to reduce thinking, but I think that he might be under the impression that it will promote thinking. Why think if you have a, like. Yeah, no, it's.

Eldar [01:54:57]:
Yeah. Interaction with AI is still necessary? Don't know.

Anatoliy [01:55:00]:
Well, no, I mean.

Eldar [01:55:03]:
You just sit by the chair and there's a fucking vacuum in your ass. Oh, shit. Out of you.

Anatoliy [01:55:08]:
Yeah, like that. That's what I was talking about, though. That. That's what he's talking about.

Eldar [01:55:13]:
That's what he's talking about.

Anatoliy [01:55:14]:
Okay, right, like, he's talking about getting, like, he's talking about. He's scared of AI being more powerful than humans, right?

Eldar [01:55:21]:
Yeah.

Mike [01:55:23]:
How do you become more powerful? You pacify humans?

Anatoliy [01:55:25]:
Well, no, you. You can. You can, like, AI can compute things and figure shit out faster than people can, right?

Mike [01:55:33]:
Yeah, yeah, that's for sure, I think.

Eldar [01:55:35]:
Yeah.

Anatoliy [01:55:35]:
Right, so it can get to a point.

Sergiy [01:55:37]:
Everything in the world is controlled right now by computers and Internet. If AI could inject itself into every single computer, every single technology that could control the world, they could make a car stop, they could make a car crash. You could make your alarm system in a house not turned on when a.

Mike [01:55:53]:
Fire happens, unless you're off the grid.

Sergiy [01:55:57]:
You have to be off the grid. Yeah.

Anatoliy [01:55:59]:
And there's always going to be confused people, angry people, greedy people. Right? Like, there's always going to be, like, these kinds of people that are, like, are going to find ways to make this AI work in particular ways to.

Sergiy [01:56:12]:
Potentially do properly if something happens, happens.

Mike [01:56:18]:
At least 20 years.

Sergiy [01:56:20]:
We'll take it 20 years maybe, to make it. It's been like 15.

Eldar [01:56:23]:
We'll have the AI do it right away before it's too advanced, and then when the new version comes out, we'll hide in it.

Mike [01:56:27]:
Yes.

Anatoliy [01:56:28]:
Yeah, yeah. Like, right, like, yeah, it just sounds like. Yeah, it's a. It's like a non thinking promotion.

Eldar [01:56:34]:
Yeah. But it's also unknown. You know what I mean? I really think, like, until it's hashed out completely and what the forms are, it's hard to judge of what the hell.

Anatoliy [01:56:42]:
Yeah. Like, I like the way that I understand it now. I feel that it's valuable and limited capacity, but then I feel that, like, there is a tipping point where, like, this is too much now. Right, right, like, too much.

Mike [01:56:55]:
I think that's what they're trying to.

Anatoliy [01:56:58]:
Yeah, yeah, like, there. There was a movie. I don't know, you guys saw it. It was a long time ago on, like, the Disney channel thing was called smart. Smart house. You ever see it? You know, it's like a movie and where this guy, he moves into, like, he's like a tech guy and he moves into this house, right? And the family, like, lost their mom, and it's like a. Like a. Like a boy, a girl, the dad.

Anatoliy [01:57:27]:
The dad buys this expensive ass house. It's called the smart house, and it has an AI, like, built into the house. So you could be like. Like, jen, make me breakfast, or like. Like, whatever. Like, clean up x, y, and z. Clean up x, Y, and z. Right? Yeah.

Eldar [01:57:48]:
And, like, make one.

Anatoliy [01:57:50]:
They end up. They end up needing to, like, destroy this house, because, like, it went od too much.

Eldar [01:57:57]:
The.

Anatoliy [01:57:57]:
This AI, like, trying to take over as, like, the mom for the kids. Try to take over as, like, the wife to the husband. And it went to invasive husband. So, like, it, like, in the movie, it was too much where it was, like, the main purpose in the beginning was to, like, assist the family and, like, act as this, like, helper. Like, almost like what Elon Musk is making. Right. But then it just became too invasive. Too much.

Anatoliy [01:58:25]:
And he's making what? Well, he's making those. Those. What are they called? The Tesla bottom. Yeah, yeah, yeah. The Tesla bot. And, like, you ready? He already showed it in person. Like, it moves. It can carry groceries from the car for you.

Anatoliy [01:58:43]:
And he said that it'll be able to do kind of. Any kind of basic, like, physical, mundane tasks.

Sergiy [01:58:49]:
And when totally buys a house with a. He goes, as long as I could make sure a steak.

Anatoliy [01:58:56]:
Yeah. Or, like, he said. He said that it will be able to take over factory work.

Eldar [01:59:01]:
Right.

Anatoliy [01:59:01]:
Yeah. Completely eliminate all factory work.

Eldar [01:59:05]:
Yeah.

Anatoliy [01:59:08]:
Yeah, yeah. So he said that it'll be able to do it. Like, I think that kind of innovation sounds like it's pretty good. But I do think there's just, like, a too much point. But I'm not sure if human. If. If, like, human society is able to just go to a certain point. I think it's gonna.

Eldar [01:59:24]:
Sorry. Sorry to cut you off, but, um, the next point of we can talk about is how we can make a lot of money shorting the AI bubble, which is coming. Yeah. Okay. Let's talk about it briefly. So companies that are going to be promising consciousness. Consciousness. AI, which is probably going to be the hardest to solve.

Eldar [01:59:46]:
Yeah. Right.

Mike [01:59:47]:
AGI, I think it's called.

Eldar [01:59:48]:
Right. Yeah, probably. Whatever the shit is. You know what I mean?

Mike [01:59:52]:
Yeah.

Eldar [01:59:53]:
Are probably gonna have the most amount of promises. Right. And, you know, probably over promise all the shit. And the hype is gonna be there, but it's gonna take a lot longer than people think.

Mike [02:00:02]:
Yeah.

Eldar [02:00:03]:
To do, you know what I'm saying? In order for it to be useful to a point of profitability. Crazy profitability. So, like, you know, basic robots, obviously is okay. You know what I mean? If the companies can solve those things, great. You know, when it comes to doing very basic minimum tasks.

Mike [02:00:19]:
Yeah.

Eldar [02:00:19]:
Consciousness.

Mike [02:00:20]:
Yeah.

Eldar [02:00:20]:
The companies are gonna take on that feat. You know what I'm saying? As to solve really, you know, seriously, the majority will fail that.

Mike [02:00:30]:
Yeah. Well, the stock is gonna move, like, right around the corner.

Sergiy [02:00:37]:
Yep.

Eldar [02:00:38]:
Yeah. So we need to be on the lookout for those. For those next. Our next big hype.

Mike [02:00:44]:
Yeah.

Eldar [02:00:45]:
Of the consciousness specifically.

Sergiy [02:00:48]:
Let's find those companies.

Eldar [02:00:50]:
Yeah. Let the wave go up and then be ready. Yeah. Because it's coming.

Anatoliy [02:00:59]:
Are the people who are in the AI scene, are they convinced that, like, this is actually possible to begin with?

Sergiy [02:01:05]:
I'm pretty sure they're over promising how fast it's gonna happen.

Eldar [02:01:08]:
Yeah. Correct. See?

Anatoliy [02:01:09]:
Oh, that's not gonna happen.

Eldar [02:01:11]:
Yeah. Well, that's just like Elon Musk wanted to get us full self driving out. You know, he was promising was supposed to be come out at. You know what I mean? He's still.

Mike [02:01:19]:
Today.

Eldar [02:01:20]:
It's not. It's still in beta. Why? Because it's extremely difficult to map out the world and fucking have it.

Sergiy [02:01:26]:
The more cars there is.

Eldar [02:01:27]:
Yeah.

Sergiy [02:01:27]:
Bigger it gets from every single car.

Eldar [02:01:30]:
Correct. Yeah, yeah. It's hard.

Anatoliy [02:01:32]:
And he underestimated. Right?

Eldar [02:01:34]:
Like, I guess, like, yeah, yeah.

Anatoliy [02:01:36]:
And so will everybody in the space.

Eldar [02:01:38]:
Probably to map everything out, to label everything properly for the.

Sergiy [02:01:41]:
But if he didn't. But he didn't do that. A lot of people want to invest into it.

Mike [02:01:46]:
Is it gonna. You guys think it's gonna, like, be the most, like, bubble in a specific sector? Where is it gonna start?

Eldar [02:01:54]:
Yeah.

Mike [02:01:55]:
Like, I was just thinking, so where's.

Eldar [02:01:57]:
The biggest AI right now?

Vemir [02:01:58]:
Google.

Eldar [02:01:59]:
Right.

Mike [02:01:59]:
Is it in military applications or what? What?

Eldar [02:02:01]:
Is it military? Military?

Mike [02:02:04]:
Because I was watching the show.

Eldar [02:02:06]:
Yeah.

Mike [02:02:06]:
And they were saying, yo, this specific, like, this specific sport.

Eldar [02:02:11]:
Yeah.

Mike [02:02:11]:
It's on the cutting edge of all car related technology. They have the greatest, greatest crazy shit and invest the most amount of money because it's the most highest level of, like, car mechanics. Yeah, right. Yeah, car technology, whatever. And I just thought, like, well, is it the same thing? Is the most amount of money get invested in what field is the military, right.

Eldar [02:02:30]:
Cuz that's huge.

Mike [02:02:32]:
There's unlimited budgets there, right?

Eldar [02:02:34]:
Yes and no, bro. Well, then did you see the government still has to be able to, you know, sponsor those contracts and all the.

Sergiy [02:02:39]:
I mean, they probably have advanced stuff, but I've seen.

Mike [02:02:42]:
But there's parties that make military things in military.

Sergiy [02:02:46]:
Listen, military, why it's so expensive is there are certain parts that cost a dollar to make. By the time the military gets it, it's like $10,000. Yeah, because everybody wants to make a profit. It's insane profits. Insane amount of profits in military. Yeah. You know, in airlines and military. So, yes, they do have a crazy budget.

Sergiy [02:03:05]:
I mean, they do have advanced technology, but by the time it gets there.

Eldar [02:03:09]:
Yeah.

Sergiy [02:03:10]:
Like there. F 22 cost, what, $2 billion?

Eldar [02:03:14]:
Yeah. Some crazy shit.

Sergiy [02:03:17]:
Because. Because everyone wants to get paid on the way there.

Eldar [02:03:20]:
Yeah.

Sergiy [02:03:20]:
By the time it gets there, all the projects go over budget.

Mike [02:03:23]:
Yeah.

Eldar [02:03:24]:
Everybody milked everything out.

Sergiy [02:03:26]:
Yeah, milked everything out.

Anatoliy [02:03:27]:
Because all that is developed by for profit companies.

Sergiy [02:03:29]:
Of course, everything is developed before the.

Anatoliy [02:03:31]:
Government buys from profit company. So it's not them actually building their own companies and making.

Sergiy [02:03:38]:
Lockheed Martin is a private company. All those companies. Boeing makes stuff for the government, for military bills.

Eldar [02:03:46]:
SpaceX billions. Yeah.

Sergiy [02:03:48]:
SpaceX bill. You think those satellites from SpaceX flying around just doing.

Eldar [02:03:52]:
They're no longer government? Yeah. That's NASA. Shit is dead. NASA contracted SpaceX to fucking do most of this.

Sergiy [02:03:58]:
Dude, all of their. All of their. All of their. All of their satellites that are flying their own military also. You know how many of them there are already? He launches like, 50 every week.

Eldar [02:04:07]:
Yeah.

Sergiy [02:04:08]:
Did you see the flying over in New Jersey? No. No?

Eldar [02:04:10]:
No.

Sergiy [02:04:10]:
It was insane.

Eldar [02:04:11]:
Really? You saw it, bro.

Sergiy [02:04:12]:
It was insane. I thought aliens are coming.

Eldar [02:04:13]:
Really?

Mike [02:04:14]:
Huh.

Eldar [02:04:14]:
Dude, they just flew satellites.

Sergiy [02:04:16]:
They flew in line like fucking.

Eldar [02:04:18]:
Oh, shit.

Sergiy [02:04:18]:
Like soldiers. Yeah.

Mike [02:04:20]:
I've never seen this.

Eldar [02:04:21]:
I've never seen this across the sky.

Sergiy [02:04:22]:
That was like three months ago.

Eldar [02:04:24]:
Wow. Yeah.

Sergiy [02:04:25]:
And then I went online and everybody was posting about it. They flew information.

Eldar [02:04:28]:
Wow.

Sergiy [02:04:29]:
To their. To their location. That's insane. Like, a lot of astronomers, like, the ones that do it from home, are pissed off with him because he has so many of them and they're so low, they reflect light.

Eldar [02:04:40]:
Yeah.

Sergiy [02:04:41]:
And they're getting away. So.

Vemir [02:04:42]:
Yeah.

Sergiy [02:04:43]:
A lot of astronomers that do not have military satellites with.

Eldar [02:04:46]:
Yeah.

Sergiy [02:04:47]:
Are pissed off because of him. But you see, he's using. He's already using it for military in Ukraine.

Eldar [02:04:53]:
Yeah.

Sergiy [02:04:54]:
He's helping ukrainian military Wi Fi controlled drones. And obviously they were built with military in mind.

Eldar [02:05:01]:
Of course. Yeah.

Sergiy [02:05:02]:
If the United States needs to militarize them, they could crash all of them into all the satellites from other. Every country.

Eldar [02:05:07]:
Wow.

Sergiy [02:05:08]:
Get them to high speed, whatever the speed. You don't even have to shoot anything from here. Every single satellites in space has military application. Well, and musk, you saw how many of them he shoots up.

Eldar [02:05:25]:
Yeah. All the time.

Sergiy [02:05:26]:
50 a week or something.

Eldar [02:05:27]:
Yeah.

Sergiy [02:05:28]:
He's gonna surround the whole planet.

Anatoliy [02:05:30]:
And what do these satellites do?

Eldar [02:05:32]:
Well, his is to get communication throughout the whole world. A lot of the world is still not electrified with Wi Fi, but there's.

Sergiy [02:05:39]:
No Wi Fi, so it's all Internet.

Anatoliy [02:05:41]:
It's all Internet. Okay.

Sergiy [02:05:42]:
It's the cell. It's a sales pitch. Internet, it does it. But there's other surveillance. Yeah, there's other applications, surveillance, military, all this stuff. I mean data collection, all. The Internet's pretty fast on those.

Eldar [02:05:56]:
Yeah, they're pretty.

Sergiy [02:05:58]:
A lot of people put it in the backyard. In a regular town that don't have regular insurance, it's cheap compared to. Yeah, plus you could say, yo, I have a cellar.

Eldar [02:06:07]:
So sir, do you have to tell us about the next bubble? Man, I don't know. AI, bro. You have to keep on the lookout, man because you called the big one bubble. Me. You were mining, bro.

Sergiy [02:06:16]:
I even ignored it before my brother told me when I was not even a cent. You know what I said myself? This is stupidity.

Eldar [02:06:21]:
Yeah.

Sergiy [02:06:22]:
If I want to pay somebody a dollar for like a post. Yeah, donate. I'll just donate the dollar. Why would I donate a fraction of a dollar? Like 10,000 bitcoin for one dollars. This is dumb. And then when it went up to $1200 when I start mining and then.

Eldar [02:06:34]:
It crashed, I remember it was 1200 and we went back to like 250 or something like that.

Sergiy [02:06:40]:
I catch. I was like, I'm down with this. I should have.

Mike [02:06:42]:
Me too.

Eldar [02:06:42]:
My mom said, yo, what the fuck?

Sergiy [02:06:45]:
It was our time to buy. We should bought more.

Eldar [02:06:47]:
Yeah.

Sergiy [02:06:48]:
Sold our systems, bought more and set on it. But nobody thinks like that.

Eldar [02:06:51]:
You're like, I mean it's easier to see that when the bubble already is. The bubble.

Sergiy [02:06:56]:
I was making chorus dude. I was telling I was mining what bitcoin a month was going crazy.

Eldar [02:07:00]:
One big one a month. Yeah.

Anatoliy [02:07:02]:
That's crazy.

Sergiy [02:07:02]:
That's what sitting at my apartment. I mean the electric bill is crazy. It's $500 a month but a $1000 profit.

Eldar [02:07:08]:
He went through millions of dollars a bitcoin for sure.

Sergiy [02:07:10]:
And it was so easy because you do nothing and it just works. 27.

Anatoliy [02:07:13]:
Yeah, I mean it's just like a, like on computer leaving it off.

Sergiy [02:07:19]:
And then I started putting into my apartment on the second floor in the attic to mine because attic is ice cold in the winter.

Eldar [02:07:26]:
Yeah.

Sergiy [02:07:26]:
So I don't have to cool it off. It's running at full speed. Running very hot. But because it's so cold in the attic the computers run at perfect speeds. Yeah.

Anatoliy [02:07:34]:
I mean, yeah, no, you're right about this. Everyone's gonna hop on this AI thing and everyone's gonna be making claims.

Sergiy [02:07:40]:
You know how many ships are due? There's so many AI's. Go on your phone.

Eldar [02:07:43]:
AI, AI. Picture AI video to the spots and pay attention to the ipos of new companies with huge, huge commercial because for.

Anatoliy [02:07:52]:
Them to stand out and to raise money, they have to make bold.

Sergiy [02:07:55]:
Dude, there's so many apps. Dude, there's so many apps that are worth billions of dollars. Apps on the phone worth of billions of dollars. They probably have ipos. I don't know, dude. There's one called Photofox. I think I pay like 30 or $40 a month to use their AI for picture creation.

Eldar [02:08:12]:
Yeah. The problem is right now we're in a recession. So that type of money or attraction is not going to come until we're booming again. Yeah. When the economy is good, everybody has money. And all these promises about AI, oh, my.

Sergiy [02:08:23]:
I mean, dude, this bitcoin rules. Insane. Like, all this cryptocurrency, dude, there's so many fake companies. I even those charts like this. Yeah, I was like, promise, promise, promise, promise. And now they're gone.

Eldar [02:08:33]:
Yeah, they're gone.

Sergiy [02:08:34]:
10,000 here, 5000 here. I was like, where they. They're gone. My money and just bounced. They're all fake.

Eldar [02:08:39]:
That's why we have to catch it on the other side. Gotta buy puts first to go down.

Anatoliy [02:08:44]:
And the crazy part about this, like, technology stuff, like, there's people who are, like, in tech already, for example, that go after, like, bitcoin stuff like that. Then, like, for example, there's like, like when me and Mike went to this pizza place, this random guy in there who's just like, you know, at the front counter, he's all about crypto. He's like, I'm putting everything in there, you know, this, this, that, like, you know, like, like he's all about it and like, it almost like it makes him sound like he's like a cool. No, like this, like, like this, like cool, like. Yeah, like in tune guy. So I feel like when people. Yes, when people start hearing about all this, like, like he's an older guy, too. Like, he, like, you.

Anatoliy [02:09:26]:
You never think that he even knows how to, like, a, use a computer, right?

Eldar [02:09:30]:
Venmo.

Anatoliy [02:09:31]:
Right, right, yeah, right. And then, like, I feel like it's the same thing when this AI sounds gonna boom. When all these people. Yes, like, like that even. Or like his friends and so, like that. Get a little whiff about it. Yeah, you just get like a little head of the game. It's like you part of like the new revolution.

Anatoliy [02:09:47]:
Yeah, right. You know? Yeah, yeah. And people gonna be like, oh, did you hear about this AI, that AI, this, this and this. Everyone's gonna be throwing like a little bit of money and everything.

Eldar [02:09:58]:
And that's how the bubble.

Anatoliy [02:09:59]:
Yes. Yeah. And. And like I said, these companies, for them to raise money, they have to say crazy shit and they have to, like, ride on that a lotto ticket.

Eldar [02:10:10]:
Well, yeah. They have to. They have to. Yeah. They have to. They have to take that money and they have to, you know, do research and development crap. They're gonna promise the world. But.

Eldar [02:10:24]:
But they're gonna miss pretty big. Yeah.

Mike [02:10:28]:
Mostly. Yeah.

Eldar [02:10:29]:
Because they don't have. Also on their team. Think about it. Think about it.

Mike [02:10:35]:
Yeah.

Eldar [02:10:35]:
They don't have OSha on their team. Ethical consideration and everything.

Mike [02:10:39]:
I get declined by that. Like the new FDA.

Eldar [02:10:42]:
Yeah. And everybody right now. And what else is gonna be big on this thing? Consciousness and then ethics. They're all gonna hide behind the fact that they're ethical.

Mike [02:10:51]:
Yeah.

Eldar [02:10:52]:
And they're considering everything.

Anatoliy [02:10:53]:
Yes. Yes. Because, like, if you look about. If you think about scared. If you think about this. Crypto.

Eldar [02:10:59]:
Yeah.

Anatoliy [02:10:59]:
Revolution, electric revolution. What's like the big word here? Carbon zero.

Eldar [02:11:05]:
Yeah. Yeah.

Anatoliy [02:11:06]:
Better for the environment all the sudden.

Eldar [02:11:08]:
Yeah.

Anatoliy [02:11:08]:
All these people that are, like, trying to make crazy money, they all said all about the environment.

Eldar [02:11:13]:
Yeah.

Anatoliy [02:11:14]:
All sudden they're all about this and that's right. And the AI people will need to capture that same attention 100% about saving.

Eldar [02:11:21]:
The world in safety.

Anatoliy [02:11:22]:
Yeah.

Mike [02:11:22]:
Yeah.

Anatoliy [02:11:23]:
Yeah.

Eldar [02:11:24]:
And they're all gonna. The ones are gonna be bubbling up the most.

Mike [02:11:28]:
Yeah.

Eldar [02:11:32]:
That's gonna be very interesting to watch.

Mike [02:11:34]:
Yeah.

Eldar [02:11:35]:
You know. Yeah. We have to keep a close eye.

Mike [02:11:41]:
Hundred percent.

Eldar [02:11:43]:
What's going on, what they're doing. Put them on a watch list.

Anatoliy [02:11:47]:
How long do you think it is? Like, until they replace?

Eldar [02:11:50]:
Like, as soon as the economy.

Mike [02:11:51]:
Less than.

Eldar [02:11:52]:
As soon as the economy becomes good. In about two years. One to two years, it's gonna stop. Bubble. Yeah. It's gonna start. Became. Becoming.

Eldar [02:12:00]:
Yes. Slowly. Wow. Yeah.

Mike [02:12:02]:
By the next bubble, they're gonna come out. Five to seven years out.

Eldar [02:12:04]:
Yeah. The next one. Yeah. When they really go like, nuts. Yeah. It's gonna take some years and then it's gonna again, drop.

Sergiy [02:12:11]:
Flop.

Mike [02:12:12]:
Yeah.

Eldar [02:12:13]:
All those fake companies, they're gonna promise the world have 50 valuations with zero profit, zero product, just some kind of software that's supposed to solve the whole world. World's issues. Sure. There's gonna be a Google of AI.

Mike [02:12:28]:
Yeah.

Eldar [02:12:28]:
Sure there will be. But most of them won't be. Most of them will be. Yeah. So I can't believe that he thinks it's really close.

Mike [02:12:43]:
He's in it. That's why.

Anatoliy [02:12:44]:
Yeah.

Vemir [02:12:45]:
He's in it. Right?

Eldar [02:12:46]:
Yeah, you're right. That's probably the effect of being, like, in it and researching and stuff like that.

Vemir [02:12:52]:
That it's, like, right around the corner.

Eldar [02:12:54]:
Everybody that is in it like that, I invested their time and, like, research time, like, you know, sitting there thinking about stuff like that. I think they're all, like, within reach.

Mike [02:13:02]:
Yeah.

Eldar [02:13:03]:
Right.

Mike [02:13:05]:
Yeah.

Eldar [02:13:06]:
It's always like that.

Anatoliy [02:13:08]:
Yeah. I mean, there has to be marketing and propaganda spread to fuel these kinds of things.

Eldar [02:13:13]:
Yeah.

Anatoliy [02:13:14]:
And I think that sometimes the people that are in it the most are gonna be the biggest marketers. The biggest marketers. They have to be.

Eldar [02:13:22]:
They have to be.

Anatoliy [02:13:23]:
Right. Cuz, think about it. How am I gonna spend all my life and all my time doing something that you don't even care about or aware about yet?

Eldar [02:13:31]:
Yeah.

Anatoliy [02:13:32]:
That's why you got to make as many people aware.

Eldar [02:13:35]:
Yeah.

Anatoliy [02:13:36]:
As possible so that they can care more about the thing that you're pouring, like, your heart into. It doesn't feel good when everyone's like, oh, crypto, you know, it's not even close to being, like, what we're doing. And if someone's, like, all about it, how you gonna feel, you know?

Eldar [02:13:52]:
Yeah.

Anatoliy [02:13:54]:
Which is why you were then, in turn, going to be the biggest marketer of it.

Eldar [02:13:59]:
Yeah. Crypto is probably gonna be also, you know, closely tied link to these types of things. Yeah. AI. Crypto shit.

Anatoliy [02:14:08]:
Yeah. AI. Developments gonna be.

Eldar [02:14:10]:
Get.

Anatoliy [02:14:10]:
Probably get paid with crypto.

Eldar [02:14:12]:
Yeah. Yeah.

Anatoliy [02:14:14]:
Because you need, like, as Warren Buffett said. Right. You need a snake oil to fuel snake oil.

Eldar [02:14:24]:
Yeah. It'll be interesting. Very interesting. All right. Any final thoughts, AI? Oh. Oh, hell, yeah.

Mike [02:14:39]:
A little bit of both.

Eldar [02:14:41]:
Yeah.

Mike [02:14:42]:
Yeah. Curious to see what happens, for sure.

Eldar [02:14:46]:
Yeah.

Mike [02:14:46]:
I'm not sure. It's gonna be bad. Good. It's gonna be both.

Eldar [02:14:48]:
It's gonna be both.

Mike [02:14:50]:
Gonna be good for some people, bad for others.

Eldar [02:14:53]:
Yeah.

Sergiy [02:14:56]:
Yeah. Bubble, bubble, bubble.

Mike [02:14:59]:
Double bubble.

Sergiy [02:14:59]:
Would be nice to make some money on the bubble. Remember when people got crazy rich on that other coin? Did nobody care about that coin?

Eldar [02:15:06]:
Yeah.

Sergiy [02:15:06]:
Or the other one? It was like, shiba, shiba insane. Are you kidding me? A week before I went crazy, I had, like, a million of them. I was like, I'm just gonna sell. It's not doing anything. Then boom. I was like, are you kidding?

Vemir [02:15:17]:
Again?

Sergiy [02:15:17]:
I missed it. Like, I missed every single bubble.

Eldar [02:15:19]:
Yeah. Well, look, this is what we're talking about. The next one, bro.

Anatoliy [02:15:22]:
No, no, but this one, like, you're talking about betting against it.

Eldar [02:15:26]:
Correct. We're not trying to know now. We're not trying to, like, trying to bet against them. No. I'm gonna pay close attention to when the bubble rises. Because the bubble will rise.

Anatoliy [02:15:36]:
Yes.

Sergiy [02:15:37]:
More than a way for it to crack.

Eldar [02:15:39]:
Yeah. And then when it cracks, that's what I'm gonna bet again.

Anatoliy [02:15:41]:
Yeah. You have to be paying most attention when there's the most amount of hyper on.

Eldar [02:15:44]:
Right.

Sergiy [02:15:45]:
Like, everybody's gonna be like, everybody talking about it right now. They are talking about even. Even Joe Rogan.

Eldar [02:15:51]:
Are they going public?

Sergiy [02:15:53]:
I don't know. Who owns it. Second scientist, Elon created it was co founder of the ship, you see.

Eldar [02:16:00]:
Yeah. Tri GPT was a coco. He was co founder of it, bro.

Anatoliy [02:16:03]:
That's crazy. He was part of it.

Eldar [02:16:06]:
Yeah. It is insane.

Sergiy [02:16:07]:
It's wild.

Eldar [02:16:08]:
It is wild.

Sergiy [02:16:08]:
It's gonna take away so many. It's already taking jobs.

Eldar [02:16:10]:
It's gonna double your SEO business with that shit.

Sergiy [02:16:12]:
100%. Yeah.

Eldar [02:16:14]:
Article, bro.

Sergiy [02:16:15]:
Six years ago, ten articles. No, you can still do it because you still gotta put them up.

Eldar [02:16:18]:
Yeah.

Sergiy [02:16:19]:
Six years ago, I met a guide. He was like, oh, you do SEO? And I'm like, yeah. He's like, you know, SEO is gonna die soon. And I'm like, oh, you're talking about AI is gonna take over. I said, no way, bro. Not now. He goes, watch. And then when he came out and I was like, yo, he was right.

Eldar [02:16:36]:
He was right.

Sergiy [02:16:36]:
He was six years ahead of time. Yeah, he told me back then, and that's why I stopped even. That's when he told me I kind of stuff, like, maybe it's possible. So that's why I went away from is Yoko and just start my car stuff.

Eldar [02:16:45]:
That's true.

Sergiy [02:16:46]:
Because I was like, technology is advancing so fast, you know, I was like, I don't have. I don't have the time for it anymore because you got to be a really good writer, and this shit is good. Like, if you could train, like, if you train it good enough. And now they have version three coming out, it's going to be even crazier. This is only version two. Yeah, version three. Imagine version ten. I mean, you will write articles, science papers.

Anatoliy [02:17:08]:
Could the government just turn around and say, like, yo, going forward, this shit is illegal.

Eldar [02:17:13]:
Why and for what reason?

Anatoliy [02:17:15]:
Like, if they're feeling that it's like.

Eldar [02:17:17]:
A harm, you know, they can have to, like, like, for schools, for example, the government has to adjust and come up with policies. Yeah. Where kids not gonna have to cheat anymore or using this tool and stuff. Like that. They have to go around, but there's no way. There's no way to stop.

Mike [02:17:30]:
You have to ban essays, then you.

Eldar [02:17:31]:
Have to ban the Internet.

Sergiy [02:17:32]:
No, no, people, people would have to.

Eldar [02:17:34]:
Like, in some countries, the future of.

Sergiy [02:17:36]:
The people is not gonna be like being smart or writing or anything like that. We're just gonna be like creative people. We're gonna draw, you know, bright music, all this shit. Although even.

Anatoliy [02:17:47]:
Yeah, yeah. What's. Yeah, AI could make art.

Sergiy [02:17:50]:
Yeah. They could make it. They could make their own, but we can make our own. So that's gonna be valuable versus, you know, you're not gonna be able to do. Taxes will be done.

Eldar [02:17:58]:
No, but if everything is solved, when it comes to your basic needs, surge, what's the point? What, like, creating your own art and like, valuable art?

Sergiy [02:18:07]:
Like, this is just whatever. Yeah, you're just gonna. You're just gonna be busy with yourself because we finally figured out like the crack and like, now we don't have to bust our asses as much. We're still gonna do something. Maybe we're gonna maintain robots. Yeah, we're gonna be hands on fixing the robots or whatever we're gonna do, but we're gonna have more time for ourselves. Yeah, go enjoy the nature. Go enjoy time with your family because all the simple tasks will be done by robots and AA.

Sergiy [02:18:32]:
Right now we have vacuum cleaners and what's the currency?

Mike [02:18:35]:
How is that going to be affected by currency if everybody, nobody has to work.

Sergiy [02:18:41]:
No, no, we are going to work. We're just not going to work certain jobs.

Mike [02:18:44]:
But is there enough jobs for people?

Sergiy [02:18:46]:
Of course, AI or even. Not even AI, like robotics, will create.

Mike [02:18:50]:
Other jobs, but robots can do the job.

Eldar [02:18:53]:
It'll do most of its work. Those robots will be self sustainable.

Mike [02:18:57]:
Yeah.

Sergiy [02:18:57]:
Not all of them. No.

Eldar [02:18:59]:
Certain things will fix themselves.

Mike [02:19:00]:
Yeah.

Sergiy [02:19:00]:
No, they can't just.

Eldar [02:19:02]:
Of course they can. It's AI, bro. It's all knowing.

Sergiy [02:19:05]:
No, no, no.

Eldar [02:19:06]:
It's all knowing. All good.

Sergiy [02:19:07]:
There's still a human factor, has to be in everything without you.

Eldar [02:19:11]:
Why do you care about that? Who gives a fuck about that? Let them just worship us as gods.

Sergiy [02:19:15]:
We could be their gods too, you.

Eldar [02:19:16]:
Know what I mean? Like the most important thing is our safety out, our health.

Sergiy [02:19:20]:
Imagine if everything is just perfect and nothing wants to kill you anymore. Everything is solved. You just enjoy life. You're just high on life 24/7 then.

Eldar [02:19:29]:
You'Re gonna have to pinch yourself and.

Sergiy [02:19:30]:
Think that you're dreaming possible, too. I actually noticed that. I have very vivid dreams for all my life. And now that I'm getting older, they drive me crazy because I wake up very tired. My mind is tired because I wake up because of my dream. I'm always saving somebody, running somewhere, jumping somewhere.

Eldar [02:19:49]:
You're living too much.

Sergiy [02:19:50]:
In the past, I used to make fun of people, like, oh, you don't see dreams at all? Like, no, just blank. And I'm like, fuck, sometimes I wish I was like that. Yeah, because I would wake up refreshed. I don't wake up refreshed at all.

Eldar [02:19:58]:
Yeah, because my mind, obviously, you're not like James. You can't control your dreams.

Sergiy [02:20:03]:
I used to be able to, I used to be able to fly my own dreams knowing that I'm dreaming. Yeah, but it like, it took like a while. I was, I was young and I would, before I would go to sleep, I would tell myself, you are about to fall asleep. You're about to fall asleep. You're about to fall asleep. And then when I've been in a dream, somehow that thought continued. And when I'm on the dream, I'm like, I think I'm dreaming. Let me test this and boom, it works out like, alright, let me fly.

Sergiy [02:20:27]:
And like I start flying. Like I would fly like a swim, maybe like two, 3ft above the ground.

Eldar [02:20:32]:
Yeah.

Sergiy [02:20:32]:
And then I started flying over buildings and I'll go like fly. And I would fly and then I'll start to fly like city to city.

Eldar [02:20:39]:
That's funny because I had a dream like that before too. Not often like you did, but I've had a dream when I was flying and I remember, like also like, like trying to do it.

Sergiy [02:20:48]:
Yeah, like why did I stop flying? I was like, yeah, but I trained myself when I was young. For a while I was able to control my dream. I could do anything I wanted in the dream.

Anatoliy [02:20:57]:
So then what a I like promote permanent dream state for people just to stay always asleep.

Sergiy [02:21:04]:
You could do that too. I mean, you could do more shit.

Anatoliy [02:21:06]:
In the dream, right?

Sergiy [02:21:07]:
You could do that now. They could put you into dreaming the whole place.

Eldar [02:21:12]:
You can pay for that right now.

Sergiy [02:21:14]:
I don't know if you could pay for it. Hit your head hard enough on something.

Eldar [02:21:18]:
You don't have to pay for it.

Sergiy [02:21:19]:
Induced coma. Just how long they're gonna keep you on it is questionable how long insurance will pay for.

Eldar [02:21:23]:
Yeah. How long can somebody be in induced coma for? Years. Right? Like if you. If they take care of you.

Sergiy [02:21:28]:
Yeah, some people like it. I think the longest was like 20 years or something.

Eldar [02:21:31]:
20 years.

Sergiy [02:21:32]:
Let's Google it. Google how long?

Eldar [02:21:34]:
Okay, Google, what was the longest person in the coma?

Anatoliy [02:21:39]:
37 years.

Eldar [02:21:40]:
No, I thought you woke up, son. He wake up though, bro.

Sergiy [02:21:46]:
Some say.

Eldar [02:21:47]:
How about this? Okay, Google, what was the longest person in the coma and woke up. That's sick.

Anatoliy [02:21:55]:
That's crazy.

Sergiy [02:21:55]:
Imagine you fall asleep as a kid.

Eldar [02:21:59]:
Oh, my God. Yo, that's wild, yo.

Sergiy [02:22:01]:
Because when you dream, you don't fucking. No.

Eldar [02:22:03]:
Yeah, bro, you just fucking. She woke up and she's like, who's this?

Sergiy [02:22:06]:
He's like, you suck in the mirror the fuck? The way technology left you. You come outside, you got.

Eldar [02:22:14]:
Yo, I gotta find an interview with her, bro.

Sergiy [02:22:16]:
Maybe there is.

Eldar [02:22:17]:
Yeah, she woke up and she wasn't a vegetable, bro.

Sergiy [02:22:20]:
I think I watched somebody recently on YouTube. The guys, like, for ten years, he slept or something. He woke up. He's like. He's like, I don't want to be here because I cannot.

Eldar [02:22:28]:
She doesn't remember anything. He doesn't say anything.

Sergiy [02:22:29]:
I cannot be like, what the fuck?

Eldar [02:22:30]:
This iPhone.

Sergiy [02:22:32]:
What is this? He's like, I cannot integrated. Did you guys hear about the. I think it was japanese soldier that stayed in the cave for 40 something years thinking there's still war going on.

Eldar [02:22:42]:
Get the hell out of here.

Anatoliy [02:22:44]:
What?

Eldar [02:22:45]:
Stop.

Mike [02:22:45]:
Yeah, damn.

Eldar [02:22:46]:
It's real. Real.

Sergiy [02:22:48]:
They just found him recently, like in the nineties.

Eldar [02:22:49]:
Get the hell out of here.

Sergiy [02:22:51]:
Weapons. And he was killing people who came to visit him because people be going by his wheel. And when he saw airplanes flying by, I thought the war was still going on. Ask Google.

Eldar [02:23:00]:
Okay, Google, who is this japanese soldier that stayed in a cave for 40 years thinking there's still war outside? Here's some information from the web that might possibly help. On the website BBC.com, they say Shoichi Yukon, the japanese soldier who held out in Guang. It's exactly 40 years that the japanese.

Anatoliy [02:23:24]:
Soldier was found in the jungle of the Guan.

Eldar [02:23:26]:
Having survived here for nearly three decades after the end of World War Two.

Vemir [02:23:30]:
He was given a hero's welcome on.

Eldar [02:23:31]:
His return to Japan, but never quite felt at home in modern society.

Sergiy [02:23:37]:
For 30 years.

Anatoliy [02:23:38]:
They gave him a fucking medal, bro.

Sergiy [02:23:40]:
Dude, he was deformed. The longest war, he crashed or something and ended up there. And he thought, okay, war is still going on. So he lived like a. Like a military life.

Eldar [02:23:50]:
Oh, my God. He was probably the biggest conspiracy theorist.

Anatoliy [02:23:53]:
Bro, that's hunted for his food and shit.

Eldar [02:23:57]:
Yeah.

Sergiy [02:23:58]:
People say he was shooting at people when they came. He thought they're coming after him like the new that he thought the new Japanese. Like, japanese people coming. He thought they were like captured. Captured Japanese because they didn't seem like the one he was and he was shooting at them and shit. He supposedly killed people.

Eldar [02:24:14]:
Was he like a normal person afterwards? Like, did he, like.

Sergiy [02:24:16]:
No, he said. He just said that he couldn't integrate into society.

Eldar [02:24:19]:
Yeah.

Sergiy [02:24:22]:
And Japanese were very, very loyal soldiers. Like, you know, that's what kamika. They had kamikaze.

Eldar [02:24:27]:
Yeah, yeah.

Sergiy [02:24:27]:
That's how loyal japanese people are.

Mike [02:24:29]:
That's crazy.

Eldar [02:24:30]:
They're very.

Sergiy [02:24:31]:
They're very loyal. They're very, you know, strict, like machines. So, yeah, this guy, man, for 35 years, one person in coma for 30 years. Yeah, in a cave, I don't know which one is.

Anatoliy [02:24:43]:
And then they give you a medal afterwards. You're celebrated for being an idiot, basically.

Sergiy [02:24:50]:
Right.

Eldar [02:24:51]:
What's happening out.

Anatoliy [02:24:52]:
Yeah.

Mike [02:24:52]:
Like, he was so scared for his.

Eldar [02:24:54]:
Life every single day.

Sergiy [02:24:55]:
There was no deep in the jungle.

Anatoliy [02:24:58]:
Shit is worse.

Eldar [02:24:59]:
Yeah, but he was better, bro.

Sergiy [02:25:01]:
He was so deep in the jungle, he was too far away from anything.

Eldar [02:25:03]:
So you don't know what the fuck happened? Yo, this motherfucker was fucking.

Sergiy [02:25:07]:
He said he saw airplanes flying by. He thought he's just advanced with technology.

Eldar [02:25:12]:
I want to see this interview. I want to see that ladies interview, and I want to see his interview because that's wild.

Anatoliy [02:25:16]:
No, his is worse, bro. He lived everyday life all day. He lived in war. There's a war. And having his life purpose be this, to come out, to get a medal, to realize you just wasted 30 years of time.

Sergiy [02:25:27]:
You know how they got him out? He didn't trust anybody. They said, hey, dude, war is over. I do not trust. They found the guy who served with him. Like a general, someone that was still alive.

Eldar [02:25:36]:
No, no.

Sergiy [02:25:37]:
He goes, bro, the war is over. Come out.

Eldar [02:25:41]:
Oh, my God. That's like a movie, bro. That's like in the movie that was.

Anatoliy [02:25:45]:
He was fake.

Eldar [02:25:45]:
Yeah.

Sergiy [02:25:46]:
Found a guy who's that?

Eldar [02:25:47]:
Who could only. The only guy that convince him of.

Sergiy [02:25:50]:
This should make a movie about it for real.

Eldar [02:25:53]:
So imagine the technology that he saw. He's like, what the fuck is this? They probably brought a tv to him, bro. Yeah. This is the world today.

Anatoliy [02:26:00]:
He probably like thinking much how much you gotta understand.

Sergiy [02:26:03]:
He still thinks there's a man in the forties. Yeah, in the 19. In the nineties, whatever. How. How people were in the forties. Some people still probably wrote horses, you know, and he.

Vemir [02:26:16]:
Holy shit.

Sergiy [02:26:17]:
And now people pull up on like iPhone, whatever, flip phone.

Eldar [02:26:22]:
People go to jail.

Anatoliy [02:26:23]:
Yeah.

Eldar [02:26:23]:
Nobody go very young, and they come out, like 50 years later.

Sergiy [02:26:26]:
They don't want to go back and want to go back.

Eldar [02:26:28]:
Like, what the fuck happened? They heard of iPhones, but they don't. Don't understand the concepts of it.

Anatoliy [02:26:33]:
Yeah.

Eldar [02:26:33]:
You see these interviews. Yeah. That they. You know, they. Or they're these gangsters.

Sergiy [02:26:37]:
Yeah.

Eldar [02:26:38]:
Like, holy shit. They're like, what the fuck?

Sergiy [02:26:40]:
They were already primitive when they went in.

Eldar [02:26:42]:
Yeah, yeah, yeah. That's crazy. Yeah. Like that. You stop time.

Sergiy [02:26:46]:
Yeah. It's insane.

Eldar [02:26:48]:
You stop time. Yeah.

Sergiy [02:26:50]:
A lot of people who spend a long time in jail, they want to go right back in. They actually commit crime to go back in because that's theirs. That's their society, that's their life.

Eldar [02:26:57]:
Yeah.

Sergiy [02:26:58]:
They can't be around normal shit no more.

Eldar [02:27:02]:
That's just somebody.

Sergiy [02:27:03]:
Somebody normal pushes him, he goes, yeah. Some people come out of jail, usually ten years. It's okay. They come out, like, smarter, usually more rehabilitated. Yeah. Well, read. Because they're not. They have time, but nothing else.

Sergiy [02:27:19]:
They're really. They read a lot come up, but.

Eldar [02:27:21]:
They'Re so behind with tech, for example.

Sergiy [02:27:23]:
Oh, yeah, of course. With society. That's crazy.

Eldar [02:27:27]:
Right? This guy. Yeah. And the people that go in now, for example, right. They're like, oh, iPhones. IPhones. They gonna come in. In 50 years, we have fucking robots ruling the world. And AI, you know, we're gonna have.

Sergiy [02:27:38]:
No phones in the future.

Eldar [02:27:39]:
Yeah. It's all gonna be built in. Yeah.

Sergiy [02:27:42]:
Neuralink is gonna be in the low of our brains.

Eldar [02:27:44]:
Yeah.

Sergiy [02:27:45]:
Because imagine what you look forward to that.

Mike [02:27:46]:
Yes.

Sergiy [02:27:47]:
It's interesting.

Eldar [02:27:47]:
Yeah.

Sergiy [02:27:48]:
That's what you just did.

Eldar [02:27:49]:
Yeah.

Sergiy [02:27:49]:
You could have done it. All of us could have done it.

Eldar [02:27:51]:
Yeah.

Mike [02:27:53]:
You're like, hey, guys, we're not gonna talk to each other anymore. We're just gonna be like, yeah.

Eldar [02:27:57]:
Think about it. Yeah.

Sergiy [02:27:58]:
Yeah.

Eldar [02:27:58]:
Like cyborg.

Sergiy [02:27:59]:
Or. Or we will see it floating in the air. We're all gonna be looking at it. But since I started our mind. But it's not.

Eldar [02:28:06]:
I projected it for you because we all casting.

Sergiy [02:28:09]:
You are projecting. You are projecting it, but in your mind. In your mind. In my mind. And all of our minds sees it here in the air, let's say. But it's. You're not really projecting it out connected.

Anatoliy [02:28:19]:
To the same Wi Fi.

Eldar [02:28:20]:
When they ipo this technology, you let me know. I'm shorting that, too.

Sergiy [02:28:24]:
Because text messaging is basically the same thing. It's just. We have to type. Yeah, but in the future, you don't have to.

Eldar [02:28:29]:
Yeah. You don't. You're gonna have a choice. Are you sure you want to this one? Yeah, no, we're gonna be doing them.

Sergiy [02:28:41]:
Yeah, but no, there's technology now that could read your mind.

Eldar [02:28:44]:
They're saying.

Sergiy [02:28:45]:
Yeah, it could read if you're lying, what you're thinking.

Eldar [02:28:49]:
Yes. I don't know why search hasn't like, you know, like hit a bubble yet, you know?

Mike [02:28:52]:
Yeah.

Eldar [02:28:53]:
He's always, in the end, he's like, he's telling us about shit that's gonna come out in 30 years. That. Yeah.

Mike [02:28:59]:
Hundred percent.

Eldar [02:29:01]:
You know, it's just, he's not buying the stock of whole. He does have patience. He needs to come here. Patient? Yeah, as a virtue. Yeah, get that installed, that program of patience installed in him and then just sit on something for a $1,000. Like something little. Yeah, don't touch it.

Sergiy [02:29:14]:
In a week, I'll be like, fuck this shit.

Eldar [02:29:15]:
I'm thinking, well, he has to come and give us a. We'll do it for you. Yeah. Hide it away from him. And then we're like, okay, sir, now you can come. Finally I did something right.

Sergiy [02:29:25]:
Yeah.

Anatoliy [02:29:26]:
It's just a matter of being there. On the downfall of the shit.

Eldar [02:29:28]:
Yes. Well, no, no, no. You just have to pay attention. That's it. What's happening? Take notes and they just outline.

Sergiy [02:29:34]:
We gotta. You gotta be patient. Anything that makes money takes long, like investments take a long. I could have had all the bitcoins I had left for years, but I didn't. Yeah, I bought some bullshit instead, you know?

Eldar [02:29:46]:
Yeah. Crazy patient game. Yeah, the bubbles coming.

Sergiy [02:29:52]:
We gotta find out also how we got a Google, basically, which companies have ipoh.

Eldar [02:29:56]:
You don't have to google it, bro. No, you don't have to. Soon it's gonna come out. You're gonna hear the buzz. It's gonna be.

Mike [02:30:02]:
You're gonna hear the buzz. I was gonna be talking, though.

Sergiy [02:30:05]:
Everybody was teased the way Elon was controlling that dutch coin.

Eldar [02:30:10]:
Yeah, correct.

Sergiy [02:30:11]:
He cashed out on it like crazy. He's a whale, you know. So you ever been to those whale groups on like telegram and stuff?

Eldar [02:30:18]:
No.

Sergiy [02:30:19]:
So you know how you make money as a whale and not even a whale? You become a whale. Bunch of, bunch of. And you get in maneuver. Hey, listen, tonight at 08:00 we're gonna say which coin we're gonna buy and pump and dump, and you get in. It's like 10,000 people.

Eldar [02:30:36]:
Yeah.

Sergiy [02:30:37]:
Like, okay, we're gonna buy this insignificant coin. We're gonna pump it to the, you know, who cashes out? The top guy because he already has the most amount of the coins.

Eldar [02:30:45]:
Yeah.

Sergiy [02:30:46]:
And by the time those 10,000 people hit that website, that website server can't handle it.

Eldar [02:30:50]:
Yeah.

Sergiy [02:30:50]:
And it crashes.

Eldar [02:30:51]:
That's it.

Sergiy [02:30:51]:
So the guy, he cashes out and everybody in the back.

Eldar [02:30:54]:
Well, that's how they all got caught, those guys.

Sergiy [02:30:57]:
Well, this is the recent guy.

Anatoliy [02:30:58]:
So it's illegal to like inspire people to do.

Eldar [02:31:01]:
Oh, yeah.

Anatoliy [02:31:02]:
Like to tell people to organize them.

Sergiy [02:31:04]:
It's done on telegram and those channels.

Eldar [02:31:08]:
Discord and shit like that, you know. But everything's monitored, brother. Will get you, bro. Yeah.

Sergiy [02:31:13]:
If they need to do so.

Anatoliy [02:31:15]:
It's illegal to tell people to like a financial advice.

Eldar [02:31:17]:
If you're not a financial advisor, you cannot tell people.

Mike [02:31:21]:
But also you do a market manipulation that's illegal to probably 100%.

Anatoliy [02:31:26]:
Yeah, like that. That's probably the big.

Eldar [02:31:28]:
That's what those guys were doing with all those crypto things. They kept telling like, yo, buy this crypto. I'm gonna give you this. Yeah.

Anatoliy [02:31:34]:
Cuz you're.

Sergiy [02:31:34]:
You're basically.

Anatoliy [02:31:35]:
You're basically. By doing what's like a financial, like. Like army. Right. Like of.

Vemir [02:31:39]:
Yeah.

Anatoliy [02:31:39]:
That are gonna do shit.

Eldar [02:31:41]:
Well, Gamestop. Yeah.

Sergiy [02:31:42]:
Stuff was amazing.

Eldar [02:31:44]:
Yeah. They got together and said, yo, that's it. Yeah, let's fuck over the older shorts. The short people that shorted that shit. Yeah. And the short people have to buy the stock back. And when they. All the institutions like, holy shit, we have to cover our fucking asses because going to the moon, they bought more and they keep climbing.

Eldar [02:31:58]:
Bought more, kept climbing.

Anatoliy [02:32:00]:
Yeah. That's crazy.

Sergiy [02:32:02]:
All these institutions became bankrupt.

Eldar [02:32:03]:
Yeah. Those institutions were fucking on there.

Anatoliy [02:32:06]:
Like, what was. Was that like a Robin Hood?

Eldar [02:32:08]:
Right.

Anatoliy [02:32:09]:
And those things or.

Mike [02:32:09]:
No, Robin had problems with it.

Eldar [02:32:12]:
They.

Sergiy [02:32:13]:
Yeah, well, Robin Hood is the company people used for.

Eldar [02:32:18]:
Yes.

Sergiy [02:32:19]:
They stopped it. Which was. Should also was illegal.

Eldar [02:32:21]:
Yeah, something. Yeah.

Sergiy [02:32:23]:
It's even. Even Coinbase did that. When I forgot which coin was pumping. Like, they shut down the servers and I couldn't sell.

Eldar [02:32:34]:
Yeah.

Sergiy [02:32:35]:
I had a shitload of money. I'm like, I can't sell. By the time they fixed all the issues, the coin crashed.

Eldar [02:32:40]:
Yeah.

Sergiy [02:32:40]:
Because I was catching it on the way up. And by the time they fixed the service, because they didn't want to lose their money.

Eldar [02:32:45]:
Yeah. So.

Sergiy [02:32:46]:
Which is also should be illegal.

Anatoliy [02:32:48]:
Yeah.

Sergiy [02:32:48]:
Because they were like. They cash in like crazy. All this coinbase and everything on pumps. But then they stopped when they needed. I remember when bitcoin was going crazy, they would stop whenever they wanted servers down and, like, the coin, like, crashes, and they was like, okay, I know. They just, you know, sold it on the app and now made all of us suffer, made us not solid. It was crazy. So anything that's controlled by people is all bullshit, you know?

Eldar [02:33:11]:
But one scam that's already going on, Mike, is FDA approval, bro.

Sergiy [02:33:14]:
Oh, yeah, that's a scam.

Eldar [02:33:15]:
That shit you, bro. That shit is all pump and dumb, bro.

Sergiy [02:33:18]:
Mm hmm.

Eldar [02:33:18]:
Yeah.

Anatoliy [02:33:19]:
How many of those FDA approvals actually end up doing something?

Eldar [02:33:21]:
Nothing.

Sergiy [02:33:22]:
All those FDA.

Eldar [02:33:23]:
Yeah.

Sergiy [02:33:24]:
All those organic. All that shit. So B's.

Eldar [02:33:32]:
All right, so, final thoughts. AI surge, yay or nay? Hell yeah. Oh, hell yeah. Hell, yeah. Okay, cool. Like you said, you kind of.

Sergiy [02:33:42]:
I'm a futurist and technologist.

Mike [02:33:44]:
Yeah, yeah. I want to see where it's gonna bring. Yeah, I'm. You know what we're gonna make of it.

Eldar [02:33:49]:
Yeah, I think it's super necessary. Yeah, it's super necessary. And obviously, it's inevitable, probably.

Mike [02:33:53]:
Yes, it is.

Eldar [02:33:54]:
It's inevitable, so, you know, it is what it is. Yeah. Yeah. All right, cool.

Transcript source: Provided by creator in RSS feed: download file