Technology is Humanity - podcast episode cover

Technology is Humanity

Mar 27, 20221 hr 9 minSeason 3Ep. 2
--:--
--:--
Listen in podcast apps:

Episode description

The way we normally think of technology is as an extension of what humans do in the absence of technology. Technology extends our eyes, ears and muscles, and also our memories and even our thinking. But in this episode I argue that technology and humanity are actually one in the same; that technology is not an extension of humanity but rather an integral, and inevitable, part of human evolution. I'll be touching on a number of fundamental patterns and concepts, including external stressors, emergence, complexity thresholds, feedback loops and negentropy (negative entropy). I'll end with some deep questions about the fate of humanity, and ultimately, what it means to be human. 

Support the show

Become a Member
nontrivialpodcast.com

Check out the Video Version
https://www.youtube.com/@nontrivialpodcast

Transcript

Everybody. Welcome to Nontrivial. I'm your host, Sean mcclure. In this episode, I'm going to look at the connection between humanity and technology. Now, normally, the way we think of this is as technology being an extension of what humans do in the absence of technology. I'm actually going to argue that technology and humanity are essentially one in the same thing.

I'm going to be touching on a number of interesting patterns and concepts, things like external stressors, emergence, complexity thresholds, feedback loops. Some of these might sound familiar based on previous episodes, something called simultaneous invention. Uh Schrodinger's paradox, uh something called neg entropy, self organization and adaptation. So a lot of really interesting and concepts to discuss in this episode, this week's book that I personally think reflects some of these patterns.

This is not the opinion of the author of the book, but you know, my opinion of this book, which is what technology wants by Kevin Kelly, what technology wants by Kevin Kelly, I think it's a really interesting example of viewing technology um as essentially a complex system and therefore taking on lifelike properties and therefore doing things like quote unquote wanting.

So I'm gonna talk about how it makes sense to think of technology as having a want having a pull to it and what that means for humanity. I think it's gonna be a really interesting episode. Has lots of important and interesting concepts and patterns associated with it. Let's get started. OK. So technology is humanity. What do I mean by that?

Um I just want to call out right now that I am going through a slide deck that I've actually put together, which has uh you know, a number of pictures and words that anchor, what I'm talking about. So for Patreon subscribers, you can head on over to Patreon and if you're willing to pay $5 a month, you can get this extra content. The audio is the exact same. Anyone listening to this podcast for free gets the exact same content from an, from an audio perspective.

But if you would like to see uh the slide deck that I'm using, which again has the visuals that anchor, what I'm talking about, it might help you kind of understand some things that's a bit of extra content that I've just decided to put out there on Patreon for subscribers that would like to pay a little extra to get a little bit more. But the audio is exactly the same.

So I'm throwing that out there, head over to Patreon if you'd like to see the deck that I'm using, I have an overview deck that just talks about the concept and I've already mentioned in the introduction, the patterns and the concepts that I'll be talking about these things like external stressors, emergence complexity. You know, a lot of these themes that I bring up uh throughout non-trivial in general. Um On the left, I talk about just the overview of this presentation.

So we've got technology as an extension of life. We've got life in complexity. We've got this thing called the Techni, which is a word that Kevin Kelly uses in his book, what technology wants, I think is a good word. It kind of basically frames all of technology that is currently available today as a singular entity, right? And so it makes you, you take on this perspective as technology being this almost breathing living thing. And uh and does it make sense to do that? And what does that mean?

And we'll move into this idea of, can technology want uh touch on the concepts that have to do with simultaneous invention? And I'll, I'll talk about why I'm bringing that up. And that was also brought up in the book, what technology wants. Uh We'll talk about Luddites and technophobes and how they actually do use a fair amount of technology and draw the line somewhere, this concept of neg entropy.

So this is like essentially the opposite of entropy entropy being this push towards disorder and chaos and how living things systems uh tend towards the opposite of that, which is the structured, ordered, you know, material and matter and information and, and that will get into this idea of evolution and adaptation. And ultimately this kind of argument that I'm gonna be making uh in this episode, that technology is essentially a continuation of evolution of humanity.

In other words, it's not really this different thing. It's just kind of a continuation of what has always been. It's just uh anyways, I'll get into that and, and tell you what I mean and also what Kevin Kelly meant in his book. Um And we'll look at the end game, this idea of technology being beautiful and then ultimately technology is humanity. So might be a little off putting to some because people tend to think of technology as being quite different than humanity.

Again, it's this extension and it's useful, but it's definitely not the same as humanity. So, what do I mean by that? Ok. Um So yeah, again, this week's book is What Technology Wants by Kevin Kelly, I'll have some other references at the end um that that people can look at and click on some papers um related to some of the kind of the scientific concepts I'll be discussing. But as usual, I will try to keep things fairly um conceptual. OK. So let's begin with technology as an extension of life.

This is definitely the normal way of thinking of technology, right? Technology extends our eyes, our ears, our muscles if we go pick up a smartphone, we start talking to, to family, friends, you know, business partners, whoever we're going to obviously hear them, uh, people that, that could essentially be thousands of miles away and, and, and be doing that virtually instantaneously.

Same thing with, you know, if we jump on a Zoom call, um, we're gonna be seeing people right in, in essentially real time. And, you know, as if we were in the same room now, not, it's not the exact same experience. We know there's definitely some missed aspects to it over being in physically, the same room as someone, but we are essentially extending our eyes and our ears beyond what they would be able to do. If we did not have that technology, we do, we, we extend our muscles as well.

So ever since the industrial revolution, we have created machines that help us build bridges and lift cinderblocks and you know, lift heavy weights and draw cables over uh very large distances, uh and, and pave roads and whatever, all the, all the kind of construction industry and all that we are extending our muscles beyond what we would be able to do as humans alone, even even as collections of humans, right?

It allows us to build things we otherwise would not be able to build and we're really extending all of our senses.

You know, I mentioned eyes, ears and muscles, but the technology that we use, especially today, you know, there's tech, there's tactile technologies, there's all these different kind of gadgets that can really extend uh all senses uh that, that, that we have as humans and beyond just the senses, we're also offloading our memory and thinking to machines, especially as of late in the last couple of decades, right?

So computing technology in the age of information that we are currently in is, is something that allows us to offload much of our memory. I mean, it used to be, you know, obviously, way back in the day people would gather around the campfires and they would tell stories.

And so we all know that kind of idea that that storytelling was the way that you passed on knowledge, you, you built up culture and now there's a lot less storytelling and a lot more just kind of off loading information uh you know, explicitly to machines. All right. So that's how we organize our files, that's how we organize our photos. You know, if you want to know how something works or what something is all about.

You go to the Wikipedia page, you go to the article, you go to the scientific journal, you read the book, whatever it is, we, we offload that essentially to software that stores, right? And, and then ultimately to hardware that information for us. So there's, there's much less of this demand to, to memorize, you know, and it's interesting that if you look at, you know, these memory competitions.

If you want to know how good human memory actually is, you know, go Google memory competitions and you can see people who will, you know, take 10 decks of cards and memorize them all in one sitting or they will um you know, take a big chunk of a book and whip through it and then, and, and you know, uh say it back verbatim or they'll give someone will give them a bunch of phone numbers or a bunch of images or whatever it is, they literally have these competitions and they hold them every year.

And uh and, and people can do, you know what looks like astounding feats of memory. But really what they're tapping into is something quite natural. And I think that's something that's been lost over the years because we offload our memory uh much more to machines. Why bother, you know, spending so much time trying to, trying to memorize things when you know, we can just put it in the, put it in our smart smartphones, right?

Um But not just memory also thinking the processing itself, especially as we get into these technologies like machine learning, which is essentially a narrow form of quote unquote A I, you know, by using A I technology, it's doing actual decision making, right?

Based on that information, it's taking all that data, it's discovering patterns making these statistical correlations and then being able to sit decisions on top of those correlations and, and that's essentially, you know, even if it's just an analogy that's analogous to what the mind does, right? When we take in a bunch of information from our surroundings and we base decisions off of those and we make decisions off of those. Well, a lot of that is now getting offloaded to machines.

And that's just because we're in the information age, there is way more information than we can handle using our, you know, brains that haven't changed in 50,000 years. And so we we offload things so it extends humanity. Technology is an extension of life that makes sense to think of technology in that fashion. But there's something problematic with the term extension. If you extend something, you're essentially, you know, supercharging innate abilities, I guess, right.

That's one way of thinking about it. Take what your eyes can do, but then go thousands of miles, take what your ears can do and do the same thing, take what your muscles can do and then you multiply it, you know, by quite a bit in order to lift a lot more. But technology has always been a part of humanity. It has never not been a part of humanity, right? Ever since, I mean, technology precedes language, right?

We, we've from the very beginning, picked up the sticks, picked up the rocks and started to smash things open in order to, to to gain access to the resources, gain access to the food the water, you know, the insects or whatever it is we were doing and it not even just humans, right? I mean, chimpanzees have this ability, uh Ravens and, or crows, I think have this ability right to, to grab a tool and to use it.

But humanity definitely more than any other species has always, always used tools and those tools are technology, right? It goes beyond what the human body can do naked in order to survive. In fact, you would be hard pressed to define humanity in the absence of technology. It wouldn't even really make sense to do so. Humans can't survive without technology. Our bodies are probably the least equipped in the entire animal kingdom to survive without technology, right?

What, what we've of course been blessed with is, is a mind that's able to create and that's how very much how we survive is is we use our minds to gather resources and to mix and match them in new ways to create gadgets, implements that we use to change our surroundings and to survive in those surroundings. So technology has always been part of humanity. It's an inherent part of our definition.

So technology as an extension of life kind of doesn't make sense because life itself, at least when we're talking about human life is technology, right? It's one and the same. It doesn't, there's no way it exists if we don't have technology, if we don't have our tools. So let's think about technology now as a mechanism of evolution and adaptation.

I think that's a better way to think of technology, not as something that you know, hey, we've been evolving as humans and now we're kind of in this world of technology and it it, it very much extends that natural state. I think technology is the natural state, it always has been there. And so technology is perhaps better viewed as a mechanism of evolution and adaptation, something that's always been part of our evolution.

In other words, so humans adapt by the tools they create more than by changes to their physical makeup, right? When we think about uh adaptation and biology, we're obviously thinking about, you know, all the different species including ourselves. And we think about, well, how has, how, how has this species gone through physical changes and why has it gone through those physical changes? And then you can take a look at the stressors that are going on in the environment.

And you can say, OK, it kind of makes sense because maybe this happened and then the the the species, you know, no longer has access to this resource or, you know, this type of, you know, thing got wiped out. And now this version of the species is, is now more adapted to that environment. So they're the ones that pass on their genes and they're the ones that evolve. We think about adaptation pretty much principally in terms of the physical makeup of the species of the organism.

Um But humans are adapting almost more. So, I mean, obviously we go through physical changes as well, but I would say it's more about the tools that we create than by changes to our physical makeup and, and the rate of human evolution because of that, it is actually far more rapid than other species. And that's because we don't just adapt to external stressors, we actually create our own external stressors, right?

Every time that we gather resources and we create tools, we start to change the landscape. And of course, this is such this is much more true in, you know, let's say the last 100 years than all of previous human history.

Um You know, we, we are entering this, this kind of new era where we are literally changing um the face of the earth, right, with the technologies that we create and the things that we create with those technologies, you know, the roads and the buildings and, and obviously, since the industrial revolution, we have really kind of refaced earth. Um And so that creates a new environment which we now have to adapt to and the way that we continue to adapt to it is by creating new technologies, right?

So if we have more pollution, if we have more wiping away of resources, you know, on the negative side of things, and then maybe there's there's positive sides as well of the things that we have created, maybe um you know, access to new resources that we didn't have before new opportunities for people, whether it's good or bad, we keep creating new surroundings, new external stressors that we have to adapt to and in order to adapt to it, we don't wait for our physical makeup to change.

We create new technologies in order to do it right. Where, where you know that that could be different types of clothing to resist changes in temperatures, filtration systems, to process the water to whatever whatever it is you can go on and on. We we create and that changes the environment and that changes the external stressors and then we have to adapt and the way that we adapt is to create more tools.

And so this kind of gets into the the the feedback loop that I'll be talking about in a bit. But again, I don't think it's technology as an extension of life. I think it's always been like this, it's much more rapid now, much more prevalent now. But using tools is what it means to be human. It's it's an absolute part of our makeup.

So let's think about this in terms of uh the complexity threshold, I brought up this concept on other episodes, but it specifically the complexity threshold and and and its relation to the emergence of lifelike properties. So life and complexity past a certain point when enough components are interacting, sorry, interacting we can say that the properties of any system start to become dictated by emergent phenomena, right?

So in a simple system, you've got, you know, like billiards balls bumping into each other, uh you can think of maybe like an atom or something. Uh If you remember from chemistry or something that's, you can visualize explicitly. Like here are the components and here are the pieces. Well, the the properties of, of those simpler things will be dictated by those components, right?

And that's why reductionism as they call it has been so successful in science as of, you know, up until recently, because you could reverse engineer something, pick it apart, look at its components and say, well, that's why it produces what it produces. That's why we see what we see here, what we hear, whatever you know, we we we measure what we measure because of those components.

But when a lot of things come together, that's no longer the case, what you're measuring what you're observing, what you're experiencing or phenomena that don't trace back to those individual components, individual components are still there. They have to come together. But what they do when they come together is not accessible to us right there, there's this causal opacity there. So we don't really know.

So past a certain threshold of number of components coming together ie complexity, it takes on these characteristics that are actually much more lifelike because when we say emergent phenomena, that's more the realm of life, right? So we start talking about things like uh you know, emotions and, and behavior, social behaviors and things that are just way too complex to trace back to their fundamental roots. Not that people don't try.

Um It makes sense that past a certain complexity threshold, we can start to assign human like characteristics to systems. OK. And, and so this is gonna be leading into this idea of, of, of assigning lifelike properties to technology itself because technology is today a bunch of components that are interacting in a complex fashion.

So when people start to assign lifelike characteristics to technology, like it thinks it wants, it feels right, it moves, it's, it seems like it's got a purpose or it's, you know, people tend to, to kind of shove that off and say, oh you're just anthropomorphizing, right? You're just assigning human like characteristics to inanimate objects because that's what we like to do and that's the way we like to think of the world. No, I don't think it is anthropomorphizing at all.

I think that it makes sense to sign human like characteristics, how close they are, we can debate but human like characteristics to technologies into systems of sufficient complexity because after all, isn't that what humans are systems of sufficient complexity, pass a certain threshold of complexity that what we do is emergent.

And that's why these reductionist approaches to science don't seem to make sense when you try to understand things um like human behavior and, you know, emotions and falling in love and quick decision making. And, you know, those things don't lend themselves to that kind of old school reductionist approach to science because you can't pick apart the components to understand how the output was produced. OK. So technology should be expected to take on lifelike properties.

OK. Uh Not, not, you know, let's say from 100 years ago and before, because the technology we used hadn't reached us, uh you know, enough intricate components that were all talking to each other. It didn't reach that complexity threshold. So nobody looks at a hammer and says, yeah, that's got lifelike property now. I mean, nobody looks at a conveyor belt or even a rocket, you know, a rocket engine.

If you peel back the, you know, the shielding, it's very, very, very complicated, but it's not complex because you can debug it no matter what goes wrong in a rocket engine, you can always debug it because it's not complex, it's complicated, right? But complex things cannot be debugged like that because they are opaque. You actually don't know how the outputs are produced in a genuinely complex system.

And that threshold that you pass from complicated to complexity is when any system including like technology would be expected to take on lifelike properties. OK. So that gets us into this, this idea of the techni as Kevin Kelly calls it in his book, what technology wants.

And that's why I'm using this book as an example of some of these patterns because he frames technology um as a as a collection of components that are largely interacting and therefore we can think of technology as a singular thing, not just as all the components themselves, but as basically this big glob that comes together and that heaves in and out and, and seems to breathe and seems to have a mind of its own. And again, that's not so much anthropomorphizing as it is.

Yeah, this is an extremely complex system and it should be expected to take on the properties of complexity, which means it's going to look like life. Now again, we can debate how close it is to life, how close it will ever be able to get to life. But again, there's something wrong with that debate isn't there because we're still separating this notion of technology and life.

And what I'm arguing in this episode is we shouldn't really have that distinction or at least that's something to think about. Maybe that distinction is not correct. Maybe there is no distinction, a distinction without a difference, a dis a difference without a distinction, right?

Humanity is technology, technology is humanity, the total collection of all technology on earth we can call the techni it's this highly interconnected, complex quote unquote alive thing and therefore it takes on properties and complex systems. So things like emergence, non linearity causal opacity. That's not no what, how most people think about life, but that is what life is, right? When we say emergence, it's things are happening in life that you can't explain.

When we say non linearity, it's you know, a bunch of people come together to produce an outcome that yet that no individual could have produced on their own. And not just that, but if you were to just sum the contributions of all those individuals, it still wouldn't produce the thing that you saw happen. OK. It's got this for, for, for the input that goes in, you get this very different output. That's non linearity. The causal opacity is the inability to explain things.

So, so much of the mystery of life is really the causal opacity of complex systems, right? A lot of things are happening because we witness them all the time and yet we can't seem to explain them very well, right? Uh not the way you would, you know with, with, with pure physics, say or pure mathematics where you can easily pick apart the si well, easily pick apart the system, you know, attach your mathematics to it.

Explain this, explain that for a lot of things in everyday reality, you can't do that to that level of of of precise precision, right? So, so the techni as this blob of technology has emergence non linearity and causal opacity. And you can actually see this in things like the electrical grid too. If you want an example of, you know, what is the techni m I would actually say the best example is the internet.

I mean, I think if, if you think of the tech num you should probably envision in your mind essentially the internet, right? Because you have, you know, millions and millions of all these machines that are talking to each other and there's no way you're gonna trace back the patterns that will emerge from that.

OK. So that's a good way of thinking of the techni, but you also see it in things like um the electrical grid, right, which of course is in some sense connected to the internet as well, right? Again, the techni is the full collection of technology and interconnections. Um The electrical grid will have patterns that emerge that nobody programmed into it or engineered into it. They just start popping out and those patterns pop out.

Um Because essentially when you reach enough complexity, enough components that are talking to each other, you're gonna get patterns that are emergent, nobody engineered them. They didn't come from anywhere. They, they have their own push and pull and want. OK. So it's not anthropomorphizing. Now, again, we can debate whether or not today's technology has reached that level. But this idea that that large systems can take on those properties is of course, absolutely true.

So can technology quote unquote want? Well, I think a way of, of really arguing this uh beyond what I've already done and you know, talking about lifelike properties that can emerge in collections of technology, think about it as a feedback loop. So if we are so think again, I think the best way to, to picture this is, is as the internet. And again, for, for people on, on Patreon, you'll actually be able to see the slide deck. I've got some visuals here that kind of show this concept.

We've got this, this blob that is the techni, which is all the technologies, a lot of them talking to each other, this complex system and we continue to add to that techni, right?

We're posting photos, we're writing text messages, we're going online, we're sending emails and on and on and on, you know, businesses are talking to each other, banks are doing transactions just there, there's an absolute uh you know, massive amount of information being sent back and forth across millions and millions of different machines, you know, every second, right?

Uh And so it's, it's, we are constantly putting resources if you will into the techni, the thing that the techni quote unquote wants is information, it wants data because that's what it's, that's it's currency. And so we keep putting that in and putting that in, but it's not a neutral thing as we put that in. Those data are getting turned into new value, right? By mixing and matching, right? We put that in, you know, algorithms start making decisions off it.

Businesses, you know, humans are involved, they're making decisions off that information, it's getting mixed and matched, it's becoming new products, new pieces of value, you know, different types of information, right? Whether it's, you know, a new song, a new presentation, a new movie, a new email that gets sent, I mean on and on and on, it's mixed and matched and constantly driving new value that increased value in turn draws us towards using more technology, right?

Because as all that data that we give, it gets mixed and matched and produced into new products and services, we consume those products and services and then that makes us put even more data back into the system. So we've got this feedback loop with the techni the way humans interact with the techni m again, somewhat problematic because it it's impossible not to have this conversation and distinguish between humanity and technology because I'm doing it right now.

I'm saying you've got humanity and then I'm saying they're interacting with the technology. But think of it as, as as humanity really being subsumed into technology, right? We're just part of that system. That's what facilitates the communication, but it creates this kind of feedback loop, right?

It produces value, we use that value, we add more to the system, it produces more value with that data, then we use it and on and on and on and so that's a kind of pull that the techni has the collection of technology is pulling us, it's drawing us into it to use it more and more and more and we're giving it what it wants so that it can do that. And so when we say can technology want, it's not such a crazy idea, is it because what is a want?

It's a pull and this feedback loop as we're involved with technology allows that pull to take place. Now, another one to take a slight sidestep here is that, so if you think about this, this might kind of scare a lot of people and say, oh man, I mean, you, you know, I, I see what you're saying. There's, there's this kind of feedback loop. It's almost like we're caught in it, you know, and is that problematic now again, I think it's always been this way.

We've always been in this feedback loop of creating something, getting value using the new value, adding more to the system, you know, psycho psycho cycle cycle. It's just that in the last 100 years and, and, and even the last 10, 20 years, it's, it seems to be quite supercharged, right?

And I'll talk a little bit more about that later, you know, really since computers were invented and the information technology took office, it's just entered this entire new category of techni and of feedback loops. So it can scare some people. Um Even though it's always been here, it, it can make us question as we're doing now what this all means for humanity.

And I think that's why we get into, you know, the banning of certain technology and something Kevin Kelly talks about in the book, what technology wants is this idea that banning technology is, is in some sense, we understand why people do it because we're scared and we, we do need to be careful and I think regulation around technology is important, But there's always, there's also this ultimate inevitability to tech, to technology because it's always been here and because it's an inherent part of the makeup of being human, again, pretty hard pressed to define humanity in the absence of technology.

So banning technology is, is, is really a futile effort. That doesn't mean you shouldn't have regulations but the ultimate banning of it, you know, it, it's kind of like it's not a real solution to something because it's, it, it's, it's going to happen anyway. Now, what is the mechanism behind that?

Well, remember I had the episode, um a while back called, there were no giants only shoulders and I talked about this fact that, you know, even though we like to assign uh kind of owners or creators individuals to whether it's a scientific theory, you know, we've got Einstein for Relativity. Uh You know, we've got, we've got Isaac Newton, we've got Alexander Gram Bell, whatever, whatever the technology or the theory is, we like to say, oh, that's the person that came up with it.

But it's a bit ridiculous, isn't it? Because humans are so interconnected, constantly sharing information, ideas set upon ideas, hit upon ideas, set upon ideas that eventually, statistically someone pops out as the person that quote unquote, made the discovery. But there's this mountain of contribution that went into that before, before they actually made the discovery. So the, the name that we attach is really just a label uh to anchor it throughout history. Right?

So I talked about that before. Well, that's kind of related to this idea of why banning doesn't really make sense. Any invention theory is guaranteed to be discovered eventually. Ok. OK. You say, well, well, no, you, you needed Einstein for Relativity. No, that, that's a ridiculous statement. I'm sorry, that's just a ridiculous statement. Now, maybe we needed Einstein to discover that year, right? 1921 whatever. Maybe we needed him, right?

But, but within plus or minus 10 years kind of thing on any invention that is guaranteed to occur. And that's because the invention of the, or the, or the theory is not something that precipitated out of an, you know, an individual's genius mind as we all like to think. I'm sorry, that's just not the way it goes, right? It is, statistically there is enough information, enough uh cultural backs setting or whatever has to be in place for something to pop out, it's going to pop out. Right.

And the only difference. I really, if, if, if this, you know, person A versus person B discovers something is, you know, maybe the specific manifestation or look depending on who discovered it. So, you know, this person discovers the light bulb in history and we attach that name to it and say, ok, well, that's what the light bulb looks the way it does.

You know, if that person wasn't there or they got killed in a bus, a bus accident or whatever, someone else is going to discover it, the light bulb then might look a little bit different because they're kind of bringing maybe their personality to the invention. There's these little accents in history that kind of lock in a certain look and feel of something.

But if you don't think the light bulb would have been invented or, or that relativity wouldn't have been discovered as a theory that's just ridiculous on its face and, and, and that's there. There's just too many people, too many components in the system interacting and talking too much cultural backs setting not to allow the discovery to, to precipitate out it definitely will.

And so banning doesn't really make sense as a techno to, to apply a tech to a technology all out banning because you can't isolate the discovery. You can't say, well, stem cell research is happening over here. So we're going to ban it and that's going to stop stem cell research. Well, no, it's not. No it's not another country with, with different borders and different rules is going, is going to, to invent that, you know, atomic weapons. We have to ban it.

There, there's going to be atomic weapon, there's going to be nuclear weapons, right. You know, genetic, this, genetic, that whatever it is regulation, 100% we should have conversations about this, but banning is not gonna happen. That's because, because statistically the invention is going to precipitate out somewhere and it's eventually going to precipitate out somewhere which has, you know, different regulations than, than your country.

Ok. So, so the, the, you know, and, and this is called Simultaneous Invention in the book. Um What does technology want by Kevin Kelly? Right? And I, and I like the way it kind of explains that, but it's, but it's very similar to the pattern that I talked about previously. There were no giants only shoulders.

So the reason why I'm bringing this up is because there's this inevitability to technology and it's not useful to talk about technology in the context of it's bad and we have to stop it, you're not going to stop it. Saying you're going to stop technology is like saying you're going to stop humanity itself. It is the exact same thing because it's baked into the makeup into the definition of what it means to be human.

And if you try to stop something somewhere, it is 100% guaranteed to pop up somewhere else, 100 percent guaranteed. If, if it's physically possible it's going to happen. So it's not, it's, it's not a real solution when people take this hard kind of binary stance on something and say we have to ban this, it's not gonna happen. You might ban it now, you might delay, right.

If you're the most powerful country with the most influence and you decide to ban something, you'll definitely delay something and maybe that's worth it, but you're not going to stop something. So we have to deal with the inevitability of technology. We have to accept that and then use that as our starting point when we start talking about things like how to best use it, how to regulate it.

But I, but I bring this up because technology going back to what technology wants, technology is going to take technology is going to become and there's nothing humans can do to stop that because that would be the same as stopping humanity itself. So I wanna just keep that inevitability in mind as we go on to some of these other topics. Um What about the Luddites?

Uh the technophobes, the people that don't, that they, they shy away from technology, they don't want to embrace the latest thing and maybe they don't want much technology as well at all. Maybe they think technology is all that bad. Um You know, the Kevin Kelly talked about, you know, the, the, the Unabomber's Manifesto, right? Which was very much around, you know, technology is, is uh almost an evil, right? That, that's, that's where, where it's causing us again to lose our humanity.

Um You know, again, let's go back to the inevitability of technology. Let's go back to the fact that technology is in the ultimate makeup of being human. There's something wrong with this technophobic idea, even though it's also understandable, right? I mean, technology does have a cost to it. If we embrace all the technologies, it does feel like we're losing something, whether that's the ability to speak with humans. I mean, think about people all looking at their cell phones, right?

And, and not, you know, not engaging and not interacting, there is a cost to it, but there's also a gain, it's never just a cost, there is something gain, you're connecting people across miles, you're connecting people in ways you wouldn't have. It's never just a cost, right? There's a cost and a again. So he, he uh Kevin Kelly in the book, what technology wants was talking about the Amish community as kind of a quintessential example of people that tend to shy away from technology, right?

Uh Or at least they don't want to use the most recent gadgets. But then he went and visited these, these Amish communities and saw, you know, look, they're using technology all the time. They're not hating on technology. They're just more judicious in their approach of what they decided to use and what they decide not to use.

And they also looked at Wendell Berry, which is kind of this famous, I'm gonna say Luddite, I don't know if that's, that's really the appropriate term, but he's, he's uh you know, essentially this farmer and he, and he writes books and he, he talks about, you know, removing yourself from a lot of the most recent technologies and how that's a good thing to do, kind of more living off the land type of thing.

But whether it's the Amish community or, or Wendell Berry technology is granting Luddites their freedom to hold these opinions, right? Because they're, they're in order to have the ability to separate yourself out means that other aspects of society need to be taken care of via the technology that exists in order to separate yourself from the thing, that thing has to be able to uh you know, exist without you and it, and it does that via technology.

So really what I'm saying is that there's always this, the technology is always there to afford you the ability to have that opinion. It's kind of like somebody who argues against war all the time and say like, you know, you, you right now, you know USA Canada, whatever you, you know, in a very free country, you can speak easily about what war is bad and violence is bad and you know, we shouldn't go to war.

But at the end of the day, the reason you have the freedom to hold those views is because people went to war. That, that's just the truth. That's just the reality. Now, that's not, that doesn't mean we can't change it. Maybe there's, there's a way to stop it going forward, but it's still the reality that, you know, the, the freedoms that you have to express your hatred of something is often afforded you by the thing that you're hating.

And that's just the reality and that's just the interconnectivity of all the things you gotta look, you gotta step back and look at the bigger picture. So technology very much affords the freedom that the Wendell Berry and the Amish community have to exist the way that they do. There's a bigger conversation around this to really get into that and how dependencies work.

But the, the, the what I want to highlight here in which Kevin Kelly brings up in the book is that if you take a look at someone like Wendell Berry, it's not that he's not using technology, he's really drawing the line at technology from his childhood.

OK. So if you went back to his child, the, the year where he, whenever he was born, and you looked at kind of the farm equipment that was being used at the time, that's kind of what he just stuck with, which is still far more advanced than the technology that preceded it. So this idea that you can remove yourself from technology at all is, is also problematic. I mean, you can't do it anyway, right?

You really just have to draw the line somewhere and wherever you draw that line today is still gonna be far more advanced than thousands and thousands, technically, millions of years of human evolution that came before that. So you are not separating yourself from technology. It comes back to this ultimate inevitability of tech. OK. Let's move on.

Uh I just wanted to, to, to kind of touch on the inevitability of it all and I'll circle back with that and we start talking about some of the human evolution pieces later on. I want to talk about this idea of neg entropy. It's kind of a funny word. So neg entropy which would be the opposite of entropy. So entropy is, is this uh this push towards disorder, right?

Things tend towards disorder tends to decay uh chaos if you just leave something and you don't maintain it, if you don't care for it, it will degrade, right? It will decompose, it will separate itself, things start to fly apart. Things don't tend to just come together, right? They, they break, they decay, they decompose, they go from, from, you know, compact ordered things to spread out disordered things. OK? That's, that's the idea behind entropy.

It's actually built into the second law of thermodynamics as really a fundamental uh truth uh of reality that things will always tend towards increased entropy. Well, what's interesting about that is if you think about life, there seems to be this cosmic fight against that, right? Because life is much more ordered and structured. Things don't just fly apart eventually. They still will because we all die.

But we, we seem to be during the, the course of a human life and, and humanity in general society aggregates together, it creates societies, creates civilization, it creates structure. And so we've got this kind of paradox where entropy is supposed to always be increasing, but life seems to fight against that and create order. And, and a term for that is the Schroedinger uh paradox.

Because a physicist back in the day, I Erwin Schrodinger was, you know, using some of his theories to kind of talk about how, what the role is of entropy and life and how life seems to push against it. Order is organization, structure and function. And it's the opposite of this natural tendency towards randomness and chaos. So the evolution of biological systems occurs in the direction of increased complexity.

And, and, and think about increased complexity in this context is actually, you know, increased order and increased structure because more things come together and start, you know, kind of talking to each other, interacting, there's a sharing of information and matter and, and so survival is the adaptation of species to their environment. So as to minimize entropy production.

So we've got this kind of paradox here right, there's this fundamental thermodynamic rule in the universe that says entropy is supposed to increase but life goes in the opposite direction, the the the this cosmic fight against disorders. So why am I talking about that? Well, so entropy only increases or remains constant in closed systems. Something we call idiomatically isolated. If something is idiomatically isolated, it's not in communication with its surroundings.

So it means there's no energy matter information that can enter or leave. OK. But life does not occur in idiomatic isolation because living systems are actually open systems. So if you have an open system, it means you can resolve this paradox. It means that you can create structure and order within the system while still preserving this this fundamental law that overall entropy will be increasing. So, so a living system that is not in idiotic isolation.

So life is not just contained in some little bubble. It it has the sunshine coming in, it's got, you know, the winds and the rains coming in. It's got particles from outer space coming in. It's not an idiomatically isolated system. Life is very much in contact with its surroundings. And because of that, you can, you can have entropy increasing overall but still reducing the entropy, reducing the disorder and the chaos inside the system. OK. So why am I talking about this?

Well, what this really gets to is this idea that to create you must destroy. And so I think this is something to think about. In terms of the techni, the order that is produced by life. I'm not talking about technology yet. I'll get to technology. Think about life first, the order produced by life is compensated for by the disorder that's created in the surroundings. So a living organism will preserve the internal order by taking from their surroundings in the form of nutrients and sunlight.

So growth means returning entropy to the entity's surroundings. So now think about the techni, I said the techni is taking on these lifelike properties. This collection of technology like essentially the internet, but all technologies, all machines highly interconnected, you know, uh matter.

But most of the information being being transferred in and out, you know, caught in this kind of feedback loop as as as humanity as an integral part of that in order to continue to create the value to to give to technology what it wants, it must create disorder ultimately. So it so so in other words, the techni keeps structuring itself becoming bigger, right?

The the the beast gets bigger, it heaves in and out, it becomes more ordered, more structured, uh you know, more components, talking to each other, more of an entity, more of a quote unquote living thing. But in order to produce that order and that structure, it must ultimately create disorder elsewhere. So that's something to think about is is is the there's this inevitability to technology but is it all good. What is the cost of it and what is being destroyed?

You know, maybe it's planet Earth. That's kind of an easy way to think about it. Right. We know that there is a ecological cost to the growth of the techni without a doubt. Right. We, we, we've polluted rivers. Um, you know, we, we've wiped out species. We've done, uh, quite a lot of damage to the earth.

I say we, I mean, humans and technology as one and the same thing now that has created that changed our surroundings and we've had to adapt to those surroundings and the way we adapt to those surroundings is by creating new tools and new technology. So the tech that the, the, as we call, it continues to grow, but there is a cost to that. And so I think that's, and, and I'm not taking one side over the other. Some people say the cost is too great and we're destroying our planet.

And so, and so this is where, you know, the regulation needs to come in. But now remember the banning and remember the inevitability of technology, right? In order to make that conversation useful, you have to deal with the inevitability that technology is not going to stop. So what does that mean? It's not enough to just say, well, we're polluting the, you know, what are you gonna do? Right. Shut down all the companies, shut down all the products, it's not gonna happen So what is the solution?

Other people say, look, yes, it's changing and, and that's, but you know what? We've always figured it out, we will, will create new water filtration systems and we will, the techni will ultimately be the answer. It will ultimately be the solution to these problems. There is there's collateral damage along the way, species will get wiped out. The earth will have certain things that get damaged. But, but we will find a way to replenish, we'll find a way out of their problems.

And ultimately, the techni will be the answer. So something to think about, but there is a cost, but let's not pretend that there isn't a cost. But let's also not pretend that because there's a cost, you can somehow fight against the inevitability of technology. So this gets us into um the kind of the role of information and evolution, right.

So I touched on the very beginning, this idea that technology and its inevitability essentially means that technology is just a continuation of the evolution of humanity itself. So in order to think about things like this and to kind of stop distinguishing technology as being something that's different than humanity, you got to think uh on a kind of information theoretic terms. I've used that term before information theoretic.

So instead of thinking of the underlying kind of physical substrate or physicality of something, just think about it in terms of information, I'll talk about this a little bit more in the next slide. But evolution drives organisms towards higher information content. Notice I've for, for people looking at the slide deck, I, I've had this, I'm just gonna go back a little bit. I've had the same kind of curve again and again a few times anyways, I've got it in terms of population growth of humanity.

So as time goes up, population gets to kind of an inflection point and then explodes up, I've got the same thing for complexity, exact same chart, right? But just a different label as time goes on, you got this inflection point where the complexity explodes. And I've got the same chart again as time goes on, information just explodes.

So we've got this, you know, population growth, we've got complexity and we've got information essentially being tied to one and the same and, and, and information is a good way to think about it. So when you think about the evolution of organisms, you know, what is it doing? Where is it going? You know? Well, it's the physicality that's changing. It's, it's moving into different surroundings. Maybe the population is changing. It's got higher complexity. Really.

What it is is it's a push towards higher information content. And the reason why the information content is higher is because you have more individuals who are communicating and when you have more individuals who are communicating, it takes on higher complexity. And when you take on higher complexity, you get more emergence and it's that emergence that allows you to survive.

So uh a a complexity theorist as you might call him, um Gregory Chayon actually talks about life having properties of high mutual information where you know, different components within a system will share essentially the same information, self organizing systems are are really an inherent part of the physical world because the self-organization, the entropy reduction that occurs in life is is intimately tied into the increase of information content.

So I know it's getting a little jargony right now. So I'm gonna back off on that a little bit. But just that there's a reason why I'm bringing up information here. Think about the the progress of life, the evolution of life as populations increase, particularly with humans uh as complexity thresholds get passed in line with all that is is more information, information is the currency and that's what I got on this next slide here. Information is the currency of really everything.

So if you look at this image that I've got for those that are looking at the the the slide deck, I've got humanity on the left, just kind of these icons of people with lines between them and on the right. Uh It's it's the same thing but with microchips, right, as technology.

And so what I'm saying here is is when you view things in terms of information and processing the supposed distinction between humanity and technology starts to disappear humans are just these groups of components that get together and communicate same thing with machines and the internet or just the techni in general, it's components that come together, which includes humans, but all the microchips and all the machines and, and all the physicality that's there.

They're really just components that come together that talk to each other. Humanity and technology. When viewed in terms of information and processing are essentially the same thing. Humans appear as kind of slower intermediaries to the evolution of information and processing, right?

So, so if you think about evolution of humanity, you say, well, you've got humans evolving, evolving and all of a sudden you got technology and then no, no, no, no, no. Humanity and technology have always been intertwined. It's always about being about components, communicating, it's always about increased information content and it's just one evolutionary line for the whole thing.

And if you look at it like that, you say, oh well, maybe on the extreme end of things, maybe 1000 years from now, whatever, maybe technology as we think about it is the ultimate end of that evolution. And humans almost seem as to be this kind of slower intermediary. In other words, the biological humans that come together are just an organic kind of slow version of components that talk to each other. But now with computers, we have these faster components that talk to each other.

But ultimately, it's the same evolutionary line it's just components coming together that talk to each other because that's what the techni wants. That's what technology wants is increased information. It's, it's a drive towards fighting against the entropy increase by lowering its own entropy by maximizing the mutual information between all the components of a system. OK. It's a very objective way of looking at it, kind of objectifying humans, right?

But if you just think about it in terms of information, it doesn't matter almost as well if it's humans or it's machines in the internet, who cares? It's just this drive towards increased mutual information among components. That's the push, that's the want, maybe that's what the universe wants. OK? It's not anthropomorphizing, it's just saying pieces and information are increasing.

So I've got on this next slide where you show humans with hammers on the left and then humans with microchips on the right. Humanity and technology evolve in essentially the same way, especially when viewed in terms of bodies and information transfer like I just spoke about. So it makes sense that today's technology is a continuation of humanity's evolution, a continuation of humanity's evolution.

Today's technology is a faster version, right of, of the communication that humanity has always done, right?

Even with hammers, it's a, it's a type of communication because you're building something, hey, look at what I built and that's to help us survive in yada, yada, you know, and then you get all the way to um you know, telling stories about the campfires and you get all the way to, to actual like telegraphs and then telephones and then microchips and computers and the internet and, and zoom calls. It's all the same thing. Ultimately, the speed is different, which is why it looks so different.

But it's really just humans communicating the the the fundamental basic need to communicate is there. And so the the true story of evolution can be thought of one thermodynamic as just the push towards decreasing entropy, right? Trying to which means to increase order and, and and structure within the universe that that that tends towards increased entropy. But two more importantly, perhaps is just the increased information content.

It wants more information, it wants to use that information to make more things and it doesn't matter if it's human beings or machines. OK. Now that's gonna kind of scare some people, people are gonna find that very off putting because it makes it sound almost like trans humanism or just something where like, you know, what are we meshing technology and humanity together or is humanity just going to disappear? You know, again, we can't help but do that distinction.

We're talking about humans and technology and I'm trying to say no, it's not and it's is right? Humanity is technology, technology, is humanity. The question are humans merely informational intermediaries in, in, in a more in a longer process of uh of evolution towards increased information content not if humanity has always been technology, not if technology has always been, humanity doesn't make sense to call us intermediaries. Right.

Again, that is, it, it's like a distinction without a difference. In fact, technology is becoming more natural, not more robotic. Now, that's what I want to get into. Now. It, it, since the industrial revolution, I would say, you know, humanity has kind of thought of technology as a very distinct thing because if you look at machines, you see cogs, pistons, steam grease, right? Railway, it doesn't look anything like humanity.

It's metal, it's, it's mechanical, it's noisy, it's not organic, it's not beautiful. Some people think technology is right? But it's, it's very different than the illogical, messy, organic, beautiful aspect of life. But technology is becoming more natural, not more robotic.

So when we think about what the end game is, I want to get into this idea of technology as beauty and, and, and in, in Kevin Kelly, in technology ones actually says that, you know, we're drawn to, to technology because it's beautiful or, and I would say it's becoming beautiful. I don't think things in the industrial revolution are particularly beautiful. I mean, there's some interesting symmetries there that maybe we're drawn to. Uh you could call it rocket beautiful.

You might think train tracks are beautiful, but generally industrial revolution is kind of dirty, right? That's what we think about it. It's very mechanical. But as we pass the complexity threshold we end up being drawn to, to, to technology by more than just its utility. Technology is becoming more organic, natural and beautiful.

Ok. Think about, um, you know, the first computer, well, it kind of filled an entire room or even like, you know, a really old Macintosh or something or a Commodore 64. If you know what those are, uh, you know, computers that are, you know, they're, they're clunky, they're not small. I mean, relative to today's, um they look like technology. Think about the technology in the movies from the eighties. Um It looks like technology, right? It's got lots of knobs and dials and wires.

It looks complicated. It's like the rock, you pull the shielding off of the, of the rocket and look at the engine that looks like technology. But technology today, of course, we still have rocket engines and train tracks. But the new technology in the information economy looks very different. Think about an ipad, an ipad. It has almost no buttons on it. It doesn't even look like technology. It looks like AAA slab of glass with a metal backing on it.

And it's not until you interact with it that you realize. Oh, there's a, there's this visual representation of what used to be buttons, what used to be cogs and pistons, right? It's all visually presented and it's, and it's light taps that allow you to navigate a lot of these things don't even come with instructions anymore because it's not a machine as has been classically defined, it's much more fluid. And so technology is becoming a more organic fluid, natural, beautiful.

If you think about a lot of the um simulations, like go on youtube and look up computer simulations or CG I and you look at the ability to recreate uh you know, flowing water or, or blooming flowers. It's very in line, not quite there, but it's very close to actual, you know, real fluid and real water and real blooming flowers.

And, and that's because so much of the messy stuff has been figured out in technology and it's getting subsumed under the covers and the level of abstraction is raised so high that now when we interact with technology, it's not like the industrial revolution, it's fundamentally different and, and it's, it's more organic, it's more fluid. You think about the people using software, you don't need to be a computer nerd to use software.

Everybody uses software, artists use software creators use software, right? They, they operate these quote unquote machines at the highest levels of abstraction and they do very fluid human things with it. And so there's this idea of, of essentially coming full circle. OK. Things becoming more beautiful. So the reason why I say that I have to go back to what I was talking about before, you know, what is the end game?

You know, uh are we mere intermediaries that will eventually get removed basically the story behind the singularity, right? Where, where, let's say you take a look at A I technology and maybe it gets to the point where its intelligence, um, surpasses that of humanity. And then, and then all of a sudden technology decides it doesn't need us anymore. So imagine a I making the decision that, well, humans aren't even needed.

They're just getting in our way or maybe humans are a threat to our survivals. We're getting, you know, we have the power to get rid of humans. And so, you know, it's that kind of idea that, that humans are an intermediary in the evolutionary line of, of, you know, increased information content.

But if you think about the fact that technology is getting more beautiful, more organic, more real, I think it's actually coming full circle and you see this in the software industry, you know, back in the day it was, you know, well, way back in the day before software was the slide rule, right? And then the digital calculator and the computer revolution came and now we're in the information economy.

And when, when people first started programming computers, it was kind of, you know, the quote unquote, quintessential nerd with the glasses who thinks logically and they do this, they essentially, which makes sense because they're doing this low level instruction to the machine, they're instructing the machine on what to do and it's programmatic. But if you look at the software industry today. It's really, really not like that, that you still got your nerds right?

In some sense, we're all still nerds, but we're not that kind of slide rule point, Dexter, super logical, we need to instruct machines. There are still people that, that do that. But what we really need today in the software industry and what is taking off is people that operate at, at extreme high levels of abstraction. You need to, to, to, to use software libraries that have very few inputs that aren't really programmed, they're more configured.

In fact, I would argue the majority of software development today is configuration. It's not, it's not programming, it's not, you know, writing for loops and wild statements and, and explicit logic that's still there is definitely still there, but we operate at very high levels of abstraction. It's more like sculpting almost.

And I, and I think this is we're, we're kind of at this transition stage in, in, in the software industry actually where you've got this old school thinking that it's supposed to be very kind of programmatic and this is technology and it's all logical.

And then you've got this other side of things that is, is almost more on the designer side or the, the creator side, which says, well, no, you know, let's just build, let's use high levels of abstraction, let's mesh it together, let's use iteration, you know, and, and then the, the best thing will pop out.

That's, that's really where it's, technology should go, I think, and I think if you look at software today, you know, people will talk about programming as if people are writing a bunch of code. Nobody's writing a lot of code. I'm sorry. That's just BS, not, not, not like they were back in the day. If you're writing a lot of code, you, you are doing some old school stuff and you should probably just retire because that's not the software industry today.

There's still coding, there's still programming, but it's, it's, it's more configuration if anything. And that's how it should be because that's, that's where technology is headed, higher levels of abstraction, more fluidity, more organic. OK?

So that's so when you talk about the end game where humanity is gonna get removed, it doesn't make sense because we're coming full circle with technology, technology is becoming more lifelike, more complex, more emergent, more causal opacity and all those interesting complex properties. It's becoming more lifelike. So it's coming more full circle.

OK. So if you think about the job that, you know, think about the assembly line job, you know, people talk about automation and technology taking over jobs. Yeah, maybe it should though, maybe it really should. And I know that's a bit of a controversial opinion. But do you really think anybody should be on an assembly line now? I'm not saying, you know, OK, maybe someone doesn't have the opportunity and they have that job. I'm not talking about that.

I see maybe they need to take those jobs, but I mean, humanity in general 100 years from now. Should any human be on the assembly line? Of course not. Of course not. Why we, we shouldn't be lowering ourselves to cog and piston style technology, right? Technology should be meeting humans where they are not the other way around. And if technology is gonna meet, meet humanity where, where, where we are, it's gonna come full circle and be fluid and organic.

It's not gonna be cogs and pistons, it's not going to be complicated, it's going to be complex, it's not going to be logical, right? It's going to be emergent, it's going to be soft, it's going to be fluid and that is where technology is headed. So humans are not some intermediary in some push towards emotionless logic, emotionless cogs and piston machines.

That's an old school definition of machines that quite frankly is, is not very relevant, that's industrial revolution, that's not the information economy. OK. So we're coming full circle. And so why the reason why I'm saying that is I think humans are are still and always will be an integral part of this evolution of technology, not some intermediary that gets wiped out. I think the singularity idea is flawed because it it it thinks of technology in the incorrect way.

It thinks of technology as as though it were made up of Cogs and Pistons as though it was the old school programming of everything being logical and deterministic and that's just not what it is anymore and that's not where it's headed. So, ultimately, where does this all head? What is the ultimate question here? I think you, I think you have to say it, I think is technology God. All right now, quote unquote, right?

Uh Whether you, you believe in a God or believe in some ultimate entity, the the reason why I say God is, you know, is what is the end game like? What does the universe ultimately become? Right? It sounds off putting to say something like technology as God or whatever you might believe in. But only if we're thinking into uh in technology as a very inhuman thing in an inhuman way, but technology is no longer lifeless.

It's not this robotic version of what humans do, but human, humanity itself is more organic and beautiful. Technology is becoming organic and beautiful, is becoming full. It, it's coming full circle, it is humanity, it is humanity, the Androids of science fiction that show them without emotion. They, they, they got it wrong. OK. You take someone like data from Star Trek, one of my favorite characters, love him. Star Trek, next generation, great character.

But the idea that an intelligent being would be emotionless, I guarantee you that's flawed. I guarantee that will never happen because emotions are the anchor to the heuristics. We use to solve complex problems for an intelligent being to truly be an intelligent, they would have to have emotions because emotions are the pieces of the mind that operate in that high dimensional space that allow complex problems to be solved.

If, if, if you have, if you're looking at the slide deck right now, I've got this picture on the right. It's actually from a previous uh kind of kind of meme or gift that I had put together for that I sent out to Twitter that the difference between simple problem solving and complex problem solving on the left. The simple problem solving is you, you're doing this kind of deterministic path, right?

It's very emotionless, it's very logical, but you don't really, they find the lowest point on the curve on the right is how complexity is actually solved. It's not an opinion. This is every single computational approach to solving complex problems operates in this fashion. Every single one. OK? This is not like a version, this is every single one. This is the only way complexity is solved. And that's by bouncing around the possibility space by sampling, right?

I I used examples on previous episodes with ants and termite colonies solving complex problems. We talked about um you know how people try to find the best path when they're landscaping instead of trying to design it, they just let people walk and wherever it precipitates out, then they, they, they, they hard code the path in. That's exactly it. OK. How complexity is solved by bouncing around the ability to sample the space of possibilities is afforded us by emotions.

That's what emotions are doing. When, when people say well, you can't chalk up decision making to gut feelings. No, you, you, you really should. In fact, you have to, there's no way around it. Gut feeling is not a wishy washy thing. That's only if you're doing old school reductionist slide ruler, you slide rule point dexter definitions of intelligent, which is stupid, right? It works very well. Industrial revolution because we made a lot of stupid machines. I don't mean stupid like bad.

We needed the machines but I just mean stupid as in they're dumb, they don't have intelligence to them. True intelligence does not work like that. True intelligence is not logical, right? Logic plays a role but it's not deterministic and logical. That's not an opinion. We know that intelligence uses emotions as anchors to high dimensional problem solving. It's how you bounce around. It's how you search the possibility space and find optimal solutions.

Anybody that studies computational complexity or complexity science or anything in science or mathematics who's, who's worth the letters after their names should know this. OK? And it's important that, that people who are the the the uninitiated know this too. This is how complexity works hard. Stop. OK. So why am I saying that? Well, again, the Androids of science fiction, this idea that in the future, you'd have the stripping away of emotion.

100% incorrect, 100% flawed notion absolutely wrong. It will never happen. It's the hill, I'll die on it. It, it does not make sense. You require the emotion to solve truly complex problems and solving truly complex problems is what intelligence is. So this this idea of the ultimate being, humanity and technology intertwined being the same thing. What is the end game?

Could technology, the techni this blob of inter connected machines of that that has a want to it, that has a pull to it of increased information content. Could that be God? Right? Could that be the ultimate or, or whatever you wanna call that ultimate thing? Well, a lot if, if you find that off putting, it's probably because you're thinking of technology wrong in, in, in kind of an outdated way. Nobody would blame you for that. That's how we've been brought up, right?

The industrial revolution view of technology is how we think of technology, but it's not where technology is headed. There is something beautiful about it, there is something all too human about it. So maybe the ultimate is that a view of technology as some ultimate end could be beautiful, very human, like maybe even Godlike, you can debate that amongst yourselves rather than fighting the inevitable. We have to ask, how can we accept what we are, right?

It goes back to this kind of simultaneous invention and banning conversation instead of saying, well, this technology is bad. We got to ban it is, is a, is a moot point. It's a completely useless conversation because technology will happen. You know, it's like, it's like Jurassic Park. Right. Life will find a way, right. Technology will find a way. Technology will find a way because life finds a way because technology is life and life is technology, human life.

Anyway, there really is no difference. So perhaps tomorrow's technology as opposed to being some scary future, as opposed to being something that strips away what we consider humanity, the emotions, the beauty. Perhaps tomorrow's technology will actually bring us home something to think about. All right. Thank you, everyone. Uh Hopefully that makes sense. I know there's quite a bit of jargon in that episode. Uh You know, have another listen again for, for Patreon subscribers.

Go head on over to Patreon. If you're willing to pay a little bit of money, you can actually see the slide deck that I use. Again. It's got some visuals, it's got some ideas that, that uh help anchor the concept that I'm talking about if you're a visual person. But again, you get the exact same content from an audio perspective for free on the podcast. So that's totally up to you. I've got some suggested reading at the end here with some links.

Um Obviously, the book again was what technology wants by Kevin Kelly. Uh There's a paper called thermodynamic formulation of living systems and their evolution. If you want to get into some more of the chemical, uh the, the technical stuff, some wiki articles on Gibbs free energy LP principle of minimum energy principle of maximum entropy, you go got a lot of little uh you know, mathematical equations if you really want a nerd out.

Uh Although again, that's kind of an old definition of nerd as I talked about. And finally, um uh that Gregory chain, that complexity theorist, he's got something, it's actually from the seventies out in MIT press, but it's toward a mathematical definition of life. It touches on some of these ideas of mutual information and on and on. Thank you so much. Uh Hopefully that was a good episode.

Um I'll be back, uh you know, next week, uh if, if I can keep the pace with another book and some more patterns to discuss, uh talk to me on Twitter if you have any other ideas or books that you would like me to use as an anchor to some of these patterns. And uh and I hope to uh hope to see you soon. Ok. Thanks everybody. Till next time.

Transcript source: Provided by creator in RSS feed: download file
Technology is Humanity | NonTrivial podcast - Listen or read transcript on Metacast