Hi, I'm Paul Ford, the co-founder of Aboard. And I'm Rich Ciyotti, the other co-founder of Aboard and CEO. Very good. Rich, I'm going to tell you what Aboard is really quickly. I know you know, but I think it's good for you to hear it.
Aboard is a data management platform for everyone. So if you ever used a spreadsheet to make a list, whether you're doing it because it could be a shopping list, but it could also be like almost software that you use inside of your organization because when you go into companies, people got spreadsheets everywhere that kind of secretly run the place.
Aboard is a tool for turning that into actual software. Invite people, move cards around, take that data and make it into something that you want to use all day rather than something you have to use all day. You can pick it up right now and save web links for it, or you could be like the organizations we're talking to and turn it into software that runs really important parts of your company. That's what it's for.
Check it out, aboard.com. It's free, free to try. All right, so hello at Aboard.com. If you want to reach us, that's enough marketing for today. Let's actually record the Aboard podcast. This is going to be a Paul Ford special, which is I have a big idea and let's see if we can actually get somewhere with it or shift delete.
Yeah, that does happen too. So look, I picked up and read an article in it's a publication called a on A-E-O-N-.co. And the article is called the empty brain. Your brain does not process information, retrieve knowledge or store memories. Sure, your brain is not a computer. So look, this is like out of the world of cognitive science and neuroscience and so on. This is not a wacky idea. We've known for a while that brains don't really work like computers.
It's not like something you can switch on and off. They're always kind of firing. It's like this sort of wild blobby mess of signals just going every which way and they're super interconnected.
But people really want to believe that computers can become intelligent like human brains. They do. Let me ask you, why do you think that is? Why do you think that our industry, a really significant number of people in our industry, have been convinced for a long time, it's only a matter of time before the computer becomes a person.
I think if you, and I'm going to get a little lofty for a brief moment, I think when you look at invention for the last couple hundred years, a lot of it is about, you know, sort of extending our own faculties and our own capabilities, right? The invention of the camera is this enhancement on how we take in visual images. The invention of the telephone is a, it is taking our ability to communicate through language and extending it anywhere instead of just in the room.
So I do feel like when we think about innovation and when we think about invention, we are thinking about extending how humans live and extending our own potential, right? Like flying. It's like, yeah, I'm not elegant like an eagle. I can't just flap my wings and fly.
But gosh darn it. If you buy tickets to Phoenix on a Delta flight, you'll fly. Right. So I think I think the ultimate end in that is to as closely as possible, replicate the, just the magical ability of intelligence and of cognitive thinking, right?
Like that is the, the like final frontier in a lot of ways for a lot of people because it's still so mysterious. I, for my own reasons, I've gotten to know new various neurologists and people in medicine that focus on the brain and they love telling you how little they know. It's like, you don't understand. We don't know anything. How do they see the brain? How do you think in yourologist or a neuroscientist looks at the brain?
I think there are some, I mean, you started this podcast by saying that that it's not like that, but there are some things that are like that. And that is data inputs actually exist. Like those are scientifically proven. There are optic nerves. There are oral nerves. There are nerves on your skin. There are thousands of taste buds on your tongue. Those things actually your eyes are not video cameras. Right. Like they're not, it's, it's not, but there are optic nerves that run like a
USB cables to this CPU to the center. This command center, right? Now, once you get into the command center, it looks like putting, which is where things get kind of tricky. I just saw I just went and got glasses and they did that they do a photo of the inside of your eye. Yeah, yeah. Yeah. Yeah. I mean, it's woof boy. Yeah. Yeah. And so, and so I think about all they know and I am no neuroscientist by any means is that it is
a massive web of neurons and electrical signals are traveling through. That's about it. Like they also know because they poked at it. Not be for any other reason other than the fact that they poked at it is that the front of the brain is emotion and then there's like the left side is motor skill. And the only reason they know that is because they poked at it with a stick. Yeah, it's literally like soldering. Like why doesn't answer?
What does that circuit do beyond that? It's, you look, you know, it is, it is a very mysterious. It's probably it's probably one of the most mysterious corners of medicine. Obviously, like for all the reasons, right? Like they can't fix certain things about the brain. They don't know a lot. They're terrified of it.
Neurosurgeons are weirdos because only once willing to go in. It's a whole world. So this article is written by someone named Robert Epstein who used to be the editor in chief of psychology today. Wanted to mention that and also, okay, he has a spot in here relatively towards the top where he lists another writer who is an artificial intelligence expert.
The metaphors that people have used for intelligence over the last 2000 years and my favorite is I'll just read it to you the invention of hydraulic engineering in the third century BCE led to the popularity of a hydraulic model of human intelligence. So like dams, right? The idea that the flow of different fluids in the body, the humors accounted for both are physical and mental functioning.
The hydraulic metaphor presented for more than 1600 years, handicapping medical practice on the wild. Wow. See, here's why this all really. Not a good theory overall. Well, but it worked for it worked for like the big engineering challenges of that time. We're getting water into the city, right? And getting like how do you make that leap to like artificial intelligence? I would. I think that's what humans do.
Because after that, everybody got into like springs and gears, right? Like, you know, the railway shows up. What I'm going for here is, and this is really, it's really kind of meta, but I think it's relevant, which is technology when it lands. It's a technology is physical, like a chip is a physical thing. It's made out of it's got a cylinder in case and doped with different chemicals and wires go through it and G.W. is a hammer is a physical thing.
But humans just project on to physical things. We sort of bring the world in as if it was us. We're like, oh, okay. If you look at our industry, it's so powered by metaphor. And I'll give you an example. Computers were a really big deal, but ultimately relatively small industry. They were like kind of compared to, you know, other industries. They weren't even that big a deal. Like nothing compared to oil and gas in the early 80s.
And then Xerox got together is like, let's make the future of computing. And they kind of did and they came up with bump, bump, bump. A metaphor literally the windowing metaphor. It was the desktop metaphor. I'm still I'm looking at it right now as I'm talking to you. I can take a file that doesn't exist. And I can drag it into a trash can that doesn't exist or put it in a folder or put it in a folder. That's right.
And that actually turns out you can unlock trillions and trillions of dollars of value and change hundreds of millions of lives with those little metaphors. Yes. The more the metaphor makes sense to people, the more people look at it and they recognize it because it looks and relates to something else that is part of their day to day life.
The more likely you are to find success. And I think that that is one thing that's kind of true. Like I think you you get the windowing interface and so on and so forth. And I think we're in this very interesting moment. And this is this is sort of I'm driving towards even though it's taking me a minute to get there.
We're in this very interesting moment where we have. Okay, people have been saying that computers are going to think and talk and be human literally since the 1950s when they were a bunch of switches. Like they just they get mixed up everybody gets mixed up everybody thinks that you can kind of go from machine to human. We're just like two steps away.
Just the processors have to get a little bit faster, et cetera, et cetera. So that's been going on forever. Okay, maybe we will have artificial general intelligence one day. Maybe we won't right now. I don't think there is like some obvious next step that's going to get us there, but other people do. Okay, so just put that aside for a sec. We're in this really interesting moment where everybody's focused on that kind of metaphor.
They're focused on like the human is a computer, right? And this piece sort of breaks down the reasons the human is like the way our brains and you know, our amygdala does not work like a large language model and kind of all that stuff. I don't really expect. But the metaphors that we're picking right now using all these new technologies. It feels very, very like unfounded and I kind of want to just use let me just throw it out.
We've chosen chat because it is the interface that makes the most sense for interacting with one of these weird databases. Where are we doing it? What, where do we go from there? Like we have a strategy at a board where we're going to use AI to create data, right? But are there what are the other things that we're going to do as a society?
How, how, how would you unlock this? Because we're hurting for metaphor. People want AI because it's the new exciting thing. But then they're like, they actually don't have that problem. They have other problems. Let's talk about metaphor first split first split second here because I think its utility is Matt. It's it's it's ability to catch on is so powerful because imagine Xerox Park had said, you know what?
Let's not replicate a digital desktop. Yeah. And folders and files. Let's do something else. We have graphics and they create a solar system on your screen. And they call files data units. And they just don't bother with it all, right? And then what you end up with is this much steeper hill to climb for most people. Most will bail, right? And the truth is you can see where metaphors have limited utility meaning they just don't bother with them.
The command line, the the the terminal console, the scripts that you write to do certain things. Those are highly specialized and no one bothers. I mean, they call them packages. They do call them containers sometimes containerization. They still use words. But those are not for the masses. What happened with chat is that the whole world is talking to the rest of the world on their phones. And then they added a machine that seems to be just as chatty as Ant Louise.
And it's like, whoa, and Ant Louise seems to know everything. If she doesn't know everything she acts like she does. I've said this many times on this podcast and in other podcasts, the progress of technology is the elimination of steps. If you can eliminate steps, then you've advanced. And AI was a huge, it was like a long jump leap in terms of steps because I didn't install anything.
I didn't have to learn some new thing to get an answer. I was using something that not only have we been conditioned to use, which is chat for the last 30 years. But it's using it in a way that's like it's in WhatsApp. Like WhatsApp. Facebook added like Jake Gillin Hall's face and named him Sid in WhatsApp. I don't know if you know this. They took celebrity faces and they named them something else.
And next thing, you know, I'm asking them questions. This, what Facebook understands probably as well as anyone is like, how do I get even the person with a fifth grade education to kid to understand how to use this thing? And so that the power of that is immeasurable really in terms of its ability to take.
You want to know another thing that's fascinating in the industry. I'm going to go in a slightly different direction, which is we've been talking about like, okay, yeah, I think this is right. I think that like the window interface, when the desktop, the desktop metaphor, the chat metaphor, they're familiar. That's what we talk about metaphors as if they have a magic quality. Paul, just to punctuate, do you know what word nobody ever uses like in common language?
Portal. You know how many years they tried to use the word portal because all the nerds kept reading books, sci-fi books. And they're like, well, I'm going to give you a portal. And it was like, okay, thank you, portal man. And no one knows what it and it died. They did build that data. There was that video portal between Dublin and New York City and people kept exposing themselves and like the Dublin people kept holding pictures of 9-11. I mean, and the podcast right there. That sums it up.
No, no, well, where you went is exactly where I was going to, which is that the there's a whole bunch of technologies when you are running a technology organization that only engineers care about. And I almost want to name them as like anti metaphorical. Right. I'll give you a true hypertext, deep functional programming. You know, people like Ted Nelson, who is one of the earliest create earliest developers of hypertext spent 30 years, 40 years telling everyone that they were doing it wrong.
And if they weren't doing it in the true way of real hypertext, right. And I was very when I was in my 20s, I had a book by him. I was very enthralled. I just thought that was very exciting that like, you know, oh, yeah, we're exacting vision of the future.
We just need to organize the world along kind of library science lines and use these data structures and then educate everyone with like a master's level degree, master's degree level understanding of technology. And then we will have the utopia that digital life should bring.
And I mean, you know, Alan Kay, who was one of those creators of the desktop metaphor was always frustrated because what Apple took away was the metaphor and they didn't bring object oriented programming, which he thought would revolutionize the world world. And I have to say I do think it's, it's this is the moment reading this article and I'm watching this cycle play out over and over again, which is and I just want to, it doesn't have a name.
Right, it's you could call it like a hype cycle or whatever, but it doesn't really have a name, which is. There's this very big abstract idea that shows up, it'll be like a large language models. Okay, so like suddenly neural networks are growing up and you can do new things with them. You couldn't because the computers are faster and we figured out that 3D cars from Nvidia are really good at making databases of all human knowledge that are kind of lossy.
Whoa, who knew right and then Sam Altman is like, yeah, go ahead launch it and suddenly the world blows up. Okay, and meanwhile, you've got all these nerds over here making LLM pronouncements and then they get really obsessed with all these sort of like very specific ideas about artificial intelligence coming down the pike and here we go.
And sort of like an orthodoxy emerges around what the technology is for and then people come and are like so excited by this new sort of metaphor they come and chat with it right. So you have the like the technology purists on one side, you have the interface on the other and then they're kind of at war for the next 30 years.
I think we're going to be hearing from people about AGI and what true AI is and what you should do with this technology for the next 30 years and then chat will get thrown, you know, it'll be chat and then it'll be something else for the next 30 years and this will be this big tension and they don't we saw with the web like there's one true web and it has to be one year would be accessible and next year it'd be sort of data driven or semantic or whatever.
And you know, Google just kept adding like more video playing functionality to the browser. Okay, so this is I think this is what I think you're saying. I do appreciate the idealists. There are like a lot of the invention. We have some of it by the way. We're crazy. We have been picking away at the same scab for three years now at a board because we have this perpetual frustration with how the world seems to work.
And we're pretty convinced that a lot needs to change right and eventually I think the tension is between idealism and frankly adoption and commercialism or whatever reason you'd want Google became a company and then that company started looking at charts and graphs and kept saying wow people really like moving pictures.
Let's keep going with that right they go it starts as an idealistic sort of view of things. I don't know the Google founders like when they were in the garage but very often it's like there is usually very lofty statements about right like a desk a computer on every desk was Bill Gates is that's not a business goal.
That is like this is going to be something that is going to be in every house right that idealism and and what it represents that's the seed of it right and then what happens is commercial motivations frankly Steve Bombers shows up the tension is Steve Bombers shows up and he's like well sales.
It's time for sales sales sales right and then that now the things you do are less about reaching some utopian state and more about frankly conquering the textile industry and then conquering the auto industry with software and then like it becomes much more pragmatic much more gold driven much more money driven you can say it out loud here but there's always a handful of people and you see that like some of the
people at open AI like Altman is effective because he's got a foot in each one in commercial and one in idealism tends to interchange him but there's a couple of we like you always have to just see him with his team and there's always a couple of weirdos that can't believe we're burning the world down like they're just convinced that this is bigger and we don't understand it we got to hang back a bit that is to me a very classic tension that idealism versus that pragmatism that's frankly
driven by business and money right like that's what it is like all I will tell you it's you know this is going to change how people live their lives we're going to find cures and then he goes to like cut up to get a trillion dollar so he can open a chip factory like that's eventually it boils down what's real here I don't know I I snicker at the idealism I actually figured out what I was trying to go for if you go into a big organization
a big organization with a lot of impact you are going to find a group of people who truly are aligned with the mission like they see the technology as its own thing and they actually see it as truly representing what it means to be human they think that they are building the future of humanity they feel that this is a loftier purpose so many of the best engineers I've worked with really do believe that
100% I've hired a couple I could not get them to hit a date and they were brilliant but they felt that they were on a mission that there had to be purpose to what they were doing and they couldn't do anything else and they believe that the kind of that the computer has a soul yes they really do so there's always a group of those people and you actually you need them because they're motivated and they're brilliant and they they've let the metaphor of the technology take over their lives
even though that's not the way the world or your brain actually works absolutely absolutely it's the mad scientist right in the same organization right you have people who and I think this is where I ended up where you ended up you look out and you talk to people and you talk to individuals and you talk to smart people at businesses and you realize that that entire world that everybody lives in that system of belief about technology and change
and one tenth of one percent of it can leak out into the general population that's all they can handle and they can handle chatting with chat GPT and asking to write their essays but when they see when they look inside and they see the religion they're like what the hell is that I just wanted it to do math for me yeah I don't know I don't want to brush off the idealist because I don't know if we'd be here without them I was one
of those things that you you're not anymore I'm not I'll tell you why it's not that you hang on to some things Paul I've heard you say look I want this to be a business we're talking about a board but I do want this to be something that's useful and has utility in positive ways in the world I've heard you say that maybe that's your negotiation you're
negotiating no I believe you have to do good work or otherwise why bother right so there's there's that it's but it's not that I used to really believe like you know the semantic web will change humanity but what I believe now is a little different which is my understanding of humanity was deeply off humans can't handle much more than they're handling today the idea that you're going to transform them with your vision of technology and I've come
close like I got to tell millions of people about new technology over the course of my career but people go out and they look for tools that they can use they're actually not looking for a transformative experience
so the only real option you have if you're in a business is you have this one group of people who might be true believers and then you have the entire world on the other side and if you're a leader you're the filter you are the individual who is responsible for taking what the true believers do disappointing them every day of their lives and trying to empower a bunch of people who really can't handle much more that you can give them a couple tools a couple ideas and then see what happens next
I agree I agree I mean this is this is a classic tension I've had many engineering leaders just look at me with sometimes disappointment oftentimes disappointment occasional discussed and not because they thought what I was doing was wrong
but they saw its purpose is very limited you fraudulent sell out exactly exactly life there it is yeah I know I think we should keep going with this conversation I think a good pivot on it by the way is that I think the one thing we can take away from the idealist is they never want to do harm they always see it as a very positive force and great we love them yeah and then what happens is I think one of the sad things is that's happened with a lot of technology is it's used to do harm
because of commercial motivations and that's something that's just fact of life in the world even if even the best intention stuff like you yeah it's almost impossible win but you can you can like again another conversation for another day all right thank you for indulging me when I go meta I know we're for enterprise this is very philosophical people in the enterprise like big thoughts too
yeah yeah we'd love to hear your thoughts on this subject give your idealist a hug when you see him that's right they need a hug and hit us up at hello at a board dot com thanks for listening oh you know what you do with your idealist you you schedule an hour for them to talk about important new technologies tools and just nod slowly don't look at your phone don't look at your phone this is exactly all right we're just listening hello at a board dot com thank you have a lovely week bye bye