Welcome to Tech Stuff, a production from iHeartRadio. Hey thereon Welcome to Tech Stuff. I'm your host, Jonathan Strickland. I'm an executive producer with iHeartRadio. And how the tech are you? So I threatened? Sorry, I mean I mentioned last week that I planned to do an episode about logic gates, which light at the heart of computer processing. So today we're going to do a quick overview of logic gates
and what they do. And originally I was going to run reruns this week because I'm actually on vacation as you listen to this, but I had a little extra pizazz in my step that I'll talk about in a second. But yeah, let's talk about logic gates and what they do. And first up, you should know that logic gates are based off of Boolean algebra. This branch of math was invented by the guy that it was named after, George Algebra. Just kidding. His name was George Bull. Bull was born
in England in eighteen fifteen, as so many were. He died forty nine years later in Ireland. And I can understand why. I guess this is where I should mention. I'm writing this episode while under the influence of advil PM because I grabbed the wrong tablets while trying to treat a headache. So that's going to be a factor for the rest of this episode, just like you know. Anyway, George Bull helped establish symbolic logic, which, by the way, I loved that subject in college. It was a math
course that I excelled at. In fact, I would only show up to class on Thursdays and Fridays. We met every day of the week, but there was no attendance policy. So Thursday I would show up to find out which chapters that the professor had gone over, and then Friday I would show up because there was a quiz for that week's lessons. And symbolic logic made so much sense to me that I could just show up on Thursday to see which chapters that needed to read, show up
Friday and do the work. And I aced that class. Now this is not to say that I'm a genius. I am not. Just for some reason or another, symbolic logic clicked with me in a way that a lot of subjects never did. Anyway, symbolic logic would become a fundamental foundation for digital circuits. Years later, my guess is he didn't know that was going to happen because computers weren't a thing yet. So what had happened was Bull
mostly learned mathematics all on his own. He received some tutoring from his father, who was a tradesman, and he did go to a couple of schools, but no like secondary formal education. Mainly he was self taught, and he actually began teaching around various schools in his region when he was just sixteen years old, not only because he was brilliant, but also because it was necessary. His father's business had slowed down and his family needed the income.
So the gig economy has been around for a while, I guess is what I'm saying. In the eighteen forties, Bull submitted papers to the Cambridge Mathematical Journal on subjects ranging from differential equations to calculus, you know, light reading. In eighteen forty seven, he published a work titled the Mathematical Analysis of Logic Being an Essay toward a Calculus of Deductive reasoning. This landed him a university teaching gig, even though he had never earned a college degree of
his own. In eighteen forty five, he published a further treatment of his ideas titled an Investigation into the Laws of Thought on which are founded the mathematical theories of logic and probabilities Real Page Turner. In eighteen forty six he married Mary Everest, the daughter of Mount Everest. Okay, wait, no, sorry, no, she was the daughter of George Everest. It's just that Mount Everest is named after George Everest. That's actually true.
I got a little confused there. Sorry, I'm blaming the ADVILPM. At this point, Boole used mathematical symbols to represent logical arguments and showed that by encoding an argument as a series of equations, one could check to see if the argument was sound or not, if it were true, or if it were false. So, for example, maybe you were saying something like all cats are mammals, Old Greg is
a cat, therefore old Greg is a mammal. Well, you could actually represent those statements as equations, and then you could solve to show that the conclusion is contained within the premises. So if the premisses contained the conclusion and everything lines up, you would say it's logically sound. It is a true argument. However, if you said all cats are mammals. Old Greg likes to sip Bailey's out of
a shoe. Therefore, Old Greg is a mammal. Well, that wouldn't fly because you haven't established that Old Greg is a cat or any other kind of mammal for that matter. He's Old Greg. So what does this have to do with computers. Well, Old Greg will have nothing to do with computers, probably because he lives at the bottom of
a lake and his computer with short circuit immediately. But boolean logic would underpin the concept of logic gates, which in turn would allow a computer to process information in a meaningful way. And this brings us to the concept of actual logic gates. PCMag dot com defines logic gates as quote a collection of transistors and resistors that implement
boolean logic operations in a digital circuit. Logic gates have one or two zero or one inputs, but only one zero or one output, as in the following examples, which they then list to continue the quote. Transistors make up gates, Gates makeup circuits, and circuits may up electronic systems. End quote. So a typical logic gate usually accepts two inputs and produces a single output, and that output is based both upon the nature of the inputs and the nature of
the gate itself. I say typically, because of course there are exceptions, but for the purposes of simplicity, we're mainly focusing on the typical example of two inputs enter, one output leaves. So yeah, I am going thunderdome with these rules. The output that a logic gate produces, like I said, depends both upon the value of the inputs and the
type of logic gait that we're talking about. So you can think of logic gates being kind of like a physical gate that leads into, say a courtyard, and this particular gate has a bouncer standing outside of it. The bouncer enforces the rules. So you come up to the gate, and if you meet certain criteria, the bouncer lets you through, and if you do not meet the criteria, the bouncer
turns you away. This analogy isn't perfect, because really the logic gates allow a value to pass through no matter what it's just what value is that going to be? Will it be a zero or will it be a one. So the bouncers in circuits are electronic components, and the rules depend upon the type of gait. And we are talking physical structures here in a circuit. We're actually talking about transistors and resistors. So the way the gates work is dependent upon voltage. So each input can have one
of two values. Either zero volts is applied to the input, which would then represent an input of zero, or five positive volts are inputed into that input and then that represents a one. The output produces a value of either zero volts, so a zero in logic, or five volts again meaning a one in logic. Now, if we just talking two inputs, you can have four possible combinations, right, So each input can have one of two states, either
a zero or a one. And if we start to group these two inputs together, that gives us four potential combos. You could have both input A and input B B zero, so that's one value. Or you could have both of them be one that's a second value. Or you could have input A B zero, input B is one that's a third value. Or you could have input A B one and input B is zero that's your fourth value. So four possible combinations zero, zero, zero, one, one zero or one one. Now, those are the basics when we
come back. We're gonna talk about the different kinds of logic gates, and we'll build from the most basic to the more complicated. But first let's take this quick break to thank our sponsor. We're back, So now we're going to talk about logic gates, and the first one up is the and logic gate. The and logic gate will produce a one result as the output only if both inputs are also one. So if input A is one, an input B is one, then the output is also one.
Any other combination, whether it's zero, zero, zero, one, or one zero, will have an output of zero. So an and gate will produce a one if both inputs are also one. Next up, we've got the or gate. This one will produce a zero output if both inputs are also zero, so any other combination will create a one output. It's kind of the opposite of the AND gate in many ways. Then we have the exclusive or, which is agreed not to date other people. I'm sorry, that's the
nighttime advil talking. The exclusive or x or produces a one only if one input is a one and the other one is a zero. So if you have one, zero or zero, one is the inputs that produces a one output. The zero zero and one one inputs would produce a zero output. I know this starts getting confusing, but this is just for the purposes of explaining the different types of logic gates. Next up, we've got the logical inverter. So a logic inverter is someone who stands
mister spot on his head. No, actually no, it's a not git and this only has a single input. So it's one of those exceptions I was talking about a few moments ago. So an inverter does exactly what you would expect. It produces an output that's the opposite of the input. So if you have a one coming into a not git, a zero is coming out, or vice versa. Now we're going to get into a few gates that combine some of the more simple verse into something a
little more sophisticated. So first up, we've got the nand gate, the NA in D gate. This behaves as if it is an and gate immediately followed by a not gate. So the nand gate will create a zero output only if both inputs are one, so one one creates a zero output. Any other combination would produce a one output. Then we have the nore gate, which is an or gate followed by an inverter. It will only produce a one if both inputs are zero, So zero zero creates
a one output. Every other combination creates a zero output. Now let's go crazy Broadway style. We got the exclusive nore or x nore gate, which will produce an output of one if both inputs are the same. So a zero zero input will create a one output, but then so will a one one input. Now a one zero or a zero one input that would create a zero. So yes, I know that sounded like a lot of zeros and ones, but that's what we've got to work with. And these logic gates are what make up digital circuits.
You can combine them in all sorts of different configurations, with the only real limit being that we are talking about physical structures producing these outputs, right, we are talking about actual transistors and resistors. So eventually you do start to run out of physical space. So they do take up space, very little space because these are very very
very tiny components, but they do take up space. You also run into issues like how much heat you're producing when you're providing power to the circuit, so that can also be a limiting factor, but otherwise you can get into lots of complex orientations and configurations. Now, the cool thing about this is you could actually write out the string of logic gates that your circuits follow. Though you would need an awful lot of paper to do anything
like a modern circuit, you could do it, however. The point is that logic gates represent specific rules, and these rules determine the output produced based upon the input received. And that's the very basic foundation of digital computers. Though obviously it gets a lot more complicated from there, but that's something I'll just have to tackle on a day when I'm not on ADVILPM, which is not a sponsor, I should add, it's just the reason this episode turned
out the way it did, so my apologies on that one. Anyway, that's the basics of logic gates. You can get a lot more detailed, as I said, and I know it gets really confusing hearing all those ones and zeros. I recommend looking up truth tables for different logic gates so that you can get a better understanding. It helps me, at least when I'm able to see visualizations of this in various charts, and that lets me get a better understanding of what output you're going to get based upon
the input going in. So yeah, this is like the core foundation of processing. So you combine that with the element of binary information where your values can either be a zero or one. Then you start grouping bits together to make more meaningful representations of info. Couple that with the process of logic gates, and you start to see how computers actually start physically handle this information in the
form of voltages. It's really incredible, like when you start to break it down, because we deal with such an abstraction of what computers are doing when we're running any program, right, We're just focusing on whatever the program does. We're not necessarily thinking what is going on at the circuit level to make this happen. Well, logic gates are really the
basis for that. So I hope this was interesting. I'm sure it was at least cringe worthy and maybe in a retaining as well due to my aduled state thanks to the nighttime medication I accidentally took. And it's still doing a number on me, y'all. I've had two full cups of coffee to push through this, So blame my foolish on grabbing the first headache medicine that I wasn't within reach for me to deal with my headache. That's the reason. All right, We've got a couple more new
episodes this week before we get into rerun territory. I've got a new episode for tomorrow and for Wednesday, so hope you enjoy those. They will also be on the shorter side, which was necessary for me to be able to get everything done before I headed off on vacation. And I hope you are all well. I hope I'm
well right now. I should be kind of leaning back in a hammock in the Blue Ridge Mountains somewhere hopefully at this point, unless the weather's terrible, which is probably will be, and that case, I'll just be inside watching the rain. Either way sounds good to me. And I will talk to you again really soon. Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple Podcasts, or wherever you listen to your favorite shows.