Welcome to tech Stuff, a production from iHeartRadio. Hey there, and welcome to tech Stuff. I'm your host, Jonathan Strickland. I'm an executive producer with iHeartRadio. And how the tech are you today? I thought we would talk a bit about resolutions. Not like I'm gonna lose weight this year, I promise, but more like, yeah, I got a five K ultra wide display, so you know, I'm pretty cool.
Resolution is one of the factors that determines the quality of image you can have on a display, whether that's a television or a computer display. We're largely going to talk about TVs in this episode. And remember I said one of the factors, because there's actually a ton of different factors that come into play when it's all about your image quality. I think that's important because often, I
guess less often than it used to be. But you know, it used to be a thing where people would use resolution as a as shorthand for quality, right, Like they would talk about resolution as if that automatically made one display or television better than another. And there's this implication that the larger the number, the better the experience. It is kind of like how Nigel Toughnel of a spinal tap insisted that his amp was superior to all other amps on the market because his goes to eleven and
eleven is one louder than ten. As it turns out, there are lots of elements that determine whether or not the image you get on screen is fantastic or if it's not, And we'll talk a bit about that at the end, but for now, let's just focus on resolution, no pun intended. Typically we describe resolution as think of it as the number of pixels that are used to make up an image, so pixels in this case would be individual points of light. This is a lot easier
to talk about with digital displays. When you get to analog it's a very different story. We'll touch on that. So there have been a lot of different display technologies over the years, and they work in different ways, but we frequently will still talk about resolution in terms of pixels. It's just a useful kind of shorthand. Now it's undeniably true that if you only have a few pixels, any image you create is going to be crude and blocky. Really, you'll be able to pick out the shapes of the
individual pixels. So a lot of the discussion we're going to have today is going to be about square pixels. And thus, if you make an image with just a few pixels and they're all square pixels, it's going to look like a bunch of blocks. If you've ever played old home video games, like the stuff you would get on say an Atari twenty six hundred or the old Nintendo Entertainment System, you probably noticed that stuff on screen
typically is made up of squares. Some of the squares could be pretty small, but they're still squares, and they have defined edges. If you get a good enough look, you can see, oh, these are a bunch of blocks. Even rounded characters like the Goombas in the original Super Mario Brothers game on the nes, they're made up of these tiny little blocks, and pixels are essentially the same basic idea, although pixels are really the elements that are
allowing us to see those blocks. So if I gave you a bunch of actual physical wooden blocks, and the blocks were of all different colors, like you know, you've got like a ton of red, and you've got lots of different shades of red, and a ton of blue and a ton of green, and all those kind of things. But each block measured one inch to a side, So these are these are cubes. And I give you a frame that's I don't know, like twenty one inches wide
and twelve inches tall. That's roughly a four x three relationship. Not quite, but it's close enough. And I told you create a picture of a house using these blocks, Like you have to make a house using these blocks within that frame. So you're doing what is essentially a two dimensional image, Like, yeah, the blocks are cubes, but we're
just gonna be looking at it from the top down. Well, your picture is gonna look pretty primitive because you're using these one inch by one inch blocks, and you know you're not gonna have any smooth curves or or things like that, like anything that's at an angle. You're gonna get this blocky texture. But let's say that instead I give you the same size frame, but now I'm giving you blocks that are half an inch to a side, not a full inch. Now you can fit more blocks
inside that same frame. The house you create will look slightly less jagged, it'll look less primitive. Now let's say I keep doing that, and eventually you have blocks that are just one one hundredth of an inch to a side, and you're able to manipulate these somehow well, you could create an image that, from at least a certain distance might not look like it's jagged at all. It might look like it's made up of smooth lines no matter what the angle is, maybe even curved lines. Well, that's
the basic idea behind resolution. It's a combination of pixel density, as in how many pixels are within that frame, and then the display size how big is the frame. So, going back to our example where we were starting with a frame that's twenty one inches wide and twelve inches tall, let's say we increase the frame size by a factor of ten. We might get ten times larger. But instead of adding more blocks, I also just increase the size of the blocks as well, So the pixels in this
case also get to be ten times long. The resolution would be the same. Even though the image is bigger, The resolution is the same because it's the exact same number of pixels that make of that image. It doesn't matter how big I make that frame. If the pixels expand along with the frame expanding, we get the same blocky image. Now, granted, if you get further and further away from the picture, it starts to look better because we lose the fidelity with our vision. Human vision has limitations.
But if I really want to make a higher resolution image, I have to change the size of the pixels. That's what's important, so I can put more pixels within that frame now, at least to a certain point. The higher the pixel density, the smoother and image will appear on a display. I say to a certain point because as I mentioned earlier, human vision has its own limits, and beyond those limits, you are not likely to notice a difference. It's not like there's a hard cutoff for all people.
Obviously everybody's different, But if you were to look at a typical person, you would say that beyond a certain resolution, if you're considering a certain size display and a certain distance from it, you're not really going to notice a difference in increase resolution. It just comes a point where
a curve line is going to look perfectly curved. Now, if you were to zoom way way in, you would see that actually that curve line is made up of these little individual points, and if you go in far enough you might see, oh, these edges, there's edges to this curve. It's not perfectly curved, but only if you're zoomed way in. For our normal vision, it looks smooth, and that's that. Now, you might technically be able to cram even more pixels into that frame, but it wouldn't
really make a difference in the experience. So this kind of becomes one of those if a tree falls in the woods and no one is around to hear it, does it make a sound of situations? If a manufacturer boosts resolution in a display but you're not able to
perceive any difference, does it matter. I would argue that from a consumer standpoint it doesn't really matter, but the resolution tends to get tied into sales pitches and marketing and pr and from that side, it really does matter because it's a way for you to set your product apart from your competitors, because again, big numbers sound really impressive, and certain types of people just naturally want to think they're buying the best thing that's out there, even if
they can't actually tell the difference between that and say, earlier or competitor models. I'm reminded of a time where I attended cees. This was years ago, and I went to a certain company's booth. I'm not going to name them, mostly because I can't really remember which booth it was, but it was one of the big ones that does television,
so probably probably something like Phillips or Samsung. The company had several televisions and displays, and they had all different ranges of resolution, from some that were still in the high definition range, which we'll talk about a little bit, up to what I think was eight K. I think they had an eight K TV on there, and this
was in the earliest days of eight K technology. It was kind of a prototype device, so this was before you could buy an eight K television as a consumer, and it was really just a demonstration of that technology. And the representatives in the booth had magnifying glasses and they would hand one to you and encourage you to step right on up to that screen and hold up
the magnifying glass and look for pixels. And even when you did that, they were still really really hard to see, and they were saying, see, look how small our pixels are, and therefore the resolution is incredible, And yes, it was impressive, but to my eyes, when I stepped back and I just looked at the display and I compared it to lower resolution displays that were, you know, just a few
feet away, some of that was virtually indistinguishable. You know, ones that were at a lower resolution didn't look that much different from the eight K display. Yes, if I got right on up to them, if I got to the point where my nose is almost touching the glass of the display, and especially if I use the magnifying glass, yes I would see pixels more clearly on the lower resolution screens than I did with the eight K. But hey, y'all,
I don't watch TV that way. I'm usually quite a few feet back from the screen, and I rarely ever whip out my magnifying glass while I'm watching, you know,
shmigadoon or whatever that being said. As I mentioned earlier, if you are a manufacturer and you decide, hey, I'm going to create the largest consumer television that's been introduced so far, resolution is going to matter in that case because the bigger the display you go with, the higher resolution you're going to need in order to get that smooth experience with your image quality unless people are just
looking at the display from really far away. Because you know, if you look at a huge television but you're really far away. It's kind of the same thing as looking at a medium sized television, but you're closer. I know that i'm saying the obvious, but it's important for things like resolution. So if I'm shopping for a new television and I'm looking for something that's in the forty three inch range, right, like, that's the size TV I'm looking for, I probably don't need to go up to an eight
K television. I'm probably not going to be able to tell the difference. But let's say I'm crazy wealthy and I want to get the equivalent of Samsung's the Wall, which I think can go up to like one hundred forty six inches. I mean, it's crazy huge. That's more than one hundred inches larger than a forty three inch TV right on the diagonal, So that's truly huge. Well, if I'm getting something that big, well I probably want the best resolution I can get because it will be
noticeable if you're sitting a little close. I mean, you probably don't want to sit too close to something like that because you wouldn't be able to see everything anyway. But you get what I'm saying. So I want to higre a resolution display if it's really huge. That way, I'm not distracted by blockie images. By the way, the same thing is true for the displays on our phones.
Resolution does matter on smartphones, but again, once you get to a certain pixel density, the differences become harder to spot, at least with resolution. Smartphones vary in size, but they don't get to gargantuan proportions like some televisions do. And also, the distance from which we view our smartphones tends to fall in a fairly narrow range. So I looked it up.
I found the National Institute of Health actually has stats on this and says that on average, people hold their phone between teen point three centimeters to thirty two point nine centimeters or about five point two inches to just under thirteen inches away from them if they are sitting down. If you're laying down, it's different. You get the phone closer to you, probably because you don't want to be holding the phone at arm's length above you in bed
or something. So if you're laying down, it tends to be between nine point nine centimeters away to twenty one point three centimeters. That's about well just under four inches to a little more than eight inches away from your face. So My point is, while companies might boast about increasing image resolution at a certain point, it just doesn't make a noticeable difference in the image quality. All Right, we're gonna take a quick break. When we come back, we'll
have more to talk about with resolutions. All right, let's talk about some of the different resolutions that are out there. And also we should note that there there's another factor that's going to come into play in this discussion when we talk about televisions also computer displays, and that factor is called aspect ratio. This is the relationship between how
tall a display is versus how wide it is. So before the nineteen nineties, with televisions, it was pretty standard for those to come in a four to three aspect ratio, meaning the display was slightly wider than it was tall, but it was, you know, fairly boxy. So it means that you would take, you.
Know, let's say you have the width of this this television. Uh, if you divided the width however wide the TV was by four, whatever that number was, if you multiplied it by three, that's how tall the television was.
That's the four to three, so a little wider than it is tall. Beginning in the nineteen nineties, we started to see the introduction of what was then called wide screen television. We just call them TVs today because it's the standard format, but they had a different aspect ratio. Their aspect ratio was sixteen to nine. That means there's a bigger difference between height and width, right, The width is wider now than it changed more than the height did.
And you could think about that, like if you multiplied four to three by three, it would be twelve to nine. But we're not talking twelve to nine. We're talking sixteen to nine. So the width changed more than the height did. That's when we get the wide screens. That's also going to affect how we talk about resolution. The introduction of computer displays complicated matters considerably because televisions followed a fairly
standardized path. They had to. It was a necessity because they needed to be able to show broadcast television and cable television and these followed. So the televisions that we bought needed to be able to display those standards and make it a good experience. So, while there was nothing inherently requiring televisions to all follow the exact same you know,
measurements and everything or aspect ratios. It was important that they did so, otherwise you would end up with like weird bars along the sides or top and bottom of images so that they would fit your screen, and people typically don't like those so much. So we're again, we're going to focus primarily on televisions. But you know, computer displays didn't have those restrictions, so computer displays have come in all sorts of different shapes and aspect ratios and
thus also resolutions. So the reason we're going to focus on televisions is because if we tried to do it with computer displays, it would we'd be here for weeks, and some of them reference like a display that was used for a couple of years and hasn't been used since, and it would make no sense to even really spend time on it. So with televisions, we typically start discussions by talking about standard definition TV. You technically could go back further than that, but there's no real point in
doing it, so don't ask me to do it. Besides, I've already done episodes where I talk about the history of TV and kind of talked about this. Also, I'm out of coffee, so I'm a little cranky today, so you know, it's just getting grouchy, Jonathan, I'm sorry. So standard definition really came in two formats, and they followed two different technologies. They were different enough to be incompatible,
and what you got depended upon where you lived. So for folks who lived in places like the United States, we had standard Definition TV that was at four eighty I. The I in this case stands for interlaced. That gets into a related but arguably tangential territory. So I'm just going to stick with the resolution. But if you were in England, then you had the PAL format and that had five seventy six I resolution, So for eighty in the US five seventy six in the UK. Technically PAL
crammed more pixels onto the screen. It also had a different frame rate from US or in TSC based tv U. These formats are largely obsolete today. Obviously, if you watch older content made for television, you're going to see that reflected in both image quality and aspect ratio. The numbers there indicate the number of lines that made up images on the screen, and we're talking lines from the top of the screen to the bottom or from the bottom
to the top if you prefer. So, we're talking about the height of the display, So four eighty means there's four hundred eighty lines. Five seventy six means there's five hundred seventy six line. Uh, And thus the five to seventy six format is putting more image into a displays,
cramming more lines of image. So if we were talking about digital, we would say by taking a four to three aspect ratio television and old style TV showing standard definition TV, if we were thinking of it as digital, we would say it was four eighty by six forty. That's what you get. Would you take the four eighty height and you put it through that four to three ratio. Remember the three in the four to three is the height, So it's four eighty tall and six forty wide five
seventy six. If we did the same thing, would be five seventy six tall and seventy seven hundred and sixty eight wide. Again, assuming that you're using square pixels, that also would change things. But we're trying to keep this as symbol as possible. Now, that's if we were talking digital information with the standard definition, we wouldn't be Typically we'd be talking about analog television, that's a very different thing. We would still be talking about lines, but we wouldn't
be talking about pixels so much. It's a different way of going about creating an image. So pixels gets a little muddled in that context, not that it really matters for our discussion because we're talking about largely obsolete technologies. But when you get up to enhanced definition television, the resolutions actually become four eighty by seven twenty and five seventy six by seven to twenty. So you might say, well, why why seven twenty Why is the width standard but
the height different. Well, the answer to that is a little complicated and also largely moot, and so I'm not going to go into it because it doesn't really matter. It's not something that you would commonly encounter today. That again was standard definition that was in the before times, and then high definition would bring new resolutions into the picture, so to speak, and also a change to the sixteen to nine aspect ratio that would have a large impact
as well. So once we got to HDTV, the lowest resolution for HDTV was seven twenty P. So the P in this case stands for progressive scan. That's the alternative to interlace. Right, we talked about four eighty I and five seventy six I seven twenty P progressive scan progressive versus interlace. We're still talking about the height of the screen when we use that number seven to twenty P. So a seven to twenty P HDTV has seven hundred
and twenty vertical pixels. If we were to measure the width of the screen in pixels, we would count one thousand, two hundred eighty pixels across. You got to multiply both of those to get the full number of pixels that are on the display. Right, the height times the width, that'll give us nine hundred two twenty one thousand, six hundred pixels. That's how many pixels make up a seven
to twenty P HDTV image. Now, a step up from seven to twenty P was ten eighty P. We also had ten eighty I, which was arguably a step up because it was a higher resolution, and I say arguably because of the differences between interlaced and progressive scan. But again that gets into a different technology. So a ten eighty screen shows images that are one thousand and eighty
pixels tall. So we're still talking about the vertical here, and then for with we're talking one thousand, nine hundred and twenty pixels wide, So you multiply those two together and you get more than two million pixels on display, so more than twice as many of a seven to twenty p display. So you're really cramming those pixels in there. Of course, we're only just getting started. Also, this is at the point where the television industry did something silly.
So again, up to this point, when we're talking numbers, what we're describing is the number of vertical pixels in a column, right, how tall the image is in pixels. That's what we've been talking about with standard definition and high definition. But then we get into the ultra high definition territories and we suddenly stop looking at the vertical side of the equation and we look at the horizontal the width. So ten eighty would mean one thousand and
eighty vertical pixels. But when we hit UHD, we do the flippy floppy, and we talk about the pixels that are across rather than the ones that go up and down. So let's say that you were talking about a two K display. Technically two K really references resolutions that are
somewhere in the neighborhood of two thousand pixels wide. That means you could actually call a ten eighty P television a two K television because yes, it's one thousand eighty pixels tall, but it is one thousand, nine hundred twenty pixels wide. One nine hundred and twenty is practically two thousand, So a ten adp TV is technically also a two
K TV. This helps because there was never a formal definition of two K. There are other technologies that create two K images, like cameras, and they can have different resolutions, right, like it could be two thousand, five hundred and sixty pixels wide as opposed to one nine hundred and twenty, But that's cameras, not displays. Really, we typically skip over two K we go straight to four K, and a four K UHD screen has three thousand, eight hundred and
forty pixels across, so that's what we're referencing. When we say four K, we're actually talking about three thousand, eight hundred and forty and it's two thousand and one hundred sixty pixels vertically. So if we had used the old HDTV classification instead of saying four K, we would call it twenty one sixty, right, because we went from ten eighty.
Now we'd go to twenty one sixty. But for some reason, the industry decided instead to switch this up to width rather than height, So we talk four K. If we go up to eight K, then we're talking about seven thousand, six hundred eighty pixels wide and four three hundred twenty pixels tall. So again, if we were using the old classification, we'd call it a forty three to twenty television. So as we get higher in resolution, we also see that
screens are cramming more and more pixels in there. Like, the number doesn't have to go up a whole lot in either direction for the total number of pixels to go to crazy amounts. I mean, we're talking about the millions and millions of pixels at this point, and like I said, depending upon stuff like the size of the display and how far away you are from it, that
may or may not make a difference. If you've got the cash to PLoP down for a luxury Samsung display, you know that one hundred and forty six to one, which I think will cost you like half a million dollars or maybe a quarter million something like that. I mean, it's way more than I could ever ever pay. Then you know, you're you're probably not gonna sit so close that you're fogging up the glass, but you're gonna want
a pretty high resolution for that. But if you're again, if you're looking at something that's in the probably the forty to fifty inch range, I could see you going as high as four K, depending on how far away you're sitting. But even that, to me feels like overkilled. But you also have to keep in mind I'm talking about my experience with my vision, which is probably not as good as yours. So it's it's entirely possible that you could immediately tell the difference between HD and UHD
from resolution alone. I'm not here to say that there is no difference. You might be able to spotted immediately, even on smaller screens, but for someone like me, it doesn't make much sense. Okay, we're going to take another break. When we come back, I'm gonna talk a little bit about computer displays, because, as I said, that makes things
oh way more complicated. So computers and their displays. The earliest personal computers mostly relied on televisions to serve as monitors, like you would actually hook up a television one that you're presumably not using for anything else to your computer. Now, later on models would have their own monitor. Often in early computers, the monitor was incorporated into the body of the computer itself, so it wasn't like you would go and buy a display and then a computer and then
pair them together. They would be one and the same. Eventually we would get standalone displays that could be used with different types of computers as long as the connections were standardized between the two. Right, and there are many, many, many graphics formats that affected everything from resolution to aspect ratios, far too many for us to go into, because if we did, honestly, this episode would just become a numbers station.
You know, one of those radio signals where if you pick it up, there's this creepy music playing in the background, and then a voice comes over the air and just starts reading out a string of numbers, and then the music plays again, and then it repeats. Well, I don't have any creepy music to play in the background, so
I can't pull that off. So I'm just going to talk about the most common screen resolutions that we use today, Like, we could be talking about stuff like three twenty by two hundred from way back in the day, but that makes no sense, So I'm gonna use stat counter. And I'm also talking about displays for desktops. Obviously, if we talk about things like laptop computers or mobile devices, that's another matter entirely and would require a different episode. So
let's say you've got yourself a desktop computer. What are the most common screen resolutions in use currently? Well, according to stat counter, the most common resolution in use today is nineteen twenty by ten eighty that's the equivalent of HDTV ten eighty HDTV Right, nineteen twenty by ten eighty pixels. Following that, you've got thirteen sixty six by seven sixty eight. After that is fifteen thirty six by eight sixty four. So that's interesting, right, Like, the most popular is the
equivalent of HDTV. The second most popular is you know, a step down. The third most popular is a slight step in between the two. I thought thought that was interesting. Number four would be twelve eighty by seven twenty. That's the equivalent of seven to twenty p television. But none
of that marks the top end of computer resolutions. Those are just the most commonly used, and this is just based on pulling data from folks going to different websites and stuff, because web pages need to render properly on a screen, and so you get that information by collecting all the data on the back end. It's yet another data point that gets collected about us as we use
our technologies. So if you're going on the slightly higher end of resolutions, that still are They're not common, but they're more frequently used than the top top of the line, mostly along the lines of people who are in creative industries like video editing or image production, or they're the gamer types. Then you might get up to around twenty five sixty to fourteen four These are also known as
fourteen forty P resolution monitors. That's less common. It's it's higher quality, but it is not even close to the top of the line because you you do have five K and even eight K monitors available on the market. Not only that, but computer displays can actually be other
aspect ratios besides sixteen to nine. You know, that's the standard for televisions, but there's no rule that says computer monitors have to follow that same aspect ratio, So there are ultra wide monitors that have a twenty one by nine aspect ratio. In fact, that's allowing for the possibility of ten K monitors where you have a display that's four thousand, something like four three hundred and twenty pixels tall and then a crazy amount wide because it's a
twenty nine twenty one to nine aspect ratio. With a monitor that wide, you can actually create the equivalent of a dual monitor set up, but with just one screen. Right, you just designate one part of the screen to serve for something and the rest of the screen to serve for something else, and you don't need two screens because your monitor is so wide. You can do it in one go, and the resolution is so high that you can do it without the details suffering in the process.
You can divide up your screen to serve different purposes. Now, as I said early on, resolution is really just one
part of what makes a good image. You do need higher resolution if you want something to look nice and smooth, particularly if you're doing something like using a very large display close to you, like you would if you're like video editing or whatever, and particularly if it's a very large display you need to have that higher resolution, but there are other elements that are at least as important and sometimes arguably more important than resolution in order to
get a great image on your screen. So stuff like brightness matters, contrast, which is the difference between the brightest and the darkest colors that your display can show. The greater degrees of difference you can have between the brightest brights and the darkest dark colors, the better color representation is really important too. You want those colors to be accurate.
You want what you see on your screen to accurately reflect whatever the end output is going to be, right, So if your monitor can't do that, and you're working on something that's going to be displayed in some other format, then it may be that what you get out doesn't look like what you put in because you have this disparity between the two. Refresh Rate also matters. The refresh rate is how frequently the screen actually refreshes the image on display. So with video, the higher the refresh rate
typically the smoother the motion is on screen. That doesn't necessarily mean that higher refresh rates are automatically better. They can be for things like if you're, you know, playing games, and you want to have a really smooth experience, Yes, but if you're watching something on screen and it has a super high refresh rate on your on your media
and your monitor, it can be distracting. It can create that effect people talk about where it kind of creates the the what they call it the Mexican soap opera effect. Everything looks a little too smooth, the motions are too smooth, and it doesn't feel right because the way that traditional television and film has kind of trained us, we see moving images having different qualities than what we get if
we have a really high refresh rate. So I know a lot of people who when they get a new television that has the option for a high refresh rate, one of the first things they do is that down because otherwise it makes everything look weird, unless you're talking about sports, which makes it look crazy awesome. So like for specific applications that high refresh rate is a good thing,
for others, it may be more distracting than good. And there's still other elements like HDR, which is high dynamic range. This kind of gets into luminosity or brightness, and specifically with relation to color representation, that gets really complicated. It's not helped by the fact that the industry often will push out a technology before that technology is standardized. So what that means ultimately is that you can have media that supports one approach toward HDR and hardware that supports
a different approach to HDR. The two are not necessarily compatible, which means while technically you have the hardware to produce HDR images and the software or media to create them, because of that lack of compatibility, you can't actually experience. It doesn't mean that the media won't necessarily play, you just won't be able to take advantage of that particular feature, and then you're paying money for something that you can't
actually use in that particular instance. We've seen similar things, by the way, with surround sound, where you could have a surround sound system that supports certain protocols but not others, which means you can't be certain just by putting media into the system that you're going to be able to enjoy the full capabilities of it. It has to be compatible. That's the issue with standardization, Like you want to have standardization so that way you don't have to think about
these things. You know, you just go out and get what you want, and you play it on what you want, and you get the best experience that the two can offer you without making any concessions. Unfortunately, that's not necessarily where we are, so we often have to do a little extra research to make sure that we're getting what we want. So that is kind of a round up on resolutions. It falls very similarly along the lines of megapixels.
When digital cameras first became a thing, megapixels was like the one and only way that manufacturers were using to differentiate their product from others, and essentially they were saying more megapixels equals more better, and if it had a higher number for megapixels, then it meant better pictures, which
is not necessarily the case. It does mean that you have higher resolution images, but at a certain point that stops to matter unless there's always none less unless you need to print the image on some enormous format, in which case you need that high resolution in order to not have a big drop in quality once you scale the image up to whatever size you're actually printing at. But for most people, not a big deal. Same thing I think with display resolutions. For most people, I don't
think you need to go overkill on resolution. Also, I didn't really touch on this, but when we're talking about things like eight K televisions, there's a lack of media for eight K, right, there's a lack of actual support for eight K. So you might have an eight K display, but everything you're watching on it is four K or
lower resolution. Maybe the display has the ability to upscale, but that's kind of like an artificial boost of resolution, right, it's computers interpreting what should be added to the image in order to create a higher resolution. It's not the true capture of whatever that thing was. So again, I think that going into it knowing these things and knowing the limitations is good. Maybe in your case, maybe you've got eagle eyes and you're looking for an enormous television
and eight K is absolutely what you want. You want a future proof it because you can already tell the difference between four K and everything else, and it stands out to you. Then it matters to you. If you're like me, it probably doesn't. I keep getting tempted to get a ultra high definition television because I'm listen. I use a TV until it stops working. So my televisions are still in the HDTV range, but I can't really tell the difference that much. Maybe slightly, but not enough
for it to really matter. The only reason I would go to ultra high definition is if I were purchasing media that was in the ultra high definition format that had features on it that I wanted. So in other words, if I'm going out and buying a UHD copy of a movie and the reason I'm doing is because there's behind the scenes commentary and stuff like that that I can't get in other formats, then yes, I would want to have the hardware to be able to play that
particular media. But you know, most people have gotten away from physical media. I've just started to get back into it because I got tired of stuff on streaming disappearing and not being able to watch it anymore. So having a physical copy is helpful in that case, but you have to have a you know, the space and equipment to play it. All right, I'm rambling. That's it for this what was going to be a text of tidbits on resolution. I know that I rambled like usual, so
it's not really a tidbits episode. I hope you are all well and I'll talk to you again. Really soon. Tech Stuff is an iHeartRadio production. For more podcasts from iHeartRadio, visit the iHeartRadio app, Apple podcast wherever you listen to your favorite shows.