Shopping for new home theater gear can be quite confusing. Whether you’re in the market for a new TV, sound system, speakers, or something entirely different, there are a lot of terms being thrown around. Maybe you’re not quite sure what “OLED” is, but your friend told you that you just had to have it vs LCD setups. Or, maybe you want the most immersive audio experience possible, but aren’t quite sure how to get it. In this series, we’ll go over everything you need to look for when shopping for new home theater gear, including HDR types, surround sound setups, as well as differences in TV panels and resolutions.
Nomad Base Station
What are TV the different TV panels?
The main portion of any TV is the panel. It’s what your content is shown on and is the most visible portion of any TV…and, whatever panel you buy is what you’re stuck with, so buying the right one is crucial. There are many terms thrown around with TVs, from OLED and LED to QLED and microLED, but which one is right for you? Each have a set of strengths and weaknesses. OLED provides, arguably, the best picture, but also has the highest cost. normal LED gives a semi-mediocre picture, but is the most affordable. What about QLED and microLED?
OLED is top dog when it comes to display technology which stands for “organic light-emitting diode” and functions in quite a bit different of a way than standard LEDs do. These panels are made from organic compounds that only illuminate when electricity is put to them. While not a major difference when comparing to LED, as they function in a semi-similar way, OLED is quite different. This new technology allows displays to be made paper thin, and in some cases, even grants the ability to flex your display or roll it without anything breaking.
The big difference between OLED and LED TVs come from how the individual pixels function. In an OLED TV, when something is black or bright, the individual pixel representing that item either shuts off completely or brightens up, giving a much greater contrast to your panel. This technology is called being an “emissive display”, which means that each individual pixel produces its own light. Because of this, in a dark scene, you can have a pure black background and a subject in the middle that’s still visible, which is very difficult to achieve on an LED panel.
OLED is generally going to be the most expensive TV type on the market today, due to its high production cost and high quality image. It’s one of the newer technologies on the market when it comes to TVs, first being released in 2013.
For the best home theater experience when it comes to movies and TV shows, OLED is the obvious choice. But, with the way OLED is designed, it’s not always the best for gaming, should you want high frame rate capability. But, with newer technologies, like NVIDIA’s BFGD, high frame rate OLED is making an entrance into the marketplace.
LED/LCD is the most typical technology that you’ll find in a TV. LED stands for “light-emitting diode” and LCD stands for “liquid crystal display”. As one of the most tried and true technologies in modern televisions, this is what most panels consist of.
While LCD TVs used to use cold cathode fluorescent lamps to illuminate the display, modern technology won out here and manufacturers moved over to LEDs for backlighting. This is not only more energy efficient, but also much smaller, allowing the displays to get thinner and thinner.
The big downside to most LED/LCD panels it that, in order to see the picture with LCD, an overall area has to be illuminated by the LED panel. This makes it hard to achieve deep blacks in one area but not another, as the LED portion of the panel has to light up an entire area just to see one small portion of the LCD. More on this down below.
QLED is a fairly new technology, standing for quantum light-emitting diode. While QLED is very similar to LED/LCD, it uses “quantum dots” in its LCD panel. These are tiny nanoparticles that give vast improvements over color and brightness when compared to normal LED panels.
While not quite as good as OLED, QLED is a fantastic technology that provides a much better viewing experience in many cases over normal LED/LCD TVs. If you’re wanting an upgraded viewing experience, but can’t drop the cash on something like OLED, QLED is a great option.
Edge-lit vs full-array
Generally, if a TV box doesn’t say “full array dimming” somewhere on it, you’re getting an edge-lit panel. What does this mean? Well, edge-lit is much lower cost compared to full-array for manufacturers, which is why there’s a vast price difference.
Edge-lit TVs have a row of LEDs around the outside bezel that illuminate the display. This is how most TVs are products, but it doesn’t give you the best quality image. The reason behind this is that in order for the TV to show you an image at the center of it, the outside has to be illuminated. This causes areas near the border of a TV to look brighter than the inside, and can break the viewing experience for the end user as it isn’t quite as immersive.
Full-array dimming changes thing up. What this technology does is put a ton of smaller LED panels behind the LCD display, allowing for greater brightness control of the TV. If you just need the lower left lit to see something, the rest of the TVs backlighting will shut down to make it seem more realistic. If you’re looking at the night sky with a spaceship in the center, full-array dimming will dim the pixels around the sky while keeping the spaceship at full brightness, which is closer to how our eyes would process the image.
This is designed to provide a similar experience to OLED without the high cost. However, the latest technology, microLED, takes this a step further.
microLED is a fairly new technology. What microLED offers is an emissive display without the high price of organic light-emitting diodes. What does this mean? Well, in essence, it means that microLEDs can produce just as deep of black and bright colors as OLED, but without the high cost of organic materials. Plus, microLED panels should avoid burn-in, and even have the potential to be brighter than OLED. While there aren’t many microLED TVs being released right now, the technology is there, and it’ll absolutely be a benefit to consumers once it hits the market.
This technology should revolutionize the LCD industry, once it hits the market, as a full-array dimming solution to provide near-perfect blacks, viewing angles, and vibrant colors.
Resolution plays a big part in your home theater experience. While you might see the individual pixels on a larger 720p display, an 8K screen will look as if you were peering out of a window.
P vs I
While you won’t find many 720i/1080i displays these days, it’s still something you might run across. While P and I include the same number of pixels in a TV, the real difference is how your display processes the signal from a device. I, or interlaced, means that each frame is sent in alternative fields. In a 1080i TV, fields consist of 540 rows of pixels that run from top to bottom. The odd fields are displayed first and the even fields are displayed second. Together, both fields create a full frame.
P, or progressive, means each frame is sent progressively. Both odd and even fields here are sequentially displayed, making a much smoother looking image than interlaced technology offers. For the sake of this article, we’ll only reference technology in progressive, as it’s the standard these days.
This is the lowest “high definition” resolution you’ll find, as anything below this (480, 360, etc.) are considered “standard definition”. 720p consists of 1280 pixels across and 720 pixels high. This ratio doesn’t chance no matter the size of the screen, so a 32-inch TV and 65-inch one will have the same amount of pixels. Because of this, with a lower-resolution TV, you’ll be able to tell the difference between individual pixels when the panel gets larger.
Generally speaking, 720p should only be looked at for extremely low budget cases or when the screen is fairly small, like 32-inches or less.
1080p is considered “full HD”, consisting of 1920 pixels wide and 1080 pixels tall. While HD used to be the industry standard, it’s not considered previous generation technology. Going this route looks fine for most things, as even most gaming consoles will only play HD content. However, when it comes to bigger screens (like with 720), the pixels can be seen individually, which can ruin the movie-watching experience.
These panels are great for places like offices or game rooms, where the TV will only see general use (news, weather, or games). However, for home theaters, 1080p just isn’t that great of a resolution these days with more and more devices supporting 4K natively.
4K is becoming the gold standard in display resolution. Offering four times the amount of pixels in a 1080p TV, 4K panels pack a dimension of 3840 pixels wide and 2160 pixels tall. This gives you the ability to see four times the content that a 1080p TV would show, though normally that’s not the use case.
This resolution also enables larger displays to be made without you seeing the individual pixels, which is something that 1080p suffers for screens over 42-inches. Movies look sharper, have more detail, and are normally more lifelike on 4K panels over 1080p. If you’re looking for the best balance between affordability and quality for your home theater, 4K should be a must.
More and more content is becoming available in 4K instead of 1080p, which used to be the standard. YouTube, Netflix, Apple TV+, and even Disney+ all support 4K content natively, making your investment worth all the more now.
Source: Digital Trends
8K is the highest resolution currently available on the consumer market. Though these panels can cost several thousand dollars, these displays pack 16 times the amount of pixels that a 1080p TV offers. With a dimension of 7680 pixels wide and 4320 pixels tall, 8K brings an entirely new meaning to “ultra high definition.”
I’ve seen an 8K TV in person, and let me tell you, it’s absolutely stunning. While the difference between 1080p and 4K isn’t that huge on some content, 8K is an entirely different beast. It looks like you’re peering through a window and viewing the outdoors in real life, while it’s actually a video. 8K takes massive amounts of data to display, however, and there is little to no content available at its native resolution. For now, this is a mere bragging right and future proofing solution, as you’ll likely not see native 8K content for several years.
Refresh rate is a term thrown around by many manufacturers as to who can achieve the highest number possible. The refresh rate of your TV is how many times per second the TV’s display refreshes, or displays a new frame. This should coincide with the content that you watch. For instance, most movies are shot in 24 FPS (or 24 frames per second), so your TV should be able to refresh at least that fast. Gaming on console is normally 30 or 60 FPS (though Xbox can support up to 120 FPS natively on the One X now). Gaming on PC can be done at 120, 144, 165, or even 240 frames per second, as long as your hardware can handle it.
The more frames you have in each second of content, generally speaking, the smoother it is. So, a TV that can natively support 120 frames per second (or 120Hz), will display 120 FPS content much smoother than say, a 30 or 60 FPS viewing experience would offer. Consequentially, if you’re attempting to watch 120 FPS content on a display that doesn’t natively support it, the extra frames are lost and no smoother picture is shown.
Source: Best Buy
TVs these days have insane refresh rates that aren’t possible to do natively. Some manufacturers will claim to have a 960Hz refresh rate, which just isn’t possible. What happens here is the TV uses its processing power to insert extra frames between the actual frames being broadcast, which, in turn, can give a psudo-effect to make your content look smoother. However, the actual refresh rate of that TV is like 120, maybe 240Hz.
CEC, or Consumer Electronics Control, is a feature of HDMI that is baked into many modern devices. For example, my Apple TV 4K and TCL 5-series 4K TV both have CEC. When I hit a button on my Apple TV remote, it turns on both the Apple TV and the TCL. And, consequentially, when I turn off the Apple TV, my TCL also shuts down.
HomeKit/AirPlay 2 support
HomeKit and AirPlay 2 are coming to more and more TVs. VIZIO is bringing HomeKit back to its older panels while some manufacturers are only releasing this capability to new TVs.
This feature enables you to easily add a TV to your smart home and send content from your iPhone, iPad, or Mac to your home theater to easily view it without having to purchase an Apple TV.
Google Cast support
Google Cast is very similar to AirPlay 2 support, as it allows you to send content from specific supported apps on a smartphone, tablet, or computer to your TV. Unlike AirPlay, this isn’t locked into the Apple ecosystem and works on Android, Amazon, and Windows devices.
This is similar to HomeKit support as it allows you to add a TV to your smart home for voice control, and can make for a very immersive experience when configured.
FTC: We use income earning auto affiliate links. More.