Technology: Basic TV Tips, Myth Busting, and Better Gaming

Why your TV looks weird, is going to blow up, and sucks for gaming.

Why does my TV make everything look like a soap opera? I just bought a 240hz TV, so why do video games play like crap on it? Why did my $1500 Sharp TV stop working? Why doesn't my $200 Black Friday TV look as good as a Sony? It's a Vizio and 120hz! That's supposed to be the best, right?! The salesman pretend they know things, but you know more, don't you? You read five things once, a friend told you Vizios are great, and you mulled over your consumer reports telling you what to buy. You made an informed decision... right?

The retail show floor is the manufacturer's battle ground and it has sadly ruined the expectations of an "informed" purchase. Out of the box, a TV's default settings are typically what you'll see at a store—which is garbage, believe it or not. These settings, usually known as "Vivid" or "Dynamic," are meant for a show floor only and serve to aggressively boost the image so that a particular brand's set pops more than the competitor's. These settings, in the majority of cases, are what is to blame for the decreased lifespan of modern TVs and misguided expectations of the consumer. The ultra bright, absurdly radioactive colors of show floor sets are self-destructive. Sooner or later, these settings have a terrific chance of frying your set as they literally cause your TV to run far too hot.

Manufacturers know this and they don't care because you're just going to have to buy another set in three or four years and they get more of your money. It's not dishonest business, but they're certainly fine with the fact that you don't know this. A TV must first be calibrated and broken in before you get the best picture and the longest life from it, but how do you calibrate a TV? Unless you're an ISF certified technician with a lot of magical science equipment, you can't, but the good news is, you can half ass it for now. Search your model number on for calibrated settings. That should give you a great jump start. Keep in mind, however, that any new TV is going to require a 150-200 hour "warm up" before it gives you a more accurate image.

We know that your TV does not perform correctly out of the box. We know that these settings are literally too hot and will decrease the potential lifespan of your set—but these settings are so pretty, though! At some point, consumers started thinking that brighter means better. I think this is a problem both started and exacerbated by the in-store battle manufacturers have with each other. Brighter just means brighter, and not clearer. You don't need a high brightness setting to appreciate the detail in a painting, do you?  In fact, like other settings on your TV, making your TV brighter also means squashing contrast, losing fine details, and running your set too hot. These aren't the only features of modern TVs that crush your picture and siphon the life from your set.

The left image is poorly calibrated. Notice the dark details are lost in the
middle greenery, while the clouds have also turned blue 
due to saturation bleed.
The image to the right is more natural, warmer, and inviting.

Another huge myth that seems to never go away is that 120hz is better for gaming and 60hz is trash. All TVs these days are 60hz or 120hz. That's it. That 240hz $3,500 Samsung you bought is 120hz (or maybe less) and it sucks for gaming. It's awful for gaming, in fact. But, faster refresh means faster games, dude! Yes. You're absolutely right... sort of. If we were talking about computer monitors and PC gaming, specifically, then yes, you're correct. When we are talking about televisions, however, is where those advertised refresh rates are unreliable. First, I'll need to explain some things, then I'll get into why it's horrible for gaming. From here on out, I will be speaking strictly of televisions.

When you see something like 240hz, ClearAction Rate 720, or Motionflow 480 on LEDs these days, these numbers are highly misleading and are absolutely not a true refresh rate. Are you familiar with the "Soap Opera Effect?" Like you're watching the actors perform in your very own living room? That effect that everyone thinks is inherent to modern TVs and is something you just have to live with? The great news is that you don't have to! When a brand advertises 120hz or 240hz, what they often can mean is that their TV has a feature that creates that effect and it can be turned off! What? No way! Yes, my friends, you no longer have to live like this.

So, how do I turn that off? Simple. Go into your video settings, and find an option, usually found in Advanced Settings called TruMotion, AutoMotion Plus, MotionFlow or whatever your particular brand calls it, and turn that God-forsaken ugliness off! If this option is grayed out or unable to be selected, go into your default modes and stop using Vivid or Dynamic—for the love of technology!—and switch it to User, Custom, or Cinema. One or all of these will unlock your Advanced Settings and you'll find that option within. Or maybe you're one of those weirdos who thinks that looks good, in which case, you're a weirdo and you can leave it on, weirdo.

Like this article. Please share. 
StumbleUpon Reddit Pinterest Facebook Twitter Addthis 

Plasmas, on the other hand, advertise a whopping 600hz refresh rate, but this does not work the same way 120hz or 240hz does. It is not a multiple of these numbers, but I'll come back to this later. True, plasma typically has superior motion clarity versus the aforementioned "refresh rates" on LEDs, but what's causing LEDs to look like a soap opera is artificial frame insertion. What in the hell is that? These motion features "predict" what the next frame of motion is that streams to your set and inserts an artificially produced frame that the TV figures makes the most sense to fit in between frame 1 and 2. Remember in school when you'd take a stack of Post-It notes and draw a stick figure throwing a baseball? You draw the figure say 10 different times and when you flip through it you see him throw the baseball. What the motion features do on these sets is in between every one of your drawings, it inserts another one that it thinks would best go between them in order to make the motion appear smoother, making for a total of 20 individual stick figures. That's why many people say that modern TVs look too cartoony, or too real, depending on your perspective, and it's where the dreaded Soap Opera Effect moniker was inspired from. This is called motion interpolation, or motion smoothing, and it's the devil.

So, why is this bad for gaming then? Keep in mind, buying a 120hz TV for gaming is not inherently a bad idea. In fact, it could be a terrific one, but you need to make sure you turn the motion features off. It's bad for gaming because it causes what is called input lag. When researching what TV to buy for gaming, you want to look for the best input lag, not the best refresh rate. With these motion settings on, when you press a button on your controller, the TV first must know what happens in the next frame that results from your button press, then it inserts the false frame before you, the player, sees the outcome of that button press. Every time you move an analog stick or press a button the TV waits a fraction of a second to understand what happens because of that action and builds that artificial frame based on that information, then afterwards presents that to you. If this is happening multiple times per second during a heated first-person-shooter, you can imagine what kind of tax that imparts on your experience. Turn these options off!
With improper contrast, or too hot of color saturation,
John Goodman's pocket on his red shirt will be invisible.

600hz on most plasmas, however, if you can even find a good one these days, does not do any of this. Some still do their own version of motion interpolation, but it's not a core feature of why plasmas are advertised as 600hz. Plasmas take 10 unique qualities of a single frame of motion (for the sake of dumbing this down, we'll just say blacks, blues, greens, reds, and bright areas among them) and refreshes each of those things 60 times per second. 60 frames per second x 10 picture quality thingies = 600hz. It's not a real refresh rate either, but the result is an exceptionally sharp image no matter how fast something is moving. There is very little motion blur on plasmas because of this and is why plasmas are highly recommended for gaming, granted you can find one that's decent.

The demise of plasma was in larger part due to the brightness wars of show floor LEDs, manufacturers needing more money—it is a business, after all—and the uneducated consumer. Plasmas just can't look their best in the super bright settings of large retail stores. When you have even the worst LED next to a plasma, the brightness is what catches the eye, and plasmas were eventually only bought by the niche enthusiast who knew better. That and the technology having a hard time shaking the persistent stigma of "burn in," which on later sets is rarely an issue.

Alas, the era of plasma is over, for good and bad. I'll proudly stand by plasma, as I own one of the coveted 2013 Panasonics myself, but I'm also happy to see advancements in other technology strive to surpass them. Now that you can't easily find a great plasma, what do you do? From my pretty darn thorough, hands-on experience with TVs for the last few years, Samsung and Sony are the safest bet these days for most types of viewers and especially gamers. That's not to say that other top manufacturers such as Sharp, and LG aren't as good. In some cases they are just as good or marginally better. Each company excels at something and you can't really go wrong with any of them granted you've done your homework.

Samsungs are notorious for their attractive design, fairly intuitive smart features, and super vibrant picture, but this comes at a cost. The image, if not carefully tuned, isn't accurate, though radiant, and can have abysmal input lag if you are not using Game mode. Watch out for this when shopping for a gaming TV. LGs are simple to use, have attractive interfaces, a natural image quality, great 3D performance, and sometimes good input lag on their sets. They do, however, incorporate more IPS panels into their lineup than their competitors and suffer inferior contrast and black levels because of this. Some of the better Sharps boast a pseudo 4K image and have a funky resolution that sits between 1080p and 4k. While the image looks great, their interfaces are clunky while having weaker gaming lag. Sony has made massive strides in their smart feature set and now both Sony and Samsung are indisputable kings of the market who share almost equally stunning picture quality overall (especially in 4K and 1080p upscaling). Sony obviously should, seeing as they helped develop the standard for 4K TVs and also own Playstation. They also sport a generally more accurate, natural image out of the box and have made big improvements in upscaling and motion clarity in hopes to finally dethrone plasma and they're getting darn close.

All brands, though, regardless of the name, still throw out some bad sets to stay competitive. Be mindful of this. Just because it says Sharp doesn't mean it's automatically great. For example, the Sharp LB261U is a 120hz TV with a very attractive price. You look at the Sony W600B next to it and see that it's only 60hz and $100 more. I know what you're thinking, because everyone else thinks the same thing. It's a Sharp. It's a good brand. It's 120hz and it's a killer deal. How can I go wrong? The only "benefit" of the Sharp, in this case, is that the Soap Opera Effect is available to you. The Sony next to it, however, beats it in virtually every other quality. Better color, contrast, motion clarity, response time, and input lag.  There is much more to a TV than just its resolution and refresh rate. Believe it or not, a $1500 Sony W850B, a remarkable 1080p, 120hz set has a reason—many reasons, actually—which propel it beyond just being a 1080p, 120hz TV and it's not reasons to pad the manufacturer's pockets.

You need to be looking for more than just numbers on a tag. If that were the case, there would be no reason to pay more for a higher quality set. The reason one TV is more expensive than another is not because it's 3D or has Smart features. This is another common myth—that you are paying more for those features. Contrast ratio, lighting, motion clarity, response time, backlight blinking, color accuracy, input lag, and black levels are just some of the actual things you're paying for in a TV. Because these features are difficult to quantify to the average consumer, snazzy Smart features, and 3D, among others are an easy way to illustrate to a consumer that one set is better than the next—sometimes to the manufacturer's own disservice—consumers will skip these sets because they think that's all they offer versus another cheaper brand.

One of the very best websites I've come across is ("Ratings" minus the 'a'). They get down and dirty with their sets, illustrating the performance of a TV with easy to digest images and objective science. I highly encourage you to use their website as a tool for all future purchases. Don't just look at the overall scores of the sets, though. Consider your environment the set will be going in, what you will be watching or doing on the set, and look for the qualities that matter most to you. A set that has a score of 8.5 might not have the input lag you're looking for while another scoring an 8 might be significantly better for gaming or motion clarity.

After researching your set and you've picked out the one you definitely want, your work's not done yet. We talked about calibration before. If you have the extra cash, hiring a certified technician to do this for you will yield the very best results. However, there are some steps you can take to make your set look and perform a little better. Results and particulars will vary from set to set, so don't take this as a sure fire way to make your TV impervious to stray bullets and random TV broadcasting that ranges from horrible to decent.
In Watchmen, paintings, knick knacks, and other background detail
could be missing from you image with too harsh of black levels.

First, like I mentioned before, do not use Vivid or Dynamic default settings. These are the very worst thing you can do to your TV besides taking it for a swim or repairing the screen with a magnet and a power drill. Switch off from those settings into User, Custom, Cinema, or Standard, depending on what your set offers you. To circumvent a lot of menu crawling, try to see if your TV has a Game Mode as this will give you the best base setting to start with and can automatically deactivate a lot of the redundant or harsh image features modern TVs have.

Usually sets offer a Backlight option. This varies greatly from set to set, but try setting the option to a middle, halfway up, or setting of 5 is fine.

Brightness should be in the upper 40s or just below halfway.

Contrast is usually fine in the upper 80s to low 90s, or roughly 90% up.

Color is typically solid in the low 50s, or halfway up.

Hue is almost never needed to be messed with. Leave it at default or centered.

Color Temperature is usually correct at Warm 1 or Warm 2. While Cool settings are really tempting, ironically they run the TV the hottest and can make everything look fluorescent, unnatural, and, well, cold. Be mindful of skin tones here. They are a sure fire way to tell how off your temperature is.

Before fussing with sharpness, be sure to turn Overscan off, or find the setting that adjusts image trim. This is not to be confused with Format and is usually something you can adjust by 1 digit increments by shaving off or revealing more of the image's edge. Be certain you are seeing the entirety of the image, otherwise, you may be just a few pixels zoomed in, magnifying noise, aliasing, or other issues and will conflict with proper sharpness settings. From there,  tweak your sharpness so that your image looks clean and not crispy. A setting of 0 or 50 is proper.

It is a common misunderstanding that sharpness is just making your picture sharper, when to a point that's correct, but often I see consumers go overboard and crank it to make the image look harsh or crusty on edges. Plus, an overblown Sharpness level will exacerbate bad aliasing (or jaggies) in video games. If your Sharpness is set to look clean, it should keep jaggies to a minimum on its own. Bear in mind, however, with the current quality of console gaming, you're never going to fully get rid of jaggies, so it's in your interest to manage it as best as you can.

From here, there are a lot of wishy-washy options you can tweak in the main settings for some TVs. These features might be called edge enhancement, resolution, gradation, ambient sensors, adaptive lighting etc. Turn these all the way down or off. If the previous settings are all somewhat properly managed, such as brightness, contrast, and sharpness, these features aren't necessary and are just fluffy features that one manufacturer can say they have over their competitors. If your sharpness is set correctly, edge enhancement is redundant and so is the resolution slider some sets offer you (unless you're watching 1080p content on a 4K TV this can be useful). Ambient sensors, on the other hand, and adaptive lighting will cause the brightness of your set to fluctuate depending on what's on screen. This can be jarring and is entirely useless if your set is properly calibrated.

Then you'll see noise filters, advanced/dynamic contrast, black level extenders/correctors, live color, etc. The names of these vary from brand to brand, but they all should be turned off, set to normal, or their lowest setting. Again, if all of your main settings are properly managed, these serve only to further destroy your image. Noise is bad, though, right? Why would I turn that off? If you are playing video games, or watching a blu-ray, there is no reason to apply a noise filter because the image is as pure as it's going to get in that format. With all those noise filters active, you're destroying fine details in an already gorgeous image. Again, with a properly set Sharpness, this will take care of a lot of "noise" on its own. But, but I watched a movie and it was so grainy! Film grain, or digital noise inherent in cameras is OK! It's just fine to see this! Many of these features available can and do destroy the film quality of a classic or the digital grit of modern movies. It's supposed to look like that and using features that filter "noise" on an image that doesn't need it can cause your movies to look waxy or soft.

But when I turn on the black corrector my black levels look so deep, bro! Yes, they do, don't they? But you're practically erasing shadow detail. You want to see every possible detail you paid for, right? Turn it off. The same goes for dynamic contrast (or whatever it's called on your particular set). While yes, the image smacks you in the face with its shininess, this setting crushes color detail. Instead of a smooth transition from rosy pink cheeks to beautiful porcelain flesh tones on Katy Perry, you will get a sharp drop off from those rosy cheeks to piercing white ghostly pail mannequin woman.

Gamma is another one that's a tough call because it's different for every set. Too high of a setting and you're back to where you started with harsh brightness. Try it at default first.

Remember, though, if your set has a Game Mode, much of this will be turned off or down for you. Now that you've done all of that you might find advanced, color specific settings where you can adjust red, green, and blue color bands, or color range options, along with some other stuff you don't feel like getting a headache over. This is where a certified technician should take over for you. There isn't a single TV out there that 100% replicates every color correctly. That's why these color bands exist on TVs, so that a skilled or educated person can tune these properly. It would be in your best interest to just ignore these for now. Although, if you feel like surfing the web for a while and fiddling with your set until the wee hours of the morning, you can find calibrated settings online for your model number. However, these calibrated settings are fine tuned for the environment it was in at the time and may or may not look correct on your TV. Try several unique calibrations before you find one that you think looks best.

If you're a gamer, in your console settings you'll most certainly find a color range option. In almost all cases you'll want to leave it on Standard, Limited, or 16-235. You'd think Full Range means better, but most sources, except for PCs don't support such a color range. But my color and black levels are so rad with it on, dude! Yes, but again, like all of the other options we went over, it comes at a cost by crushing color, and shadow detail. Make certain that your TV supports a Full RGB color range. Your TV will not go to a blank screen or give you an error message if it does not support it. Just because you still see an image does not mean your TV supports it! You'll know if it supports it in your TV's advanced menu system by finding standard (16-235) or non-standard (0-255) color switches. If these are not there, your TV does not support a full RGB range and your gaming will suffer because of it.
In the jaw-dropping Killzone: Shadow Fall, the shadow detail
here will be crushed if using Full RGB 
on a TV that does not support it.

If you want the easy version: Turn on Game Mode if you have it. If not, make sure sliders such as color and brightness are set to halfway up, sharpness at 0 or default, color temperature at warm, and any and all options that are available to be turned off, turn them the hell off. If you're a gamer, don't switch on Full RGB just because of "purtty colors." Make absolutely certain your TV supports it.

Remember, any new TV you buy will require that 150-200 hour break in before starting to look its best. A TV needs to "iron itself out" to more evenly display an image. So, play as many video games and watch as many movies as possible after turning it on. Use the settings above as a guideline, but not a rule book. Get your TV to look halfway decent for now, but until that break in period is setting in, results will vary until you get a better calibration. The Xbox One, for example, has a decent calibrator built-in, but use these guidelines first as a basis, then tune from there. Hopefully, all of this will help squash some misguided expectations and get you the best from your investment.

-J.G. Barnes