Friday, January 30, 2015

Why not use a TV as a monitor?

So, large screen LED backlit LCD televisions are amazing. They look great, and they've become quite affordable over the past few years.

Large computer monitors on the other hand, are still really expensive... Though the definition of "large" has changed over the past ten years from 17" to 19" to 21" to 24", to 27", and for the last couple years has topped out in the range of 30"-32". That's about the most people can comfortably fit on their desktop, and still be able to see the screen edge to edge without turning their heads.

Most people who want more screen area, just get two 24" monitors, because they don't need to look at a single big high resolution thing very much. Those who do, buy the 27" to 32" monitors bite the bullet, and pay the extra money.

So, since 32" tv's are so cheap, and so great, why not just buy one of those, or even a 37" or 40" etc... for a big monitor?

Yeah... why not use a flat screen TV as a monitor?

At first glance, it makes sense, and for some people in some applications, it absolutely does. But, there are a number of factors you need to take into account.

First, the sharpness, contrast ratio, luminance, and response time,  are generally considerably better on decent computer monitors than on most televisions.

Those qualities are expensive, and get more so as screen size goes up. They're expensive on TV's as well, but to show a good HDTV image, you don't need as good a screen (technically a display panel).

Also, the image processing, color gamut, panel design and the like, are generally different between monitors and televisions.

Computer monitors are designed to display text and graphics very precisely but not necessarily naturally. PC monitors panels and image processing are designed to make text look very clear, readable, and high contrast, without being oversharp, especially black text on white backgrounds. Monitors (at least higher quality ones) are also generally designed to display a much broader range of colors, more accurately (meaning with less bias or distortion, at consistent brightness and contrast, across the whole panel. Better quality monitors can also be color, contrast, and brightness corrected and calibrated, and can adjust the display in ways to correct common computer display issues. Of course, how well they actually manage to do those is another question).

Televisions generally have panels and image processing designed to make live video and film images, especially fast moving images, more natural looking. They are also designed to make bright objects against a dark background look better, without popping or ghosting, and with a "natural" appearing color, softness, and motion blur. They also have different screen adjustments to address issues typical to HD video signals rather than computer display signals (overscan, position correction, aspect ratio etc...).

This by the way is one reason why professional video production monitors are much more expensive than normal televisions. They are designed with all of the capabilities of the best quality PC monitors, AND the best televisions, plus additional inputs, image controls, and correction and calibration capabilities (as well as some other things like refresh and color/refresh/channel sync lock).

Isn't 1080p good enough?

The issue isn't resolution... 1080p is good enough... at the right screen size and viewing distance.

The issue is our vision, contrast, pixel size, and viewing distance.

PC monitors are designed be looked at closely, from close distances. TV's are designed to be watched from farther away, looking at the whole screen at once.

A 1920x1080 screen at 50" vs a 1920x1080 screen at 24" have VASTLY different PPI (pixel per inch). The big screen will be 44ppi and the smaller screen will be 92ppi.

My phone has a 5.1" 1920x1080 screen at  432ppi.

That's a .6mm, a .28mm and a 0.06mm pixel size respectively.

Whether a particular pixel count is acceptable or not, depends on how far away your eyes are from the screen.

The average human can see an individual .6mm pixel in a contrasting field (one white pixel on black) from appx 78", a .28mm pixel from 37" and a 0.06mm pixel from about 8".
You can get these numbers for yourself for any screen size here: https://www.sven.de/dpi/  and here: http://isthisretina.com/

So what's the difference and what's the damage?


In the real world, you don't need super high quality, ultra contrast, ultra high pixel densityetc... for general computing. A 24" 1080p screen looks just fine for most things, including video, most gaming, and small text.

Cheaper 1080p screens in the the 32" size range are now available for $250 or so. They look fine for displaying HDTV. Unfortunately look like crap when used as computer monitors from typical chair to screen distances. The text can be difficult to read, window edges look weird, games look weird, colors don't look right, black and white look like grey and brighter grey (usually with some blue, green, or yellow mixed in).

That said, they're not entirely unusable, and for video, large text, status displays etc... they're fine for that. They're also fine at 6 feet away. A slightly more expensive TV in the same size range may also have more controls and options so you can calibrate and compensate and make it look better.

30" to 32" computer monitors start at twice that (for 1080p screens), and go up into the $5,000 range (for 4k 4096x2160). They also look great at typical chair to screen distances for anything but small text (at 1080p).

For small text, even 27" is iffy at 1080p. For a 27-32" you really want something like 2048x1152, 2048x1536 or 2560x1440, or go to a QHD/4k.

Professional video production monitors in the 30-32" range on the other hand START at around $5,000 for 1080p, and go into the $30,000 range for 4k. They look amazing at any distance (once you get used to a professional video screen, other screen look like crap in comparison).

If there weren't good reason for it, no-one would spend that extra money.