Some folks don't really see the point of HD-TV, HD-DVD, Blu-Ray etc... Though that title isn't a direct quote, it's a fair composite of the sentiments of several commenters, every time I talk about High def media.
There seem to be two classes of consumers here; the ones who looked at HDTVs and weren't much impressed, and the second are those who just dismiss the whole thing because "DVD looks great anyway, why bother".
Now addressing the first consumer, there's typically two issues here:
- They probably bought, or watched, a not very good HDTV
- Said HDTV (even if it was a good one) was probably not hooked to an HD source, or if it was, it was hooked up incorrectly so that HD wasn't being delivered, or there were quality problems.
A lot of folks go in to Costco, grab a TV that looks OK on the floor at a price they can deal with, take it home, and just plug it in, in place of their old TV.
For 60 years, when you upgraded TVs that's generally all you needed to do; unfortunately, with new digital and HD technologies, the "system" no longer works that way. In order to get HD, you need to use HD hookups, HD cables, and HD ports, from an HD source.
Worse, when you get a great brand new high resolution HDTV, and send a standard def source to it, over old cables; frankly they are going to look like crap (more on the "why" behind that later).
If that is your sole experience with HDTV, then yes, you're going to be very unhappy. Your mom and dad probably did that, and that's why...
"Mom and dad" think their new TV is crap...Ok, so let's presume you've got the "mom and dad took it home and just plugged it in" problem, how do you fix this?
Well, first, if you buy a better quality TV it will have video scaler hardware that makes SD look much less like crap; but generally speaking SD sources don’t look as good on an HDTV as they do on an SDTV.
Technically speaking, it's not that the SD sources actually look worse, it's just that the TV looks so much better, that it reveals just how bad the SD signal was in the first place. It's kinda like someone took your beer goggles off.
By 2009 that will be a moot point, since everything will be in HD, or pseduo-HD (not really HD, but upconverted to look better) because of the mandated DTV cutover. For now though, you generally have to specifically order an HD source. Call your cable or sattelite company and get them to activate an HD service for you (and maybe send you an HD box).
Of course, even if you have an HD cable cable box, it’s entirely likely that the TV is hooked up in a way that is functional, but not necessarily the best way to watch. For example, in the "mom and dad" situation; they probably just plugged in the Coax from their cable box, like they did on their old tube TV.
Remember how bad your VCR looked going over coax?
Untangling the rats nest...There are quite a few different ways to hook a TV up to a video (and audio) source, and they are kind of complex and confusing. Let's go over them here, in increasing order of quality:
- 75 Ohm RG-59/RG-6 Coax cable: This is the cable that delivers the signal to your cable box; and for the last 25 years has generally been the way the cable box output to your TV.
Unfortunately, the standard analog signaling over coax is HORRIBLE in quality. It delivers all the video, and all the audio information for the signal over a single conductor and a single ground. It's noisy, and it has SEVERELY limited sound and video bandwidth available to it.
Technically speaking, the newer RG6 standard of coax is capable of supporting VERY high bandwidth with digital encoding and/or analog multiplexing (which is how your cable company transmits 400 channels over the stuff); but conventional analog video interconnection using coax is based on the older low bandwidth standards from the 60's.
- Composite A/V: This is the other familiar cabling standard to most people; and almost all AV equipment supports it.
With composite video, the video signal is transmitted over a single conductor/shield pair of wires (usually color coded yellow) with RCA plugs. Separately, two channel stereo audio is transmitted over standard red and white color coded RCA audio cables.
Video signals by nature have two separate types of signals, color information (chrominance), and grayscale light level only information (luminance).
Technically speaking, there are actually three different chrominance signals, a luminance signal, and two synchronization signals; and this standard is called composite, because it takes all the color, brightness, and sync information, and squeezes them down (compositing) into a single conductor and shield pair.
Quality is generally slightly better than analog coax, because you are not trying to squeeze all the video and audio signals together into a small fraction of the bandwidth of a single wire; and don't need to modulate or multiplex for that.
- S-Video: S-Video is somewhat less common than composite, but has been around for about 20 years; and is also available in most AV equipment. Basically, S-video takes the composite signal, and splits the color and brightness information into a pair of conductors (one live, one ground) each.
Although the source signal for S-video is the exact same as that for composite video; because theres two seperate pairs of conductors for the signal, you get more bandwidth, less modulation and filtering required etc... and therefore the quality is quite a bit better.
Oh and again; this standard uses separate audio cabling, so you don't have to worry about a quality compromise there... but of course it means more cables to connect.
- Analog RGB Component: Component video is an interesting beast. Firstly, there are actually three separate common standards for component video; two analog, one digital; and they support different levels of quality. Oh and adding even more complexity, there are several cabling standards for the variants as well.
Analog component video takes the S-video concept of splitting chrominance and luminance one better, and splits chroma into multiple channels.
The RGB variant uses separate Red, Green, and Blue chroma channels. Each chroma channel includes a component of the grayscale image, and all the color information for that color.
RGB component video requires fourth and fifth "sync" channels to carry synchronization information, so that the three colors can be properly combined, and framed etc...
Some cabling standards for RGB mix the sync channels onto the green chroma channel; while others require a separate cable for sync (which mixes both horizontal and vertical sync), or even two discrete cables for H and V sync.
The familiar computer "VGA" cable is an RGB component standard. That of course means that the VGA cable can transport a pretty high quality TV signal.
Technically speaking, analog RGB component can carry resolutions up to 1920x1200 interlaced, which is higher resolution than 1080i; but there is actually less bandwidth available through the analog encoding than digital 1080i signals require.
A lot of HDTVs have VGA ports, and they support signals from computers, and possibly from AV components, but usually only up to 1024x768, or 1280x1024, at 16 bit or 24 bit color; and of course that's analog video.
Some early generation HDTVs, especially those that were advertised as "HD monitors" or "HD ready" have separate BNC RGB connectors for each channel (either three or four, with or without the sync mixed onto the green channel).
- Analog YPbPr Component: The other major analog component signaling standard is YPbPr; which is carried over three pairs of wires, with RCA connectors at each end; and usually color coded in green, blue, and red.
The YPbPr standard carries the luminance and sync information on the Y channel, the data for blue is on the Pb channel, the data for red is on the Pr channel, and the data for green is derived by subtracting the red and blue data from the luminance data (because once you take red and blue out, the green is left over).
Other than the cabling difference, YPbPr component video is identical in capability to RGB component.
Most newer video devices, including A/V receivers, include at a minimum this standard of video; in preference to RGB because it is easier to implement; and because the digital component standard uses the same cabling.
- Digital YCbCr Component: Digital component video takes the same cabling standard as YPbPr, and adds digital color space and gamma encoding.
Most often, if a device supports both analog and digital video standards, it will support them both over this sets of jacks and cables; switching modes as required.
Analog component is still interlaced; but digital is progressive, and supports resolutions (in some implementations) as high as 1920x1080; which is 1080p (though at that resolution, the standard is quite sensitive to cable length, and interference).
Most HD devices support digital component video at 720p or 1080i (which itself is actually not interlaced; they simply output alternating progressive signals to reduce the total amount of bandwidth required).
Unfortunately, component doesn't support any DRM or copy protection features; so the studios won't let you output the highest quality signals over it, even though the standard is technically capable of outputing at 1080p.
- DVI: DVI is an odd duck as well; because it also supports both analog and digital signals over the same connector. DVI has gradually been replacing VGA in the computing world, and to facilitate this, the DVI port can map pins to a VGA port with an adapter, to transmit the analog RGB signal that would otherwise have gone over the VGA cable (and therefore is technically identical to VGA).
Of course the DVI standard is also a digital standard, and has 29+1 pins (24 standard pin, 5 analog chroma/sync pins, and a grounded shield) vs the 15+1 (15 pins and a shield) of the VGA standard (which is technically called Dsub-15) so a HELL of a lot more data can be sent over DVI.
DVI digital can be had in two varieties, single link or dual link, the primary difference being how many pins they are actively using. Single link supports resolutions of up to 1920x1200 progressive, and 32 bit color; which is higher than 1080p. Dual link supports up to 2560x1600 progressive at 32 bit color, which is also sometimes called "quad-hd".
DVI can also support HDCP copy protection, so the studios graciously allow us to send their best quality content over DVI cables. DVI is the "lowest" standard which allows this functionality.
Like all the other discrete cabling standards, DVI is a video only standard; audio is carried seperately.
- HDMI: HDMI is a 19 pin, high density compact connector cabling standard, that is electrically an extension of the DVI standard. The primary difference is that it runs at a much higher clock rate, and thus a higher bandwidth; and because of that can support a good deal more data being shipped over the wire, including high definition audio.
DVI dual link digital cabling supports up to about 7.4 gigabits per second of video data. The latest HDMI 1.3b standard supports 10.2 gigabits of data. This allows a maximum resolution of 2560x1600, and up to 48 bit color (which is actually far more color than the human eye can see).
All new HD devices include HDMI, because it is the preferred format for AV manufacturers, as well as the studios. You want an HDMI 1.3 port, so you can support high definition audio as well as 1080p HD video, with the hated copy protection the studios enforce on us.
Basically, for at least the next 5 years or so, HDMI is, and will be the dominant media interconnect standard; so you're going to need components that support it, and the cables to go along. That's a good thing though, because it takes that entire spaghetti mess I've gone over above, and it sticks everything into one cable.
One cable for video and audio; one cable type, every kind of audio and video. It's simple, and it generally works (some early HDMI devices had compatibility issues).
OK, but it STILL looks like crap...Alright, so let's assume we've got the TV hooked up to an HD source, using HDMI; or at worst digital component (let's ignore audio for now if you're not using HDMI).
Let’s not forget the TV settings themselves. There are, quite frankly, a bewildering array of settings for modern TVs; some of which have quite dramatic effects on the viewing experience. Worse, many of them are obscure, and definitely non-intuitive.
Let's continue with the "mom and dad at Costco" example, and talk about TV quality.
Wal-Mart and Costco both carry the Vizio line of TVs as their bottom end HDTV products. In fact, as of this holiday season, Vizio is the best selling line of HDTVs in the U.S.
Now, not to say the Vizios aren’t a great deal, they are, but they have a very broad product line; and "mom and dad" probably bought the cheapest one in whatever size range they wanted.
Let's say they bought a Vizio VW-42L; the lowest price model currently sold in the 42" size range in most stores (42" is the most popular size for LCDs right now).
First, it's a 720p set; which is fine for under 50” and more than six feet away; but the 720p sets don’t get the best features. For example, this model has no video processor, only the most basic scaler, and a mediocre response time of 8ms (thats full on/ full off, not grey to grey; which means nothing to a non video guy, but it's about half the speed you want).
If "mom and dad" have an A/V receiver (stereo receiver with video inputs and outputs) with HDMI, and a video processor( and they don't watch a lot of sports) this isn’t a problem; but given this is their first HDTV, I bet they don’t have an AV receiver with HDMI, and I can't imagine my Dad going without golf and football (actually my dad specifically LOVES his HDTV. He's a gadget lover, though he can never figure the things out).
So, first step, borrow an HD-DVD or Blu-Ray player from a friend… and borrow a friend who actually knows what they are doing with setting up a TV; and get a setup disc (they're about $25).
Now, make sure it's using the right cables, and looks good when properly set up with an HD source. Then make sure it looks at least acceptable with an SD source.
If not, then "mom and dad" went too cheap. Thankfully, they probably bought it at Costco, and they have the worlds best return policy. Spend a couple hundred more and get something that advertises one of the following keywords:
- Faroudja
- DCDi
- HQV
- Silicon Optics
- Reon
- Realta
- Bravia (Sony has several revisions of Bravia, you want the XBR2)
Those are all the names of proper upscaling video processors, and/or their manufacturers. If the TV has even the most basic name brand video processing, you can be reasonably certain it will have acceptable image quality.
If they're stuck with the TV they've got; they probably need to upgrade their stereo anyway; and you can get an AV receiver from Onkyo that has Faroudja DCDi for $500, or HQV Reon processing for about a grand. At that point, if the TV can just act as a monitor for the video processor, you can get away with a cheaper TV.
Of course they'll be better off if they just buy a Sharp, Panasonic, Hitachi, Sony, Mitsubishi, or Pioneer in the first place; preferably a 1080p set. It’ll be a few hundred more, but the quality difference will definitely be noticeable.
Okay so "mom and dad" are sorted, what about the guy who says "It's just a scam anyway. they'll just be changing things constantly, theres a format war, and I'm just going to wait until it settles down". Or there's the guy who says, "Ahhh, I've just got a small apartment, I'll never see a difference anyway".
Yes, you will see a difference...The difference is startling on even a 24” TV (the smallest set you can get in 1080p). In fact, HD video at 1080p on a small screen has an almost 3D quality to it, because the individual pixels are so small and sharp, with such high contrast and vivid colors (and again, I'll go into more on why later in the post).
Now, if you have a good upconverting DVD player (one that takes a 480i signal and uses image processing to interpolate up to 720p or 1080i/p); you can get something that looks great out of a standard DVD. Though it's not too close to real HD quality, and it doesn't pop in near 3d like true HD does; it's certainly more than enough to satisfy most people.
Up until recently that’s what I was recommending people do , until the format war was resolved. Well, It’s over, BluRay won; and every HD-DVD player and Blu-Ray player are also upconverting DVD players; so buying one not only lets you watch next-gen content, it makes your older content look better too.
Oh and one thing you definitely will notice a difference with, is how bad non-upconverted standard definition movies look on HD-TVs.
You have to remember, they aren’t even making non-HD TVs bigger than 24” anymore; and pretty much everyone in America will be replacing their TV in the next 5 years… most of us in the next year; as the hype over the digital transition goes mainstream.
Every broadcaster is already moved over to DTV in preparation for the cutoff, most of them are broadcasting at least half their shows in HD; and will be broadcasting even more in HD after Feb 2009.
Right now, a little less than 30% of American households have at least one HD TV, and it’s expected another 30% will go HD by the end of 2009.
In February 2009 all broadcast TV goes digital, most of it in HD. Most of the cable companies are following suit by making HD a standard package feature. I figure almost everyone who has a DVD playertoday, will have a BluRay player by the end of 2010 or so; and as of the most recent Sony finagling, 80% of all new movies, and more than 90% of all back catalogue movies, will be coming out in Blu-Ray.
So this isn't VHS vs. Betamax anymore, it's like the difference between having a TV and not having one.
OK now I've talked about how much better HD is than SD... why is that? I mean it's all recorded the same way right?
Well, no, not really.
It’s a matter of resolution (never mind the sound quality)...Standard DVDs are mastered in 530p, which means 530 lines of horizontal resolution, progressively encoded. This gives you a 720x530 picture, with every line drawn in every frame.
Unfortunately, when output to a standard definition television, that is down converted to 480i; for a 640x480 image, where half the lines of a frame are drawn at one time, 30 times a second each (to produce a 60hz refresh rate and 30 frames per second; the television broadcast standard).
Standard definition broadcast television is even worse, with only 330 lines of horizontal resolution; and standard VHS worse still at 230 lines.
This is why DVD looks so much better than either VHS or TV; and why DVD has become most peoples standard for video quality.
So let's compare DVD's quality on an SDTV, to that of an HD-DVD (or BluRay, they're identical in image quality) on a 1080p HDTV.
The difference lies in the amount of information being displayed.
SD-DVD content on an SDTV is displayed at 480i. That's 640x480 interlaced; for 307,200 pixels, or rather half of that drawn half the time, for a total of 153,600 pixels being drawn at any given moment.
HD content is mastered in 1080p, which is 1080 lines of vertical resolution at 16:9 aspect ratio. This gives you a 1920x1080 picture drawn progressively. It can be displayed at several rates including 24fps, 30fps, 50fps, 60fps, 72fps, and 120fps; but it is mastered at either 30, or 24. For purposes of this comparison let's just match it to SD and make it 60.
The total number of pixels being displayed at any given moment for a 1080p signal, is 2,073,600; and it is displaying the full content at all times, rather than alternating every other row.
That’s 13.5 times the information being displayed on the screen.
Actually, it’s more than that; because HD color depth is higher.
A standard TV can only display about 16 bits of color simultaneously; though the total spectrum of available color to SDTV is approximately 24 bits worth (broadcast TV is maximum of between 18 and 20 bits, because it steals color data to boost brightness and contrast).
HD TVs can display at a minimum the full RGB colorspace; and can do so at 24, 32, 36, 40, or 48 bits of color simultaneously (depending on the type and model).
The color model involves three colors, red, green, and blue; and each color is represented by an equal number of digital bits. So 24 bit color represents 8 bits of Red, Green, and Blue information each (255 discrete shades of each color); and 48 bit color represents 16 bits of information for each color (65,535 discrete shades of each color) .
Color depths above 24 bit, are referred to as "deep color". There are today, movies and players with 36 bit color; and in theory, they could go to 48 bit (the maximum color depth of both film and professional digital video, and the equipment used to process it).
Now, it’s important to note that 36 bits of color isn’t 2 times the color of 16 bits; it’s 65,535 times as much color. 16 bit color is about 65,000 colors, 20 bit color is about a million colors, 24 bit color is about 16.7 million colors, 32 bit color is about 4.3 billion colors, 36 bit is 68 billion colors, and 48 bit is 281 trillion colors.
That's approximately 94 trillion shades of each red, green, and blue.
If that sounds like a ridiculous number, it is and it isn’t. Yes, it's a huge number of colors, but because the human eye is analog and doesn't break things up into discrete bits, it can distinguish about 32 bits of color (and some genetic anomalies can distinguish about 40 bits, because they have an extra set of cones - that's about 1.4 billion, and 366 billion shades of each primary color respectively).
By mastering at higher depths of color than we can see, it makes color anomalies like banding, and clear field moire or block patterns nearly impossible (large areas of very bright solid colors don't look distorted).
We’ve had the technology necessary to broadcast 480i at 30 frames per second, in 20 bit color since the mid 50s; it’s only in the last five years or so that we could even contemplate broadcasting 1080p/30 in 32 bit color.
In fact, it’s so much bandwidth, that even now most broadcasters are only using 1080i (that trick that halves the amount of data you’re sending at once) in 24 bit.
What about the sound?Of course up to now I've said "let's ignore the sound"; but it's one of the biggest components of the difference between HD and SD.
Again, let's ignore broadcast TV, because it's pretty uniformly awful in sound (though it's amazing what you can fool the mind into believing is there, with a good audio processor. THank you Dolby); and focus on DVD.
SD movies on DVD are mixed in redbook PCM 2 channel stereo (cd quality), Dolby ProLogic, Dolby Digital 5.1, or DTS 5.1.
Those formats encode the soundtrack at approximately the same quality as standard CDs; with a maximum bandwidth of about 320Kbps for Dolby digital 5.1 (though they CAN be mixed and mastered at up to 448kbps), or about 640Kbps for DTS (though most are mastered at FAR lower rates than that even, and the theoretical maximum is 1.5mbps).
Broadcast HD audio, for programming from the premium HD movie channels for example generally also uses the DD or DTS codecs, but it uses them at 640kbps and 768kbps; far higher than the 448Kpbs limit of a DVD.
Getting into BluRay and HD-DVD though is where you see the biggest differences.
HD audio is mastered in multiple formats on each disk (As many as five) to support the various standards available; and is mastered ata MUCH higher data rate.
Dolby TrueHD and DTS Master audio are sampled at 24 bits per sample at 96khz, per channel, for up to 8 discrete channels. The maximum banwidth is up around 24.5 megabits per second; though most soundtracks use lower quality settings (this is one area where BluRay outperforms HD-DVD, which is encoded at about a 20% lower bit rate).
Going from 320Kilobits per second, up to 18 megabits per second; or 640Kbps to 24.5Mbps is a HUGE difference; and that's not even taking into account the fact that the newer encoding technologies use compression that about twice as effective.
You're talking about almost 100 times as much raw sound data in a True HD or Master Audio soundtrack as in a full quality standard 5.1 DVD.
So... is all that really necessary...Well, that's a tough question. It really depends on the individual consumer doesn't it.
Believe me, unless you're deaf and blind, you are going to see and hear a huge difference between a standard DVD with a Dolby Digital 5.1 sound track, and a 1080p high def video with a Master Audio sound track. The former looks and sounds pretty good, the latter... it's almost like being in the same room.
Of course, it comes at all comes at a price...You're going to need about a $1500 TV, a $500 stereo receiver, a $400 HD video player, and about $1000 worth of speakers (minimum) to really make this work. Also, the movies are about $10 more expensive each as well(oh an an extra $100 for cables and the like); and all those are minimums. You can easily spend twice that, or even 20 times that (if you're building a dedicated theater room for example).
$3500 isn't chicken feed.
Heck, I've been assembling a home theater (or at least an HD home entertainment system. Some people balk at the use of the term "home theater" unless it's a dedicated theater room) for two years now piece by piece, and it's cost us about $5500 to get to that point (we have a bigger TV, better speakers, and a better receiver). Yes, we spread it out over two years, but still, it's a fair chunk of change.
Now personally, I think HD is great. I don't want to watch movies and TV that aren't in HD anymore, and given that $3500 (or $5500 in our case) should last for anywhere from 5 to 10 years, I think it's worth it. Heck, we pay more for cable than we did for the whole entertainment system.
Others might not see it as worth the money though; it's really up to the individual consumer.
So you can say that the difference doesn't matter to you, or that you don't think it's worth the money; but don't try and say there's no difference.
Of course the fact that it looks and sounds better doesn't resolve the problem that very little of it is worth watching. I'm not sure if "American Idol" being in High Definition makes it better, or worse.