But the deal was so good, I couldn't pass it up....
Well... actually... BOTH deals; but I only ended up with one, and that was as a result of the first one falling through.
What am I babbling on about?
Why, buying my SEVENTH computer for the holiday season; this one for lil'ol'me.
Kinda funny how this one worked out.
I had wisely avoided the day after Christmas mess; but I got several emails that day alerting me to "special savings" etc...
One of them had a system exactly like what I was planning to build, excepting without the second hard drive, or bluray/hd-dvd combo drive. The kicker was, it was only $999... but they wanted $70 shipping.
Well, that set me off on a research run as to specials available that day, and build vs. buy values based on those specials.... then I remembered I had seen one of the exact same computers that I was looking at as a "special value" deal in Costco with a 24" monitor, for $300 more than I was seeing it on the net for... which was a worthwhile deal if I decided I wanted that monitor; but I don't buy monitors sight unseen.
So the day AFTER the day after Christmas, I headed down to Costco to check it out.
Thus the adventure TRULY began.
When I got to Costco, they indeed had the system I remembered; but apparently that price was only good just before, and the day after Christmas. It was running $200 higher that day.
... still a good deal actually, but not better than I could build after taxes vs. shipping were taken into account.
At any rate, since I was out; I decided to head down to Frys, just to do a little comparison shopping; and because I knew their after Christmas sale went through the 1st... well technically their after Christmas sale went through the 28th, and their New Years sale went through the 2nd but anyway.
I got down to Frys, and they had the same system as Costco, but without the monitor; for about $400 less. Again, a good deal, but not better than I could build; and it's a $580 monitor, so I'd be better off buying from Costco.
Now I don't know if you've ever been to Frys, but they tend to be thin staffed at the best of times; and this was not the best of times. Also, unfortunately, my smart phones internet connection doesn't work inside their store; and I had various technical questions I needed answered about several models.
Let's just say it took me a while.
At any rate; I didn't care for the price on the Intel C2-Q6600 machine with the BluRay/HD-DVD combo drive. It was $1350; the exact same price as I would get online actually, but $25 shipping vs $90 tax... Not a great deal.
... BUUUUT
They had a brand new quad core AMD Phenom system, identical to the Intel box 'cept without the BluRay combo drive... and they had it for $950... and they sell the BluRay drive separately for $299.
and I really like the memory architecture on the phenom; it's actually significantly better than that of the Core 2 (there are errata on both processors by the way; neither is likely to occur iN any real world instance).
Better yet though, that is actually cheaper than I could build the same system for.
Now remember, this is after an hour driving to and shopping in Costco then driving down to Frys, and another 90 minutes in Frys trying to get served etc... etc...
I looked at the deal, I thought about it for a couple minutes, I thought about the online prices and the build prices... and I said what the hell, let's buy it.
So, 20 minutes later the sales guy finds the box in the back (they've only got one and it's stacked in some weird location), and I head up to the line with it.
Again, if you've never been to Frys, the lines are legendary. They have 30-50 cashiers lined up at the front of the store, and one REALLY long line, that snakes back and forth across the full width of the store, in multiple rows of S curves.
On this night, there were about 200 people in line ahead of me. From the time I made my "let's buy it" decision, to the time I got to the register, was about an hour. A total of about 2 and a half hours in the store...
And I get to the register, and check all my things in, which takes a good ten minutes because of a malfunctioning scanner...
And I present my card for payment and get in response "I'm sorry sir, we no longer accept American Express".
WHAT!
Now let me just let you know something. We have a few cards, but they are all low limit (on purpose), and used mostly for the reward points, and for the convenience of buying and paying bills online. We pay our cards off like cash every month. The only higher limit card we have is the Amex, which we use as our primary large purchase card, because it's the only card they take at Costco (in fact, it's the Costco Amex rewards card. A great deal by the way).
Now technically, I could have done it by splitting the purchase across two other cards (which they will do); or I could have paid in cash on the debit card, but this is just after Christmas, and I'd just (that day) paid every credit card off in full to account for Christmas purchasing. Not only did I not want to put more on the other cards, but I wasn't sure which payments had cleared yet and which hadn't... and I didn't want to dip into the cash reserves that far (the total including tax, and the bluray combo drive was $1300 something).
So after two and a half hours in Frys, and another hour in Costco, I just abandoned the hardware there and made ready to go home.
Man I was irritated. I HATE wasting my time like that; and I espeically hate having to change things around after coming to a decision.
What happened next... well, that's another post; but lets just say that my weirdness magnet proved once again infallible, and it cost me $150, and a fair bit of irritation. When I say this is for another post, I mean it. It's too good a story not to share, but I'm going to share it some other day soon.
At any rate, by the end of all this, I'd been out for four hours or so, and it had not only been a complete waste of time, it had been net loss. I was tired, and pissed off, and annoyed (yes I can be all three at once, and it's not a pretty sight)....
I needed to either break something, or build something... or maybe both.
So there was a Best Buy not too far out of my way home, and just for the hell of it I stopped in; not expecting I'd find anything, but I wanted to see what their after Christmas deals were...
And damned if they didn't have a core 2 quad system, identical to the one I'd got an email about, that started this whole mess... and for $910... $90 cheaper than the email box. In fact, because the email box had such high shipping; accounting for tax vs shipping, still $90 cheaper.
So I asked the sales guy about it, and he said "Oh, it's the last one we've got, a floor model; you can get it for $800".
To which I said "do you take Amex?"
Yes, yes they do.
So, the specs:
Proc: Core 2 quad 6600 at 2.4ghz
Mem: 3gb PC6400 (2x 1gb 2x 512m)
Video: Nvidia GeForce 8500GT
Board: Intel G33 oem workstation board (good quality, a bit light on features)
Drives: 2x 500gb 7200rpm SATA3 with 16mb cache
Other: Lightscribe DVD combo drive, 15 way media reader, optical out and HD surround, 2 firewire ports onboard, 8 USB2, eSATA port, 7 sata ports total, GigE.
The only thing I'd change if I were building it myself, is go with a gamers motherboard (about $70 for an equivalent board) , and to use an 8600gt instead of an 8500gt ($150 vs $75). My cost including tax: $865 dollars. Difference in component cost: $145. Cost to build it and ship it as I would have built it: $1110 dollars.
Hmmm.... looks like I break even, even replacing the motherboard and video card, and I'm not going to replace the mobo. Honestly, I don't need to overclock a quad 2.4ghz machine, and that's the only reason to replace the mobo.
Oh and of course this way I get a 1 year warranty, tech support, and an OS; rather than having to actually pay MS an extra $150 for an OEM copy, which I didn't include in the build price (I have OEM XP, I don't have OEM Vista, and if I want DX10 games, it's Vista only).
I MAY replace the video card at some point; but I can reuse the 8500gt in my kids computer (which has crappy video), so it's no loss.
In the mean time even this vid card overclocks to 600mhz stable with no heat issues (the entire 8 series are overclocking monsters), maxes out the Vista performance scores, and screams in every game I've tried it on... though I haven't tried to run Crysis yet; and I'd expect it wouldn't do too well since it's only a 256mb board (actually the only reason why I'd bother changing it is to get 512 or 640).
I stuck in a wireless card; and I'm going to grab that $300 combo drive (funny enough, it's not cheaper anywhere online than it is here locally), and a Hauppage PCIe HDTV/VidCap card with remote.
I actually could've bought the comp with the 24" monitor, and got another $140 off, but I was still trying to decide what to do about a new display (and I have a really top quality 19" already) so I declined that deal (it WAS a smokin deal though).
I'm still trying to decide if I want a new 24" monitor; or if I want to go whole hog and get the 30" and just use the whole thing as my secondary HDTV/Media center.
Or hell, I could even go dual head on the thing with my 19" as my "windows" display, and use an HDTV as my other display. They've got some really nice smaller model LCDs at quite low prices these days; and 1920x1080 on an $800 30", 32", or 37" TV looks just as good on an HDTV from a computer over HDMI as it does on a $1400 monitor (of course the $1400 30" monitor can do 2560x1900, but I don't NEED more than 1920x1080).
Oh and before you real hardcore geeks ask, yes it's dual booting Ubuntu, and yes, kernel compiles absolutely SCREAM.
For my graphics geek friends, don't expect TOO much because it's just an 8500gt; but the cpu scaling results are pretty damn good:
CINEBENCH R10
****************************************************
Tester : Chris Byrne
Processor : Intel(R) Core(TM)2 Quad CPU Q6600 @ 2.40GHz
MHz : 2400
Number of CPUs : 4
Operating System : WINDOWS 32 BIT 6.0.6000
Graphics Card : GeForce 8500 GT/PCIe/SSE2
Resolution : <1280x1024>
Color Depth : <32>
Rendering (Single CPU): 2435 CB-CPU
Rendering (Multiple CPU): 8586 CB-CPU
Multiprocessor Speedup: 3.53
Shading (OpenGL Standard) : 3285 CB-GFX
Oh and thats on the standard clock; the fan always being on high when I had the video oc'ed was irritating me.
The 8600gt typically sees 1000 better or so on the Shading test. I haven't seen an 8800gt yet, but I'd guess its another 500 over and above that. Also, Vista typically sees lower results than XP (or OS-X), and for "standard" tests, benchmarkers usually use 16 bit, because some systems use 24 bit and some 42 bit as their maximum, but they all support 16 bit.
I'll run a few more different benchmarks (any requests?) and post them up tomorrow.
The Random Mumblings of a Disgruntled Muscular Minarchist
Igitur qui desiderat pacem praeparet bellum
Monday, December 31, 2007
One of the interesting things....
about living in Arizona, is that at 4:30pm on New Years Eve, the temperature can be 78 degrees... as it was today.
The previous evenings low having been 37 degrees...
Gotta love that desert eh?
The previous evenings low having been 37 degrees...
Gotta love that desert eh?
Saturday, December 29, 2007
Undefeated
On Saturday, December 29th, 2007; at 11:22pm Eastern time; the 2007 New England Patriots are officiallythe greatest single season team in football history.
Beating the Giants for A 16-0 undefeated season; the Pats have achieved only the fourth undefeated season in NFL history, the first in 35 years, and the only undefeated season in the 16 game era (the previous undefeated teams were the 1934 and 1942 bears, at 13 and 11 games respectively, though both were defeated in the championship; and the 1972 Miami Dolphins at 14-0 regular season, and 3-0 in the post season).
Additionally, the patriots bested the all time single season scoring record; and individually Tom Brady beat the single season touchdown pass record (50), and Randy Moss beat the single season touchdown reception record (23).
Right now I'm watching the local post game commenters, and some minor ex-Cardinal (Jay Taylor) is saying "eh, the patriots aren't that great, they're good, and they're lucky".
I'm sorry, I don't see how anyone can look at the Patriots performance this year and not say that this years team is the greatest single season team ever.
Further, they have been the best team in football since 2001 (scoring, win/loss, division, conference and superbowl championships), and among the best since 1996; and have the fourth best superbowl record all time, (behind the Cowboys, 49ers, and Steelers).
Clearly, the Cowboys are the best team overall in the superbowl era NFL(5-3 superbowls) , and the 49ers are second (5-0). I really think it's reasonable to say though, that the Patriots are in a dead heat for third with the Steelers if they win the superbowl this year.
If the Pats win this year they'll be 4-2, and the Steelers are 5-1; but the Steelers have never had an undefeated season, nor have they had a winning streak like the 2003-2004 Patriots; and they've only been in two, and won a single superbowl since 1979 (they did win 4 in 6 years though; a record the Patriots could beat this year by doing it in 5)
The Packers of course have more league championships than anyone else (12), but all but three of those were in the pre-superbowl era; and they have only won a single superbowl since 1968 (against the patriots as it happens
I can't wait to see the boys go 19-0 and beat (most likely) the Colts, and the Cowboys for a perfect post season as well. That should finally shut those idiots (like Taylor) up about how the Patriots really aren't "great".
Beating the Giants for A 16-0 undefeated season; the Pats have achieved only the fourth undefeated season in NFL history, the first in 35 years, and the only undefeated season in the 16 game era (the previous undefeated teams were the 1934 and 1942 bears, at 13 and 11 games respectively, though both were defeated in the championship; and the 1972 Miami Dolphins at 14-0 regular season, and 3-0 in the post season).
Additionally, the patriots bested the all time single season scoring record; and individually Tom Brady beat the single season touchdown pass record (50), and Randy Moss beat the single season touchdown reception record (23).
Right now I'm watching the local post game commenters, and some minor ex-Cardinal (Jay Taylor) is saying "eh, the patriots aren't that great, they're good, and they're lucky".
I'm sorry, I don't see how anyone can look at the Patriots performance this year and not say that this years team is the greatest single season team ever.
Further, they have been the best team in football since 2001 (scoring, win/loss, division, conference and superbowl championships), and among the best since 1996; and have the fourth best superbowl record all time, (behind the Cowboys, 49ers, and Steelers).
Clearly, the Cowboys are the best team overall in the superbowl era NFL(5-3 superbowls) , and the 49ers are second (5-0). I really think it's reasonable to say though, that the Patriots are in a dead heat for third with the Steelers if they win the superbowl this year.
If the Pats win this year they'll be 4-2, and the Steelers are 5-1; but the Steelers have never had an undefeated season, nor have they had a winning streak like the 2003-2004 Patriots; and they've only been in two, and won a single superbowl since 1979 (they did win 4 in 6 years though; a record the Patriots could beat this year by doing it in 5)
The Packers of course have more league championships than anyone else (12), but all but three of those were in the pre-superbowl era; and they have only won a single superbowl since 1968 (against the patriots as it happens
I can't wait to see the boys go 19-0 and beat (most likely) the Colts, and the Cowboys for a perfect post season as well. That should finally shut those idiots (like Taylor) up about how the Patriots really aren't "great".
Wednesday, December 26, 2007
The Haul
So, as has become traditional in the blog world, people are talking about what they got for Christmas.
So what the hell, let's join the party.
Firstly, I didn’t really get much gun stuff this year:
Winchester gun cabinet
Lyman 2500 magnum tumbler
3x 10mm magazines
I had a decent haul on non-gun stuff (individual):
New guitar (big present from wife) - Ovation celebrity ultraslim electracoustic
New watch (other big present from wife) - Invicta automatic dive watch, with diamond markers
New turkish cotton bathrobe (very big and comfy)
About 20 books
About 40 dvds and HD dvds (a couple of full series sets, plus individual movies)
Family presents:
61” HDTV
New desktop computer for the family
A new puppy ( Jayne was an early christmas present)
It was a techie year for most of my gift giving as well.
What I got for Mel:
New laptop
Yamaha electric piano
The complete Calvin and Hobbes (hardbound collectors edition)
DVDs
5 magazines for the Hi-Power
Everyone else gave her books and DVDs.
I gave a new (or refurbished) laptop to two of my friends, and my mother. I gave a new or refurbished desktop to the kids (as a family present), and my father in law.
In terms of gunnie presents that I GAVE…
I built JohnOC a new work counter height (elbow height on a standing man) reloading and projects bench, and gave him all of my old turret press and manual reloading stuff (everything, including tumbler, press, powder measure, dies, calipers, etc...) plus a label maker, and a magnifying lamp.
A couple of us got together and bought Kommander a new Burris Speed dot XTR.
That was about it. I gave a total of 6 computers this year, which is a new record for me. Even including the big family presents, we spent less than last year (thank god. We went a little nuts last year); and everybody is happy and healthy.
What more can you ask for.
So what the hell, let's join the party.
Firstly, I didn’t really get much gun stuff this year:
Winchester gun cabinet
Lyman 2500 magnum tumbler
3x 10mm magazines
I had a decent haul on non-gun stuff (individual):
New guitar (big present from wife) - Ovation celebrity ultraslim electracoustic
New watch (other big present from wife) - Invicta automatic dive watch, with diamond markers
New turkish cotton bathrobe (very big and comfy)
About 20 books
About 40 dvds and HD dvds (a couple of full series sets, plus individual movies)
Family presents:
61” HDTV
New desktop computer for the family
A new puppy ( Jayne was an early christmas present)
It was a techie year for most of my gift giving as well.
What I got for Mel:
New laptop
Yamaha electric piano
The complete Calvin and Hobbes (hardbound collectors edition)
DVDs
5 magazines for the Hi-Power
Everyone else gave her books and DVDs.
I gave a new (or refurbished) laptop to two of my friends, and my mother. I gave a new or refurbished desktop to the kids (as a family present), and my father in law.
In terms of gunnie presents that I GAVE…
I built JohnOC a new work counter height (elbow height on a standing man) reloading and projects bench, and gave him all of my old turret press and manual reloading stuff (everything, including tumbler, press, powder measure, dies, calipers, etc...) plus a label maker, and a magnifying lamp.
A couple of us got together and bought Kommander a new Burris Speed dot XTR.
That was about it. I gave a total of 6 computers this year, which is a new record for me. Even including the big family presents, we spent less than last year (thank god. We went a little nuts last year); and everybody is happy and healthy.
What more can you ask for.
Tuesday, December 25, 2007
Monday, December 24, 2007
Bursting Your Bubble - The Investment Fallacy
"Buy a house, it's the best investment you'll ever make"
Half of that is sound advice; buying a house is, in general, a good idea. The second half of that statement though, is completely and utterly false.
Unless you are paying cash, a house isn't an investment; and even then, you can almost always get a better return out of your cash than you would have out of the house.
"Oh you're so right, but if you get a 15 year mortgage, you're paying a little bit more, but THEN you're really building equity so you come out ahead"
Well, actually, you're paying a LOT more; and you come out further behind.
"Well I waited 'til I had the cash to buy, and didn't get into any debt"
Okay, now your house IS an investment; but it's one with a very high risk, a poor return; and most importantly a VERY high opportunity cost. You could have been making a hell of a lot more money with that cash.
All of what I'm saying goes against the conventional wisdom, and even financial advice from some people who should (or who do) know better. In fact, on its face, it seems contradictory to it's own internal logic.... So why am I the super genius saying this, not some financial guru?
Because everyone likes to think of their house as a investment. People get angry and lash out at you if you threaten that idea in their head. Most of the famous type people who should be saying this are trying to sell books to those same people who would burn them at the stake for challenging their idea about their house.
I won't say number don't lie; because of course you CAN lie with numbers; but seriously, run them. When you run the numbers, you will see that not only is a house not an investment, it is almost always a major liability...
But wait a sec, I said that buying a house was good advice... if a house is a major liability, how can it be good advice to buy one? The answer is, once again, in the numbers.
Let me show you what I'm talking about:
My home is worth about $300,000. I am currently in a lease to own deal; which means that at the end of my lease I have the option to buy at an agreed price, with a portion of my lease payments discounted off that price.
I do not have, and cannot easily save, the $60,000 necessary for a 20% down payment... and in fact I don't know anyone who is paying 20% down anymore. That means 5% down mortgages, and PMI for me... and just about everybody else as well.
Most people traditionally opted for a 30 year fixed mortgage. Although other options have proliferated greatly in the last few years, especially interest only, jumbo, and 3/1 or 5/1 adjustable; for most people the 30 year fixed is still by far the best deal... or rather the least bad deal; and I'll explain why as we go.
Now accepting that buying a house is a good idea, and starting with the default 30 year fixed, let's look at why the advice to go to a 15 year mortgage is a bad idea financially.
A 30 year fixed mortgage on my house at 6% interest, no points, and 5% down would result in a payment of $1800 a month, not including property taxes and PMI.
Going from a 30, to a 15 year mortgage on my house would add an additional $1000 a month to the payment; though it would cut $144,000 in interest charges off ($648,000 vs $504,000).
$144,000 saved in interest; that's a great deal right?
Well, saving on interest is good, but the real question there is, would putting the cash towards early payoff, or into a mutual fund instead result in a better return after 15 years?
Presuming your house appreciates 50% in value over 15 years (the average since 1980 rather than the recent bubble rates), including inflation, then that $300,000 house will be worth $450,000; and you will have paid $504,000 for it; a net loss of $54,000.
If you use the extra $1000 per month for early payoff of a 30 year fixed, you end up paying $561,000 total; a net loss of $111,000.
Now, presuming you put all $1000 per month into a fund; and aggregating for annual performance, plus assuming reinvestment of earnings, at an average year over year of 8% (that's average fund performance over the last 30 years for index funds. Some do much better), and capital gains of 15%; you come up with $384,000.
You will have paid $324,000 in mortgage payments, which at the 15 year point will have reduced your principal balance to about $200,000, for a total liability of $524,000. If you then sell the house for $450,000 your net loss would be $74,000.
However, you now have $384,000 in savings and investment income; which offset against that $74,000 net loss, is a $310,000 gain. Even removing your initial capital investment of $180,000 over 15 years, you’re talking about a $130,000 gain.
Or better, don’t sell. Just pay off the loan and take the $1800 a month that was going into your mortgage, and put it into your 8% fund instead. Now you’re starting with $184,000, and your annuals go up to about $34,000. At the end of that second 15 years, you’ve got about 1.6 million in cash, AND a $600,000 house.
I keep trying to explain it to people; houses are not an investment unless you can pay cash.
Paying a house off early may make you more secure, and it may very well be a good idea; the less interest you pay the better; but mortgages are not a good deal for you. Interest almost always accrues faster than value; and most often you would be better off taking that money and investing it.
The only good thing about a mortgage, is that unlike rent, you have an asset to recover value from. I’d rather be paying $1800 a month, and recovering some asset value, than paying $1800 a month in rent, with no asset value. That’s why mortgages make sense at all; because you’d have to be paying rent anyway.
Over a typical 15 year mortgage, you are going to pay around 65% of the original mortgage value as interest, and a house will increase in value about 50%; you come out at a substantial net loss.
Over a 30 year mortgage you pay about 105% as interest. A house will typically double in value over 30 years, so you come out pretty close to even.
Now, if you pay your house off in 15 years, then hold on to it and sell it after 30, you make about $100,000; but 30 years worth of inflation makes $100,000 a pretty small number.
The only way mortgages makes sense, is if you look at them as an alternative to rent. 30 years of rent at $1800 (let’s presume rent is the same as the mortgage payment) is $648,000; and at the end you’ve got nothing but a net loss; whereas with a mortgage, you’ve got a $600,000 asset for a net loss of just $48,000.
Even for a 15 year, it’s a better deal; with a $504,000 net loss, vs a $54,000 net loss.
So don’t look at your house as an investment; what it is, is a stoploss.
Of course this should make it clear why IO loans make no sense whatsoever. They are in fact worse then rent; because not only are you not building asset value (in effect you are paying the bank rent), you are incurring a huge liability and risk at the same time. There is no upside to you, and no downside to the bank when you are in an IO loan.
If you do take an IO, it's because you are going into more house than you can afford; and betting that the houses value will increase fast enough to offset your liability, and build equity for you.
Bad bet.
In order to get value, you need to understand just how much you are losing; and how much less you lose with different mortgage and payment options. This lets you maximize the value of your stoploss, and direct your assets and income into areas where they can best earn for you.
Half of that is sound advice; buying a house is, in general, a good idea. The second half of that statement though, is completely and utterly false.
Unless you are paying cash, a house isn't an investment; and even then, you can almost always get a better return out of your cash than you would have out of the house.
"Oh you're so right, but if you get a 15 year mortgage, you're paying a little bit more, but THEN you're really building equity so you come out ahead"
Well, actually, you're paying a LOT more; and you come out further behind.
"Well I waited 'til I had the cash to buy, and didn't get into any debt"
Okay, now your house IS an investment; but it's one with a very high risk, a poor return; and most importantly a VERY high opportunity cost. You could have been making a hell of a lot more money with that cash.
All of what I'm saying goes against the conventional wisdom, and even financial advice from some people who should (or who do) know better. In fact, on its face, it seems contradictory to it's own internal logic.... So why am I the super genius saying this, not some financial guru?
Because everyone likes to think of their house as a investment. People get angry and lash out at you if you threaten that idea in their head. Most of the famous type people who should be saying this are trying to sell books to those same people who would burn them at the stake for challenging their idea about their house.
I won't say number don't lie; because of course you CAN lie with numbers; but seriously, run them. When you run the numbers, you will see that not only is a house not an investment, it is almost always a major liability...
But wait a sec, I said that buying a house was good advice... if a house is a major liability, how can it be good advice to buy one? The answer is, once again, in the numbers.
Let me show you what I'm talking about:
My home is worth about $300,000. I am currently in a lease to own deal; which means that at the end of my lease I have the option to buy at an agreed price, with a portion of my lease payments discounted off that price.
I do not have, and cannot easily save, the $60,000 necessary for a 20% down payment... and in fact I don't know anyone who is paying 20% down anymore. That means 5% down mortgages, and PMI for me... and just about everybody else as well.
Most people traditionally opted for a 30 year fixed mortgage. Although other options have proliferated greatly in the last few years, especially interest only, jumbo, and 3/1 or 5/1 adjustable; for most people the 30 year fixed is still by far the best deal... or rather the least bad deal; and I'll explain why as we go.
Now accepting that buying a house is a good idea, and starting with the default 30 year fixed, let's look at why the advice to go to a 15 year mortgage is a bad idea financially.
A 30 year fixed mortgage on my house at 6% interest, no points, and 5% down would result in a payment of $1800 a month, not including property taxes and PMI.
Going from a 30, to a 15 year mortgage on my house would add an additional $1000 a month to the payment; though it would cut $144,000 in interest charges off ($648,000 vs $504,000).
$144,000 saved in interest; that's a great deal right?
Well, saving on interest is good, but the real question there is, would putting the cash towards early payoff, or into a mutual fund instead result in a better return after 15 years?
Presuming your house appreciates 50% in value over 15 years (the average since 1980 rather than the recent bubble rates), including inflation, then that $300,000 house will be worth $450,000; and you will have paid $504,000 for it; a net loss of $54,000.
If you use the extra $1000 per month for early payoff of a 30 year fixed, you end up paying $561,000 total; a net loss of $111,000.
Now, presuming you put all $1000 per month into a fund; and aggregating for annual performance, plus assuming reinvestment of earnings, at an average year over year of 8% (that's average fund performance over the last 30 years for index funds. Some do much better), and capital gains of 15%; you come up with $384,000.
You will have paid $324,000 in mortgage payments, which at the 15 year point will have reduced your principal balance to about $200,000, for a total liability of $524,000. If you then sell the house for $450,000 your net loss would be $74,000.
However, you now have $384,000 in savings and investment income; which offset against that $74,000 net loss, is a $310,000 gain. Even removing your initial capital investment of $180,000 over 15 years, you’re talking about a $130,000 gain.
Or better, don’t sell. Just pay off the loan and take the $1800 a month that was going into your mortgage, and put it into your 8% fund instead. Now you’re starting with $184,000, and your annuals go up to about $34,000. At the end of that second 15 years, you’ve got about 1.6 million in cash, AND a $600,000 house.
I keep trying to explain it to people; houses are not an investment unless you can pay cash.
Paying a house off early may make you more secure, and it may very well be a good idea; the less interest you pay the better; but mortgages are not a good deal for you. Interest almost always accrues faster than value; and most often you would be better off taking that money and investing it.
The only good thing about a mortgage, is that unlike rent, you have an asset to recover value from. I’d rather be paying $1800 a month, and recovering some asset value, than paying $1800 a month in rent, with no asset value. That’s why mortgages make sense at all; because you’d have to be paying rent anyway.
Over a typical 15 year mortgage, you are going to pay around 65% of the original mortgage value as interest, and a house will increase in value about 50%; you come out at a substantial net loss.
Over a 30 year mortgage you pay about 105% as interest. A house will typically double in value over 30 years, so you come out pretty close to even.
Now, if you pay your house off in 15 years, then hold on to it and sell it after 30, you make about $100,000; but 30 years worth of inflation makes $100,000 a pretty small number.
The only way mortgages makes sense, is if you look at them as an alternative to rent. 30 years of rent at $1800 (let’s presume rent is the same as the mortgage payment) is $648,000; and at the end you’ve got nothing but a net loss; whereas with a mortgage, you’ve got a $600,000 asset for a net loss of just $48,000.
Even for a 15 year, it’s a better deal; with a $504,000 net loss, vs a $54,000 net loss.
So don’t look at your house as an investment; what it is, is a stoploss.
Of course this should make it clear why IO loans make no sense whatsoever. They are in fact worse then rent; because not only are you not building asset value (in effect you are paying the bank rent), you are incurring a huge liability and risk at the same time. There is no upside to you, and no downside to the bank when you are in an IO loan.
If you do take an IO, it's because you are going into more house than you can afford; and betting that the houses value will increase fast enough to offset your liability, and build equity for you.
Bad bet.
In order to get value, you need to understand just how much you are losing; and how much less you lose with different mortgage and payment options. This lets you maximize the value of your stoploss, and direct your assets and income into areas where they can best earn for you.
Sunday, December 23, 2007
Interesting twist on a quiz
I received 86 credits on The Sci Fi Sounds Quiz How much of a Sci-Fi geek are you? | |
Take the Sci-Fi Movie Quiz canon s5 is |
Thursday, December 20, 2007
Choosing your HD service - HD Cable, or DirecTV HD?
So as I've written over the past few weeks, we've upgraded to an HDTV and TiVO-HD, along with HD service with our local cable company (Cox).
A lot of folks have chosen to go with satellite for their HD service, and that can be a very good choice. DirecTV has far more channels in HD than most cable providers. As of today they have 85 HD channels available, whereas in my market, Cox only offers 24.
Both are of course expanding their offerings. DirecTV have said they will have 100 channels available in HD by the middle of 2008 (it was originally supposed to be by the end of 2007, but they are still 15 short). Cox is planning on 80 channels by the end of 2008, rolling them out 12 per quarter, the first group coming March 8th.
More important to many, DirecTV offers out of region NFL football, and MLB baseball on an exclusive basis (they have NASCAR and NBA with enhanced programming but not exclusive). If you live in Phoenix, and want to watch a Patriots homegame that's in local blackout, DirecTV is the only way you can get it.
In terms of general HD programming, that situation is going to equalize over the next two years. By 2009, HD will be the default choice for all cable and satellite providers, and the basic HD tier will be included for free (Cox has just announced this in fact. They will stop charging for basic HD service as of 2008).
Also, by 2009, all broadcast TV will be fully transitioned to HD, because of the analog cutoff. This means all your local channels will be in HD; and by extension all national network channels. Unfortunately, a lot of the cable channels will still only be 480p ED (extended definition), because the providers are trying to save bandwidth by using lower resolution and higher compression.
The broadcasters have all gone to HD, because switching off analog (which they legally had to do anyway) cost them roughly the same whether it was SD or HD, and HD got them more viewers. The cable providers on the other hand (although HD is also bringing them more viewers); are not as enthused, because their networks have bandwidth limitations that OTA (over the air) digital broadcasts don’t have (or rather the limits exist, but they are not an issue for a single channel broadcast). This means infrastructure upgrades for a lot of cable systems; which have cost them billions of dollars over the past few years (and will continue to cost them billions for the next two or three years as well).
These limitations also exist for satellite providers, but the sat companies don't have millions of miles of substandard copper cabling to replace; they just need to replace the head ends and receivers, and rent more bandwidth on the satellites (or launch more satellites - they’ve been doing both).
This has allowed satellite providers to move faster in putting HD content out there; but the costs are still substantial. There are only so many satellites in the sky, and only so much bandwidth available on them. Right now, HD service with DirecTV requires a clear view of a 5 satellite constellation; and that's the limit without launching more satellites, and upgrading every subscribers dish.
In order to minimize these infrastructure costs, the cable and satellite companies have both limited the number of HD signals they are offering; and they are compressing those signals as much as possible.
Of course compression allows them to squeeze more programming into a given amount of bandwidth; but it also reduces quality. Some channels are contracted to be broadcast with less compression, and some signal formats or content survive compression better. This compression is one reason why some channels look quite poor in HD (food network for example), while others look flat out spectacular (discovery HD).
By about 2010, the cable companies will have completed their physical plant upgrades in most regions; and the satellite providers will have reached the maximum capacity of their existing infrastructure without a MAJOR upgrade (as in hundreds of billions of dollars major). At that point, other than contract exclusivity (which I predict won't survive very long in the ubiquitous HD era as more customers complain to the sports leagues rather than switch providers), the programming should equalize between cable and satellite.
I should note cable and DirecTV are not the only alternatives. There are still c-band ("big" dish) and Dish Network on the satellite side for example; and a few lucky folks in some markets have fiber optic based TV services.
Dish offers a similar level of service to DirecTV, with 70 HD channels (and growing), and a SIGNIFICANTLY better HD-DVR (also TiVO technology based, but without the extra features); but they don't have DirecTVs marketing muscle, or their exclusive contracts.
Right now Dish stands at about 14 million subscribers, to DirecTVs 17 million (those are world wide numbers, though both have the vast majority of their subscribers in the US); and neither company are in particularly great financial shape, though both have strong revenues.
Simply put, the Satellite business is expensive, and capital intensive. Every satellite launched can cost upwards of a billion dollars, they have a limited lifespan (15-25 years depending); and sometimes the launches fail (there were two major failures last year for example). Local cable companies have their cable plants to maintain, which are also very expensive; but the cost is spread out to the local providers rather than concentrated on two companies.
This large capital cost, is the primary reason why satellite companies haven't been able to simply add a huge number of services and run away with the market. That said, as you can see by the numbers, mini-dish providers combined now represent a little less than 1/3 of the US "enhanced" television market (the 106 million households with cable, satellite, or other non local broadcasts); a huge leap from where they were 10 years ago.
C-Band services, which once dominated satellite; have fallen out of favor in regions where mini-dish services are available. Though they are still going strong outside of those regions, programming availability may be inconsistent. Many new HD signal streams are in a format that no c-band home receiver can decode for example, so even if you wanted to you couldn't subscribe to them.
The major Telcos are leveraging some of their cable plant expenses, and beginning to offer television services over their local loop; where the loop has been upgraded to either FTTP (fiber to the premesis, also called FTTH) or FTTN (fiber to the node, also called fiber to the neighborhood). These services use high bandwidth fiber optics to transmit IP (internet networking) based telephone, internet, and television programming.
Currently these services are well known as Verizons FiOS (an FTTH service), but other providers offer varying levels of service in different areas. THe second largest provider of these services is Qwest. Both telcos also resell DirecTV in areas where their FTTP/N services are unavailable.
CUrrently these services are both technically immature, and not commonly available. Most providers of the service are in the midst of a transition from first generation VDSL services which used analog multiplexed television signals over relatively low bandwidth links (under 40 megabit, which is barely sufficient for two simultaneous HD streams) to second generation services which offer several hundred megabits of fully digital bandwdith (enough for 10 simultaneous streams or more). Because IPTV only streams one channel per receiver from the head end at any given time, you can in theory offer better service on every channel, without overcompression or compromising on resolution.
Unfortunately, fiber to the neighborhood, and fiber to the home are still rare; and in the first generation form are significantly worse than cable in quality of service. I can't wait for them to be common and viable alternatives, but that isn't likely before 2010 or perhaps later.
Of course that's 2010. Given the fact that as of today, satellite offers ME personally better programming choices however, why did I still choose to go with cable?
Well I didn't exactly, I chose to go with a digital TV service from Qwest (my telco), but it was so poor (they sold me second generation service, but delivered first generation) I canceled it the same day it was installed, and went BACK to cable.... but I still went back to cable instead of going to satellite.
Honestly, Cox HD service is so-so, unless you skip their DVR and buy an HD TiVO. I wholeheartedly recommend doing so. The COX DVR just plain sucks.
With DirecTV you get more HD channels, and for now a TiVO based DVR (though they are moving away from TiVO next year some time), but you have to deal with DirecTVs shortcomings.
So why?
Three things:
1. Loss or degradation of signal
With satellite, you get loss of signal in certain atmospheric conditions. Lots of people say this never happens to them; but lots of others say it happens all the time.
If you can get a good install, and there’s not a lot of electrical noise in your neighborhood, and no obstructions; you’ll only have signal problems during heavy rain and wind.
...Of course we get heavy rain and wind every day for six weeks, twice a year (our monsoons); and high winds can be an issue year round. We also get dust storms here, and they can block signal as well.
Basically, your signal depends on your neighborhood, and the quality of your install. If you can get good, strong, clear, unobstructed signal then signal loss should only be an issue during the peaks of the monsoon (or heavy snow fall, or thunderstorms). If your signal is marginal, then you could get dropouts all the time, with just high winds and blowing dust (which we get all the time down here).
I know a lot of people around this area with DirecTV, and just as I note above; they get mixed results, depending on the weather, their neighborhood, and the quality of their install.
2. Latency
It can take up to 20 seconds for your satellite receiver to lock in and tune a station when you flip channels (it's also filling a read ahead buffer, to smooth out signal jitter). I find this INCREDIBLY irritating. Forget about flipping through channels to see what's on; you pretty much have to use the program guide no matter what.
3. HD-DVR
DirecTV charges a HELL of a lot for their HD DVR, it doesn't record very much video, and it just isn't as good as a real TiVO.
For now, DirecTV uses software licensed from TiVO for their DVR, but beginning in 2008 they are moving to their own software. Worse, the current software doesn't have NEAR the features that TiVO does.
Oh and even though you're paying $300 for your DVR, it isn't yours; it's a lease. If you cancel your DirecTV service, they own the box not you; and you (obviously) can't take it to another provider.
For me, all those irritations more than counterbalance the better HD programming that DirecTV offers over Cox. You on the other hand may weight your priorities differently.
A lot of folks have chosen to go with satellite for their HD service, and that can be a very good choice. DirecTV has far more channels in HD than most cable providers. As of today they have 85 HD channels available, whereas in my market, Cox only offers 24.
Both are of course expanding their offerings. DirecTV have said they will have 100 channels available in HD by the middle of 2008 (it was originally supposed to be by the end of 2007, but they are still 15 short). Cox is planning on 80 channels by the end of 2008, rolling them out 12 per quarter, the first group coming March 8th.
More important to many, DirecTV offers out of region NFL football, and MLB baseball on an exclusive basis (they have NASCAR and NBA with enhanced programming but not exclusive). If you live in Phoenix, and want to watch a Patriots homegame that's in local blackout, DirecTV is the only way you can get it.
In terms of general HD programming, that situation is going to equalize over the next two years. By 2009, HD will be the default choice for all cable and satellite providers, and the basic HD tier will be included for free (Cox has just announced this in fact. They will stop charging for basic HD service as of 2008).
Also, by 2009, all broadcast TV will be fully transitioned to HD, because of the analog cutoff. This means all your local channels will be in HD; and by extension all national network channels. Unfortunately, a lot of the cable channels will still only be 480p ED (extended definition), because the providers are trying to save bandwidth by using lower resolution and higher compression.
The broadcasters have all gone to HD, because switching off analog (which they legally had to do anyway) cost them roughly the same whether it was SD or HD, and HD got them more viewers. The cable providers on the other hand (although HD is also bringing them more viewers); are not as enthused, because their networks have bandwidth limitations that OTA (over the air) digital broadcasts don’t have (or rather the limits exist, but they are not an issue for a single channel broadcast). This means infrastructure upgrades for a lot of cable systems; which have cost them billions of dollars over the past few years (and will continue to cost them billions for the next two or three years as well).
These limitations also exist for satellite providers, but the sat companies don't have millions of miles of substandard copper cabling to replace; they just need to replace the head ends and receivers, and rent more bandwidth on the satellites (or launch more satellites - they’ve been doing both).
This has allowed satellite providers to move faster in putting HD content out there; but the costs are still substantial. There are only so many satellites in the sky, and only so much bandwidth available on them. Right now, HD service with DirecTV requires a clear view of a 5 satellite constellation; and that's the limit without launching more satellites, and upgrading every subscribers dish.
In order to minimize these infrastructure costs, the cable and satellite companies have both limited the number of HD signals they are offering; and they are compressing those signals as much as possible.
Of course compression allows them to squeeze more programming into a given amount of bandwidth; but it also reduces quality. Some channels are contracted to be broadcast with less compression, and some signal formats or content survive compression better. This compression is one reason why some channels look quite poor in HD (food network for example), while others look flat out spectacular (discovery HD).
By about 2010, the cable companies will have completed their physical plant upgrades in most regions; and the satellite providers will have reached the maximum capacity of their existing infrastructure without a MAJOR upgrade (as in hundreds of billions of dollars major). At that point, other than contract exclusivity (which I predict won't survive very long in the ubiquitous HD era as more customers complain to the sports leagues rather than switch providers), the programming should equalize between cable and satellite.
I should note cable and DirecTV are not the only alternatives. There are still c-band ("big" dish) and Dish Network on the satellite side for example; and a few lucky folks in some markets have fiber optic based TV services.
Dish offers a similar level of service to DirecTV, with 70 HD channels (and growing), and a SIGNIFICANTLY better HD-DVR (also TiVO technology based, but without the extra features); but they don't have DirecTVs marketing muscle, or their exclusive contracts.
Right now Dish stands at about 14 million subscribers, to DirecTVs 17 million (those are world wide numbers, though both have the vast majority of their subscribers in the US); and neither company are in particularly great financial shape, though both have strong revenues.
Simply put, the Satellite business is expensive, and capital intensive. Every satellite launched can cost upwards of a billion dollars, they have a limited lifespan (15-25 years depending); and sometimes the launches fail (there were two major failures last year for example). Local cable companies have their cable plants to maintain, which are also very expensive; but the cost is spread out to the local providers rather than concentrated on two companies.
This large capital cost, is the primary reason why satellite companies haven't been able to simply add a huge number of services and run away with the market. That said, as you can see by the numbers, mini-dish providers combined now represent a little less than 1/3 of the US "enhanced" television market (the 106 million households with cable, satellite, or other non local broadcasts); a huge leap from where they were 10 years ago.
C-Band services, which once dominated satellite; have fallen out of favor in regions where mini-dish services are available. Though they are still going strong outside of those regions, programming availability may be inconsistent. Many new HD signal streams are in a format that no c-band home receiver can decode for example, so even if you wanted to you couldn't subscribe to them.
The major Telcos are leveraging some of their cable plant expenses, and beginning to offer television services over their local loop; where the loop has been upgraded to either FTTP (fiber to the premesis, also called FTTH) or FTTN (fiber to the node, also called fiber to the neighborhood). These services use high bandwidth fiber optics to transmit IP (internet networking) based telephone, internet, and television programming.
Currently these services are well known as Verizons FiOS (an FTTH service), but other providers offer varying levels of service in different areas. THe second largest provider of these services is Qwest. Both telcos also resell DirecTV in areas where their FTTP/N services are unavailable.
CUrrently these services are both technically immature, and not commonly available. Most providers of the service are in the midst of a transition from first generation VDSL services which used analog multiplexed television signals over relatively low bandwidth links (under 40 megabit, which is barely sufficient for two simultaneous HD streams) to second generation services which offer several hundred megabits of fully digital bandwdith (enough for 10 simultaneous streams or more). Because IPTV only streams one channel per receiver from the head end at any given time, you can in theory offer better service on every channel, without overcompression or compromising on resolution.
Unfortunately, fiber to the neighborhood, and fiber to the home are still rare; and in the first generation form are significantly worse than cable in quality of service. I can't wait for them to be common and viable alternatives, but that isn't likely before 2010 or perhaps later.
Of course that's 2010. Given the fact that as of today, satellite offers ME personally better programming choices however, why did I still choose to go with cable?
Well I didn't exactly, I chose to go with a digital TV service from Qwest (my telco), but it was so poor (they sold me second generation service, but delivered first generation) I canceled it the same day it was installed, and went BACK to cable.... but I still went back to cable instead of going to satellite.
Honestly, Cox HD service is so-so, unless you skip their DVR and buy an HD TiVO. I wholeheartedly recommend doing so. The COX DVR just plain sucks.
With DirecTV you get more HD channels, and for now a TiVO based DVR (though they are moving away from TiVO next year some time), but you have to deal with DirecTVs shortcomings.
So why?
Three things:
1. Loss or degradation of signal
With satellite, you get loss of signal in certain atmospheric conditions. Lots of people say this never happens to them; but lots of others say it happens all the time.
If you can get a good install, and there’s not a lot of electrical noise in your neighborhood, and no obstructions; you’ll only have signal problems during heavy rain and wind.
...Of course we get heavy rain and wind every day for six weeks, twice a year (our monsoons); and high winds can be an issue year round. We also get dust storms here, and they can block signal as well.
Basically, your signal depends on your neighborhood, and the quality of your install. If you can get good, strong, clear, unobstructed signal then signal loss should only be an issue during the peaks of the monsoon (or heavy snow fall, or thunderstorms). If your signal is marginal, then you could get dropouts all the time, with just high winds and blowing dust (which we get all the time down here).
I know a lot of people around this area with DirecTV, and just as I note above; they get mixed results, depending on the weather, their neighborhood, and the quality of their install.
2. Latency
It can take up to 20 seconds for your satellite receiver to lock in and tune a station when you flip channels (it's also filling a read ahead buffer, to smooth out signal jitter). I find this INCREDIBLY irritating. Forget about flipping through channels to see what's on; you pretty much have to use the program guide no matter what.
3. HD-DVR
DirecTV charges a HELL of a lot for their HD DVR, it doesn't record very much video, and it just isn't as good as a real TiVO.
For now, DirecTV uses software licensed from TiVO for their DVR, but beginning in 2008 they are moving to their own software. Worse, the current software doesn't have NEAR the features that TiVO does.
Oh and even though you're paying $300 for your DVR, it isn't yours; it's a lease. If you cancel your DirecTV service, they own the box not you; and you (obviously) can't take it to another provider.
For me, all those irritations more than counterbalance the better HD programming that DirecTV offers over Cox. You on the other hand may weight your priorities differently.
Tuesday, December 18, 2007
Build or buy for 2007
So, I'm giving computers as gifts to a half dozen people this year; and every time I've mentioned it, I've had someone ask for computer advice.
So, continuing on my gadgetary thrust for the end of the year, lets talk about computers.
Now, the first question is always "build or buy"; by which I mean do you buy a pre-built system from a major vendor, or do you build your own.
Generally speaking, this is a two part question:
1. Does it make sense technically
2. Does it make sense in terms of value
The first part of the first question is also split into a couple of parts; or perhaps I should say there is an immediate definitive discriminator, and that is this: Are you buying a laptop or a Mac (or both)?
If you want a Mac, you only have one choice, and that's from Apple; ether direct (which I recommend) or from a reseller. Either way you get great support from Apple (which is kinda the point); at least if you have a problem that Apple acknowledges (the last couple years they've had a nasty tendency of releasing laptops with major issues and pretending there was no problem until too many people complained).
If you are buying a Mac, the only advice I can give you is this: If you want performance and don't care about your warranty, buy it stripped and add your own parts. If you care about the warranty then you need to pay Apples OUTFRIKKENRAGEOUS prices for RAM and hard drives. Oh and if you DO care, buy the Applecare warranty, because the basic warranty isn't great.
If you are buying a laptop, buy it from a major system vendor. There are a few specialty vendors that will sell you laptop pars, or custom build you a laptop, but they aren't worth the hassle and the incompatibilities, unless you are an expert, or you need something very specialized.
The best advice I can give you here, is to physically try out the keyboards, pointing devices, and screens of any laptop (or at least a comparable model) before buying it. They are greatly subjective matters of personal comfort and preference; and you don't want to end up with a laptop keyboard you can't change, and can't stand to type on.
The second part of the question comes down to this: do you need basic tech support, or tech support for the OS? If you do, or if the eventual end user does (buying a computer for your mom for example), then go ahead and buy from a major vendor.
Don't bother buying a pre-built system from a small independent vendor if you need the tech support; what you're looking for is 24/7 response, depot parts etc...
Also, if you need the tech support, and you plan on keeping the system for more than a year, this is one of the few times when you should consider a factory extended warranty. Factory warranties range from 90 days to one year; which isn't an awfully long time; and may not cover 2 days service etc... Extended warranties from Dell, HP etc.. are generally reasonably priced, and can provide up to a full three years of warranty protection, usually with one week or less turnaround time. They may have two day service available.
That said, only buy the factory extended warranty, directly from the vendor. Don't even think about buying a reseller warranty from Best Buy or somesuch.
Which brings up another good point: If you are going to buy a major vendor system, buy it direct from the vendor when it's on clearance; or through an authorized reseller like CDW, Insight, tiger direct, or newegg.
Unless your local Best Buy is offering an identical system (check the part numbers and spec sheets) on a HUGE discount, you generally don't want to buy a computer from them. For one thing, the prices on the direct vendor and internet clearance houses are usually better; but also the tech support for systems you buy from Best Buy often goes through Best Buy first rather than direct with the system vendor. If there are two systems which appear identical, and several resellers have on part number and Best Buy has a different part number, you can bet it's a BB only part number (even if it's the same model number by the way), and that the tech support is going to go through BB.
Kinda defeats the purpose of buying from a major system vendor in the first place eh?
Now I'm not saying you can't get a great deal from a brick and mortar store (in fact, I just bought a laptop at BB a month ago); just that you have to be absolutely 100% sure of what you are getting. When you go with Best Buy vs. the net, yes, you get to pick it up that day, and return it there if it is broken; but shipping is usually lower than taxes; you generally can't buy the factory extended warranty (at least not without some hassle), and you may be getting a BB only part number with substandard support.
Ok so, technical question out of the way; what about value?
Obviously, if you need the support that a major vendor offers, the value is there. But if you don't really need that support; if you are capable of supporting yourself, and building your own system; there is still a question about whether doing so provides value to you.
This one can be a BIT harder to pin down, but what it comes down to is, how are you going to use the pc? What are you going to use it for, and how hard are you going to use it?
If you are just using a system for email, web surfing, general office tasks etc... then your needs can be met with the lowest of low end systems; and quite frankly, you can't build a low end system on your own for what Dell and the like are selling them for these days.
Today, on the Dell web site, you can buy a brand new (not refurbished) pc, with a gig of RAM, 250 gigs of hard drive space, a 17" lcd, keyboard, mouse, Windows Vista home basic, MS works, and a DVD burner with player and burner software; for $500, or for $300 without the monitor, and with a smaller drive.
That's an every day price, not a clearance or sale. If you get it on one of their nearly weekly clearance deals, you might snag the system with Monitor for $400.
Going to HPs web site yields similar results. They have five computers selling at under $400; and the lowest price system they are selling today is $299, plus $100 for a monitor.
If you catch it at the right time, you can often get the same, or even slightly higher end systems as factory refurbished for as little as $250 (I bought one a couple months ago in fact).
Even without including the software costs (about $200) you simply cannot buy the parts necessary to build these systems for $500; and certainly not for $400.
Keeping everything comparable to the bottom end major systems, you come up at a fair bit higher price. If you account for $60 for a motherboard, $60 for a processor, $25 for RAM, $80 for a case with power supply, $30 for a DVD burner, $90 for a hard drive, $20 for keyboard and mouse; you've got $365 before we even get into a monitor (about $100 minimum) and the operating system ($90 for an OEM version). Oh and of course none of that includes shipping (which Dell will often throw in for free).
Basically, the major system vendors have taken the low end of the market, and dropped the prices to cost or below, as a loss leader. They get you in with the cheap systems, and the refurbished systems; and they make their money on the high end boxes, accessories, software, and extended warranties. You might have noticed you cant just buy a computer at one of these sites anymore without clicking through 5 pages of add-on stuff they want to sell you; well that's why.
Of course that brings up the other half of the home computing equation... the mid-range and high end machines for power using and gaming.
This is where the fun starts.
You really can't beat the majors when it comes to the low end; but if you're building a gaming rig, even only a midrange system; you can either get substandard and proprietary components, or you can get absolutely reamed on price by the likes of Dell and HP.
For example, if I wanted a mid-configured core 2 quad Q6600 HP Blackbird, air cooled (they make water cooled versions for a couple hundred bucks more) with a 640mb 8800gts, 4gb of corsair dominator ram (about twice as expensive as normal ram), an X-fi extreme gamer, two pairs of 500gb hard drives, two dvd drives, and a 1000 watt power supply; I’d play close to $5000 ($4928 to be exact).
That is a midrange system; but from HP, you're paying a high end systems price (Dell sells something similar in the XPS line; for a similar price)
If I built it myself, it would only run about $2500, with all the EXACT same major components, a case of equivalent quality (obviously, HP uses their own case that you can’t buy), and vista home premium (no point in buying ultimate).
If I wanted to, I could add an HD-DVD or BlueRay drive for $200 on top of that, or $400 a BluRay or HD writer; something very few system vendors offer (HP does offer a BluRay writer for $400)
Now remember, that’s a midrange gaming system. If I want to go truly hard core with dual quad xeons, 16 gigs of ram (64 bit os obviously), an SLI setup, 4x 1tb RAID array etc… I'd be looking at $15-$18k from a major vendor; but under $10k to do it myself.
Even aiming a bit lower, into the intro gaming segment, your options aren't just more expensive, they tend to be few and far between. In order to get standard, reasonable quality components, you generally have to move up to the specialized gaming lines; and instead of starting at $300 you start at $1500.
Going with Dell this time, and the XPS line, we intro at that $1500 base price. For that, you get that same Core 2 Quad 6600 (it's the best deal in computing right now), 2 gigs of mid grade RAM ($100 more for the corsair dominator I mentioned earlier), a single 250gb hard drive ($110 more to go to two drives), a single low end DVD burner ($40 for two of them, $250 more for a single BluRay drive, and $350 more for a BluRay writer), and an 8800GT with 512mb of RAM (also the best bargain in gaming video cards right now).
Honestly, that's a pretty decent system. In fact, other than upgrading the RAM and motherboard to slightly higher end models, and adding a second (and larger) hard drive, I'm thinking about building a very similar system myself. If you don't know any better, $1500 sounds like a reasonable price. In fact, if you want a gaming system, and you need tech support and a warranty, it IS a reasonable price.
...The only problem with that is, I can build the EXACT same system at NewEgg for just under $1000 (not including software)... or the slightly upgraded version I'm thinking about for another $120 on top of that.
The higher you aim in systems, the more expensive it gets; or looking at it from the other side, the more you can save. If I wanted to upgrade my processor, go to 4 gigs of premium gaming ram, a 2x750 raid array, and an X-fi sound card, I'd be paying Dell over $2500, maybe $3000. For the same system at NewEgg, I'd be paying around $1700.
I've built three sample configurations at NewEgg; corresponding to what would be $1800, $5500 and $18,500 boxes respectively if purchased from major vendors.
Value box - $1200
God-ish box - $2800
Super-god box - $7700
The super god box listed there is roughly the same raw spec as a fully loaded MacPro tower by the by; excepting better, with more options, better hardware, more storage, more RAM... well, just more and better of everything; and less than half the price.
...Of course it doesn't give you Steve Jobs reflected cool factor; and wont help you score hipster chicks.
If only Steve would sell us his wonderful operating system (and I mean that without any sarcasm. OS X is great).
Talking in more general terms, roughly speaking this is what you need to build a decent midrange or gaming system:
Mid range dual core processor: $150 ($190 gets you into a 4m cache 2.66ghz dual core , $260 gets you a quad core)
mid range motherboard: $70 ($150 gets you a bunch of neat extras)
2 gigs half decent RAM: $60 (spend $200 to go to 4 gigs of medium grade ram)
Mid range graphics card with 340-512 ram: $150 ($300 gives you about 50% more power)
500 gig hard drive: $125 (or $250 for two)
Good quality DVD writer, and second dvd drive: $75
Mid range sound card: $100 (or buy a $150 motherboard and get pretty good onboard sound)
Good quality case: $100
Good quality PSU: $100
Total inc tax and shipping: about $1000 for the bottom, up to about $1500.
My current plan for a new gaming box I want to build early next year looks a little more like this:
Mid range quad core processor: $260
top quality non-sli mobo: $175
4 gigs of very good ram: $350
Medium high end non-sli graphics: $350
4x500gb hard drives in a raid array: $500
HD-DVD burner: $300
BluRay burner: $400
Mid range sound card: $100
VERY good quality aluminum case: $200
VERY good quality modular power supply: $250
Total inc tax and shipping: about $3k
Of course I could easily drop $600 of that $700 for the HD and BluRay drives, $100 on the video card, $100 on the ram, $250 on the hard drives, and $150 on the case and PSU; and still have almost all the speed, computing power, and graphical performance for around $1800 instead.
Ok so why a quad, and why bother with BluRay and HD?
First, the quad question. Why bother with a quad core, it just seems like overkill.
Well, for games it usually is; but for hardcore multi tasking, video editing, photo editing and the like, that quad really helps.
It really depends on the workload, but take a look at performance number between a non-extreme core 2 duo, and a core 2 quad. If the app being test4ed uses multicore well at all, the quads beat out everything other than the Core 2 Extremes (which are more than twice as expensive); and for 64 bit multithreaded apps… there's no comparison. At only $100 more than an equivalently clocked Core 2 duo, they are definitely worth it, if you can use it.
Even better though, the C2 Q6600, that I have called the best bargain in computing above; is actually a 1333mhz part that’s been downclocked to 1066mhz. On most motherboards, you can run it at 1333mhz, with the original OEM heat sink and no additional cooling; and now you’ve got a 3.0GHz quad core with 8mb of cache.
Now why would I add next gen discs to my preferred box; and if they are that great, why not add them to the value system?
Well first, movies. I happen to like watching movies on my computer. I also really like using my computer as a media server. Then of course there's the high volume data storage (50gb per disc in this generation).
Remember now, the higher end selections were intended to be just that, higher end. It’s not something I would recommend to everyone. With my highest end box though, I want to be able to read, and write, any common format. I also need something for offsite backup of critical files (my onsite backup is to my local NAS server), and HD and BluRay are a HELL of a lot cheaper than LTO tape.
Of course that still leaves a couple of important decisions:
1. What about a monitor
2. How much graphic power to pay for (and specifically, to SLI/X-fire or not)
3. How much memory (which is related to the SLI question)
I want to address the monitor question first here; because it's of use to everyone, not just the hard core gamer types.
So, what about monitor?
Firstly, I never count the monitor as part of the computer purchase. Oh yes, you need to budget for it; but it’s like counting your TV as part of your stereo. You don't want to economize on your monitor to get more computer; neither do you want to economize on your computer to get more monitor.
The monitor is the most important element of your human computer interface. Every interaction you have with your computer runs through that monitor; and how happy you are with your computer is to a large extent determined by how happy you are with your monitor.
Also, monitors are one area of computing where you get a decent return on your investment in a higher end piece of gear; because you are likely to keep your monitor for far longer than you keep the rest of the computer. Up until I moved to all LCD a few years back, I had monitors from 5 years, and even 10 years ago; and because I spent the money on top quality, they still looked and worked great.
The prices and deals on LCD displays change from day to day, so it's very hard to make specific recommendations.
Making it even harder, often brand is really little or no indicator as to quality, because there are only 6 fabs that make 19” and bigger LCD panels for computer displays anyway; it’s all the little extras like ports and warranties that make the difference.
I know lots of folks who think they’d never buy an LCD from Goldstar... except they don’t know that Goldstar and Samsung are two of the three leading manufacturers of LCD panels (NEC are the third); and that almost every manufacturer other than Sony (who make their own), including Apple, HP, Dell, Planar and ViewSonic all use Samsung, Goldstar, or NEC panels.
All that said, the extras really do make a difference. When you buy quality, what you're paying for TECHNICALLY is response time, different connectivity, better quality of signal processing, and better backlights (brighter, and with more accurate color spectrum).
More important though, you are also paying for a better warranty and fewer dead or stuck pixels. The better manufacturers all warrant against more than one or two stuck pixels, and most will replace the display if there is even one in a very noticeable spot. The lesser manufacturers offer little to no warranty.
Really, that IS worth the extra $50 or even $100 you pay for going to a higher end manufacturer; especially when buying a 24" or 30" panel.
Ok, now let's talk about the specific features to look for:
1. FOFO (full on/full off) response time of 4ms or less, and GTG (gray to gray) response time of 6ms or less (6ms and 8ms respectively are acceptable, 2 and 4 are ideal).
Unfortunately, most manufacturers are unclear as to what response time they are quoting. If you see 4ms (min) thats the Full On/Full Off response time. If you see 6ms(max) thats the GTG time. If they don’t specify FOFO, GTG, min, or max, then it’s either the average of the two (where they’ll usually note avg. or typ.), or they’re quoting a rise only time, which is actually twice as fast as the real number.
Important to note, if you see a time of 4ms or under quoted without a specific notation as to what they are quoting, and it’s on a less than top of the line display, they are almost certainly quoting rise time; so double it to get the real number.
2. Refresh rate of 120hz is preferred. Refresh rate may not be quoted; if the display has a GTG response time of 8ms or less, it’s probably 120hz. It's not absolutely necessary, but it helps.
3. Static (typical) contrast ratio above 600:1, 1000:1 or above preferred. Again, you may not be sure what number you're being quoted here.
4. dynamic contrast ratio of at least 3000:1, and 10000:1 preferred. Unfortunately, dynamic CR is a marketing number, but with many manufacturers, it’s all they give you.
Heres a tip: if you see a number above 1,000:1 on anything other than a top end display, it's dynamic; and if you see a number higher than 3000:1 on a display less than $2500, it’s a dynamic ratio.
5. A brightness of at least 300cdm^2/300nit at D6500K.
6. A color corrected backlight at D6500K. LED backlight if possible. This gives you better color accuracy, and longer display life.
7. DVI with HDCP, and possibly HDMI. Multiple inputs preferred. DisplayPort if possible. Basically, the more inputs the better.
8. Pixel pitch of under .28mm; .25mm or under preferred. If pixel pitch isn’t quoted you can figure it by dividing the long dimension of the screen in mm by the long dimension of the native resolution in pixels.
Here's a trick for you. If the length and width of the display area are not quoted, you can figure them with the aspect ratio, the diagonal measurement, and the Pythagorean theorem.
A 19” diagonal screen at 4:3 is as follows:
19^2=371
(4^2)+(3^2)=25
371/25=14.84
sqrt(14.84)=3.85
4x3.85=15.4” width
3x3.85= 11.55” height
A 24” 16:9 screen is as follows:
24^2=576
(16^2)+(9^2)= 337
576/337=1.709
sqrt(1.709)=1.307
16x1.307=20.91” width
9x1.307= 11.77 height
Knowing this height and width spec is important in deciding between 4:3 and 16:9; because the diagonal measurements are not comparable. A 24” 16:9 actually gives you the same screen HEIGHT as a 19” 4:3; it just adds an extra 5.5” of width.
The smallest I’d personally go with right now would be a 19” 4:3 or a 24” 16:9. The smaller models are almost all considered “economy” models. Either are the minimum size to display a standard 8.5"x11" piece of paper at 100% scale in portrait orientation.
Personally, I happen to think the sweet spot today is in the 24” 16:9 displays. They give you a massive screen area, with great resolution and pixel pitch; but they’re down under $500 for even the best quality models, and around $350 for discount models. If you don’t feel like spending that much, then get a 19” 4:3. Don't bother going for a 20" or 22" 16:9 or a 17" 4:3 unless you are really looking to save money.
The next step up that’s worthwhile is the 30” now offered by Apple, HP, Dell, and Samsung (all using the same Samsung panel actually), and they’re incredibly gorgeous, huge, and spectacular… but they sell from $1300 to $1900. There are 26" and 28" models, but I really don't think they offer much advantage over 24" displays, or much price savings over 30".
As to brands, I like NEC, ViewSonic, Samsung, HP and Dell (usually Samsung), Apple (also usually Samsung), LG, and Planar.
Now, talking about specs. Some manufacturers are more honest than others. Most of the mass market manufacturers (even the good ones) vastly overstate their actual performance. Companies that cater to graphics professionals (Apple, Viewsonic, Planar etc...) on the other hand tend to report their specifications relatively accurately.
All of my recommendations above are based on the semi-reliable marketing numbers of the better mass market brands.
If the manufacturer is very specific, quoting GTG, typical contrast ratio instead of dynamic, exact pixel pitch etc… they are most likely at least reasonably accurate; so if they quote a spec thats good, but less than another brand, it’s likely that the more accurate number is going to be just as good, or even better than the “better”, but less accurate number.
Alright now, graphics and memory...
If in fact you ARE a hardcore gamer or power user, your memory, and your video card are your most important configuration decisions; and they are tightly interrelated.
If you are a power user, but NOT a gamer, you should simply pack as much memory in as you can afford; and your operating system supports. If you run 32 bit windows (about 96% of all computer users), that's 4GB; which conveniently is the amount that most consumer motherboards support.
More memory is just about always better; because you can have more files open and in relatively fast memory without going back to relatively slow disk. For applications like video editing software, where you are performing multiple operations simultaneously on very large files, memory is as important as electricity.
If you are a hardcore gamer though, the question becomes more difficult; and modern graphics cards, especially with SLI; are the reason.
SLI is a way of using multiple video cards to improve performance; and the next few hundred words or so are why I think that even a hardcore gamer shouldn't bother with it.
There’s two reasons why SLI either doesn’t make much difference in performance; or can actually hurt it; in most real world configurations. To talk about this intelligently though, we’re going to have to go into a little compu-history lesson (and believe me, I’m being as brief as possible here. The FULL story would be a hell of a lot longer).
The original implementation of SLI, from 3DFx in 1998, was actually quite useful.
The 3d cards of the time just barely had the guts to render full scenes at 640x480 (yes, that low) and 30 frames per second. Most in fact could only run at 512x384... which is worse than my PDA today.
When 3DFx introduced the Voodoo 2 in ‘98, it was a revolution. The Voodoo2 could render 800x600x16bit at 30 frames per second (native TV refresh rate).
The biggest news with Voodoo2 however, was SLI; which stood for "scan line interleaving".
Unlike LCDs, which have discrete pixels that are turned on and off individually; analog CRT based displays (including tube televisions) have an electron gun which draws on the displays surface in scan lines. These scan lines are split into two sets, half of which are displayed on one pass; then the other half are are interlaced in between on the next pass.
By using two video cards, and having each card render every other scan line within a frame, you could go up to 640x480, and 60 frames per second, or 1024x768 and 30 frames per second.
Of course these days, even with the gnarliest dx9 3d games, the graphics card is no longer the limiting factor. Since 2004, with the mainstreaming of the PCIe bus for graphics; even relatively cheap graphics cards can render most games at 1280x1024x32bit, and 120fps, with 2x anti-aliasing, and trilinear filtering (graphics geek speak for "looks pretty damn good").
Given that’s the best frame rate and resolution that most peoples monitors can display, anything over and above that doesn’t do much to improve the play experience unless you’ve got a very large, very high end display (or multiple displays; but most games don’t work well, or at all, with multi-monitors).
In 2004, Nvidia (who acquired 3dfx from bankruptcy) announced a new version of SLI for their top end cards; and using their own motherboard chipset. ATI also introduced their own version, which they call "CrossFire" and which works very similarly (I'm just going to refer to SLI from this point forward for convenience, but most of what I say applies to both).
The NEW version of SLI, is the “scalable link interface”. Since we’ve switched to primarily progressive mode displays for our computers (LCDs), it uses one of two different methods for sharing the rendering load. It can split a frame and have each card render half; or it can have each card render every other frame. It’s not a terribly efficient way of doing things, because each card is often processing the same data as the other over and over (static backgrounds and the like) but it will give you an increase in frame rates at maximum resolution.
Now, no monitor today can display at more than 120 frames per second, because that’s the maximum refresh rate of the monitors; so going above 120fps is pointless.
On test scenes where the CPU doesn’t have to deal with physics issues, and the scene is programmed to use all the advanced features of the cards, a top end SLI configuration can render scenes at 2560x1600x32bit, on a 30” display, with 8x anti-aliasing, smart filtering, smart texturing, smart shading, and full dynamic lighting; at over 120 frames per second.
In fact some cards in SLI mode can render such a scene at 200+ frames per second… but the best monitor can only display 120fps; and this isn’t a case where having too much data is actually still helpful because you can accurately downsample (as it is when you can compute at a higher resolution than you can display). There is nothing to downsample to, the card simply drops the extra frames on the floor when it pushes out to the display.
Oh and of course I should note, most displays still can’t do 120hz; most LCDs are just 60hz, and most CRTs can only do 75-90hz. So again, unless you have a top end display…
The third way that SLI can increase performance can actually be useful for something other than raw framerate. In SLI Anti-aliasing mode, the cards render alternate frames, and export them back to each other; then one card is used to handle half of the anti-aliasing on a single frame, and the other card to handle the other half.
This works, because Anti-aliasing is the most intensive work the GPU does; involving oversampling each polygon as many as 64 times in as many as 8 locations per vertex. By enabling this mode, you can have one card anti-alias half the locations per vertex, and the other, the other half. This lets you maintain frame rates that you got from 8x anti-aliasing on a single card, all the way up to 32x anti-aliasing (Nvidia just announced a 3 way SLI implementation that enables 64x anti-aliasing); and produces slightly better image quality even at the same level as with single card AA.
Of course the human eye can’t see the difference between 8x and 32x at most resolutions on most sized screens; and in reality none of those scenes could be rendered that highly anyway, because in a real game, the CPU would be too busy with AI and physics information to keep the pipeline full at those resolutions.
At lower resolutions, as I said before, even low end cards can render acceptably, and midrange cards render just as well as the high end ones. There’s really no benefit to SLI at lower resolutions at all.
As you can see, SLI is really just bragging points right now; because the GPU is no longer a limiting factor in any modern game, at any reasonable resolution, level of detail, or level of filtering.
For the last few years, programmers have been focused on improving the physics engines, AIs, and element simulation in their games. This has all put massive strain on the CPU, while graphics have been relatively static (and has also resulted in the development of a physics accelerator card).
Now that DX10 games are coming out, we’re getting a little more demand on the video cards. Games like Crysis for example really do need a high end card to keep up; but still, the limiting factor on games is mostly the CPU; and likely will be until games are programmed to offload physics tasks onto multiple cores or dedicated sub processors (which is starting to happen, slowly).
There are two other things that together make SLI kind of pointless; the 4GB memory barrier, and MMIO.
On 32 bit operating systems, the maximum physical memory addressing space is 4GB. That means that no matter how much memory is in a box, a 32 bit OS will only see 4GB of it, unless you do some weird tricks with memory allocation (which many dedicated servers can do for example).
Now, for most applications and with most hardware, that isn’t an issue; because the maximum memory space that can be assigned to a single application is 3GB anyway (with 1GB allocated to the Operating System Kernel). Unfortunately for us, memory requirements have really been going up on games the last few years, and so now, some games can easily use all 3GB available to them (and all your swap space too).
Ok, again, still not a problem because you’ve got 4gb right?
Well… no.
When the 32 bit memory space was first opened up to personal computers, with the introduction of the Intel 80386 processor in 1986 (Macs had 32 bit processors beginning with the mac plus in 1986; but only got a full 32 bit address space with System 7 in 1991) ; the idea of 4GB in a home computer was thought preposterous. The few people who thought we may eventually see such memory usage, all assumed we would have transitioned to 64 bit systems by then.
In fact, every other platform HAS transitioned to 64 bit; most of them within 10 years of the 386’s introduction (though most still operate in both modes depending on circumstances).
Every platform except Windows.
Even more irritating, is that for about 3 years now, every new processor and chipset has also been 64bit; but the operating system isn’t using that capability.
At any rate, no-one thought we would ever use 4GB of RAM, for as long as 4GB of RAM was a limitation; so they decided to use that “extra” addressing space”.
What they came up with, was called “Memory Mapped Input and Output” or MMIO.
Actually MMIO had been used before; it was used in the 640k days as well, and there were several different workarounds and memory managers that moved the MMIO space into the HMA (high memory areas). This gave you most of your memory below 640k back, and put the device maps into the area above 640k, so that the low memory area was usable by programs.
Well, an identical situation exists today. Every physical device in a computer needs to have a dedicated address space so that software can talk to it. The way MMIO works, is that rather than create a separate addressing space for every device (which would have been a nightmare); programmers simply wrote the BIOS and OS to write to an address in main memory that was mapped to the IO of that device; thus Memory Mapped IO.
When the 32 bit address space became available, it was a natural to use MMIO again. S0 for all these new devices, they allocated addresses starting with the very last byte of the 4GB space, and counted down from there.
Initially this wasn’t a big deal. The MMIO space used by devices on a system was typically less than 64k. Then AGP and USB came along, and devices started requesting a lot more address space. Sometimes as much as 256MB; because the video card needed to map every bit of it’s onboard memory into the main memory addressing space.
Again, not a problem; because no-one could afford more than 2GB of RAM, and no non-server chipsets (which handle the problem by extending their mapping space through mutliple 32bit address tables, and remapping MMIO to a virtual space counting down from 32GB) supported 4GB anyway.
In 2004, four things happened pretty much all at once to make this 4GB issue a potential problem: Graphics cards started regularly coming with 512MB of memory or more (there are now cards with two GPUs on one card, and 1.5GB of memory), Nvidia introduced their new SLI system to work with these higher end cards, Intel and Nvidia both introduced new chipsets and reference motherboards that supported 4GB (or even 8GB) of RAM, and RAM prices crashed through the floor.
At this point, you could put 4GB of RAM in a system for a relatively reasonable amount of money; but when you put that 768MB graphics card in, all of a sudden your operating system could only see and address 3.2GB. Put two of them into a box, and you’re down to 2.5GB.
So you’ve paid for 4GB, and installed 4GB; but you’re only actually able to use 2.5GB?
That’s a problem… but at least there weren’t very any games that needed more than 2GB of ram…
...Until Supreme Commander came out, and it wanted every bit of memory you could give it. In fact, it wanted MORE than you could give it, and tried to steal memory from the OS, thus causing many many crashes.
In Supreme Commander, you are the strategic commander of an armored mech army, which can vary in size from a few units, up to several thousand. Each individual unit is fully rendered and textured, and has it’s own AI. All terrain is also fully rendered. There can be several hundred fully rendered units on screen at any given time.
There is no system today that can handle supreme commander at… really any resolution. 3ghz quad core with 3.2gb of RAM (down from 4gb), and a 768mb 8800GTX is the best you can do, and even then it slows down to under 30fps quite frequently.
Given the amount of 3D work required, one would think an SLI configuration would be perfect for Supreme commander. The problem is the memory hole created by the cards memory.
If you run supreme commander on an SLI system all you’re doing is taking away memory that the CPU needs to handle all the units on the screen.
Today, you have the capability to put 3 gigs worth of video ram into a system; but if you did that in a system with 4GB of RAM, you'd end up with just 1GB usable system memory. Because of the 4gb memory barrier on 32 bit operating systems, you can't add more RAM to compensate. All you can do is decided whether you need more video RAM, or more system RAM; and choose your video memory accordingly.
Honestly, my personal recommendation is to skip the SLI, and cards with more than 512mb of video memory; and keep that system memory available.
It's only very rarely can a single card effectively use more than 512 megs of RAM. In theory, there are some situations where running two 512 meg cards in an SLI configuration; which cuts you down to under 3gb of RAM usable, will give you enough graphical performance increase to justify losing that gig. Most of the time however, you aren't going to see that kind of performance boost.
My personal recommendation is to buy one very good card with 512 megs of video ram (or if you must, up to 768); and pay for the full 4 gigs of system RAM.
Well, I hope this was helpful. I realize theres a lot of obscure detail here, but if you're trying to make an informed decision, this is the absolute minimum of what you need to know. We haven't even gone in to things like chipset differences, sound cards, or processor architecture issues; and those would be another 10,000 words easy. Don't even get me started on Vista vs. XP, or 32bit vs. 64 bit.
Any questions?
So, continuing on my gadgetary thrust for the end of the year, lets talk about computers.
Now, the first question is always "build or buy"; by which I mean do you buy a pre-built system from a major vendor, or do you build your own.
Generally speaking, this is a two part question:
1. Does it make sense technically
2. Does it make sense in terms of value
The first part of the first question is also split into a couple of parts; or perhaps I should say there is an immediate definitive discriminator, and that is this: Are you buying a laptop or a Mac (or both)?
If you want a Mac, you only have one choice, and that's from Apple; ether direct (which I recommend) or from a reseller. Either way you get great support from Apple (which is kinda the point); at least if you have a problem that Apple acknowledges (the last couple years they've had a nasty tendency of releasing laptops with major issues and pretending there was no problem until too many people complained).
If you are buying a Mac, the only advice I can give you is this: If you want performance and don't care about your warranty, buy it stripped and add your own parts. If you care about the warranty then you need to pay Apples OUTFRIKKENRAGEOUS prices for RAM and hard drives. Oh and if you DO care, buy the Applecare warranty, because the basic warranty isn't great.
If you are buying a laptop, buy it from a major system vendor. There are a few specialty vendors that will sell you laptop pars, or custom build you a laptop, but they aren't worth the hassle and the incompatibilities, unless you are an expert, or you need something very specialized.
The best advice I can give you here, is to physically try out the keyboards, pointing devices, and screens of any laptop (or at least a comparable model) before buying it. They are greatly subjective matters of personal comfort and preference; and you don't want to end up with a laptop keyboard you can't change, and can't stand to type on.
The second part of the question comes down to this: do you need basic tech support, or tech support for the OS? If you do, or if the eventual end user does (buying a computer for your mom for example), then go ahead and buy from a major vendor.
Don't bother buying a pre-built system from a small independent vendor if you need the tech support; what you're looking for is 24/7 response, depot parts etc...
Also, if you need the tech support, and you plan on keeping the system for more than a year, this is one of the few times when you should consider a factory extended warranty. Factory warranties range from 90 days to one year; which isn't an awfully long time; and may not cover 2 days service etc... Extended warranties from Dell, HP etc.. are generally reasonably priced, and can provide up to a full three years of warranty protection, usually with one week or less turnaround time. They may have two day service available.
That said, only buy the factory extended warranty, directly from the vendor. Don't even think about buying a reseller warranty from Best Buy or somesuch.
Which brings up another good point: If you are going to buy a major vendor system, buy it direct from the vendor when it's on clearance; or through an authorized reseller like CDW, Insight, tiger direct, or newegg.
Unless your local Best Buy is offering an identical system (check the part numbers and spec sheets) on a HUGE discount, you generally don't want to buy a computer from them. For one thing, the prices on the direct vendor and internet clearance houses are usually better; but also the tech support for systems you buy from Best Buy often goes through Best Buy first rather than direct with the system vendor. If there are two systems which appear identical, and several resellers have on part number and Best Buy has a different part number, you can bet it's a BB only part number (even if it's the same model number by the way), and that the tech support is going to go through BB.
Kinda defeats the purpose of buying from a major system vendor in the first place eh?
Now I'm not saying you can't get a great deal from a brick and mortar store (in fact, I just bought a laptop at BB a month ago); just that you have to be absolutely 100% sure of what you are getting. When you go with Best Buy vs. the net, yes, you get to pick it up that day, and return it there if it is broken; but shipping is usually lower than taxes; you generally can't buy the factory extended warranty (at least not without some hassle), and you may be getting a BB only part number with substandard support.
Ok so, technical question out of the way; what about value?
Obviously, if you need the support that a major vendor offers, the value is there. But if you don't really need that support; if you are capable of supporting yourself, and building your own system; there is still a question about whether doing so provides value to you.
This one can be a BIT harder to pin down, but what it comes down to is, how are you going to use the pc? What are you going to use it for, and how hard are you going to use it?
If you are just using a system for email, web surfing, general office tasks etc... then your needs can be met with the lowest of low end systems; and quite frankly, you can't build a low end system on your own for what Dell and the like are selling them for these days.
Today, on the Dell web site, you can buy a brand new (not refurbished) pc, with a gig of RAM, 250 gigs of hard drive space, a 17" lcd, keyboard, mouse, Windows Vista home basic, MS works, and a DVD burner with player and burner software; for $500, or for $300 without the monitor, and with a smaller drive.
That's an every day price, not a clearance or sale. If you get it on one of their nearly weekly clearance deals, you might snag the system with Monitor for $400.
Going to HPs web site yields similar results. They have five computers selling at under $400; and the lowest price system they are selling today is $299, plus $100 for a monitor.
If you catch it at the right time, you can often get the same, or even slightly higher end systems as factory refurbished for as little as $250 (I bought one a couple months ago in fact).
Even without including the software costs (about $200) you simply cannot buy the parts necessary to build these systems for $500; and certainly not for $400.
Keeping everything comparable to the bottom end major systems, you come up at a fair bit higher price. If you account for $60 for a motherboard, $60 for a processor, $25 for RAM, $80 for a case with power supply, $30 for a DVD burner, $90 for a hard drive, $20 for keyboard and mouse; you've got $365 before we even get into a monitor (about $100 minimum) and the operating system ($90 for an OEM version). Oh and of course none of that includes shipping (which Dell will often throw in for free).
Basically, the major system vendors have taken the low end of the market, and dropped the prices to cost or below, as a loss leader. They get you in with the cheap systems, and the refurbished systems; and they make their money on the high end boxes, accessories, software, and extended warranties. You might have noticed you cant just buy a computer at one of these sites anymore without clicking through 5 pages of add-on stuff they want to sell you; well that's why.
Of course that brings up the other half of the home computing equation... the mid-range and high end machines for power using and gaming.
This is where the fun starts.
You really can't beat the majors when it comes to the low end; but if you're building a gaming rig, even only a midrange system; you can either get substandard and proprietary components, or you can get absolutely reamed on price by the likes of Dell and HP.
For example, if I wanted a mid-configured core 2 quad Q6600 HP Blackbird, air cooled (they make water cooled versions for a couple hundred bucks more) with a 640mb 8800gts, 4gb of corsair dominator ram (about twice as expensive as normal ram), an X-fi extreme gamer, two pairs of 500gb hard drives, two dvd drives, and a 1000 watt power supply; I’d play close to $5000 ($4928 to be exact).
That is a midrange system; but from HP, you're paying a high end systems price (Dell sells something similar in the XPS line; for a similar price)
If I built it myself, it would only run about $2500, with all the EXACT same major components, a case of equivalent quality (obviously, HP uses their own case that you can’t buy), and vista home premium (no point in buying ultimate).
If I wanted to, I could add an HD-DVD or BlueRay drive for $200 on top of that, or $400 a BluRay or HD writer; something very few system vendors offer (HP does offer a BluRay writer for $400)
Now remember, that’s a midrange gaming system. If I want to go truly hard core with dual quad xeons, 16 gigs of ram (64 bit os obviously), an SLI setup, 4x 1tb RAID array etc… I'd be looking at $15-$18k from a major vendor; but under $10k to do it myself.
Even aiming a bit lower, into the intro gaming segment, your options aren't just more expensive, they tend to be few and far between. In order to get standard, reasonable quality components, you generally have to move up to the specialized gaming lines; and instead of starting at $300 you start at $1500.
Going with Dell this time, and the XPS line, we intro at that $1500 base price. For that, you get that same Core 2 Quad 6600 (it's the best deal in computing right now), 2 gigs of mid grade RAM ($100 more for the corsair dominator I mentioned earlier), a single 250gb hard drive ($110 more to go to two drives), a single low end DVD burner ($40 for two of them, $250 more for a single BluRay drive, and $350 more for a BluRay writer), and an 8800GT with 512mb of RAM (also the best bargain in gaming video cards right now).
Honestly, that's a pretty decent system. In fact, other than upgrading the RAM and motherboard to slightly higher end models, and adding a second (and larger) hard drive, I'm thinking about building a very similar system myself. If you don't know any better, $1500 sounds like a reasonable price. In fact, if you want a gaming system, and you need tech support and a warranty, it IS a reasonable price.
...The only problem with that is, I can build the EXACT same system at NewEgg for just under $1000 (not including software)... or the slightly upgraded version I'm thinking about for another $120 on top of that.
The higher you aim in systems, the more expensive it gets; or looking at it from the other side, the more you can save. If I wanted to upgrade my processor, go to 4 gigs of premium gaming ram, a 2x750 raid array, and an X-fi sound card, I'd be paying Dell over $2500, maybe $3000. For the same system at NewEgg, I'd be paying around $1700.
I've built three sample configurations at NewEgg; corresponding to what would be $1800, $5500 and $18,500 boxes respectively if purchased from major vendors.
Value box - $1200
God-ish box - $2800
Super-god box - $7700
The super god box listed there is roughly the same raw spec as a fully loaded MacPro tower by the by; excepting better, with more options, better hardware, more storage, more RAM... well, just more and better of everything; and less than half the price.
...Of course it doesn't give you Steve Jobs reflected cool factor; and wont help you score hipster chicks.
If only Steve would sell us his wonderful operating system (and I mean that without any sarcasm. OS X is great).
Talking in more general terms, roughly speaking this is what you need to build a decent midrange or gaming system:
Mid range dual core processor: $150 ($190 gets you into a 4m cache 2.66ghz dual core , $260 gets you a quad core)
mid range motherboard: $70 ($150 gets you a bunch of neat extras)
2 gigs half decent RAM: $60 (spend $200 to go to 4 gigs of medium grade ram)
Mid range graphics card with 340-512 ram: $150 ($300 gives you about 50% more power)
500 gig hard drive: $125 (or $250 for two)
Good quality DVD writer, and second dvd drive: $75
Mid range sound card: $100 (or buy a $150 motherboard and get pretty good onboard sound)
Good quality case: $100
Good quality PSU: $100
Total inc tax and shipping: about $1000 for the bottom, up to about $1500.
My current plan for a new gaming box I want to build early next year looks a little more like this:
Mid range quad core processor: $260
top quality non-sli mobo: $175
4 gigs of very good ram: $350
Medium high end non-sli graphics: $350
4x500gb hard drives in a raid array: $500
HD-DVD burner: $300
BluRay burner: $400
Mid range sound card: $100
VERY good quality aluminum case: $200
VERY good quality modular power supply: $250
Total inc tax and shipping: about $3k
Of course I could easily drop $600 of that $700 for the HD and BluRay drives, $100 on the video card, $100 on the ram, $250 on the hard drives, and $150 on the case and PSU; and still have almost all the speed, computing power, and graphical performance for around $1800 instead.
Ok so why a quad, and why bother with BluRay and HD?
First, the quad question. Why bother with a quad core, it just seems like overkill.
Well, for games it usually is; but for hardcore multi tasking, video editing, photo editing and the like, that quad really helps.
It really depends on the workload, but take a look at performance number between a non-extreme core 2 duo, and a core 2 quad. If the app being test4ed uses multicore well at all, the quads beat out everything other than the Core 2 Extremes (which are more than twice as expensive); and for 64 bit multithreaded apps… there's no comparison. At only $100 more than an equivalently clocked Core 2 duo, they are definitely worth it, if you can use it.
Even better though, the C2 Q6600, that I have called the best bargain in computing above; is actually a 1333mhz part that’s been downclocked to 1066mhz. On most motherboards, you can run it at 1333mhz, with the original OEM heat sink and no additional cooling; and now you’ve got a 3.0GHz quad core with 8mb of cache.
Now why would I add next gen discs to my preferred box; and if they are that great, why not add them to the value system?
Well first, movies. I happen to like watching movies on my computer. I also really like using my computer as a media server. Then of course there's the high volume data storage (50gb per disc in this generation).
Remember now, the higher end selections were intended to be just that, higher end. It’s not something I would recommend to everyone. With my highest end box though, I want to be able to read, and write, any common format. I also need something for offsite backup of critical files (my onsite backup is to my local NAS server), and HD and BluRay are a HELL of a lot cheaper than LTO tape.
Of course that still leaves a couple of important decisions:
1. What about a monitor
2. How much graphic power to pay for (and specifically, to SLI/X-fire or not)
3. How much memory (which is related to the SLI question)
I want to address the monitor question first here; because it's of use to everyone, not just the hard core gamer types.
So, what about monitor?
Firstly, I never count the monitor as part of the computer purchase. Oh yes, you need to budget for it; but it’s like counting your TV as part of your stereo. You don't want to economize on your monitor to get more computer; neither do you want to economize on your computer to get more monitor.
The monitor is the most important element of your human computer interface. Every interaction you have with your computer runs through that monitor; and how happy you are with your computer is to a large extent determined by how happy you are with your monitor.
Also, monitors are one area of computing where you get a decent return on your investment in a higher end piece of gear; because you are likely to keep your monitor for far longer than you keep the rest of the computer. Up until I moved to all LCD a few years back, I had monitors from 5 years, and even 10 years ago; and because I spent the money on top quality, they still looked and worked great.
The prices and deals on LCD displays change from day to day, so it's very hard to make specific recommendations.
Making it even harder, often brand is really little or no indicator as to quality, because there are only 6 fabs that make 19” and bigger LCD panels for computer displays anyway; it’s all the little extras like ports and warranties that make the difference.
I know lots of folks who think they’d never buy an LCD from Goldstar... except they don’t know that Goldstar and Samsung are two of the three leading manufacturers of LCD panels (NEC are the third); and that almost every manufacturer other than Sony (who make their own), including Apple, HP, Dell, Planar and ViewSonic all use Samsung, Goldstar, or NEC panels.
All that said, the extras really do make a difference. When you buy quality, what you're paying for TECHNICALLY is response time, different connectivity, better quality of signal processing, and better backlights (brighter, and with more accurate color spectrum).
More important though, you are also paying for a better warranty and fewer dead or stuck pixels. The better manufacturers all warrant against more than one or two stuck pixels, and most will replace the display if there is even one in a very noticeable spot. The lesser manufacturers offer little to no warranty.
Really, that IS worth the extra $50 or even $100 you pay for going to a higher end manufacturer; especially when buying a 24" or 30" panel.
Ok, now let's talk about the specific features to look for:
1. FOFO (full on/full off) response time of 4ms or less, and GTG (gray to gray) response time of 6ms or less (6ms and 8ms respectively are acceptable, 2 and 4 are ideal).
Unfortunately, most manufacturers are unclear as to what response time they are quoting. If you see 4ms (min) thats the Full On/Full Off response time. If you see 6ms(max) thats the GTG time. If they don’t specify FOFO, GTG, min, or max, then it’s either the average of the two (where they’ll usually note avg. or typ.), or they’re quoting a rise only time, which is actually twice as fast as the real number.
Important to note, if you see a time of 4ms or under quoted without a specific notation as to what they are quoting, and it’s on a less than top of the line display, they are almost certainly quoting rise time; so double it to get the real number.
2. Refresh rate of 120hz is preferred. Refresh rate may not be quoted; if the display has a GTG response time of 8ms or less, it’s probably 120hz. It's not absolutely necessary, but it helps.
3. Static (typical) contrast ratio above 600:1, 1000:1 or above preferred. Again, you may not be sure what number you're being quoted here.
4. dynamic contrast ratio of at least 3000:1, and 10000:1 preferred. Unfortunately, dynamic CR is a marketing number, but with many manufacturers, it’s all they give you.
Heres a tip: if you see a number above 1,000:1 on anything other than a top end display, it's dynamic; and if you see a number higher than 3000:1 on a display less than $2500, it’s a dynamic ratio.
5. A brightness of at least 300cdm^2/300nit at D6500K.
6. A color corrected backlight at D6500K. LED backlight if possible. This gives you better color accuracy, and longer display life.
7. DVI with HDCP, and possibly HDMI. Multiple inputs preferred. DisplayPort if possible. Basically, the more inputs the better.
8. Pixel pitch of under .28mm; .25mm or under preferred. If pixel pitch isn’t quoted you can figure it by dividing the long dimension of the screen in mm by the long dimension of the native resolution in pixels.
Here's a trick for you. If the length and width of the display area are not quoted, you can figure them with the aspect ratio, the diagonal measurement, and the Pythagorean theorem.
A 19” diagonal screen at 4:3 is as follows:
19^2=371
(4^2)+(3^2)=25
371/25=14.84
sqrt(14.84)=3.85
4x3.85=15.4” width
3x3.85= 11.55” height
A 24” 16:9 screen is as follows:
24^2=576
(16^2)+(9^2)= 337
576/337=1.709
sqrt(1.709)=1.307
16x1.307=20.91” width
9x1.307= 11.77 height
Knowing this height and width spec is important in deciding between 4:3 and 16:9; because the diagonal measurements are not comparable. A 24” 16:9 actually gives you the same screen HEIGHT as a 19” 4:3; it just adds an extra 5.5” of width.
The smallest I’d personally go with right now would be a 19” 4:3 or a 24” 16:9. The smaller models are almost all considered “economy” models. Either are the minimum size to display a standard 8.5"x11" piece of paper at 100% scale in portrait orientation.
Personally, I happen to think the sweet spot today is in the 24” 16:9 displays. They give you a massive screen area, with great resolution and pixel pitch; but they’re down under $500 for even the best quality models, and around $350 for discount models. If you don’t feel like spending that much, then get a 19” 4:3. Don't bother going for a 20" or 22" 16:9 or a 17" 4:3 unless you are really looking to save money.
The next step up that’s worthwhile is the 30” now offered by Apple, HP, Dell, and Samsung (all using the same Samsung panel actually), and they’re incredibly gorgeous, huge, and spectacular… but they sell from $1300 to $1900. There are 26" and 28" models, but I really don't think they offer much advantage over 24" displays, or much price savings over 30".
As to brands, I like NEC, ViewSonic, Samsung, HP and Dell (usually Samsung), Apple (also usually Samsung), LG, and Planar.
Now, talking about specs. Some manufacturers are more honest than others. Most of the mass market manufacturers (even the good ones) vastly overstate their actual performance. Companies that cater to graphics professionals (Apple, Viewsonic, Planar etc...) on the other hand tend to report their specifications relatively accurately.
All of my recommendations above are based on the semi-reliable marketing numbers of the better mass market brands.
If the manufacturer is very specific, quoting GTG, typical contrast ratio instead of dynamic, exact pixel pitch etc… they are most likely at least reasonably accurate; so if they quote a spec thats good, but less than another brand, it’s likely that the more accurate number is going to be just as good, or even better than the “better”, but less accurate number.
Alright now, graphics and memory...
If in fact you ARE a hardcore gamer or power user, your memory, and your video card are your most important configuration decisions; and they are tightly interrelated.
If you are a power user, but NOT a gamer, you should simply pack as much memory in as you can afford; and your operating system supports. If you run 32 bit windows (about 96% of all computer users), that's 4GB; which conveniently is the amount that most consumer motherboards support.
More memory is just about always better; because you can have more files open and in relatively fast memory without going back to relatively slow disk. For applications like video editing software, where you are performing multiple operations simultaneously on very large files, memory is as important as electricity.
If you are a hardcore gamer though, the question becomes more difficult; and modern graphics cards, especially with SLI; are the reason.
SLI is a way of using multiple video cards to improve performance; and the next few hundred words or so are why I think that even a hardcore gamer shouldn't bother with it.
There’s two reasons why SLI either doesn’t make much difference in performance; or can actually hurt it; in most real world configurations. To talk about this intelligently though, we’re going to have to go into a little compu-history lesson (and believe me, I’m being as brief as possible here. The FULL story would be a hell of a lot longer).
The original implementation of SLI, from 3DFx in 1998, was actually quite useful.
The 3d cards of the time just barely had the guts to render full scenes at 640x480 (yes, that low) and 30 frames per second. Most in fact could only run at 512x384... which is worse than my PDA today.
When 3DFx introduced the Voodoo 2 in ‘98, it was a revolution. The Voodoo2 could render 800x600x16bit at 30 frames per second (native TV refresh rate).
The biggest news with Voodoo2 however, was SLI; which stood for "scan line interleaving".
Unlike LCDs, which have discrete pixels that are turned on and off individually; analog CRT based displays (including tube televisions) have an electron gun which draws on the displays surface in scan lines. These scan lines are split into two sets, half of which are displayed on one pass; then the other half are are interlaced in between on the next pass.
By using two video cards, and having each card render every other scan line within a frame, you could go up to 640x480, and 60 frames per second, or 1024x768 and 30 frames per second.
Of course these days, even with the gnarliest dx9 3d games, the graphics card is no longer the limiting factor. Since 2004, with the mainstreaming of the PCIe bus for graphics; even relatively cheap graphics cards can render most games at 1280x1024x32bit, and 120fps, with 2x anti-aliasing, and trilinear filtering (graphics geek speak for "looks pretty damn good").
Given that’s the best frame rate and resolution that most peoples monitors can display, anything over and above that doesn’t do much to improve the play experience unless you’ve got a very large, very high end display (or multiple displays; but most games don’t work well, or at all, with multi-monitors).
In 2004, Nvidia (who acquired 3dfx from bankruptcy) announced a new version of SLI for their top end cards; and using their own motherboard chipset. ATI also introduced their own version, which they call "CrossFire" and which works very similarly (I'm just going to refer to SLI from this point forward for convenience, but most of what I say applies to both).
The NEW version of SLI, is the “scalable link interface”. Since we’ve switched to primarily progressive mode displays for our computers (LCDs), it uses one of two different methods for sharing the rendering load. It can split a frame and have each card render half; or it can have each card render every other frame. It’s not a terribly efficient way of doing things, because each card is often processing the same data as the other over and over (static backgrounds and the like) but it will give you an increase in frame rates at maximum resolution.
Now, no monitor today can display at more than 120 frames per second, because that’s the maximum refresh rate of the monitors; so going above 120fps is pointless.
On test scenes where the CPU doesn’t have to deal with physics issues, and the scene is programmed to use all the advanced features of the cards, a top end SLI configuration can render scenes at 2560x1600x32bit, on a 30” display, with 8x anti-aliasing, smart filtering, smart texturing, smart shading, and full dynamic lighting; at over 120 frames per second.
In fact some cards in SLI mode can render such a scene at 200+ frames per second… but the best monitor can only display 120fps; and this isn’t a case where having too much data is actually still helpful because you can accurately downsample (as it is when you can compute at a higher resolution than you can display). There is nothing to downsample to, the card simply drops the extra frames on the floor when it pushes out to the display.
Oh and of course I should note, most displays still can’t do 120hz; most LCDs are just 60hz, and most CRTs can only do 75-90hz. So again, unless you have a top end display…
The third way that SLI can increase performance can actually be useful for something other than raw framerate. In SLI Anti-aliasing mode, the cards render alternate frames, and export them back to each other; then one card is used to handle half of the anti-aliasing on a single frame, and the other card to handle the other half.
This works, because Anti-aliasing is the most intensive work the GPU does; involving oversampling each polygon as many as 64 times in as many as 8 locations per vertex. By enabling this mode, you can have one card anti-alias half the locations per vertex, and the other, the other half. This lets you maintain frame rates that you got from 8x anti-aliasing on a single card, all the way up to 32x anti-aliasing (Nvidia just announced a 3 way SLI implementation that enables 64x anti-aliasing); and produces slightly better image quality even at the same level as with single card AA.
Of course the human eye can’t see the difference between 8x and 32x at most resolutions on most sized screens; and in reality none of those scenes could be rendered that highly anyway, because in a real game, the CPU would be too busy with AI and physics information to keep the pipeline full at those resolutions.
At lower resolutions, as I said before, even low end cards can render acceptably, and midrange cards render just as well as the high end ones. There’s really no benefit to SLI at lower resolutions at all.
As you can see, SLI is really just bragging points right now; because the GPU is no longer a limiting factor in any modern game, at any reasonable resolution, level of detail, or level of filtering.
For the last few years, programmers have been focused on improving the physics engines, AIs, and element simulation in their games. This has all put massive strain on the CPU, while graphics have been relatively static (and has also resulted in the development of a physics accelerator card).
Now that DX10 games are coming out, we’re getting a little more demand on the video cards. Games like Crysis for example really do need a high end card to keep up; but still, the limiting factor on games is mostly the CPU; and likely will be until games are programmed to offload physics tasks onto multiple cores or dedicated sub processors (which is starting to happen, slowly).
There are two other things that together make SLI kind of pointless; the 4GB memory barrier, and MMIO.
On 32 bit operating systems, the maximum physical memory addressing space is 4GB. That means that no matter how much memory is in a box, a 32 bit OS will only see 4GB of it, unless you do some weird tricks with memory allocation (which many dedicated servers can do for example).
Now, for most applications and with most hardware, that isn’t an issue; because the maximum memory space that can be assigned to a single application is 3GB anyway (with 1GB allocated to the Operating System Kernel). Unfortunately for us, memory requirements have really been going up on games the last few years, and so now, some games can easily use all 3GB available to them (and all your swap space too).
Ok, again, still not a problem because you’ve got 4gb right?
Well… no.
When the 32 bit memory space was first opened up to personal computers, with the introduction of the Intel 80386 processor in 1986 (Macs had 32 bit processors beginning with the mac plus in 1986; but only got a full 32 bit address space with System 7 in 1991) ; the idea of 4GB in a home computer was thought preposterous. The few people who thought we may eventually see such memory usage, all assumed we would have transitioned to 64 bit systems by then.
In fact, every other platform HAS transitioned to 64 bit; most of them within 10 years of the 386’s introduction (though most still operate in both modes depending on circumstances).
Every platform except Windows.
Even more irritating, is that for about 3 years now, every new processor and chipset has also been 64bit; but the operating system isn’t using that capability.
At any rate, no-one thought we would ever use 4GB of RAM, for as long as 4GB of RAM was a limitation; so they decided to use that “extra” addressing space”.
What they came up with, was called “Memory Mapped Input and Output” or MMIO.
Actually MMIO had been used before; it was used in the 640k days as well, and there were several different workarounds and memory managers that moved the MMIO space into the HMA (high memory areas). This gave you most of your memory below 640k back, and put the device maps into the area above 640k, so that the low memory area was usable by programs.
Well, an identical situation exists today. Every physical device in a computer needs to have a dedicated address space so that software can talk to it. The way MMIO works, is that rather than create a separate addressing space for every device (which would have been a nightmare); programmers simply wrote the BIOS and OS to write to an address in main memory that was mapped to the IO of that device; thus Memory Mapped IO.
When the 32 bit address space became available, it was a natural to use MMIO again. S0 for all these new devices, they allocated addresses starting with the very last byte of the 4GB space, and counted down from there.
Initially this wasn’t a big deal. The MMIO space used by devices on a system was typically less than 64k. Then AGP and USB came along, and devices started requesting a lot more address space. Sometimes as much as 256MB; because the video card needed to map every bit of it’s onboard memory into the main memory addressing space.
Again, not a problem; because no-one could afford more than 2GB of RAM, and no non-server chipsets (which handle the problem by extending their mapping space through mutliple 32bit address tables, and remapping MMIO to a virtual space counting down from 32GB) supported 4GB anyway.
In 2004, four things happened pretty much all at once to make this 4GB issue a potential problem: Graphics cards started regularly coming with 512MB of memory or more (there are now cards with two GPUs on one card, and 1.5GB of memory), Nvidia introduced their new SLI system to work with these higher end cards, Intel and Nvidia both introduced new chipsets and reference motherboards that supported 4GB (or even 8GB) of RAM, and RAM prices crashed through the floor.
At this point, you could put 4GB of RAM in a system for a relatively reasonable amount of money; but when you put that 768MB graphics card in, all of a sudden your operating system could only see and address 3.2GB. Put two of them into a box, and you’re down to 2.5GB.
So you’ve paid for 4GB, and installed 4GB; but you’re only actually able to use 2.5GB?
That’s a problem… but at least there weren’t very any games that needed more than 2GB of ram…
...Until Supreme Commander came out, and it wanted every bit of memory you could give it. In fact, it wanted MORE than you could give it, and tried to steal memory from the OS, thus causing many many crashes.
In Supreme Commander, you are the strategic commander of an armored mech army, which can vary in size from a few units, up to several thousand. Each individual unit is fully rendered and textured, and has it’s own AI. All terrain is also fully rendered. There can be several hundred fully rendered units on screen at any given time.
There is no system today that can handle supreme commander at… really any resolution. 3ghz quad core with 3.2gb of RAM (down from 4gb), and a 768mb 8800GTX is the best you can do, and even then it slows down to under 30fps quite frequently.
Given the amount of 3D work required, one would think an SLI configuration would be perfect for Supreme commander. The problem is the memory hole created by the cards memory.
If you run supreme commander on an SLI system all you’re doing is taking away memory that the CPU needs to handle all the units on the screen.
Today, you have the capability to put 3 gigs worth of video ram into a system; but if you did that in a system with 4GB of RAM, you'd end up with just 1GB usable system memory. Because of the 4gb memory barrier on 32 bit operating systems, you can't add more RAM to compensate. All you can do is decided whether you need more video RAM, or more system RAM; and choose your video memory accordingly.
Honestly, my personal recommendation is to skip the SLI, and cards with more than 512mb of video memory; and keep that system memory available.
It's only very rarely can a single card effectively use more than 512 megs of RAM. In theory, there are some situations where running two 512 meg cards in an SLI configuration; which cuts you down to under 3gb of RAM usable, will give you enough graphical performance increase to justify losing that gig. Most of the time however, you aren't going to see that kind of performance boost.
My personal recommendation is to buy one very good card with 512 megs of video ram (or if you must, up to 768); and pay for the full 4 gigs of system RAM.
Well, I hope this was helpful. I realize theres a lot of obscure detail here, but if you're trying to make an informed decision, this is the absolute minimum of what you need to know. We haven't even gone in to things like chipset differences, sound cards, or processor architecture issues; and those would be another 10,000 words easy. Don't even get me started on Vista vs. XP, or 32bit vs. 64 bit.
Any questions?
Monday, December 17, 2007
Say Hello to "The Behemoth"
Rising above the city, blocking out the noonday sun, it warps the mighty redwoods and it towers over everyone --
That right there would be our new 61" HDTV.
It's BIG.
It's pretty too... but damn it's huge.
Two weeks ago I went through the while process of picking out a new HDTV; and that we'd settled on a three chip rear projection from JVC.
Last week I mentioned that the retailer I was going to purchase the JVC from screwed things up; and I wasn't able to get what I wanted at the price that I wanted.
Well, that problem didn't change the fact that I wanted a three chip rear projector; or how great the JVC was, so I started looking for other purchasing options.
Unfortunately, it was a discontinued model; and it's been replaced by a model that three items as expensive, so I had to find a retailer with new old stock. I ended up talking with about twenty different places, none of them had more than one or two in stock, and they were all at least $1900 plus $200+ shipping for it.
Well, at that price, I might as well have just gone ahead and bought a plasma or LCD.
Our other option for three chip LCoS was to go to Sony... now, I'm not a big Sony fan; in fact I really don't care to deal with them. They make everything as proprietary a pain in the ass as possible, just because they think they can.
Well, I hadn't really seriously researched the Sony; but it was still a three chip rear projector, and it was still a hell of a lot cheaper than a plasma or LCD of equivalent quality, so I decided to research the Sony options. I found that the reviews were generally comparable to the JVC, ad the pricing was from $1600 to $1800 plus shipping depending on the retailer.
Sony had another advantage, in that the TV was available locally at a reasonable price (from Costco, Best Buy, and Sams Club). Not quite an online price, but taking into account shipping etc... the difference was pretty small.
So, we headed out to take a look; ad honestly weren't all that impressed with the Sony. Oh sure, it was good (though I didn't think as good as the JVC), but I didn't like the connectivity options, and the picture was just, OK.
Costco had a couple Panasonic plasmas at blowout pricing, including a 5o" 1080p for $2300; and I was severely tempted. It was a great TV with a great picture; but it was also a little smaller than I wanted, and a bit more than I wanted to spend.
Right next to it though, they had a TV that I hadn't seriously considered.
Panasonic is promoting a new three chip LCD rear projection lineup, using a high power lamp technology similar to LED (it isn't actually an LED; but it doesn't use a conventional filament, or an HID arc like DLPs) that they are calling LiFi. It's a color corrected, and very bright, light source; but instead of the 3,000 to 5,000 hour lamp life of a DLP or LCoS set, the LiFi lamps have a 50,000 hour plus lamp life.
Basically, you never have to change the lamps.
Other than the light source, the tvs are conventional three chip LCD systems. To my mind better than DLP, but not quite as good as LCoS. THe biggest practical difference is that LCoS has a bit better black level, and a bit better off axis consistency.
Anyway, the 61" Panasonic PT-61LCZ70 was sitting there right next to the plasmas, and it looked pretty damned good. In fact, it looked a fair bit better than the Sony; especially from 7-9 feet back (my seating range).
The kicker though, was the pricing. List price on these is $2000, plus $300 for a stand. Costco is clearing these out for $1250 including the stand.
Well, we like the picture, and we certainly liked the price... plus it's Costco, who has the worlds best return policy; so what the hell, we took it home last week. OH and since I didnt need the stand, I sold it on craigslist the next day.
Now, the reviews on the set are mixed. They all rated the picture as excellent, with great quality and great connectivity options; but that color accuracy and black level were poor without calibration. Also pre-production samples apparently had a problem with lamp color cast stability (the color would change over time as the lamp wore in).
Well, I'll agree, out of the box with default settings, the color accuracy was poor. Honestly, if I had just plugged the TV in and watched stuff, without figuring out how to set the TV up proplery, it would have looked horrible and I would have been very dissatisfied.
Thankfully, I knew what to expect; and with a couple hours worth of effort, experimentation, and a calibration DVD; I was able to get black levels, and color accuracy as good as any other RPTV I've tried. There is still a little bit too much saturation in the greens and blues, but when I have the system professionally calibrated I'm sure they'll be able to fix that.
With the right display settings, I was able to get excellent brightness, sharpness, and color; with very deep, satisfying blacks without loss of shadow detail. I will say it took a hell of a lot of fiddling around to find the right combination of settings however; probably more time than the reviewers who mentioned poor black levels were willing to take. The TV has both an automatic, and a manual lmap and light level adjustment, with five different tweaks to the overall light output; and that alone took the most effort to figure out and set properly. Not coincidentally, it is also the most important setting or getting dark blacks without loss of detail.
Also important to note, each input, and each picture mode had to be set up individually (however at least each input and setting has it's own setting memory). This was a real pain in the butt without a question.
Now that said, the fact that there WERE all those controls, user accessible, and relatively easy to understand, is a good thing; as is the fact that all the settings were individual to an input and mode. A lot of televisions hide adjustments away in service menus only accessible to technicians etc; and some only allow one setting for the whole set, or the whole mode. HDTV input sources can vary greatly from SD-DV, and from HD-DVD sources; and all three really do require their own optimzation.
One irritation, that is unfortunately not specific to this TV; is that viewiing different sources form the same input; like say SDTV and HDTV from your set top box; also requires two different sets of settings to look as good as possible. You can compromise and get very good picture; but for the BEST picture, you need to readjsut whenever you switch from a standard def TV show to a high def one. THis is obviously inconvenient, and somewhat impractical. YOu could do it by setting different modes on the same input with different optimization settings; but only the custom mode (one per input) allows you full control over all your picture options, and thus can produce the absolute best quality picture.
As I said, it's an irritation, but it's one common to most every HDTV, and is inherent to the nature of the input sources.
Just as an example of picture quality; here is an un-enhanced (it's only been resized) photograph of my television screen. The slight noise and distortion is from my camera and lens, not the screen.
Football looks spectacular, movies look spectacular, HDTV looks great, regular DVDs upconverted look nearly as good as HD... and SD is SD. It will upconvert native signals and it does a decent job of it, but garbage in garbage out; and a good quality TV makes all the flaws in SD signals readily apparent.
Watching Cars (upconverted) or Shrek 3 (HD-DVD) giving a fully digital path for a fully digital image, the results are startling. The clarity is amazing, but because it is an RPTV and not an LCD, you never get that "artificial" or "hyper real" look. Everything looks natural and slightly filmlike.
Overall we're very happy with the set; and for the money I think we got excellent value.
Oh and did I mentioned it's damn big? I've slept on beds smaller than this thing.
Sunday, December 16, 2007
Just watched Star Wars Episode IV in HD...
The special edition was on HBO HD.
It looked gorgeous (though it's a bit odd seeing the mold lines and brush strokes on the props), but let me just say:
... I hate George Lucas; truly I do.
It looked gorgeous (though it's a bit odd seeing the mold lines and brush strokes on the props), but let me just say:
HAN SHOT FIRST!
... I hate George Lucas; truly I do.
Thursday, December 13, 2007
Beef - It's what's for dinner
For the next six months or so...
That my friends, is a pic of about 200lbs of free range, grass fed, prime, dry aged beef (it's been blast frozen).
We have a share in a local cattle co-op, delivered as a 1/2 steer trimmed, dressed, aged, and prepped. We pay $1.50 per pound live weight, which typically ends up at about $3.75 to $4.25 a pound delivered weight (dependent on butchering, aging etc..); picked up from the packer.
Right there you see the 24lbs of rib eyes, and 24lbs of T-Bones (the 27lbs of sirloins are right behind the t-bones). No porterhouses cut on this run unfortunately. We do have a rib eye roast and a couple eye of round roasts though... oh and about 15lbs of ribs.
...and that would be the 55lbs of hamburger. Oh and there's 40lbs of soup bones (shanks, tail etc...) not shown in these pictures (or counted in the prepped weight).
All in all, it's about $2000 worth of beef at wholesale butcher prices, or about twice that (prime dry ages steak runs $25-$30 a pound, roasts about $15 a pound) at supermarket retail (if your supermarket even HAS prime beef).
More, better beef, for less money... Not bad that.
That my friends, is a pic of about 200lbs of free range, grass fed, prime, dry aged beef (it's been blast frozen).
We have a share in a local cattle co-op, delivered as a 1/2 steer trimmed, dressed, aged, and prepped. We pay $1.50 per pound live weight, which typically ends up at about $3.75 to $4.25 a pound delivered weight (dependent on butchering, aging etc..); picked up from the packer.
Right there you see the 24lbs of rib eyes, and 24lbs of T-Bones (the 27lbs of sirloins are right behind the t-bones). No porterhouses cut on this run unfortunately. We do have a rib eye roast and a couple eye of round roasts though... oh and about 15lbs of ribs.
...and that would be the 55lbs of hamburger. Oh and there's 40lbs of soup bones (shanks, tail etc...) not shown in these pictures (or counted in the prepped weight).
All in all, it's about $2000 worth of beef at wholesale butcher prices, or about twice that (prime dry ages steak runs $25-$30 a pound, roasts about $15 a pound) at supermarket retail (if your supermarket even HAS prime beef).
More, better beef, for less money... Not bad that.
Sick as a damn dog
I've had a low grade sinus cold for about a week; and the suddenly in the last two days its gone into a full blown flaring infection.
Joy.
It's been raining and chilly here for about two weeks; and the humidity is so high I'm actually getting condensation on my tile floors, and on some of my walls. Combine that with the dust and mold issues we've got here in Phoenix, and you can see it's a situation ripe for environmental aggravation; turning a minor sinus cold into a major league pain in the ass (and head).
Any way, that' why so little content the last few; and likely the next couple as well. I have another TV post to write, a camera post to finish up, and a PC post to finish up... all paeans to gadgetary consumerism; but I doubt all three will be out the door this week.
Joy.
It's been raining and chilly here for about two weeks; and the humidity is so high I'm actually getting condensation on my tile floors, and on some of my walls. Combine that with the dust and mold issues we've got here in Phoenix, and you can see it's a situation ripe for environmental aggravation; turning a minor sinus cold into a major league pain in the ass (and head).
Any way, that' why so little content the last few; and likely the next couple as well. I have another TV post to write, a camera post to finish up, and a PC post to finish up... all paeans to gadgetary consumerism; but I doubt all three will be out the door this week.
Tuesday, December 11, 2007
Sonsabitches
So, yaknow that really spectacular deal I was getting on that really spectacular TV?
Well, it turns out when the seller said they had one remaining in stock, and took my money, and confirmed a 7 day delivery time... They were lying.
See, they didn't have any remaining in stock; and since the model is discontinued, they won't be getting any more.
I'm not so much irritated at not getting the deal there; what I'm really pissed about is that they KNEW they couldn't fulfill my order on Friday; but they neither called, nor emailed me. I had to call when I got nervous about not receiving my delivery scheduling email.
It seems that rather than cancel my order, they processed it, and as the warehouse didn't have any remaining stock, the expediter put it on backorder status; even though the item is not backorderable.
If I hadn't called in, my order would have just sat there on backorder for up to 90 days.
Anyway, other options are currently being investigated.
UPDATE: Ok, several people have asked me to publish the name of the retailer who pulled this.
The culprit as it were, was Beach Camera; who is also Beach Video, Beach Photo, and BuyDig.com (if you call them directly, they answer as BuyDig.com).
I have to say in their defense, I have dealt with them before, and this was the first trouble I've had. They are generally very highly rated. Also, once I did call, they refunded my money within 24 hours (it was processed the same day, and credited back yesterday in fact).
Well, it turns out when the seller said they had one remaining in stock, and took my money, and confirmed a 7 day delivery time... They were lying.
See, they didn't have any remaining in stock; and since the model is discontinued, they won't be getting any more.
I'm not so much irritated at not getting the deal there; what I'm really pissed about is that they KNEW they couldn't fulfill my order on Friday; but they neither called, nor emailed me. I had to call when I got nervous about not receiving my delivery scheduling email.
It seems that rather than cancel my order, they processed it, and as the warehouse didn't have any remaining stock, the expediter put it on backorder status; even though the item is not backorderable.
If I hadn't called in, my order would have just sat there on backorder for up to 90 days.
Anyway, other options are currently being investigated.
UPDATE: Ok, several people have asked me to publish the name of the retailer who pulled this.
The culprit as it were, was Beach Camera; who is also Beach Video, Beach Photo, and BuyDig.com (if you call them directly, they answer as BuyDig.com).
I have to say in their defense, I have dealt with them before, and this was the first trouble I've had. They are generally very highly rated. Also, once I did call, they refunded my money within 24 hours (it was processed the same day, and credited back yesterday in fact).
Monday, December 10, 2007
My plan for global home theater domination is nearly complete
Bwahahahahaha...
Oh I'm sorry, did I say that out loud?
So a few weeks ago, we bought an HD-DVD player. A couple weeks after that we bought an HD TiVO, and had our little fun with our HD cable service.
Of course the problem is, we don't have an HDTV yet. We were planning on buying one this Christmas, but had to hold off because my contract extension hadn't been confirmed yet.
Well, it was confirmed last week; and as of last Thursday this little... or rather this quite large puppy, is rolling it's way towards our happy home:
That my friends, is the JVC 56" HD-ILA (LCoS) rear projection television, model number HD-56fh97.
But wait batman, I thought you were going to buy a Sharp Aquos LCD? Was ist los? Rear Projection? Isn't that like, totally '80s?
Nay friends; with LCoS technology, rear projection is in fact STILL the best choice for real home theater, and HD sports; at least outside of a dedicated theater room (where of course you want a real front projector).
Now, I've been on the LCD path for a while; and I was pretty much decided on the 52" Sharp Aquos 52D92U; which is generally the highest rated flat panel television you can get for under $5000.
It's gorgeous. It's also rather expensive, with street prices in the $3,000 to $3500 range from reputable sellers; which meant that I was going to have to wait at least a few more months if I wanted to pick it up.
One other problem though... I love the picture on this Sharp... BUT... Although this very high end $3500 LCD, the top of Sharps range in this size, has a 120hz refresh rate, and 4ms response time... There's STILL perceptible motion blur in fast action and text on screen, oversaturation of primary colors, and visible grayness in black areas.
See, my primary interest in watching an HDTV, is for spectacular movies; with a secondary task of giving me spectacular football.
It just so happens that movies with a lot of dark contrast (like most sci-fi, fantasy, and action movies - my favorite genres), and fast motion against detailed, colorful backgrounds (like say, football players running down field, and game information text) are the hardest images to display well; and the areas where LCD's have the most difficulty.
That's not to say the Aquos doesn't do a good job, but like literally every LCD yet made; these limitations are a part of the technology. Even the best rated LCDS, the new $8000 Pioneer sets, have some level of motion blurring, and very slightly greyish blacks.
So anyway, knowing that I would have several months to wait; I've been doing research, and reading reviews and recommendations from Sound and vision, home theater mag, ultimate AV, Consumer Reports and others.
Right now, your options for large screen HDTVs are front projection, Plasma, LCD, DLP, and LCoS (front projectors generally use one of the last three technologies).
Front projectors are used primarily in dedicated theater room setups, with very large silvered screens. They produce an image in the same way (or a similar way) that modern digital cinema projectors do; and there is no better way to experience a movie in your home, especially if you want a screen above 70". Unfortunately, they are quite expensive, somewhat noisy, and like all projection systems (including rear projection TVS by the way), in that they use powerful lamps, which will eventually burn out (typically speaking ever 3-5 years or so), and must be replaced, at the cost of $250-$500 per lamp (depending on the exact set). Finally, they require total darkness to function properly, so you really need to use them in a dedicated theater room.
Yes... that's right out for us I'm afraid.
Plasmas produce the best picture of the other options available; by individually illuminating tiny plasma cells in the screen, you can produce much truer blacks than with an LCD; while still maintaining sharp and bright colors, and a very bright picture overall. Unfortunately they are quite expensive in the larger screen sizes, they are susceptible to burn in (yes, still, though not as bad as they once were), and they have very reflective screens that are not great for use in areas with poor light control....
... like say, my living room; which has one mostly glass wall, and a semi-open back wall with the rest of the house lights shining through it. We generally watch movies in the dark, but the kids use that TV in the daytime, and we need to have something that you can easily see in uneven lighting conditions.
Oh and big, thin, flat, easily broken glass screens mixed with two kids, two cats, and two dogs... It's a concern.
Ok, so front projection and plasma are out, and LCD has issues and costs as we talked about already (and will some more in jsut a bit)... how about rear projection?
A lot of folks think of rear projection tvs as the huge, ugly, unreliable, not very sharp, or contrasty, or colorful, or bright big screen tvs they got used to in the 80s.
Not even close.
Todays DLP and LCoS sets are COMPLETELY different technology, and can produce images every bit as good as the direct view technologies of plasma and LCD. The best part is though, they do it at a fraction of the cost.
DLP is actually a very interesting technology. It uses one monochrome reflective microdisplay at half the horizonatl resolution of the screen (so a 1080p display is actually at 960x1080 instead of 1920x1080) and then uses millions of tiny, individually aimable micro-mirrors, and a high speed rotating color wheel to produce the image at full resolution and in color (a process which they call wobulation... no, seriously, they do).
DLP, like every other display technology, has a couple problems. Though it generally produces good blacks, it has a susceptibility to stray light from the colorwheel and millions of tiny mirrors; so they aren't as good as plasmas. Also that same stray light issue can cause some color smearing, or color rainbows to appear around the edges of objects for some viewers (especially those with astigmatism or who've had Lasik).
The wife has a severe astigmatism, and I'm planning on Lasik...
Right then... LCD it is; I'll just have to bite the bullet on price and motion blur...
Or maybe not...
In all my research the same theme kept popping up: Although serious videophiles didn't much care for DLP, they actually preferred the new 3 chip rear projection systems (Liquid Crystal On Silicon LCoS - sold as HD-ILA by JVC, and SXRD by Sony) to all but the very top end of plasma and LCD TVs.
So I started doing some more targeted research.
LCoS rear projection TVs reflect the projection lamp off of three (appx. 2" wide) microdisplays, to project all three primary color images simultaneously through a collimator lens. Each microdisplay is independently at full 1920x1080 resolution at 120hz or 150hz (or any number that is evenly divisible by 30 actually) progressive scan; and there are no wobulating mirrors or color wheels to reduce resolution, soften the image, produce rainbow artifacts, or produce a screen door effect.
This produces a clearer, sharper, brighter image than other projection technologies, with fewer image artifacts, and better color accuracy. Also, because there is no color wheel and no micro-mirrors, reliability and lifespan are increased, and noise and warm-up time are decreased as compared to DLP.
Also, because of the nature of the technology, the projection screens for LCoS use a much smaller grain structure, and present a very natural appearance as compared to DLP (or LCD for that matter, which can appear to be TOO sharp, and actually worsen the appearance of film based source programming).
Specifications on top end LCoS TVs are comparable to the highest end plasma and LCD tvs, with typical contrast ratios in excess of 3000:1; and contrast ratios when using auto iris controls to optimize black levels, as high as 10,000:1. LCoS sets can have brightness levels that are naturally variable (as film is, and direct view LCDs are not) from .01 foot lamberts, to over 100fl (over 10,000 to one); with neutral D6500K images from 27fl to 48fl (video reference standard is 30fl) depending on adjustments.
Most LCDs are naturally much brighter than 30fl at D6500K neutral gray (as in 45fl or more). This looks better in a brightly lit room, especially under fluorescents light; but can cause eyestrain and oversaturation of colors in darkness. Also, most LCDs cant produce an image as dark as .01fl, nor as bright as 100fl; because they are transmissive. In a direct view LCD, the light from the backlight is transmitted through the pixels; and light output is dependent on just how opaque, or how transparent, the pixels can get.
This is why direct view LCDs still can't quite show true black as well as other technologies can (though this is changing with dynamic LED based backlighting). They shine a VERY bright backlight through the screen to ensure bright whites and colors, but then depend on that same screen they shine through for colors, to block the light out completely for blacks.
Obviously, this is somewhat imperfect, with transmissive technologies typically having black levels 3 to 5 times higher than reflective technologies; which themselves are 5 to 10 times (or at least 5-10 times within the measuring tolerance) higher than direct emission technologies (such as CRT). Of course these are all still very low levels of light; but in scenes which have bright whites, and dark blacks simultaneously, this difference can be obvious.
LCoS uses a reflected light technology; where the projector lamp is bounced off the microdisplay chip to produce the image; and therefore the darkness of blacks isn't dependent on the light blocking ability of a semi translucent pixel; but rather the lack of light reflection off a black pixel. It still doesn't produce as true a black as a reference grade CRT (which have contrast ratios too high for the sensitivity of most measuring equipment) or the best plasmas (which have typical contrast ratios of as high as 15,000:1); but it's pretty close.
You do lose some of the saturation and brightness of the very brightest colors as compared to the best plasmas and LCDs; but this reflective display results in a more natural rendering; because the light is delivered to your eyes in a way closer to that of natural vision (where light is reflected off objects, not transmitted through them, or emitted from them). LCDs and plasmas can often have a "hyper-reality" look to them, because of that transmissive image production; where it feels like the picture is actually being projected into your eyes directly.
Though... some people actually prefer that hyper reality look. It can produce an almost 3D effect, and when watching high end animation (try Disneys "Cars" for an example), the impact is truly spectacular.
At this point, I decided I wanted to take a look. I visited a couple of Best Buys, and Frys; and saw the LCDs, plasma, DLPs, and LCoS sets I was generally interested in; both in bright floor environments, and in darkened viewing rooms.
There really is no substitute for doing this by the way. You can never really tell what a set is going to look like until you view it in variable lighting and viewing angle conditions.
Although I wasn't able to adjust the picture settings on any of the TVs I looked at (typically floor demo TVs are adjusted to ridiculously overbright, oversaturated, and oversharpened modes; because they make people say "ooooh pretty" on the showroom floor) I definitely got a good idea of the picture characteristics of the technologies, and specific models I was interested in.
What I found actually surprised me.
I went into this thinking I would greatly prefer the LCDs and plasmas over the Rear Projection TVs; and , for the DLPs, I was correct (DLPs all look slightly fuzzy and grainy to me, and tend to have poor color saturation to my mind) ; but the HD-ILA and SXRD sets I looked at were damn good.
Initially, I preferred the bright static images produced by the LCDs and plasmas on the brightly lit sales floor. Once I looked at football, and darker movies, in a darkened viewing room however; my preferences changed, and I thought the LCoS sets were producing a significantly better image than the LCDs, and all but the most expensive (as in well over $5000) plasmas.
And of course, although they were significantly more expensive than DLPs of the same size; the LCoS sets were all several hundred, to over $1000 cheaper than the equivalent LCD or Plasma options in the same range of price and quality.
So it was back again to research; this time looking at specific models.
Now size....
The biggest mistake people make with HDTVs is to buy too small; because they’re thinking of what is a comfortable viewing distance and angle for their old standard definition 4:3 interlaced TV.
There’s two ways to calculate optimum viewing distance. There’s the average angle rule of thumb, and theres the visual acuity calculation based method.
Using the average angle rule, with a 16:9 HDTV, optimal viewing distance is about 1.9 times the diagonal screen size; with the generally acceptable range being from 1.5 to 2.5 times the diagonal screen measurement.
Initially, the wife was ADAMANT that we buy something between 42” and 46”. She was absolutely convinced that anything larger would be “way too big”; because a 42” 16:9 TV looked “about the same size” as our 32” 4:3 TV.
I then showed her what HDTVs looked like in viewing environments, especially as compared to our current 32” 4:3 SDTV; and she started to listen to what I was saying, rather than her idea of what “huge” was.
Our living room is 28ft by 14ft, with a 9ft distance between our televisions position, and our primary seating positions. 9ft is 108”, so to keep that 1.5-2.5 screen size ratio, acceptable would be anything from a 44” to 72”; and the ideal would be 57”.
Based on this rough calculation, I decided to look for televisions between 46” and 61”, with the ideal range being 52"-58".
For a little more precision, you can use the visual acuity rule. To do so, take the actual height of the screen, and multiply it by 3.2 to get the optimal viewing distance for 1080p, or 4.8 for the optimal viewing distance for 720p (optimal for 480i is about 6.4).
Why height? Mostly because it’s what we perceive to be the most important dimension of an image. We judge our scale of images based on how tall they are, not by how wide. If you take a picture of a man six feet tall, and keep him centered in the frame; you can extend the sides of that frame out as far as you want, the man will still “look normal”. Make the frame 30 feet wide, and you will still perceive the mans size as “normal”. Take that same picture, and extend the top and bottom, leaving the man the same; you will perceive the man as looking smaller and smaller. It’s just the way we humans are visually wired.
When we were all on tube TVs, the most popular “living room” sizes were 27” and 32”. A lot of us still have 32” tv’s sitting in our main viewing area. The actual screen height of a 32” 4:3 TV, is equal to about 20”, for an optimal viewing distance in SD, of 128” or 10.75 feet; which is typical of many living room viewing distances.
With a 16:9 TV though, as I said, our perception of size is different. A 16:9 TV displaying an SD signal would need to be 42” diagonal to “look as big” as a 32” 4:3 TV, and to have the same optimum viewing distance at the same resolution.
Of course, as you increase the resolution, the optimal viewing distance goes down. Since most of us don’t much want to change our viewing position (living rooms not notable for having mobile walls and such) that means going up in screen size.
We were very happy with our viewing position at 9ft, on our current TV; and we don't want to move our furniture; so we needed to optimize for that distance.
The actual screen height of a 56” 16:9 TV is 27.45”; so the optimal viewing distance for 1080p/i content is 88” (7’4"), and the optimal viewing distance for 720p is 132” (11'). If we average the two, we come up with 110” or…
Well, whadya know… just a bit over 9 feet.
So, now I had a specific size range I was interested in, 46"-61" with an ideal of 52" to 58"; and I had a set of minimum requirements
1. True 1080p with 1080p input over HDMI, and native 480i/p and 720i/p input support
2. At least two HDMI ports
3. Good quality of upscaling, de-interlacing, and 3:2 pulldown
4. Auto iris, and manual iris and gamma controls (giving a dynamic contrast ration of 10,000:1)
5. D6500K color corrected lighting with a native contrast ratio of at least 3000:1
6. PC input preferred
7. A narrow bezel design, with unobtrusive speakers, black preferred
8. Useful color correction and calibration controls
9. A large variety of inputs
10. A cable card slot
This list of requirements left me with about a dozen models to sort through; when I found exactly what I wanted.
The reviews on the JVC TheaterPro series were all uniformly excellent. The three models in the series 56", 61", and 70" were all essentially identical but for screen size; and in every case I found was either the top rated, or second from top rated television in their size class (though they all noted that the sets needed to be calibrated out of the box for best color performance).
The TheaterPros were actually originally designed as reference monitors; and were sold under JVCs pro studio line starting in 2004. In 2006, they added the frilly consumer features, and announced them as consumer models as the FH and FN96 models (the FH, which I've purchased, has a black bezel, and an RS232 serial port, for professional home automation and home theater integration. The FN does not.). The updated 97 models models were announced in late 2006, and introduced in early 2007; and most of the major AV mags rated them as their best pick in class, or at least in their top five in class for the year.
What really sealed the deal for me though?
I started looking for prices. The 56" FH97s were announced at $3299 list; and went down to $2699 street. Then, this past September, JVC introduced a new 58" ultraslim (only 11" thick, and wallmountable) model that was otherwise technically identical to the 56". At that point, they decided to discontinue the 56, and reduced the MSRP to $2699 for closeout.
Well, when they discontinued it, street prices dropped to $1800 or so; but I managed to find one retailer (a top rated retailer on bizrate, paypal seller ratings, and resellerratings) blowing it out for $1299 with free shipping.
At that point, I was waiting for my contract renewal to come through to pull the trigger; and I was just hoping they didn't sell out before it happened. Last Thursday was finally the day, and it turned out none too soon, because it was the very last unopened one they had in stock (never take a floor model LCD, plasma, or projection TV. They lose significant life out on the floor).
In theory, the set will be here this Friday; though I haven't been able to confirm a dropoff time yet. I can't wait to see the Patriots crush the Jets in 56" 1080p glory.
Just two final steps... a new receiver (my current system doesn't switch HDMI), and new speakers to go with it (of course those final steps are going to be more expensive than everything else combined).
Oh I'm sorry, did I say that out loud?
So a few weeks ago, we bought an HD-DVD player. A couple weeks after that we bought an HD TiVO, and had our little fun with our HD cable service.
Of course the problem is, we don't have an HDTV yet. We were planning on buying one this Christmas, but had to hold off because my contract extension hadn't been confirmed yet.
Well, it was confirmed last week; and as of last Thursday this little... or rather this quite large puppy, is rolling it's way towards our happy home:
That my friends, is the JVC 56" HD-ILA (LCoS) rear projection television, model number HD-56fh97.
But wait batman, I thought you were going to buy a Sharp Aquos LCD? Was ist los? Rear Projection? Isn't that like, totally '80s?
Nay friends; with LCoS technology, rear projection is in fact STILL the best choice for real home theater, and HD sports; at least outside of a dedicated theater room (where of course you want a real front projector).
Now, I've been on the LCD path for a while; and I was pretty much decided on the 52" Sharp Aquos 52D92U; which is generally the highest rated flat panel television you can get for under $5000.
It's gorgeous. It's also rather expensive, with street prices in the $3,000 to $3500 range from reputable sellers; which meant that I was going to have to wait at least a few more months if I wanted to pick it up.
One other problem though... I love the picture on this Sharp... BUT... Although this very high end $3500 LCD, the top of Sharps range in this size, has a 120hz refresh rate, and 4ms response time... There's STILL perceptible motion blur in fast action and text on screen, oversaturation of primary colors, and visible grayness in black areas.
See, my primary interest in watching an HDTV, is for spectacular movies; with a secondary task of giving me spectacular football.
It just so happens that movies with a lot of dark contrast (like most sci-fi, fantasy, and action movies - my favorite genres), and fast motion against detailed, colorful backgrounds (like say, football players running down field, and game information text) are the hardest images to display well; and the areas where LCD's have the most difficulty.
That's not to say the Aquos doesn't do a good job, but like literally every LCD yet made; these limitations are a part of the technology. Even the best rated LCDS, the new $8000 Pioneer sets, have some level of motion blurring, and very slightly greyish blacks.
So anyway, knowing that I would have several months to wait; I've been doing research, and reading reviews and recommendations from Sound and vision, home theater mag, ultimate AV, Consumer Reports and others.
Right now, your options for large screen HDTVs are front projection, Plasma, LCD, DLP, and LCoS (front projectors generally use one of the last three technologies).
Front projectors are used primarily in dedicated theater room setups, with very large silvered screens. They produce an image in the same way (or a similar way) that modern digital cinema projectors do; and there is no better way to experience a movie in your home, especially if you want a screen above 70". Unfortunately, they are quite expensive, somewhat noisy, and like all projection systems (including rear projection TVS by the way), in that they use powerful lamps, which will eventually burn out (typically speaking ever 3-5 years or so), and must be replaced, at the cost of $250-$500 per lamp (depending on the exact set). Finally, they require total darkness to function properly, so you really need to use them in a dedicated theater room.
Yes... that's right out for us I'm afraid.
Plasmas produce the best picture of the other options available; by individually illuminating tiny plasma cells in the screen, you can produce much truer blacks than with an LCD; while still maintaining sharp and bright colors, and a very bright picture overall. Unfortunately they are quite expensive in the larger screen sizes, they are susceptible to burn in (yes, still, though not as bad as they once were), and they have very reflective screens that are not great for use in areas with poor light control....
... like say, my living room; which has one mostly glass wall, and a semi-open back wall with the rest of the house lights shining through it. We generally watch movies in the dark, but the kids use that TV in the daytime, and we need to have something that you can easily see in uneven lighting conditions.
Oh and big, thin, flat, easily broken glass screens mixed with two kids, two cats, and two dogs... It's a concern.
Ok, so front projection and plasma are out, and LCD has issues and costs as we talked about already (and will some more in jsut a bit)... how about rear projection?
A lot of folks think of rear projection tvs as the huge, ugly, unreliable, not very sharp, or contrasty, or colorful, or bright big screen tvs they got used to in the 80s.
Not even close.
Todays DLP and LCoS sets are COMPLETELY different technology, and can produce images every bit as good as the direct view technologies of plasma and LCD. The best part is though, they do it at a fraction of the cost.
DLP is actually a very interesting technology. It uses one monochrome reflective microdisplay at half the horizonatl resolution of the screen (so a 1080p display is actually at 960x1080 instead of 1920x1080) and then uses millions of tiny, individually aimable micro-mirrors, and a high speed rotating color wheel to produce the image at full resolution and in color (a process which they call wobulation... no, seriously, they do).
DLP, like every other display technology, has a couple problems. Though it generally produces good blacks, it has a susceptibility to stray light from the colorwheel and millions of tiny mirrors; so they aren't as good as plasmas. Also that same stray light issue can cause some color smearing, or color rainbows to appear around the edges of objects for some viewers (especially those with astigmatism or who've had Lasik).
The wife has a severe astigmatism, and I'm planning on Lasik...
Right then... LCD it is; I'll just have to bite the bullet on price and motion blur...
Or maybe not...
In all my research the same theme kept popping up: Although serious videophiles didn't much care for DLP, they actually preferred the new 3 chip rear projection systems (Liquid Crystal On Silicon LCoS - sold as HD-ILA by JVC, and SXRD by Sony) to all but the very top end of plasma and LCD TVs.
So I started doing some more targeted research.
LCoS rear projection TVs reflect the projection lamp off of three (appx. 2" wide) microdisplays, to project all three primary color images simultaneously through a collimator lens. Each microdisplay is independently at full 1920x1080 resolution at 120hz or 150hz (or any number that is evenly divisible by 30 actually) progressive scan; and there are no wobulating mirrors or color wheels to reduce resolution, soften the image, produce rainbow artifacts, or produce a screen door effect.
This produces a clearer, sharper, brighter image than other projection technologies, with fewer image artifacts, and better color accuracy. Also, because there is no color wheel and no micro-mirrors, reliability and lifespan are increased, and noise and warm-up time are decreased as compared to DLP.
Also, because of the nature of the technology, the projection screens for LCoS use a much smaller grain structure, and present a very natural appearance as compared to DLP (or LCD for that matter, which can appear to be TOO sharp, and actually worsen the appearance of film based source programming).
Specifications on top end LCoS TVs are comparable to the highest end plasma and LCD tvs, with typical contrast ratios in excess of 3000:1; and contrast ratios when using auto iris controls to optimize black levels, as high as 10,000:1. LCoS sets can have brightness levels that are naturally variable (as film is, and direct view LCDs are not) from .01 foot lamberts, to over 100fl (over 10,000 to one); with neutral D6500K images from 27fl to 48fl (video reference standard is 30fl) depending on adjustments.
Most LCDs are naturally much brighter than 30fl at D6500K neutral gray (as in 45fl or more). This looks better in a brightly lit room, especially under fluorescents light; but can cause eyestrain and oversaturation of colors in darkness. Also, most LCDs cant produce an image as dark as .01fl, nor as bright as 100fl; because they are transmissive. In a direct view LCD, the light from the backlight is transmitted through the pixels; and light output is dependent on just how opaque, or how transparent, the pixels can get.
This is why direct view LCDs still can't quite show true black as well as other technologies can (though this is changing with dynamic LED based backlighting). They shine a VERY bright backlight through the screen to ensure bright whites and colors, but then depend on that same screen they shine through for colors, to block the light out completely for blacks.
Obviously, this is somewhat imperfect, with transmissive technologies typically having black levels 3 to 5 times higher than reflective technologies; which themselves are 5 to 10 times (or at least 5-10 times within the measuring tolerance) higher than direct emission technologies (such as CRT). Of course these are all still very low levels of light; but in scenes which have bright whites, and dark blacks simultaneously, this difference can be obvious.
LCoS uses a reflected light technology; where the projector lamp is bounced off the microdisplay chip to produce the image; and therefore the darkness of blacks isn't dependent on the light blocking ability of a semi translucent pixel; but rather the lack of light reflection off a black pixel. It still doesn't produce as true a black as a reference grade CRT (which have contrast ratios too high for the sensitivity of most measuring equipment) or the best plasmas (which have typical contrast ratios of as high as 15,000:1); but it's pretty close.
You do lose some of the saturation and brightness of the very brightest colors as compared to the best plasmas and LCDs; but this reflective display results in a more natural rendering; because the light is delivered to your eyes in a way closer to that of natural vision (where light is reflected off objects, not transmitted through them, or emitted from them). LCDs and plasmas can often have a "hyper-reality" look to them, because of that transmissive image production; where it feels like the picture is actually being projected into your eyes directly.
Though... some people actually prefer that hyper reality look. It can produce an almost 3D effect, and when watching high end animation (try Disneys "Cars" for an example), the impact is truly spectacular.
At this point, I decided I wanted to take a look. I visited a couple of Best Buys, and Frys; and saw the LCDs, plasma, DLPs, and LCoS sets I was generally interested in; both in bright floor environments, and in darkened viewing rooms.
There really is no substitute for doing this by the way. You can never really tell what a set is going to look like until you view it in variable lighting and viewing angle conditions.
Although I wasn't able to adjust the picture settings on any of the TVs I looked at (typically floor demo TVs are adjusted to ridiculously overbright, oversaturated, and oversharpened modes; because they make people say "ooooh pretty" on the showroom floor) I definitely got a good idea of the picture characteristics of the technologies, and specific models I was interested in.
What I found actually surprised me.
I went into this thinking I would greatly prefer the LCDs and plasmas over the Rear Projection TVs; and , for the DLPs, I was correct (DLPs all look slightly fuzzy and grainy to me, and tend to have poor color saturation to my mind) ; but the HD-ILA and SXRD sets I looked at were damn good.
Initially, I preferred the bright static images produced by the LCDs and plasmas on the brightly lit sales floor. Once I looked at football, and darker movies, in a darkened viewing room however; my preferences changed, and I thought the LCoS sets were producing a significantly better image than the LCDs, and all but the most expensive (as in well over $5000) plasmas.
And of course, although they were significantly more expensive than DLPs of the same size; the LCoS sets were all several hundred, to over $1000 cheaper than the equivalent LCD or Plasma options in the same range of price and quality.
So it was back again to research; this time looking at specific models.
Now size....
The biggest mistake people make with HDTVs is to buy too small; because they’re thinking of what is a comfortable viewing distance and angle for their old standard definition 4:3 interlaced TV.
There’s two ways to calculate optimum viewing distance. There’s the average angle rule of thumb, and theres the visual acuity calculation based method.
Using the average angle rule, with a 16:9 HDTV, optimal viewing distance is about 1.9 times the diagonal screen size; with the generally acceptable range being from 1.5 to 2.5 times the diagonal screen measurement.
Initially, the wife was ADAMANT that we buy something between 42” and 46”. She was absolutely convinced that anything larger would be “way too big”; because a 42” 16:9 TV looked “about the same size” as our 32” 4:3 TV.
I then showed her what HDTVs looked like in viewing environments, especially as compared to our current 32” 4:3 SDTV; and she started to listen to what I was saying, rather than her idea of what “huge” was.
Our living room is 28ft by 14ft, with a 9ft distance between our televisions position, and our primary seating positions. 9ft is 108”, so to keep that 1.5-2.5 screen size ratio, acceptable would be anything from a 44” to 72”; and the ideal would be 57”.
Based on this rough calculation, I decided to look for televisions between 46” and 61”, with the ideal range being 52"-58".
For a little more precision, you can use the visual acuity rule. To do so, take the actual height of the screen, and multiply it by 3.2 to get the optimal viewing distance for 1080p, or 4.8 for the optimal viewing distance for 720p (optimal for 480i is about 6.4).
Why height? Mostly because it’s what we perceive to be the most important dimension of an image. We judge our scale of images based on how tall they are, not by how wide. If you take a picture of a man six feet tall, and keep him centered in the frame; you can extend the sides of that frame out as far as you want, the man will still “look normal”. Make the frame 30 feet wide, and you will still perceive the mans size as “normal”. Take that same picture, and extend the top and bottom, leaving the man the same; you will perceive the man as looking smaller and smaller. It’s just the way we humans are visually wired.
When we were all on tube TVs, the most popular “living room” sizes were 27” and 32”. A lot of us still have 32” tv’s sitting in our main viewing area. The actual screen height of a 32” 4:3 TV, is equal to about 20”, for an optimal viewing distance in SD, of 128” or 10.75 feet; which is typical of many living room viewing distances.
With a 16:9 TV though, as I said, our perception of size is different. A 16:9 TV displaying an SD signal would need to be 42” diagonal to “look as big” as a 32” 4:3 TV, and to have the same optimum viewing distance at the same resolution.
Of course, as you increase the resolution, the optimal viewing distance goes down. Since most of us don’t much want to change our viewing position (living rooms not notable for having mobile walls and such) that means going up in screen size.
We were very happy with our viewing position at 9ft, on our current TV; and we don't want to move our furniture; so we needed to optimize for that distance.
The actual screen height of a 56” 16:9 TV is 27.45”; so the optimal viewing distance for 1080p/i content is 88” (7’4"), and the optimal viewing distance for 720p is 132” (11'). If we average the two, we come up with 110” or…
Well, whadya know… just a bit over 9 feet.
So, now I had a specific size range I was interested in, 46"-61" with an ideal of 52" to 58"; and I had a set of minimum requirements
1. True 1080p with 1080p input over HDMI, and native 480i/p and 720i/p input support
2. At least two HDMI ports
3. Good quality of upscaling, de-interlacing, and 3:2 pulldown
4. Auto iris, and manual iris and gamma controls (giving a dynamic contrast ration of 10,000:1)
5. D6500K color corrected lighting with a native contrast ratio of at least 3000:1
6. PC input preferred
7. A narrow bezel design, with unobtrusive speakers, black preferred
8. Useful color correction and calibration controls
9. A large variety of inputs
10. A cable card slot
This list of requirements left me with about a dozen models to sort through; when I found exactly what I wanted.
The reviews on the JVC TheaterPro series were all uniformly excellent. The three models in the series 56", 61", and 70" were all essentially identical but for screen size; and in every case I found was either the top rated, or second from top rated television in their size class (though they all noted that the sets needed to be calibrated out of the box for best color performance).
The TheaterPros were actually originally designed as reference monitors; and were sold under JVCs pro studio line starting in 2004. In 2006, they added the frilly consumer features, and announced them as consumer models as the FH and FN96 models (the FH, which I've purchased, has a black bezel, and an RS232 serial port, for professional home automation and home theater integration. The FN does not.). The updated 97 models models were announced in late 2006, and introduced in early 2007; and most of the major AV mags rated them as their best pick in class, or at least in their top five in class for the year.
What really sealed the deal for me though?
I started looking for prices. The 56" FH97s were announced at $3299 list; and went down to $2699 street. Then, this past September, JVC introduced a new 58" ultraslim (only 11" thick, and wallmountable) model that was otherwise technically identical to the 56". At that point, they decided to discontinue the 56, and reduced the MSRP to $2699 for closeout.
Well, when they discontinued it, street prices dropped to $1800 or so; but I managed to find one retailer (a top rated retailer on bizrate, paypal seller ratings, and resellerratings) blowing it out for $1299 with free shipping.
At that point, I was waiting for my contract renewal to come through to pull the trigger; and I was just hoping they didn't sell out before it happened. Last Thursday was finally the day, and it turned out none too soon, because it was the very last unopened one they had in stock (never take a floor model LCD, plasma, or projection TV. They lose significant life out on the floor).
In theory, the set will be here this Friday; though I haven't been able to confirm a dropoff time yet. I can't wait to see the Patriots crush the Jets in 56" 1080p glory.
Just two final steps... a new receiver (my current system doesn't switch HDMI), and new speakers to go with it (of course those final steps are going to be more expensive than everything else combined).