No current graphics cards can support HD disks' DRM
Posted: 2006-02-13 02:17am
Due to the DRM in next-gen HD disks, no custom built computer will be able to play them. This is due to the video cards that Nvidia and ATI have been swearing up and down since the Directx 8 era will support HDCP. As it turns out, this wasn't true. The GPU's can support it, but the boards have to be able to deal with the HDCP DRM, which none of them can.
Article:
Article:
Useless thread title edited - DW]Firing Squad wrote:Introduction
You want to know a secret? None of the current ATI or NVIDIA graphics cards will support the full capabilities of Windows Vista.
But let’s start from the beginning. This story starts with my upcoming LCD Monitor Round-Up. As you know, a good monitor should last several years and outlive every other component in your PC, other than perhaps a keyboard or a mouse. So, when it came time to do another review of LCD monitors, my attention turned towards “Windows Vista-ready” monitors: those with HDCP. After all, it makes no sense to recommend a monitor that will go obsolete in just a few months.
At the time I started my article, there were only 10 PC monitors with DVI/HDCP support (we’re reviewing 5 of them). I was disappointed, but what was surprising is that many of these monitor manufacturers weren’t advertising their HDCP support. For monitors, HDCP support is the most important feature for having a “future proof” solution.
What is HDCP?
HDCP stands for High-bandwidth Digital Content Protection and is an Intel-initiated program that was developed with Silicon Image. This content protection system is mandatory for high-definition playback of HD-DVD or Blu-Ray discs. If you want to watch movies at 1980x1080, your system will need to support HDCP. If you don’t have HDCP support, you’ll only get a quarter of the resolution. A 75% loss in pixel density is a pretty big deal – Wouldn’t you be angry if your car was advertised as doing 16 mpg, and you only got 4 mpg? Or if you bought a 2 GHz CPU and found out that it only ran at 500 MHz?
As part of the Windows-Vista Ready Monitor article, I was going to publish a list of all of the graphics cards that currently support HDCP. I mean, I remember GPUs dating as far back as the Radeon 8500 that had boasted of HDCP support.
Turns out, we were all deceived.
GPU Support for HDCP
Although ATI has had “HDCP support” in their GPUs since the Radeon 8500, and NVIDIA has had “HDCP support” in their GPUs since the GeForce FX5700, it turns out that things are more complicated -- just because the GPU itself supports HDCP doesn’t mean that the graphics card can output a DVI/HDCP compliant stream. There needs to be additional support at the board level, which includes licensing the HDCP decoding keys from the Digital Content Protection, LLC (a spin-off corporation within the walls of Intel).
After some investigation, Brandon and I determined that there is no shipping retail add-in board with HDCP decoding keys. Simply put, none of the AGP or PCI-E graphics cards that you can buy today support HDCP.
I did not believe this at first. Surely, I was misinterpreting the content of the emails I was receiving. After all, everyone is hyping up H.264 support and HD-DVD/Blu-Ray playback. When I go to http://www.ati.com/products/RadeonX1900/specs.html I see HDCP support listed. Am I supposed to know that the board doesn’t support it because I can go to http://www.ati.com/products/radeonx1900 ... specs.html and see that HDCP is omitted? If that’s the case, am I supposed to know that the board has “48 shader processors” when it’s only listed in the GPU specifications page?
What we’ve confirmed
We’ve been able to confirm that none of the Built-by-ATI Radeons support HDCP. If you’ve just spent $1000 on a pair of Radeon X1900 XT graphics cards expecting to be able to playback HD-DVD or Blu-Ray movies at 1920x1080 resolution in the future, you’ve just wasted your money.
NVIDIA, being a GPU manufacturer was unable to discuss the plans of board manufacturers. We contacted all six of NVIDIA’s Tier-1 board partners. None of the GeForce 6 or 7 video cards available on the market, including the most recently released GeForce 7800GS, have HDCP support. So if you just spent $1500 on a pair of 7800GTX 512MB GPUs expecting to be able to play 1920x1080 HD-DVD or Blu-Ray movies in the future, you’ve just wasted your money.
How can these companies be so oblivious? Playing Devil’s Advocate, I thought to myself that maybe, just maybe, by the time Windows Vista comes out, most people are going to upgrade their GPU. If the HDCP support was very expensive, then paying for the HDCP license now would be like paying for something you don’t use. So I dug around for HDCP licensing costs. Turns out, that the answer is available at the HDMI website. HDCP licensing requires a $15,000 annual fee and a per-device fee of $0.005, i.e. a fraction of a cent. That’s not too expensive. There goes that argument.
Upgrade path for HDCP?
Video cards are the only components in a PC that have gone up in price over time. Yet manufacturers are trying to sell video cards that don’t support HDCP? The technology has been around for years. Microsoft made it public in March 2005 that HDCP would be required for Windows Vista – certainly the video card manufacturers were given this info before the public were. Moreover, what about companies who are already paying the $15,000 annual company fee because they produce HDCP-compliant products for televisions?
Despite my discovery that HDCP licensing is fairly cheap, I’m still trying to find an answer. There must be a silver lining somewhere. Maybe, just maybe, existing cards can be retrofitted for HDCP support. Maybe it’s simply a matter of a BIOS flash where each board gets its own serial number. If that were true, the worse case scenario would be that customers would pay a few bucks for the HDCP license.
Turns out that this was also wishful thinking.
An ATI representative said: “People will not be able to turn on HDCP through a software patch since the HDCP keys need to be present during the manufacturing. We are rolling out HDCP through OEMs at this time but we have not finalized our retail plans yet.”
As I pressed for more information about potential retail plans (i.e. trade-in programs, whether existing boards already have traces for the HDCP hardware where it can be plugged in), I got only a vague response:
“We cannot get into more detail at this time, as any further discussion would get into our trade secrets. However, we do promise to give you a full update on our retail plans once they are finalized.”
I’m not going to speculate on whether ATI’s reticence is because they’re trying to downplay a big fiasco, or if they’re trying to keep their super generous solution secret to throw off the competition. There’s actually no way to know.
Well, what about NVIDIA? They were actually very direct: “The boards themselves must be designed with an extra chip when the board is manufactured. The extra chip stores a crypto key, and you cannot retrofit an existing board after the board is produced.”
Wow. You can pick your favorite expletive.
The blame game
Blame Canada?
As ATI is a GPU and board manufacturer, I’m disappointed that Built-by-ATI video cards lack HDCP support. Think about it. The GPU engineers are smart enough to know that their GPUs need to support HDCP, but their board engineers aren’t? Is it even possible to build a GPU without thinking about the board that has to go along with it? ATI is extremely reticent to give us any more details about “Retail Plans.” Maybe ATI owners will get lucky, and ATI will have some sort of free upgrade program. Maybe ATI owners will get shafted, and buyers of X1900XT’s are going to find themselves with a video card that cannot play HD-DVD or Blu-Ray at 1920x1080. Who knows?
Blame Santa Clara?
What about NVIDIA? Personally, I think they have the least blood on their hand for two reasons. One, they aren’t a board manufacturer. That excuse alone wouldn’t be good enough for me though.
What really gets them off the hook is that NVIDIA has been offering their board manufacturing partners designs with HDCP support since May 2005. Likewise, NVIDIA has actually shipped HDCP-enabled GeForce 6200 and 6600’s in Sony Media Center PCs. Those boards just aren’t manufactured at retail. In retrospect, they did their part. It was the board manufacturers who failed us. I don’t need to name names, because they ALL failed us.
Blame the other Santa Clara company?
HDCP is the brain-child of Intel, and now belongs to a spin-off company, Digital Content Protection, LLC. They’re the ones who profit off all of the licensing fees. If HDCP licensing were cheaper, might we have seen more PC products with HDCP support? Possibly. It still seems to me that HDCP has relatively benign pricing when it comes to licensing. It's half a cent per item. If you compare that to licensing fees for HDMI, you'll see that while both have the same $15,000 annual fee, HDMI licensing is 4 cents/per unit (if you use the maximum discount as an example). Should we blame Intel for creating HDCP in the first place? I don’t think so. HDCP was a technology made in response to Hollywood’s requests. Blue laser technology can only go so far without content.
Blame Hollywood?
HDCP is an artificial requirement – there’s no reason why HD-DVD or Blu-Ray needs content protection. Although the movie industry is among the wealthiest of all industries, Hollywood has made things tougher in their paranoia of software piracy. Can we blame Hollywood for demanding HDCP? Maybe a little bit, but they’re not responsible for this current fiasco. Movie studios have done their fair part to make high-definition home video a possibility. From the get go, Hollywood made it clear that content protection was going to be necessary for high-definition video and they gave the electronics industry ample warning. HD-DVD and Blu-Ray are coming in 2006. Television manufacturers have been putting HDCP into HDTVs from as far back as 2002. While Hollywood is certainly responsible for pressuring Microsoft into requiring HDCP for Windows Vista, they set their ground rules early on.
Is it our fault?
Think about it. If consumers and reviewers didn’t use the terms GPU and graphics card interchangeably, this wouldn’t be a problem. When it was disclosed that Microsoft required HDCP for high-definition HD-DVD or Blu-Ray playback in Windows Vista, everyone turned their attention to monitors, assuming that GPUs would support it. We all know the what happens when you assume. Likewise, why didn’t reviewers investigate if features in a GPU actually made it to the board level? Most importantly, we as consumers never clamored for HDCP support.
So in a way, even consumers are at fault, right? No way. Only the truly twisted would claim that the victims brought it upon themselves. Do any of us “ask” for Direct3D or OpenGL support? It’s a given. Consumers never demanded HDCP support because it was already thought to be there.
Alan's thoughts
This is a tough situation. The PC world simply isn’t ready for high-definition video playback via HD-DVD or Blu-Ray. There failures occurred at so many different levels. I’ve probably burned a few bridges in this article, and I probably won’t be reviewing any video cards in the near future. Nonetheless, this was a train that had already left the station. Keeping quiet about the problem wouldn’t have stopped the customer outrage when Windows Vista was released. The solution to this problem isn’t technical. It’s political. I hope that board manufacturers will own up to the challenge and explain their actions to their customers.
Brandon’s thoughts
Without a doubt, this is huge, startling news. As much as ATI and NVIDIA have been promoting H.264 decoding with their latest GPUs, it’s pretty shocking to see that apparently none of the shipping retail cards on the market have been built to take advantage of it. To add insult to injury, it appears that a line of Sony GeForce 6200s and 6600s offer HDCP support, yet the latest high-end GeForce 7800 GTX cards don’t. How’s that for irony?
While some of you may not plan on upgrading to Vista at the end of this year, this is eventually going to affect you if you ever planned on watching hi-def movies on your PC in the future. Microsoft will eventually end support for Windows XP; already, their Games Division is planning Vista-exclusive titles such as Halo 2. It will only be a matter of time before other software developers follow suit, forcing anyone who’s remotely interested in gaming to upgrade to Windows Vista.
Anyone with a GeForce 6/7 or Radeon X1K card who was planning on buying a BD-ROM or HD-DVD drive later this year for their PC may want to hold off on that purchase. Quite frankly, this article should affect the purchasing decisions of potentially anyone in the market for a new PC or graphics card right now that’s even remotely interested in watching hi-def movies on their PC sometime in the future.