What Moron Designed the HDMI interface and why?

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

d3fu5i0n

Senior member
Feb 15, 2011
305
0
0
Don't say 'OP fail'. We've all made embarassing mistakes [if we have], and it doesn't feel great being picked on afterwards. The nicest way to do it is to share each others' knowledge and advise the OP and others of the correct information in a friendly manner. ^_^
 

Soundmanred

Lifer
Oct 26, 2006
10,784
6
81
Don't say 'OP fail'. We've all made embarassing mistakes [if we have], and it doesn't feel great being picked on afterwards. The nicest way to do it is to share each others' knowledge and advise the OP and others of the correct information in a friendly manner. ^_^

The OP didn't even seem to consider that it might not have been the new fangled teknology that is messed up, and made alot of completely wrong statements.
Don't make statements as fact when you don't know what you're doing, calling the designers "morons" when it was the tech he didn't understand or know how to use.
Don't make false statements and pass them of as facts when you have no idea what you're talking about and the riducule won't be an issue.
We have enough Tweakboys around here already.
 
Last edited:

Patranus

Diamond Member
Apr 15, 2007
9,280
0
0
You just trolling or what?
I, as well as several others, explained above: there are no standard resolutions for HDMI cables. There are standard industry resolutions defined for sources and sinks for compatibility reasons, but these exist for VGA and DVI too (640x480, 800x600 etc). It is your TV that doesn't want to receive 1366x768, nothing to do with HDMI protocol. I have a 22" 1680x1050. Why can't I send 1080p to it over DVI? I mean, DVI supports anything, right?

From HDMI 1.3 spec, section 6, first sentence:
"HDMI allows any video format timing to be transmitted and displayed.

http://www.dybkowski.comule.com/download/hdmi/hdmi_spec_1.3_gm1.pdf

Sure it *can* do certain things if moved beyond the standard but that is the exception to the rule.

Look at page 85 of your linked document.
It outlines the resolutions outlined in the HDMI spec.
1366x768 is no where on the list

Hell, 1366x768 is even missing on the secondary supported resolution list on page 87.

Ya. Sure looks like I am trolling
 
Last edited:

iCyborg

Golden Member
Aug 8, 2008
1,327
52
91
Once again: these are industry standards for compatibility reasons between sinks and sources. Does the sentence from the spec, 6.2. "In order to provide maximum compatibility between video Sources and Sinks, specific minimum requirements have been specified for Sources and Sinks." mean anything to you?
And I said that other protocols had such too, you are wrong that DVI/VGA don't have them (quote me the part of the spec which says that devices must support all the resolutions if you continue to claim otherwise). The only mandated resolution for DVI (and VGA) is 640x480. So how is DVI/VGA better than HDMI if neither lists 1366x768 as the required supported resolution?

And you didn't answer my question: why can't I send 1080p to my 22" screen over DVI if DVI can send anything? Or could it be that what resolution can be sent over DVI depends on particular sources and screens, and not so much on the DVI protocol itself?
 

Cogman

Lifer
Sep 19, 2000
10,278
126
106
I disagree. Other than the mandated hdcp crap, HDMI is good. But "far superior to vga", i don't think so. About the only thing i really see it has is audio added to the cable and digital signal. Neither of which is a big advantage. You can buy vga cables with audio cables molded into them. Digital signal... eh i don't think it is making that much of a difference if any at all. I remember when DVI came out and comparisons were being made against vga. I don't remember any quality related issues, just that it was suppose to take out a level of signal processing.
From an engineering standpoint, it is FAR superior to VGA. VGA was designed with CRTs in mind. HDMI and DVI, on the other hand, recognized that "Hey, CRTs are dead/dieing, why not realize that and output something LCDs can handle". That translates into much cheaper and smaller display controllers for monitors that support it and the ability to VERY easily place a pixel where ever you like on the screen.

The next thing we need to change is the way screens refresh. We waste a lot of energy and processing power because current display techs constantly spit out information at a standard refresh rate. That is a ton of wasted bandwidth. Especially for computer screens which, for a VERY large portion of their life, display the exact same thing over and over and over again.
 

iCyborg

Golden Member
Aug 8, 2008
1,327
52
91
The main problem with VGA with DFP displays is that GPUs must have a DAC to convert to analog, and DFP must convert it back, waste of effort on both sides. Most DVI connectors still have analog pins and you can drive VGA monitors with adapters. Same for DP, you can drive anything with DP.

For screen refresh, this is actually being done, I think some screens support this already by having a frame buffer on their side, and mobile GPUs will soon be able to use it to power down some HW blocks when they (dynamically) detect the screen hasn't been changing for some number of frames. It's probably not very useful for desktops.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
I was with RCA/GE around the time when HDMI was being worked out by engineers. First let me say that what engineers think is best usually doesn't make it past marketing. Marketing is all about appearances and profit. HDMI as you see it now is not what it was originally conceived to be. The original idea was a remake of the DVI connector with a few pins added for audio support. That was it , it was well proven to work, the connectors were solid and more importantly it would work well with the home theater crowd who are the first to buy new gear. All the chips existed, the hardware was in place.

In steps marketing.
USB is the hot thing, look you can just plug and play, we have to have that in our new connector. Engineers tell them that the cables for HDMI need decent shielding and that to have that much weight on something so short will strain connectors. Marketing tells them , so what, it will outlast the warranty .

In steps the MPAA screaming about audio and video on different wires and how if we would multiplex the streams and keep the audio and video together , we can protect it with just one system and stop those pesky movie soundtrack thefts, all 5 of them. Mixing audio and video on the same streams increased cost of HDMI x10 , but the MPAA was pulling the strings. It also made it much more difficult to send audio through a device without the video.

It isn't about what is best, engineers come up with better interfaces all the time that get shot down by people who have no clue and shouldn't be making the decisions.
That is the reason I left that field of work. If you want me to do the work and put my name on it don't make a dozen changes to suit yourself and then expect me to rubber stamp it.
 

Cogman

Lifer
Sep 19, 2000
10,278
126
106
I was with RCA/GE around the time when HDMI was being worked out by engineers. First let me say that what engineers think is best usually doesn't make it past marketing. Marketing is all about appearances and profit. HDMI as you see it now is not what it was originally conceived to be. The original idea was a remake of the DVI connector with a few pins added for audio support. That was it , it was well proven to work, the connectors were solid and more importantly it would work well with the home theater crowd who are the first to buy new gear. All the chips existed, the hardware was in place.

In steps marketing.
USB is the hot thing, look you can just plug and play, we have to have that in our new connector. Engineers tell them that the cables for HDMI need decent shielding and that to have that much weight on something so short will strain connectors. Marketing tells them , so what, it will outlast the warranty .

In steps the MPAA screaming about audio and video on different wires and how if we would multiplex the streams and keep the audio and video together , we can protect it with just one system and stop those pesky movie soundtrack thefts, all 5 of them. Mixing audio and video on the same streams increased cost of HDMI x10 , but the MPAA was pulling the strings. It also made it much more difficult to send audio through a device without the video.

It isn't about what is best, engineers come up with better interfaces all the time that get shot down by people who have no clue and shouldn't be making the decisions.
That is the reason I left that field of work. If you want me to do the work and put my name on it don't make a dozen changes to suit yourself and then expect me to rubber stamp it.

Interesting stuff. A company I'm working for now has a product they are selling that, no joke, got a worthless LCD screen slapped on it because marketing figured people really wanted to be able to come and look at the worthless LCD screen. Luckly, That is about the only thing (that I know of) which marketing has demanded.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
For screen refresh, this is actually being done, I think some screens support this already by having a frame buffer on their side, and mobile GPUs will soon be able to use it to power down some HW blocks when they (dynamically) detect the screen hasn't been changing for some number of frames. It's probably not very useful for desktops.


It has been in display controller chipsets for about 5 years now but it actually slows down displaying images. It takes 8-16 clocks to write to a frame buffer, compare the contents, output the contents to a new buffer , and then write that to the row/column driver to display the image. Streaming the data directly to the row/column driver takes at most 4 clock cycles, write to buffer, write to driver. Some chips use frame buffering to apply effects to total images like color or contrast but most now apply the changes to each pixel by ADD/OR the bits as they are written to the driver.

I try to not think of video being drawn in frames on LCD but instead being drawn in bits. The new frame bits are just a line telling the controller to go back to the top corner for the next sequence of bits. A fun thing to do is to tell the controller to ignore the new frame bits. You get some really interesting video when you do that depending on when you turned off the bits.
 

iCyborg

Golden Member
Aug 8, 2008
1,327
52
91
HDMI as you see it now is not what it was originally conceived to be. The original idea was a remake of the DVI connector with a few pins added for audio support. That was it , it was well proven to work, the connectors were solid and more importantly it would work well with the home theater crowd who are the first to buy new gear. All the chips existed, the hardware was in place.
Actually there are no additional pins, HDMI and DVI are completely electrically equivalent, same pins, just rearranged into a smaller physical package. Audio is transmitted between two horizontal lines during something called front and back porch, where in DVI blank pixels are being sent.

For the self-refresh, you seem to be talking about screens doing all the job. I think the idea now is to push it to the GPU side so that it doesn't send any pixels in the first place. It constructs the image so it can detect more easily if the frames aren't changing, and it can just send a special bit on an auxiliary channel to inform the screen to repeat what it already has. It won't be doing that for 1-2 repeated frames because powering down/up isn't instantaneous, but for some number it will make sense.
 

Howard

Lifer
Oct 14, 1999
47,989
10
81
Don't say 'OP fail'. We've all made embarassing mistakes [if we have], and it doesn't feel great being picked on afterwards. The nicest way to do it is to share each others' knowledge and advise the OP and others of the correct information in a friendly manner. ^_^
It takes a hell of a lot of tolerance not to rail on someone who comes out swinging.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Actually there are no additional pins, HDMI and DVI are completely electrically equivalent, same pins, just rearranged into a smaller physical package. Audio is transmitted between two horizontal lines during something called front and back porch, where in DVI blank pixels are being sent.

There are no additional pins because the engineers didn't get approval. The idea was to put additional pins in what was the DVI connector to form a new connector. With the streams separate users could then make use of the audio data for things like home theater without having to worry about routing the video data. Above all we wanted to keep audio and video streams separate.


For the self-refresh, you seem to be talking about screens doing all the job. I think the idea now is to push it to the GPU side so that it doesn't send any pixels in the first place. It constructs the image so it can detect more easily if the frames aren't changing, and it can just send a special bit on an auxiliary channel to inform the screen to repeat what it already has. It won't be doing that for 1-2 repeated frames because powering down/up isn't instantaneous, but for some number it will make sense.

You are talking about a solution for a problem that doesn't exist. There isn't a bandwidth issue with the video transmission . We can take it to a billion pixels if necessary. Adding hardware to detect if two frames are different isn't needed unless you want to alter the images in some way. The idea behind display port is to let the video card take over the task of what regular lcd displays already do internally. Right now it is
A +B [cable] B-A = display
display port is A+B [cable] A = display

GPU have by design some things that are identical to those in displays so cut the cost and save the hardware. Things like contrast, brightness, color temperature all require hardware in the display right now and also can be done on the GPU. display port allows that to be done solely on the GPU so that lowers cost on the display itself.
 

iCyborg

Golden Member
Aug 8, 2008
1,327
52
91
I'm not talking about a bandwidth issue, I'm talking about lower power usage by turning off parts of the GPU that process and send all these pixels consuming battery power when you could just instruct the display to repeat stuff from its frame buffer and achieve the same thing. I mentioned earlier it was more useful for notebooks for this reason.
 

Perryg114

Senior member
Jan 22, 2001
767
4
81
Guys I am not faulting the cable design just the fact that HDMI from a computer is a royal pain in the ass. I just know that when I hook my TV to VGA it looks great. When I hook it to HDMI it looks crappy. Now maybe for watching movies it does not matter. Don't get my started on forcing everyone to use 16:9 monitors for computers when it makes no sense. I am sure that HDMI data rates are better etc. etc. but why can't they make something that is more universally compatible with computer hardware. If HDMI is mapped to 1080P then why the overfill? Why can't I have a 1:1 pixel map like VGA?

Perry
 

Cogman

Lifer
Sep 19, 2000
10,278
126
106
Guys I am not faulting the cable design just the fact that HDMI from a computer is a royal pain in the ass. I just know that when I hook my TV to VGA it looks great. When I hook it to HDMI it looks crappy. Now maybe for watching movies it does not matter. Don't get my started on forcing everyone to use 16:9 monitors for computers when it makes no sense. I am sure that HDMI data rates are better etc. etc. but why can't they make something that is more universally compatible with computer hardware. If HDMI is mapped to 1080P then why the overfill? Why can't I have a 1:1 pixel map like VGA?

Perry

I don't have this issue. In fact, HDMI from the computer is pretty painless for me, plug it in, up the resolution to its max, and viola, crystal clear picture.
 

pw38

Senior member
Apr 21, 2010
294
0
0
Sure it *can* do certain things if moved beyond the standard but that is the exception to the rule.

Look at page 85 of your linked document.
It outlines the resolutions outlined in the HDMI spec.
1366x768 is no where on the list

Hell, 1366x768 is even missing on the secondary supported resolution list on page 87.

Ya. Sure looks like I am trolling

Well I am typing this on a 1366x768 plasma connected via HDMI from my videocard. Guess I should be lucky it works huh? :biggrin:
 

QueBert

Lifer
Jan 6, 2002
22,460
775
126
HDMI does not look better than VGA, I'll put my 22" Trintron hooked up via VGA against any monitor running HDMI and mine will look better. CRT + VGA's the best picture you can get on a computer IMHO. And you can do 120khz over VGA on a CRT, I've seen it done before.
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
Actually, I have found that the new releases are almost as good in DVD as they are in BD. Yeah if you are 3 ft from the screen you can see some difference but sitting on the couch it is really hard to tell.

I 100% disagree with this. I pretty much can't watch SD or DVD sources anymore after being spoiled by HD. If fact my wife knows not to ask me to watch a show with her till it comes out on Blu Ray...
 

Soundmanred

Lifer
Oct 26, 2006
10,784
6
81
If you can't tell the difference between even the best DVD and BluRay or other true hidef sources, you're either too far away from the TV, your TV is only SD (I've actually seen people with large SD sets buy a Bluray player and expect magic!), your TV/monitor is small, or you have bad vision.
 

Howard

Lifer
Oct 14, 1999
47,989
10
81
HDMI does not look better than VGA, I'll put my 22" Trintron hooked up via VGA against any monitor running HDMI and mine will look better. CRT + VGA's the best picture you can get on a computer IMHO. And you can do 120khz over VGA on a CRT, I've seen it done before.
*facepalm*
 

iCyborg

Golden Member
Aug 8, 2008
1,327
52
91
HDMI does not look better than VGA, I'll put my 22" Trintron hooked up via VGA against any monitor running HDMI and mine will look better. CRT + VGA's the best picture you can get on a computer IMHO. And you can do 120khz over VGA on a CRT, I've seen it done before.
Are you talking about HDMI vs VGA or CRT vs LCD/DFP?
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |