- Jul 17, 2010
- 2,140
- 3
- 81
I just built a gaming PC for somebody using a Haswell i5 and a HD7770. I used a monitor when I set it up and during the 3 days I had it I never any issues. Since giving it to the person they've reported problems getting output on their screen. I went over today and they are using a cheap 1080p TV. The iGPU works fine on the TV (although picture quality is crap), but the dGPU doesn't work reliably on the TV and when it does the picture is crap, despite the TV being 1080p. The monitor I've been using is 23" and the TV connected to the PC is 26". I know this will affect PPI but the quality between the two accounts for much more than the PPI difference.
I brought the offending HD7770 back with me to run my own tests on it and connected it into my Sandy Bridge PC and it works perfectly on my monitor. With the HD7770 still installed I then connected my PC into my Panasonic 1080p HD TV and the display when in the BIOS was split into 4 segments. In Windows, it would not fill the screen properly and letterboxed it top and bottom no matter what setting I changed regarding aspect etc..
So my question is if a TV and monitor are the same screen size and resolution, why do TV's not display the output correctly and why is the image quality so bad? Do TV's just not work as monitors and as there a reason for this?
I brought the offending HD7770 back with me to run my own tests on it and connected it into my Sandy Bridge PC and it works perfectly on my monitor. With the HD7770 still installed I then connected my PC into my Panasonic 1080p HD TV and the display when in the BIOS was split into 4 segments. In Windows, it would not fill the screen properly and letterboxed it top and bottom no matter what setting I changed regarding aspect etc..
So my question is if a TV and monitor are the same screen size and resolution, why do TV's not display the output correctly and why is the image quality so bad? Do TV's just not work as monitors and as there a reason for this?
Last edited: