GForce monitor quality

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
Laz-

"I also mentioned that the 2d quality of the product looks fine, but if you compare the 2 side by side, I believe that the majority would be able to tell the difference."

Unlike most others, I have seen them side by side, the only noticeable difference is on Trinitron monitors(out of those I have seen) when properly calibrated(something I also ask people when they complain, no answers yet on what calibration methods they use).

At 1600x1200 you can see a difference, but it is slight on most monitors. The monitor itself is more problematic at this resolution most of the time when operating at such a high resolution(though Trinitron and Diamondtron monitors don't tend to have this problem).

I'm not defending nVidia here, just the facts. Look at all the complaints, they are all on Trinitron tubed monitors. Do people mention that it is an issue with a particular type of problem or restate the same thing over and over?

I could harp non stop about how incredibly slow the V5 is, over ten times slower then the GeForce2 and leave it at that without ever mentioning that it was on a pro OpenGL benchmark. I repeat this enough, and get enough others to do the same and it suddenly becomes accepted fact. If someone points out that I am limiting the comparison to one narrow case, they would be right in pointing out the folly of the line of logic. That is what I am trying to do. It isn't about defending any company at all.

Rado45-

"I haven't seen my Geforce2 with my Sony trinitron, but I have seen the G500 monitors in a computer show but they did look very good (with a matrox card), by my knowledge Sony's monitors are of excellent quality."

Yes, they are of exceptional quality. Agruably the best monitors available(particularly the FD series) but they do have issues with at least the entire line of GF boards and also Radeons. I was trying to be foolish in my assertion as to illustrate this line of thought.

The GF has problems with one type of monitor. The Sony tubed monitors have problems with two different manufacturers video cards. Calling either one poor because of issues with each other IMHO is completely foolish. If you are particularly fond of Trinitron tubed monitors, stay away from the GeForce boards. Truly is that simple. If you don't own and are not planning on buying a Sony tubed monitor, then this is nearly a non issue.

I would however add that you should calibrate your settings if you do own a nV board. The default settings haven't been optimal on any monitor I have tried it with.
 

MrJoe

Member
Nov 5, 2000
132
0
0
BEN,

Maybe you need some glasses. First off, no one is bashing NVidia here. We are just refering to the sub par 2D quality that NVidia chipsets put out. I used to have a Voodoo 3 and when I changed to a Gforce 2, I was shocked when I booted into windows. To confirm this I put my Voodoo back in a noticed a HUGE difference. It doesn't matter what monitor you are using. The Nvidia chipset wasn't designed for 2D quality in mind.
 

cockeyed

Senior member
Dec 8, 2000
777
0
0
I've been following the 2D quality argument for sometime in various forums and newsgroups. It seems to me, most people agree that Matrox, 3dfx and now ATI Radeon have excellent 2D quality and speed. But when it comes to nVidia, some people say 2D quality is great and others say it is poor; I've also read some claims that the Windows speed for nVidia is slower.

In another thread, Senior Member-RobsTV made what I thought was a good observation; nVidia chips are used by many different videocard companies. However, Matrox, 3dfx and ATI make their own cards, at least until recently for 3dfx. Maybe it is more a case of the specific card manufacturer than the video chip. It will be interesting to see if the companies who will now build cards using the 3dfx chips, are able to produce the same 2D quality as the current Voodoo cards. If not, it just might be more a case of the card manufacturer than the chip.

Also, I would think that driver quality has a large part to play in picture quality. Matrox, 3dfx and ATI are better able to control their drivers while nVidia has less control.
 

RobsTV

Platinum Member
Feb 11, 2000
2,520
0
0
cockeyed,
3dfx DID supply many manufacturers with a reference design. The Voodoo2 comes to mind. While Diamond and Creative V2's were nearly identical to the reference design, they were not identical in perfomance and image quailty. Even though the cards did not do 2d, 2d was of different quality between the 2 cards, as they used different pass through cables. This same type of debate happened back then. Many "incorrectly" labeled 3dfx as bad, but in "reality" it was just slight differences by the board makers that caused problems. Same thing happened with the nVidia Riva 128. STB card's barely worked in non-Intel systems, while Canopus cards were the best thing since sliced bread in any system, and Diamond falling somewhere in between. Something as simple as having a flashable Bios (Canopus, Diamond) Vs. a non-flashable Bios (STB), can make a world of difference.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
"BEN,

Maybe you need some glasses."


Month old thread brought up, OK. I need glasses??? I have never had anyone who knows me try in any way to make that assertion. My eyes are signficantly better then average, the reasons I don't run Trinitron tubes is the two majorly annoying lines holding the grille in place and also the fact that I can see the grille itself just fine looking the monitor. Extremely headache inducing.

"First off, no one is bashing NVidia here. We are just refering to the sub par 2D quality that NVidia chipsets put out."

Check your facts a bit better. The problem is board level, and not all boards have the problem(not even with Trinis). There is absolutely nothing wrong with the GF series of chips in terms of 2D quality, the problem is a sub par component on the board itself. Unless you purchased a nVidia reference board then the problem is with your board manufacturer.

"I used to have a Voodoo 3 and when I changed to a Gforce 2, I was shocked when I booted into windows. To confirm this I put my Voodoo back in a noticed a HUGE difference."

What monitor first off? Second what calibration software were you using or what sorts of adjustments had you made to the RGB gamma/brightness/contrast settings when comparing. Please don't try and say you are some sort of 2D expert if you leave everything at default.

"It doesn't matter what monitor you are using. The Nvidia chipset wasn't designed for 2D quality in mind."

That is wrong, on many different levels. Have you seen the DVI out of certain GeForce boards hooked up to the SGI widescreen LCD? Makes Trinis look like sh!t by comparison(it should for the price). What can a CHIP manufacturer do to improve 2D quality? Name what nVidia could do that they haven't already. If you want to state that say Creative Labs has problems with their 2D on Trinitron montiors I would agree. You are however saying that all GeForce boards have problems with all monitors which is a flat out no way around it lie.
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
In order to meet the FCC radiated emissions requirements some video card manufacturers will add RC filter circuits to the output of the video chip or ramdac on older cards to reduce radio frequency emissions. The FCC regulates radio emissions from computer equipment in order to keep your equipment from interfering with your neighbor's TV or radio reception. This is also the reason they as you to turn off electronic equipment in airplanes during takeoffs and landings so your equipment does not interfere with the navigational equipment of the plane. Missing a runway because someone is using a cell phone can make for an unpleasant day for many people.

A perfect waveform from the output of the chip should be a square wave with very short rise and fall times. Faster rise and fall times will make better-looking text on the screen. As you add capacitance to the video lines, this will reduce the rise and fall time of the signal and make text look fuzzy (for lack of a better term).

For more information on this subject see the following link.

http://www.geocities.com/porotuner/

Good Luck to all


 

hans007

Lifer
Feb 1, 2000
20,212
17
81
ok MOST geforce boards have problems with most monitors, I've had an elsa erazor x , a asus v6800, an asus v7700 and a creative labs gf2, all were horrible on my KDS 19" NON TRINITRON. The problem isn't the chipset, its the design of the reference design that everyone follors, the RFI circuitry has to filter it to get FCC class B regulation approval, so its screws up the image, the chip is not at fault its the ref design, i think ELSA is said to not have as much RFI filtering thus the better looking (or they dont follow the reference design, after all its just reference but most companies are too cheap to add on to it). I'd be bold enough to say that if you could get a canopus geforce from japan, it would be awesome as canopus makes their own designs. Also the geforces look good on DVI out, so its not the chip just the filters . Now if you have to remove the filters to make it look good, the design still sucks because well ITS SUPPOSED TO WORK out of box
 

quadcells

Senior member
Jul 18, 2000
479
0
0
I had Geforce SDR, DDR, GTS with my IIyama 450 I think it looks great, But 2D looks so much better in WIN2K then it does in WIN98SE.
Why is that?
My computer is dual boot WIN98SE/WIN2K and I do a lot of video editing on my computer, I used WIN98SE and WIN2K, Now I just use WIN2K because the video looks sharper and better color.
my 2cents

BTW
hans007, where did you read about the RFI circuitry for the GEforce?





 

hans007

Lifer
Feb 1, 2000
20,212
17
81
quadcells, there have been some posts on it here, there was a page talking about it too and how to remove it so you get good 2d. The RFI filter pretty much just filters off the top and bottom parts of the video signal, thus you dont get enough video bandwith and it looks like crap. Ben skywalker, that site with how to remove the filters, even says that its not providing enough video bandwith, and if you look at your monitors specs (any monitor) it will say how much video bandwith you need at certain resolutions and refresh rates, it will affect every monitor if the effective video bandwith is low. I've tried this on a non trinitron and a g400 of a freinds, and it makes the same effect, as far as i know only the visiontek and elsa (since they are the same) are any good, and i've heard since they skimp on the RFI filter.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
"I've had an elsa erazor x , a asus v6800, an asus v7700 and a creative labs gf2, all were horrible on my KDS 19" NON TRINITRON."

Which calibration software were you using?

"Also the geforces look good on DVI out, so its not the chip just the filters . Now if you have to remove the filters to make it look good, the design still sucks because well ITS SUPPOSED TO WORK out of box"

Which is it? Does it look good using DVI out or does it not work out of the box? I know, two different things. My point is that there is absoutely nothing wrong with the GeForce CHIP, which is the part that nVidia sells. Check out a Herc Ultra board hooked up to the SGI widescreen LCD compared to a Matrox G400 on a FD Trini and it is clear that there is absolutely nothing wrong with the chip at all. Now, if board makers have problems that is something different all together, but they can modify pretty much anything they want to(not quite, but nearly anything).

Edit-

Just missed the last post-

"Ben skywalker, that site with how to remove the filters, even says that its not providing enough video bandwith, and if you look at your monitors specs (any monitor) it will say how much video bandwith you need at certain resolutions and refresh rates, it will affect every monitor if the effective video bandwith is low. I've tried this on a non trinitron and a g400 of a freinds, and it makes the same effect, as far as i know only the visiontek and elsa (since they are the same) are any good, and i've heard since they skimp on the RFI filter."

"Skimping" on the RFI filter improves quality, I wouldn't consider that skimping. Besides that, check out the DVI specs on monitors and compare them to what is available for nV based boards. In the highest end 2D realm, several nVidia based boards are considered class leading, besting even Matrox(DVI take a bow). For MOST people on this forum the 2D is also quite good, just by looking at the posts you can easily see that.
 

cockeyed

Senior member
Dec 8, 2000
777
0
0
I visited the web page that describes how to modify the RFI filter to improve 2D on nVidia chip based cards. It seems logical that the RFI filter could cause a problem with the output video signal. However, I would have to assume that Matrox, 3dfx and ATI use similiar types of RFI filters on their cards. I then wonder, why is the 2D not affected in the same way on those cards? Also, since the 3D signal follows the same path, why is 3D not affected or is it? Just food for thought!
 

Gosharkss

Senior member
Nov 10, 2000
956
0
0
Odds are those other card are not using the same values of components. The RC time constants of the filter will determine at what frequency the filtering will have the most affect. Maybe the other cards do not need the filtering due to better Printed circuit board design, there are a number of factors that can be in play here. Due to the time constant of the filter it may not have an affect on the signal. Windows based programs typically run at the higher frequencies where the filtering is needed the most. I'm not a game player, however most of the games I have seen run at the lower resolutions. Also games like television typically always has movement on the screen to distract your eyes from the subtle ghosting or smearing these RF filters introduce. If games this smearing may make the image look better. Text based applications tend to be static for long periods allowing the eye to see the smearing.

Good Luck
 

wmachine

Junior Member
Dec 11, 2000
17
0
0
Interesting thread, am trying to selecta vid card for new system (TBird 1GHz/KT7-Raid running CAD and general apps). Was leaning towards the GeForce2 GTS (32/64, Pro, Ultra - not sure which, advice is welcome) partially due to the good reviews and the mod to convert to a Quadro2. This thread and some others I've seen have me concerned about the sharpness. Will be using with a Nokia 19" 446Xpro, probably no higher than 1280 x 1024, and will be using Win2000 Pro.

Where were the links on modifying the filters, and has anyone used these successfully? (hate to butcher a new board, but willing to try mods if there's a good improvement)

Any other boards worth checking out?
 

wmachine

Junior Member
Dec 11, 2000
17
0
0
Excellent info, many thanks!

Looks like the Geforce will get the nod, any suggestions on which Geforce type and vendor has the best base boards for mods? (GTS/Pro/Ultra; Hercules, Elsa, Creative, etc.)
 

MigraineMan

Member
Mar 15, 2000
45
0
0
Well, although this is a GeForce focused thread, I do have to say that after going down the GeForce road and not liking there image performance with my Glide-based games, my pathetic little V5500 is running Unreal Tournament over 50fps at 1600x1200 resolution. My recent upgrade to the T-Bird 1Ghz (o/c'd to 1150 at the moment) helped this, but for me, this graphics speed is completely acceptable at the moment, and the 2D performance is really like night and day between the lower quality nVidia implementation. I am using a 19" Trinitron (Mitsubishi 900u) and running Win2000.

UT rocks in Glide. Once the Unreal engine games die out, so will my video card, but for the moment, I see little value striving for faster, irrelevant framerates at the expense of image quality. The V5500 has it in spades and is getting cheaper to buy everyday. And after what I have heard about the lack of incremental value the NV20 will add, I'm not entirely sure what my next video purchase will be. The V5500 was the best compromise of price/performance/quality. NVidia seems to emphasize speed over everything else. I hope their 3dfx buyout will help bring a balance to nVidia's aggressive pursuit of framerates by integrating 3dfx's emphasis on image quality.
 

major_major

Member
Feb 20, 2000
66
0
0
my GTS was pretty blurry so i decided to follow the instructions on the porotuner page and pull off some capacitors. I didn't quite understand the instructions about soldering the inductors so i left them alone..but after removing the capacitors i see very little difference if any. My 2d quality is still blurry.
 

edwardhchan

Senior member
Dec 14, 2000
212
0
0
Thanks all for that info! I just posted a question should I get a Radeon or a GeForce 2MX. I think you all answered my questions. Off to Fry's I go.... They've got a Radeon 32MB DDR for $154 after rebate and tax.
 

audreymi

Member
Nov 5, 2000
66
0
0
Some useful things to keep in mind when you compare 2D
quality can be found in this informative
video card review on the Nvidia based
Riva3D site.

Another interesting article

link:
http://www.zdnet.com/etestinglabs/stories/bi/0,8829,2374375,00.html

I found was that fast framerates can
also be achieved in the driver by skipping the task of rendering
frames in the driver. The article say to be suspicious of drivers
that have high benchmark numbers but have wide framerate variation
running actual games.

A good example is a card that was recently reviewed at Tom's hardware site with an almost 2:1 advantage over an Nvidia card.
My suspicions are that it buffered data for processesing and
reported back to the CPU it was done even though a lot of the
data was not rendered.

link: http://www.zdnet.com/products/stories/reviews/0,4161,368060,00.html


The other interesting thing are the tradeoffs
between quality to get more performance:

link:
http://www.zdnet.com/products/stories/reviews/0,4161,368060,00.html



My advice, take a 20% hit in framerate
in anything over 80 fps (result is 64 fps) and look
for the best 2D quality you can find.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |