Where ATI has failed ...

Page 3 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ChaiBabbaChai
Originally posted by: Keysplayr

No guys. The long white "traditional" flourescent lights. I have the incandescent replacements bulbs "squiggly bulbs" in every light bulb socket in my house, including right over my head where I have 3DVision setup. There is no flickering of any kind whatsoever with this type of lighting.

Just because you can't see it doesn't mean it isn't there. think of it like analog audio (incandescent) vs. digital audio (CFL). You can't hear the gaps between samples, but you're not hearing every single piece of the sound wave with digital audio. Like others have said they turn on / off like 120 times per second.

Hehe, like ghosts. Yeah, there is one of those bulbs almost directly overhead and to the right about two feet. No flickering. So analog, digital, or whatever, if there is any sort of flickering, I can't detect it. I tried hanging a two foot flourescent worklight over head and that was when I noticed some hairy flickering. Impossible to use the glasses.

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Modelworks
I don't think ATI has failed at graphics cards. I just think they lack the marketing power that nvida has.


The thing about 120hz displays is that you need a 120 fps capable card or the display is going to interpolate to display at 120hz. Remember LCD is not like CRT, it only updates the display when something changes. If you display a picture on a LCD display with no pixels changing the refresh rate is 0hz . 60hz or 120hz just means the display can update at a rate of 120 times a second not that it does that for everything you display. Think of it as the speed that the cpu inside the monitor can process data.

Film is shot at 24fps and displayed in theaters at 72hz so people wanting a theater like experience should shoot for that

The big issue I see with LCD is that it needs a cleaner light source. Most LCD are using fluorescent based lights that do flicker. If you set a display to pure white then use something like a digital camera it will pick up the flicker.

The IR interference is mainly caused by the color of the light not the flickering. Ever notice how things like the sensor on a tv are behind a dark almost black plastic ? The reason is to block all the stray light except the range that the sensor can see. I'm working with a project I built that is an IR receiver for the pc. If I have the window shades open, letting sunlight in, the thing sends data constantly because it sees the IR in the sunlight.

The bad part about choosing 120hz for a 3d display is that in the USA 60Hz is the power frequency . So if you are sending a ir signal to each side of a set of glasses at 60hz per side and the light being in the same spectrum they are going to interfere. Remote controls use 35hz - 50hz. A quick demo of the issue is to grab two remote controls. Press a button on one and hold it down, now try to use the other remote in the same room. It is also a fun trick to play on people. Get a remote that transmits as long as you hold down the button. Sit in the room holding down the button while you watch people wander WTF is wrong with the tv remote

120 would be the max. I have tried that Resident Evil 5 videobench. I am using a GTX295, but the video bench did not support SLI, so I was technically using a single GTX275. Using 3DVision, it maintained an average of 47-51 fps. That means the card didn't have to render a full 120 fps (60 per side) to utilize 3DVision. So, my card was probably rendering 94-102 fps on average. I think 60 per side is the max due to the refresh rate of the monitor. 240hz monitors will of course raise this cap.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: ChaiBabbaChai
I really like my nVidia control panel, but nVidia based mobo's don't cut it right now, so that pretty much dictates that I stay with AMD and ATI.

I don't know about that. This 790i Deluxe kicks some buttocks. Supports all 775 CPUs. What is it that my board isn't cutting? Short of no i7 support?

 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: Azn
Originally posted by: Kakkoii
Originally posted by: Obsoleet
Where Nvidia has failed is on their drivers. The new NV is the old ATI.

Nvidia needs to work on that instead of on wonky technologies that will never take off. A few kids and Nvidia's puppets might be circle jerking around tHrEe dEe vIsIoN but most of us are wanting more of what ATI's been doing.

And what exactly has ATI been doing?

Brought performance to the mass. Made Nvidia drop prices.

If it wasn't for ATI, Nvidia would still be milking you for $600 video cards.

Azn, I appreciate what ATI did in lowering it's prices so drastically forcing competitive pricing all around. That's great for the consumer. But I'm not sure why you think ATI did this for "us". (that's the impression I'm getting). AFAICT, it was to gain desperately needed market share, which doesn't appear to be working. As for bringing performance to the masses? I think both companies can take credit for that. 8800GT was perhaps one of the best selling cards in history. Then the 9 series. 9600GT was a very powerful card for the money. Anyway, flip the coin, and you would see that conversely, if it wasn't for Nvidia, you can bet ATI would be milking you for $600 video cards. Or did you think these big corporations are not out to make money?



 

AzN

Banned
Nov 26, 2001
4,112
2
0
Originally posted by: Kakkoii
Originally posted by: dguy6789
Originally posted by: Kakkoii
Originally posted by: Azn
Originally posted by: Kakkoii
Originally posted by: Obsoleet
Where Nvidia has failed is on their drivers. The new NV is the old ATI.

Nvidia needs to work on that instead of on wonky technologies that will never take off. A few kids and Nvidia's puppets might be circle jerking around tHrEe dEe vIsIoN but most of us are wanting more of what ATI's been doing.

And what exactly has ATI been doing?

Brought performance to the mass. Made Nvidia drop prices.

If it wasn't for ATI, Nvidia would still be milking you for $600 video cards.

Competitive pricing, that's it. And it not really because of ATI, but because ATI couldn't compete with Nvidia at the same price.

One could look at it just as easily the other way. Nvidia couldn't compete with ATi so they had to lower the price. Don't be so mayo.

That way of looking is already taken into account. As it is a result of ATI not being able to compete at the same price in the first place. After lowering the prices so much, it turns the tables, and thus Nvidia is on the short end and has to compromise. It's not what either company chose to do, but what the market made them do.

Not quite. I remember Nvidia was trying to sell GTX280 for $550+ and $400+ for GTX260 when they were first released. ATI in the other hand came with 4870 at $299 around the same period. The market never made ATI drop prices. It was their intention to make smaller chips and only competitive in mid range market because that's where the bulk of the sales were. It's just that GT200 performance level was so underwhelming thus needed to be competitive with RV770 price segments.
 

BenSkywalker

Diamond Member
Oct 9, 1999
9,140
67
91
most of us are wanting more of what ATI's been doing.

If that were true, nV wouldn't hold 67% of the add in board market. I would much rather see a close to 50/50 split myself, so hopefully ATi figures out things more people like then just their die hard fan base soon.

If it wasn't for ATI, Nvidia would still be milking you for $600 video cards.

Well, if we forget about....

The class action lawsuit against both Nvidia and ATI was filed in late August of this year, by plaintiffs Jordan Walker and Michael Bensingor. According to the plaintiffs, Nvidia and ATI allegedly violated federal antitrust laws by conspiring ?to fix, raise, maintain and stabilize prices of graphics processing chips and cards.? As well, the plaintiffs contended that both Nvidia and ATI were unlawfully colluding ?to coordinate new product introductions.? The plaintiffs were seeking ?triple damages, attorney fees and the costs under the federal antitrust laws.? The certified class is limited to ?all individuals and entities who purchased graphics processing card products online from the defendants? websites in the United States during the period from December 4, 2002 to November 7, 2007.?

Turns out ATi and nV's pricing was so bad it was illegal. They were working together to screw us all, but don't let something like that make you think less of the one that is glorious and virtuous, the other is clearly evil because..... well..... it clashes with orange.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: mwmorph
Originally posted by: Idontcare
Originally posted by: nitromullet
Originally posted by: mwmorph
I see a immediate problem with that. As my light bulbs die in my house, I am slowly switching to the low wattage compact fluorescent bulbs (the 13w or so squiggly ones you use to replace the lightbulb shaped incandescent ones) and with the way most households are heading, if the tech doesn't work with florescents, then it's basically useless.

Just about everyone I know is moving towards to the low wattage CFLs and there is a real push in the industry to adoption of those. In the EU & Canada, your normal 40-150w incandescent will be phased out by 2012 and in the US, by 2014.

If the 3D displays not working in florescent lighting is true, then ATi was smart to not spend money on that 3D gaming idea. It's a dead end, worthless technology before it even matures and it would be stupid to invest in the current idea.

I didn't even think of that, pretty much every bulb in my house is florescent. We bought a 24 pack of these bulbs at Costco when we first moved and replaced most of the bulbs.

Fluorescent lamps using a magnetic mains frequency ballast do not give out a steady light; instead, they flicker at twice the supply frequency.

However I have noticed that LED based lights are the next wave of incandescent replacement, coming in at about 1/3 the power consumption per lumen of the CFL's, so I'm not sure how long of a market life CFL's really have given the plethora of disadvantages that CFL's have (warmup time that increases over lifespan, lumens decline over lifespan, mercury hazards if broken, etc) which LED based bulbs do not.

I steadily replaced all my incandescent with CFL's over the past 2 years, and now as my 10yr lifetime CFL's start dying off barely a year into their lifespan I am replacing them with LED equivalents. Sams club even stocks them, they are becoming more an more prevalent every day.

All fluorescent lights do that with the exception of a few special applications of high frequency ballasts (like in tanning bed). Flickering at 120hz (US) or 100hz(most of the world) is normal, just like 60/120hz and 50/100hz is the normal frequency of tvs and monitors in the world.

The LED lighting is a viable replacement, but the expense too high right now for the money saved. ALso LEDS do have issues with high lumen lighting. Most LEDs available are low light, medium cost bulbs while the medium light bulbs have high costs an heat issues (converting high v ac to low v dc). You can't just stack more leds to increase brightness, LED efficiency goes down a heat increases, so we've hit a brightness wall right now.

Also LEDS tend to be directional, which is great for taillights and stoplight but bad for general room lighting.

Then there's the problem of production yields, it requires semiconductor production to make a LED bulb and the phosphors for true high quality, white light are still too expensive to use for widespread adoption of home lighting.

Overall, LED is a great idea but still has technical hurdles that CFLs don't have to worry about. It's too young a tech to be declared as the replacement yet.

To get more life out of CFLs, turn then on and off less frequently. Each power cycle degrades the bulb more and more. In home situations, the fastest killer of CFLs is frequent power cycles since the process of witching a light on degrades the cathode surface.

I'm just saying they are available right now at your local Sams Club and Home Depot, they claim to save you even more money than a CFL does in the same way CFL's are pruported to save you money by replacing incandescent bulbs.

All the arguments you make for them being costly and unpractical could be 100% accurate but it would appear (to Sams club shoppers) that they are not so problematic as to prevent them from being a financially viable reality existing down aisle 16 on the left, mid shelf, take it home today if you want to.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: Modelworks

The thing about 120hz displays is that you need a 120 fps capable card or the display is going to interpolate to display at 120hz.
But this is true of any display receiving frames at less than its refresh rate; they all simply display the last frame until they get a new one.

The point of 120 Hz is that it can display 120 FPS with vsync, unlike a 60 Hz device which can only display 60 FPS.

Without vsync, a 120 Hz device is twice as likely to have a refresh cycle available than a 60 Hz display (also each refresh cycle is twice as fast), so that means less tearing.

Also because there are more sampling points with 120 Hz, it?s beneficial even when the framerate is lower than the refresh rate (e.g. 24 FPS fits evenly into 120 Hz, but it does not into 60 Hz).

Remember LCD is not like CRT, it only updates the display when something changes. If you display a picture on a LCD display with no pixels changing the refresh rate is 0hz .
Right, but if your game is running at 120 FPS, your display will be receiving 120 FPS. That?s exactly why nVidia?s glasses won?t work on 60 Hz devices. I?m not sure how a scenario like a desktop picture that never changes is relevant.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Be careful buying some brands of LED lighting if you care about flicker. There is a technique to make LED brighter than normal using PWM. They send pulses to the LED for a split second at higher than the normal current capacity of the LED to make it brighter. It reduces the lifespan of the LED and induces flicker.
 

Fox5

Diamond Member
Jan 31, 2005
5,957
7
81
Originally posted by: DaveSimmons
Out of curiosity, what 3D glasses games are you playing with your nV card now? Perhaps you can convince us of the awesomeness where the hardware PhysX fans have failed.

It works in almost all games.
Main problem is that the additional hardware for 3d vision is expensive. I don't need a new monitor for starters, and then their glasses are expensive.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
Originally posted by: BFG10K
Originally posted by: Modelworks

The thing about 120hz displays is that you need a 120 fps capable card or the display is going to interpolate to display at 120hz.
But this is true of any display receiving frames at less than its refresh rate; they all simply display the last frame until they get a new one.

Really the term refresh rate should not be used with LCD. Refresh rate meant what it said with CRT. The phosphors on a CRT monitor are hit with an electron beam and then immediately start to fade the second that beam turns off. The beam starts at the top and moves down left to right one row at a time until the bottom then goes back to the top and repeats. If the beam is not completing that cycle fast enough then the phosphors go dark before they are hit with electrons again, so the phosphor is dark then light, that is the source of flickering monitors. Higher refresh rates look better because the phosphor is not being allowed to darken because it is being hit with electrons more often. So if you display a picture on a CRT or display a movie, it is always displaying at the refresh rate regardless of how many fps you send it.

LCD is different in that it never updates the display and the pixels will not change unless it gets something different to display. So if you feed it 10 fps or 50fps whatever fps you send it , that is the ~refresh rate (really hate that term with LCD). If you setup a monitor to use some of the interpolating features then what happens is you are telling the monitor to update the screen 120 times a second regardless of the input fps. So the LCD takes the hz setting of 120hz and divides that by the data coming in . 120hz / 60fps = 2, so the monitor needs to make up a frame for every frame sent. The DSP inside the LCD compares frame 1 and frame 2 and creates a frame that is the difference of the two ( I left out a lot about that process) . That frame never existed in the source and can make things look worse or better depending on the DSP in the LCD.


The point of 120 Hz is that it can display 120 FPS with vsync, unlike a 60 Hz device which can only display 60 FPS.

Without vsync, a 120 Hz device is twice as likely to have a refresh cycle available than a 60 Hz display (also each refresh cycle is twice as fast), so that means less tearing.

Also because there are more sampling points with 120 Hz, it?s beneficial even when the framerate is lower than the refresh rate (e.g. 24 FPS fits evenly into 120 Hz, but it does not into 60 Hz).

Vsync is another term that needs to be replaced, it really doesn't apply to LCD either. Vsync means that the video data is being sent when the electron beam is at the top of the screen so the data doesn't change when the beam is half way down the screen. Since data is sent to LCD as one complete frame at a time there really is nothing that can change until the next frame. LCD monitors use the Vsync signal to tell them when a new frame is possible but Vsync was designed to tell the PC the CRT is at the top scan line. Kind of in reverse.

The important thing about 120hz is it has the possibility of updating more often because like you said it is looking for frame change 120 times a second.


Originally posted by: Modelworks
Remember LCD is not like CRT, it only updates the display when something changes. If you display a picture on a LCD display with no pixels changing the refresh rate is 0hz .
Right, but if your game is running at 120 FPS, your display will be receiving 120 FPS. That?s exactly why nVidia?s glasses won?t work on 60 Hz devices. I?m not sure how a scenario like a desktop picture that never changes is relevant.

If you send it 120FPS, it is updating 60 times per second for each eye. They could work with 60hz displays but that would mean 30 updates per second per eye and could cause headaches and eye fatigue .

I brought up the desktop picture that never changes to illustrate the point that refresh rate on LCD is not the same as CRT, as I said above CRT will always update the display at the refresh rate regardless of the input. So a picture never that never changes will always be redrawn 60 times a second on a 60hz CRT , while on a LCD it is never redrawn unless something changes.
 

ronnn

Diamond Member
May 22, 2003
3,918
0
71
If the scuttlebutt around b3d can be believed. AMD graphics made money for a change. Market share has grown and likely nvidia (which lost money) will be very late this coming round. Seems like ati's big mistake was being bought out by amd.

The op is right if you want 3d glasses, nvidia is the only ticket. :beer:
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: ronnn
If the scuttlebutt around b3d can be believed. AMD graphics made money for a change. Market share has grown and likely nvidia (which lost money) will be very late this coming round. Seems like ati's big mistake was being bought out by amd.

The op is right if you want 3d glasses, nvidia is the only ticket. :beer:

Scuttlebutt would be wrong then, unfortunately:

Originally posted by: Idontcare
Originally posted by: thilan29
http://www.businesswire.com/po...0721006259&newsLang=en

GPU revenue is up and as usual the CPU side is down.

Since its going to be asked over and over again till horses be dead, the Graphics revenue was $251m and profits were -$12m (a loss for the quarter).

Interestingly they do report/list their portion of the foundry revenue (and losses).

Financial tables here

http://forums.anandtech.com/me...=2320703&enterthread=y
 

reallyscrued

Platinum Member
Jul 28, 2004
2,618
5
81
Originally posted by: Modelworks
Originally posted by: BFG10K
Originally posted by: Modelworks

The thing about 120hz displays is that you need a 120 fps capable card or the display is going to interpolate to display at 120hz.
But this is true of any display receiving frames at less than its refresh rate; they all simply display the last frame until they get a new one.

Really the term refresh rate should not be used with LCD. Refresh rate meant what it said with CRT. The phosphors on a CRT monitor are hit with an electron beam and then immediately start to fade the second that beam turns off. The beam starts at the top and moves down left to right one row at a time until the bottom then goes back to the top and repeats. If the beam is not completing that cycle fast enough then the phosphors go dark before they are hit with electrons again, so the phosphor is dark then light, that is the source of flickering monitors. Higher refresh rates look better because the phosphor is not being allowed to darken because it is being hit with electrons more often. So if you display a picture on a CRT or display a movie, it is always displaying at the refresh rate regardless of how many fps you send it.

LCD is different in that it never updates the display and the pixels will not change unless it gets something different to display. So if you feed it 10 fps or 50fps whatever fps you send it , that is the ~refresh rate (really hate that term with LCD). If you setup a monitor to use some of the interpolating features then what happens is you are telling the monitor to update the screen 120 times a second regardless of the input fps. So the LCD takes the hz setting of 120hz and divides that by the data coming in . 120hz / 60fps = 2, so the monitor needs to make up a frame for every frame sent. The DSP inside the LCD compares frame 1 and frame 2 and creates a frame that is the difference of the two ( I left out a lot about that process) . That frame never existed in the source and can make things look worse or better depending on the DSP in the LCD.


The point of 120 Hz is that it can display 120 FPS with vsync, unlike a 60 Hz device which can only display 60 FPS.

Without vsync, a 120 Hz device is twice as likely to have a refresh cycle available than a 60 Hz display (also each refresh cycle is twice as fast), so that means less tearing.

Also because there are more sampling points with 120 Hz, it?s beneficial even when the framerate is lower than the refresh rate (e.g. 24 FPS fits evenly into 120 Hz, but it does not into 60 Hz).

Vsync is another term that needs to be replaced, it really doesn't apply to LCD either. Vsync means that the video data is being sent when the electron beam is at the top of the screen so the data doesn't change when the beam is half way down the screen. Since data is sent to LCD as one complete frame at a time there really is nothing that can change until the next frame. LCD monitors use the Vsync signal to tell them when a new frame is possible but Vsync was designed to tell the PC the CRT is at the top scan line. Kind of in reverse.

The important thing about 120hz is it has the possibility of updating more often because like you said it is looking for frame change 120 times a second.


Originally posted by: Modelworks
Remember LCD is not like CRT, it only updates the display when something changes. If you display a picture on a LCD display with no pixels changing the refresh rate is 0hz .
Right, but if your game is running at 120 FPS, your display will be receiving 120 FPS. That?s exactly why nVidia?s glasses won?t work on 60 Hz devices. I?m not sure how a scenario like a desktop picture that never changes is relevant.

If you send it 120FPS, it is updating 60 times per second for each eye. They could work with 60hz displays but that would mean 30 updates per second per eye and could cause headaches and eye fatigue .

I brought up the desktop picture that never changes to illustrate the point that refresh rate on LCD is not the same as CRT, as I said above CRT will always update the display at the refresh rate regardless of the input. So a picture never that never changes will always be redrawn 60 times a second on a 60hz CRT , while on a LCD it is never redrawn unless something changes.

I think you are confusing a lot of processes. You are suggesting with a 120 hz monitor, watching something with natively 60 fps would look bad since it is "interpolating" two frames and finding the difference? Why would there even be a difference? It needs the same frame twice to make up the differnece. It can't be the difference between one frame forward and one frame backward, if it knew what the forward frame was, it would have displayed it already.

A 120 hz monitor is ideal for watching blu-ray movies and playing games, period. You can now Vsync to 120 hz vs just 60 and 24 fps divides evenly into it, and that's all that matters. No pulldowns, no fancy business.

Hell, if you wanted to, you could run the monitor at 60 hz if you felt like it just to make you happy. It's just an extra option man. We've needed it for years.


Not to thread-jack but does anyone have any idea why my natively 1280x1024 LCD display looks horribly blurry when its "refresh rate" is set to 60 hz instead of 75?


 

MegaWorks

Diamond Member
Jan 26, 2004
3,819
1
0
Originally posted by: Keysplayr
Originally posted by: ChaiBabbaChai
I really like my nVidia control panel, but nVidia based mobo's don't cut it right now, so that pretty much dictates that I stay with AMD and ATI.

I don't know about that. This 790i Deluxe kicks some buttocks. Supports all 775 CPUs. What is it that my board isn't cutting? Short of no i7 support?

Its because he doesn't know what he's talking about. I also have the 790i board, and from experience it's one the best board I've ever used. Gotta love those Foxconn boards.
 

ChaiBabbaChai

Golden Member
Dec 16, 2005
1,090
0
0
Originally posted by: bamacre
Originally posted by: ChaiBabbaChai
I really like my nVidia control panel, but nVidia based mobo's don't cut it right now, so that pretty much dictates that I stay with AMD and ATI.

That makes no sense. You can use an Nvidia card on an AMD-based board.

YEAH BROOOOOOOO. no shiz. But can you run SLi on an AMD board?
 

ChaiBabbaChai

Golden Member
Dec 16, 2005
1,090
0
0
Originally posted by: Keysplayr
Originally posted by: ChaiBabbaChai
I really like my nVidia control panel, but nVidia based mobo's don't cut it right now, so that pretty much dictates that I stay with AMD and ATI.

I don't know about that. This 790i Deluxe kicks some buttocks. Supports all 775 CPUs. What is it that my board isn't cutting? Short of no i7 support?

I just like the boards made with the AMD chipset right now. I'm sure nVidia is fine (I had a DFI nF3-250Gb for years), but mostly it's a customer service/build quality thing. I don't need the fastest computer ever.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: reallyscrued
Originally posted by: flexy
ATI's 3D support was ALWAYS lacking.

That's not true.

Originally posted by: flexy

There's also no question that this is no criteria if someone is not interested in this.

So like...95% of the population of gamers? How many of these people can afford these cards + these TVs?

Yeah, ATI reeeeeeeeeally dropped the ball this time. :roll:

Oups!! No! Not the "3d" like in 3d gaming..i am talking about the 3d with shutter-glasses.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: flexy
Originally posted by: reallyscrued
Originally posted by: flexy
ATI's 3D support was ALWAYS lacking.

That's not true.

Originally posted by: flexy

There's also no question that this is no criteria if someone is not interested in this.

So like...95% of the population of gamers? How many of these people can afford these cards + these TVs?

Yeah, ATI reeeeeeeeeally dropped the ball this time. :roll:

Oups!! No! Not the "3d" like in 3d gaming..i am talking about the 3d with shutter-glasses.

As for expensive....yes the current 120hz monitors are still expensive, and the Nvision shutter glasses cost like $120. But not like that it's more expensive than any other enthusiast's hardware, like watercooling or whatever you buy for your computer. Come one, there are people who do not overclock and buy a $1000 CPU...from that point of view its not *that* expensive. (I am not even talking about 3D projectors, things like that which you could ALSO buy and which would set you down a few grand


 

akugami

Diamond Member
Feb 14, 2005
5,995
2,328
136
Originally posted by: flexy
Originally posted by: flexy
Originally posted by: reallyscrued
Originally posted by: flexy
ATI's 3D support was ALWAYS lacking.

That's not true.

Originally posted by: flexy

There's also no question that this is no criteria if someone is not interested in this.

So like...95% of the population of gamers? How many of these people can afford these cards + these TVs?

Yeah, ATI reeeeeeeeeally dropped the ball this time. :roll:

Oups!! No! Not the "3d" like in 3d gaming..i am talking about the 3d with shutter-glasses.

As for expensive....yes the current 120hz monitors are still expensive, and the Nvision shutter glasses cost like $120. But not like that it's more expensive than any other enthusiast's hardware, like watercooling or whatever you buy for your computer. Come one, there are people who do not overclock and buy a $1000 CPU...from that point of view its not *that* expensive. (I am not even talking about 3D projectors, things like that which you could ALSO buy and which would set you down a few grand

By your reasoning a BMW 3 ($45k-80k) series is not that expensive because from the point of view of a person who buys a Bentley ($200k+). Seriously, go tell that to someone who is driving a Kia and see how hard you get laughed off the street.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: akugami
Originally posted by: flexy
Originally posted by: flexy
Originally posted by: reallyscrued
Originally posted by: flexy
ATI's 3D support was ALWAYS lacking.

That's not true.

Originally posted by: flexy

There's also no question that this is no criteria if someone is not interested in this.

So like...95% of the population of gamers? How many of these people can afford these cards + these TVs?

Yeah, ATI reeeeeeeeeally dropped the ball this time. :roll:

Oups!! No! Not the "3d" like in 3d gaming..i am talking about the 3d with shutter-glasses.

As for expensive....yes the current 120hz monitors are still expensive, and the Nvision shutter glasses cost like $120. But not like that it's more expensive than any other enthusiast's hardware, like watercooling or whatever you buy for your computer. Come one, there are people who do not overclock and buy a $1000 CPU...from that point of view its not *that* expensive. (I am not even talking about 3D projectors, things like that which you could ALSO buy and which would set you down a few grand

By your reasoning a BMW 3 ($45k-80k) series is not that expensive because from the point of view of a person who buys a Bentley ($200k+). Seriously, go tell that to someone who is driving a Kia and see how hard you get laughed off the street.

Well, the 120Hz monitor and 3DVision shutter glasses started out at 600.00 for both.
Now you can get a 120Hz 3D Ready monitor and shutter glasses for 440.00. 320.00 for the monitor and 120 for the glasses. If you're in the market for a new monitor, it makes transistioning that much easier. And it won't cost you a Bentley either.

Also, in the NCP, there is a compatability list under the "Stereoscopic 3D" section. Lists probably around 1600 games and their 3DVision compatability. Ranging from "Not Recommended", "Fair", "Good", "Excellent".
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,995
126
Originally posted by: Modelworks

Really the term refresh rate should not be used with LCD.
I disagree with that. I know how the displays work so I?ve snipped the rest of your quote, and that?s why I disagree with you.

No matter what the display technology or how it works, the refresh rate is used to describe the number of full frames the display can accept and display. The fact is, a 120 Hz LCD can accept and display 120 FPS while a 60 Hz LCD can only do 60 FPS. This applies to CRTs, Plasma, LCDs, projectors, or whatever.

To put it simply, it?s the metric of how many discrete frames the display can accept and display. I also might add that while LCDs only change after a frame is updated, the internal matrix still runs at the stated refresh rate.

Vsync is another term that needs to be replaced, it really doesn't apply to LCD either.
Since data is sent to LCD as one complete frame at a time there really is nothing that can change until the next frame. LCD monitors use the Vsync signal to tell them when a new frame is possible but Vsync was designed to tell the PC the CRT is at the top scan line.
I know how vsync works too and this statement is false, so I?ve snipped the rest of your quote.

Vsync applies to any display, regardless of the technology. Like refresh rate above. Any display can tear if the source frame changes before the old one has been displayed, and again this is irrespective of the display technology.

Even though an LCD doesn?t have a scanline and waits for the whole frame to be loaded first, the transmission of said frame to an LCD is not instantaneous. That means if the GPU changes the frame buffer?s contents before the old frame has been sent to the LCD, the resulting displayed image will tear.

If vsync didn?t apply to LCDs like you claim then we wouldn?t see tearing on them when vsync is disabled, but that?s patently false because we can. In fact, LCDs tear more than CRTs for the very reason that they only run at 60 Hz while CRTs will almost never be run that low when gaming. For this very reason 120 Hz LCDs will tear less without vsync and are superior to 60 Hz variants.

If you send it 120FPS, it is updating 60 times per second for each eye.
Right, by virtue of the display being able to accept and display 120 discrete frames per second, unlike a 60 Hz device that can only do 60 such frames. That illustrates my point of how refresh rate applies to any display device, including an LCD.

I brought up the desktop picture that never changes to illustrate the point that refresh rate on LCD is not the same as CRT, as I said above CRT will always update the display at the refresh rate regardless of the input. So a picture never that never changes will always be redrawn 60 times a second on a 60hz CRT , while on a LCD it is never redrawn unless something changes.
While this is technically true in your particular example, it?s not relevant at all to what is being discussed, and doesn?t change the other points I made in the slightest.
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Maybe it's just the terminology that is the issue. In CRT's, each pixel had to be refreshed so many times per second by an electron beam. In LCD's, at least TFT's, the "redraw" rate would be more appropriate I guess. Measuring how fast each thin film transistor can twist and untwist.

On the static desktop, CRT's still needed to continuously refresh the pixels. On LCD's the TFT remains static until a change is required. In gaming though, the max theoretical "redraw rate" on a 120hz LCD would be of course 120. Or 60 per eye using 3DV and shutter glasses.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
all the technicals are rather irrelevant, and yes, we had such discussion quite often. Whether a LCD "refreshes" or not or whether Vsync is really technially "vsync" also for a LCD it doesn't matter.

What matters is that the normal, current LCDs are "locked" at 60hz, even if in a game vsync is turned "OFF" and i set refresh to 100hz...the actual screen content is only refreshed 60 times/second.

NOW...i always said that personally for me it doesnt really matter, i know that hardcore gamers insist on 100+ refresh rate because they think that "6ohz" would be unplayable. I have never seen it like that, i have always seen MORE overall benefits with my new 60hz LCD over my old CRT, even if the old CRT was able to do higher refreshes.

But there is no doubt about it that those new 120hz LCD are extremely nice and now the final "hurdle" is taken, LCDs are now also attractive for those hardcore CS gamers who think that everything below 100hz is "not playable".

Are the 120hz monitors more expensive? Yes they are. Do the 3d glasses cost something...yes they do. But i am still saying its in a "tolerable" range..and myself i would actually spend $100 more on a LCD if i get a good image quality in return. The problem is just that i fear that the current viewsonic and samsung 120hz (aside from the 120hz) MIGHT have a little less image quality than my Benq. I am working on my computer, NOT only gaming. And the other thing is that for me a big FOV (field of view) would be important, especially when it comes to "real 3d" gaming. I really, really wish there would be a a 24"-30" soon with 120hz to make this worthwile, for my desk.
 

flexy

Diamond Member
Sep 28, 2001
8,464
155
106
Originally posted by: akugami
Originally posted by: flexy
Originally posted by: flexy
Originally posted by: reallyscrued
Originally posted by: flexy
ATI's 3D support was ALWAYS lacking.

That's not true.

Originally posted by: flexy

There's also no question that this is no criteria if someone is not interested in this.

So like...95% of the population of gamers? How many of these people can afford these cards + these TVs?

Yeah, ATI reeeeeeeeeally dropped the ball this time. :roll:

Oups!! No! Not the "3d" like in 3d gaming..i am talking about the 3d with shutter-glasses.

As for expensive....yes the current 120hz monitors are still expensive, and the Nvision shutter glasses cost like $120. But not like that it's more expensive than any other enthusiast's hardware, like watercooling or whatever you buy for your computer. Come one, there are people who do not overclock and buy a $1000 CPU...from that point of view its not *that* expensive. (I am not even talking about 3D projectors, things like that which you could ALSO buy and which would set you down a few grand

By your reasoning a BMW 3 ($45k-80k) series is not that expensive because from the point of view of a person who buys a Bentley ($200k+). Seriously, go tell that to someone who is driving a Kia and see how hard you get laughed off the street.

i dont understand that logic. I do NOT have two GTX295 in SLI, i dont have watercooling and i still do NOT have the latest mobo and Intel CPU. I still have an "old" Q6600 and still have a 8800GTS. If i were to spend $200-$250 more on a 120hz monitor w/ 3d glasses it would make more sense to me compared to what other people spend on SLI setups or other hardware. Skimping on the monitor would be unwise...and for a gamer the 120hz are actually a very *real* benefit.
All i am saying is that there are way stupider ways to waste money.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |