nVidia 3D Vision Surround Review

Page 2 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
Has anyone seen any Multimonitor setups using 40" or larger HDTV's?
It would take some aggressive adapter usage on the DP I think.

1) HDMI to HDMI
2) DVI to HDMI
3) DP Active adapter to DVI DVI to HDMI

Biggest issue would be larger bezel sizes on the HDTV's, but newer ones are getting reasonably slim. Another question I'm unclear on is whether the newer HDTV's with advertized 120hz specs can be used as monitors for 3D gaming, pretty sure the HDTV's advertising 120hz are not using same criteria as computer monitors that claim 120hz. In orther words, I wouldn't be able to use 120hz rated HDTV to get 3D gaming from my HTPC.

Yea, Ledfoot's sim setup was the final straw in convincing me to purchase an Eyefinity setup.
http://www.youtube.com/watch?v=7X2Bvb8-cqY

Many HDTV's have at least 1 DVI input btw, and converting DVI->HDMI is 'free', no quality loss because it's the same signal just different plugin.

Bezels seem less and less an issue the larger the monitor is to me because you have more screen real estate to bezel ratio. HDTV's advertised as 120Hz are generally 60Hz input and generate intermediary frames through interpolation to provide a fake 120Hz output, and so won't work for 120Hz 3d gaming. Afaik every HDTV that is true 120Hz is marketed as 3D Ready.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
For the life of me I don't understand why ATI went for displayport as a mandatory plug. I really don't and I don't know if I ever will.

It was a really bad decision.

Nevertheless, now I know that I won't be using Nvidia's surround just like I won't be using ATI's Eyefinity. I wouldn't spring for a $100 adapter so why the hell would a spring for a second $300 dollar card?

Well if you remember AT's great interview with the lead ATI designer he said they were struggling over die size vs features and had to make tough decisions about what to cut and what to keep in. Adding a 3rd on-die RAMDAC takes up extra space on the die, whereas DisplayPort is basically free. So the choice was increase the die size when they were already struggling to fit all the features they wanted on the die, or require DisplayPort for the minority interested in Eyefinity.

They'd also made a commitment to push DisplayPort, while this may have been more of a shove it's at least a shove in the right direction imo.

So it's pretty understandable why they require DisplayPort for a single card, not quite as understandable is why they don't support using 3xDVI between 2 cards since nVidia seems to have managed it quite well.
 

gorobei

Diamond Member
Jan 7, 2007
3,777
1,226
136
So it's pretty understandable why they require DisplayPort for a single card, not quite as understandable is why they don't support using 3xDVI between 2 cards since nVidia seems to have managed it quite well.

the reason was covered in the interview. At 3 monitors a software solution like NV surround is ok, but once you go higher resolutions 3x(25x16) or a 6 monitor setup the amount of buffer transfer over pcie or the sli/xfire link bridges gets to be too much. It may even get worse if you try a 3x or 4x sli/xfire.

Also, the tomshardware review points out that a number of games don't work correctly with 3d(shadows and lighting wont render at correct offset) and that a number of games require going to lower resolution in order to play well.
 

Dribble

Platinum Member
Aug 9, 2005
2,076
611
136
the reason was covered in the interview. At 3 monitors a software solution like NV surround is ok, but once you go higher resolutions 3x(25x16) or a 6 monitor setup the amount of buffer transfer over pcie or the sli/xfire link bridges gets to be too much. It may even get worse if you try a 3x or 4x sli/xfire.

Also, the tomshardware review points out that a number of games don't work correctly with 3d(shadows and lighting wont render at correct offset) and that a number of games require going to lower resolution in order to play well.

Surely it's the other way around? If each nvidia card is driving a screen then there is less transfer as each can deal with it's own portion of the picture, with the ati solution the whole lot needs to be sent to one card to output. This is backed up by early reviews showing Sli scaling to be much better then xfire scaling for multi monitor.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Surely it's the other way around? If each nvidia card is driving a screen then there is less transfer as each can deal with it's own portion of the picture, with the ati solution the whole lot needs to be sent to one card to output.

NV doesn't render hooked up screens only. It does the full output on one card, and the full output on the other card in AFR mode. Otherwise for one thing you'd have a 2/3rds 1/3rd split between the cards and they would be rendering different loads.

In a situation with 2 monitors on A and 1 monitor on B:
That means that 2/3rds of a frame needs to go from B to A every time B renders.
1/3rd of a frame needs to go from A to B every time A it renders.

So every other frame results in the transfer of a full frame of data (where the frame is 5760x1080, so each third is 1920x1080 in a x3 setup of that res).

With ATI you have 3 monitors on card A and 0 on card B.
That means you need to transfer a full frame from B to A every time it renders.
And you transfer 0 frames from A to B every time it renders (since it's doing the outputting).
Total transfer: 1 full frame.

Or at least that's how I would understand it to work. Basically the same amount of data is sent across the link in both cases, but with the NV method the load is split into 1/3rd and 2/3rds and goes in both directions (which may improve performance if it's a bi-directional link).
 

gorobei

Diamond Member
Jan 7, 2007
3,777
1,226
136
Surely it's the other way around? If each nvidia card is driving a screen then there is less transfer as each can deal with it's own portion of the picture, with the ati solution the whole lot needs to be sent to one card to output. This is backed up by early reviews showing Sli scaling to be much better then xfire scaling for multi monitor.

nope, read the reviews. it is pure afr. one card renders the entire extra wide frame and then splits the buffer to send the image data to the other card's output. the next frame is rendered by the previously unused card.

the sli performance advantage is going to be limited. the more cards/larger the resolution/more monitors means that eventually the pci-e paths and the sli bridges will hit their limit. While the general performance/memory edge gives the 480 an overall lead and scaling lead, at anything over 3 monitors (ie 5x portrait, 3x2 landscape. aka 3x2 output card sli) the amount of data crossing over wont likely be overcome with that horsepower lead.
 

NoQuarter

Golden Member
Jan 1, 2001
1,006
0
76
NV doesn't render hooked up screens only. It does the full output on one card, and the full output on the other card in AFR mode. Otherwise for one thing you'd have a 2/3rds 1/3rd split between the cards and they would be rendering different loads.

In a situation with 2 monitors on A and 1 monitor on B:
That means that 2/3rds of a frame needs to go from B to A every time B renders.
1/3rd of a frame needs to go from A to B every time A it renders.

So every other frame results in the transfer of a full frame of data (where the frame is 5760x1080, so each third is 1920x1080 in a x3 setup of that res).

With ATI you have 3 monitors on card A and 0 on card B.
That means you need to transfer a full frame from B to A every time it renders.
And you transfer 0 frames from A to B every time it renders (since it's doing the outputting).
Total transfer: 1 full frame.

Or at least that's how I would understand it to work. Basically the same amount of data is sent across the link in both cases, but with the NV method the load is split into 1/3rd and 2/3rds and goes in both directions (which may improve performance if it's a bi-directional link).

That was what I surmised too. With NV, every frame one of the cards sends a fraction of a screen, with ATI every other frame one card sends a full screen.

NV's method is a little trickier on drivers but more consistent bandwidth demands, ATI's method is easier to implement but causes bandwidth spikes. Which could account for some of the stuttering some Crossfire Eyefinity users have complained about.

Provided NV's drivers handle it well it's easy to see SLI Surround as a superior solution to Crossfire Eyefinity, though single card Eyefinity should have an advantage against cheaper SLI Surround configs (2x GTX 260 vs 1 5870).
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
That was what I surmised too. With NV, every frame one of the cards sends a fraction of a screen, with ATI every other frame one card sends a full screen.

NV's method is a little trickier on drivers but more consistent bandwidth demands, ATI's method is easier to implement but causes bandwidth spikes. Which could account for some of the stuttering some Crossfire Eyefinity users have complained about.

Provided NV's drivers handle it well it's easy to see SLI Surround as a superior solution to Crossfire Eyefinity, though single card Eyefinity should have an advantage against cheaper SLI Surround configs (2x GTX 260 vs 1 5870).

I wouldn't say superior generally, because it's limited to three monitors, but in 3 monitor configurations it's superior.
When it comes to flexibility, Eyefinity is the better solution since it can scale up to 6 monitors and there are more configurations available because of that, such as layout possibilities.

DualHead2Go + NV Surround might be an interesting experiment to give 6 monitor output though.
 

ViRGE

Elite Member, Moderator Emeritus
Oct 9, 1999
31,516
167
106
Just in case anyone is still confused about AFR, there was a nice NV-supplied diagram in some of the articles that spells it out.





Each card is rendering a complete jumbo frame; there's no frame splitting going on. This is also true for 3D Vision Surround, where a single card renders both the left eye and the right eye while the second card works on the next true frame.



I haven't seen an article definitively write about AMD's method. I'm assuming they have to be doing split frame rendering in CF mode because they scale to higher resolutions, where the framebuffer would eat the VRAM alive in 6/24 monitor configurations. Does anyone know for sure?
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
I wouldn't say superior generally, because it's limited to three monitors, but in 3 monitor configurations it's superior.
When it comes to flexibility, Eyefinity is the better solution since it can scale up to 6 monitors and there are more configurations available because of that, such as layout possibilities.

DualHead2Go + NV Surround might be an interesting experiment to give 6 monitor output though.

While I'm still getting used to the black bezel separation on a 3 monitor setup, I can't see a 6 monitor solution being used by anybody for gaming. Vertically splitting the scene into 3 monitors isn't ideal, but at least you have one full center screen where the main action takes place. Using 6 monitors would cut a horizontal swath right through your center screen splitting it into a top and bottom section. I can't see anybody enjoying that at all. Actually, 9 screens would be better than 6 as you would resume a full center screen.

Here is a shot of my setup.



Now, this takes getting used to, but now imagine a solid 1" or greater black bar cutting right through the center horizontally. Does not make a nice sandwich.
What I think 6 monitors would be absolutely terrific for, would be 2d apps in a work environment where you can move each work window to a specific screen if needed. For example, that type of setup would be ideal for stockbrokers monitoring numerous commodities.
 
Last edited:

dug777

Lifer
Oct 13, 2004
24,778
4
0
While I'm still getting used to the black bezel separation on a 3 monitor setup, I can't see a 6 monitor solution being used by anybody for gaming. Vertically splitting the scene into 3 monitors isn't ideal, but at least you have one full center screen where the main action takes place. Using 6 monitors would cut a horizontal swath right through your center screen splitting it into a top and bottom section. I can't see anybody enjoying that at all. Actually, 9 screens would be better than 6 as you would resume a full center screen.

Here is a shot of my setup.

<snip>

Now, this takes getting used to, but now imagine a solid 1" or greater black bar cutting right through the center horizontally. Does not make a nice sandwich.
The only thing 6 monitors would be absolutely terrific for, would be 2d apps in a work environment where you can move each work window to a specific screen if needed.


Fair enough, although I note that in Anand's review he said:

The single 3x2 group is the problematic configuration. For games you play in the third person, it's great. For first person shooters however, playing on an Eyefinity 6 setup puts you at a disadvantage due to crosshair problem.


I haven't any form of eyefinity or nvidia surround so I am not a learned authority, but clearly at least one person who has tried it thinks it is great for some forms of gaming
 
Last edited:

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Predictable as in "I hate how Dug is in like the future and sh!t" ?
Or predictable as if what I posted would be something that anyone would post who posessed logical thought?

3 monitors is currently the max amount of screens Nvidia can utilize for gaming, but I think you knew that but asked anyway. But ok.
No matter what kind of game you're playing, RPG, FPS, Sim, you're not going to want the main focus screen to be cut in half horizontally. So your spellcasting character, torso and head would be on screen e
2 and legs on screen 5 below it. It's extremely distracting. You either go 3 screens, or 9 (which ATI can only support). 6 is the oddball as would 12 be an oddball.
Predictable or not, what I'm saying makes sense.
 

GaiaHunter

Diamond Member
Jul 13, 2008
3,650
218
106
No matter what kind of game you're playing, RPG, FPS, Sim, you're not going to want the main focus screen to be cut in half horizontally. So your spellcasting character, torso and head would be on screen e
2 and legs on screen 5 below it. It's extremely distracting. You either go 3 screens, or 9 (which ATI can only support).

I can see some use for RTS, with proper implementation.

Other possibles use would be to have a game in 3 screens and something else on the top 3 screens (not sure in how many games that would be practical).

But since even 3 monitors support is quite at an infant state in gaming, I generally agree that 6 screens simply aren't that useful for gaming.

6 is the oddball as would 12 be an oddball.
Predictable or not, what I'm saying makes sense.

It does.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Predictable as in "I hate how Dug is in like the future and sh!t" ?
Or predictable as if what I posted would be something that anyone would post who posessed logical thought?

3 monitors is currently the max amount of screens Nvidia can utilize for gaming, but I think you knew that but asked anyway. But ok.
No matter what kind of game you're playing, RPG, FPS, Sim, you're not going to want the main focus screen to be cut in half horizontally. So your spellcasting character, torso and head would be on screen e
2 and legs on screen 5 below it. It's extremely distracting. You either go 3 screens, or 9 (which ATI can only support). 6 is the oddball as would 12 be an oddball.
Predictable or not, what I'm saying makes sense.

It didn't make sense to Anand when he tried it, as his comments clearly state in the review he did. He said it was 'great' for games you play in the third person. I have a large amount of respect for Anand, and I think it's fair to say that most people would agree he demonstrates plenty of 'logical thought'

As far as I know, you haven't actually tried 6 monitor eyefinity, have you? if not, then stay with me If you have, fair enough and your feedback is entirely validated

Looking around at reviews and posts, it seems that plenty of people have got used to three screens and bezels and effectively 'no longer notice them/find them distracting' while gaming. Even you yourself appear to be making the transition (if your pic is three 120Hz 3D Nvidia Surround capable displays that is enough to make anyone happy in ther pants)

Now it doesn't seem impossible or even implausible to me that you might also find 6 worked excellently as well once you had a chance to get used to it and get immersed in playing on a vast combined area of screens. That may be what Anand experienced for some games when he tried it, hence his comment that I provided above.

That is all
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Ok Dug. You're right, I'm wrong. Not going back and forth with you on it.



You made a pretty clear statement that you couldn't see a 6 monitor solution being used by anybody for gaming.
Anand's experience suggests that people can and have used a 6 monitor solution for gaming and enjoyed it.

That's pretty much all I should have said in the first place I suppose. Ah well, lesson learned

Of course, it's a point of the most incredible irrelevance to pretty much everyone who is likely to post here as I can't imagine many takers for six screens and associated gear to get it all to work!
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
Here is a shot from Anand's article using WoW -- this is why 6 may be too distracting to some when it comes to PC gaming -- right across the center, where aiming and site-lines are essential.

http://www.anandtech.com/show/2833

A seamless option would do wonders for a 6 configuration though.
 

dug777

Lifer
Oct 13, 2004
24,778
4
0
Here is a shot from Anand's article using WoW -- this is why 6 may be too distracting to some when it comes to PC gaming -- right across the center, where aiming and site-lines are essential.

http://www.anandtech.com/show/2833

A seamless option would do wonders for a 6 configuration though.

Here are some more examples illustrating the issues 6 screens bring:

http://www.anandtech.com/show/3621/amds-radeon-hd-5870-eyefinity-6-edition-reviewed/4

However, the Dirt2 video looks like it would work pretty well.
 

SirPauly

Diamond Member
Apr 28, 2009
5,187
1
0
The choice is always welcomed and it starts the ball rolling. However, even in Dirt2, the site-lines, well, the bezels are right in the way. There are probably some gamers that find the experience offered with 6 are worth the limitations of the bezels -- even in the center. If they find enjoyment, think it is great. Certainly glad there is a choice for 6 -- but believe for many and for me -- three is the sweet spot so-to-speak. This view was offered before I even knew nVidia was going to offer a surround feature.

Competition is a wonderful thing and with ATI offering EyeFinity may of pushed nVidia to offering a similar feature for their customers. Multi-monitor gaming certainly offers more immersion than what one monitor can offer and another feature to raise the gaming experience potential for both ATI and nVidia platforms -- both with pros and cons.

Was hoping for this feature when nVidia offered their Big Bang 2 driver set a short time ago with multi-monitor and Sli, -- now it's here and so welcomed and should only improve as hardware and software mature.
 

Matrices

Golden Member
Aug 9, 2003
1,377
0
0
I have no idea why some of you are trying to deny the plainly obvious: Nvidia has the superior high-end solution for triple-monitor gaming, period.

It's ironic, but it's true. It's been confirmed by performance tests on Tom's and PCPer.com. Crossfire scaling with Eyefinity sucks, simple as that. And then there are the stuttering issues many people seem to have using CF+EF.

The whole debate about 6-monitor gaming is hilariously irrelevant. Who the hell is going to pay for 6 monitors only to live with a horizontally bisected screen? Even if you can somehow ignore that, you're going to play games on a massive resolution with a single card or poorly scaling Crossfire solution that will run things like shit? Don't be ridiculous; almost no one is going to do that. Triple monitor is extravagant enough that it's a niche segment; six-monitor is not even worth talking about when we're discussing consumer choices.

I'm definitely not sold on "3D" Surround, but regular Surround outclasses and outmatches Eyefinity for those gamers who don't want to heavily sacrifice IQ to get a more immersive experience.
 

Lonyo

Lifer
Aug 10, 2002
21,938
6
81
Yeah 6 monitors is a bit weird...9 would be the way to go!!

Or 5.

Also triple monitor gaming doesn't have to require a super high end setup.
I was doing triple monitor gaming on my 7800GT years ago, total resolution was ~4000x1050.
Performed fine for racing sims.

It will be interesting to see if NV believes having single card multiple display setups is sensible when it releases its next generation of product.
 
Last edited:

waffleironhead

Diamond Member
Aug 10, 2005
6,940
456
136
I have no idea why some of you are trying to deny the plainly obvious: Nvidia has the superior high-end solution for triple-monitor gaming, period.

It's ironic, but it's true. It's been confirmed by performance tests on Tom's and PCPer.com. Crossfire scaling with Eyefinity sucks, simple as that. And then there are the stuttering issues many people seem to have using CF+EF.

The whole debate about 6-monitor gaming is hilariously irrelevant. Who the hell is going to pay for 6 monitors only to live with a horizontally bisected screen? Even if you can somehow ignore that, you're going to play games on a massive resolution with a single card or poorly scaling Crossfire solution that will run things like shit? Don't be ridiculous; almost no one is going to do that. Triple monitor is extravagant enough that it's a niche segment; six-monitor is not even worth talking about when we're discussing consumer choices.

I'm definitely not sold on "3D" Surround, but regular Surround outclasses and outmatches Eyefinity for those gamers who don't want to heavily sacrifice IQ to get a more immersive experience.


methinks its only "irrelevant" because you cant do it with your hardware.

The surround bezel naysayers were pretty rampant regarding their dislike for the 3 monitor setups, that is, until they tried it. I'm betting a whole slew of people will change their tunes about 6 mon setups once both vendors offer it.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |