FarCry 1.3 available for DL

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Awesome. I just got this game the other day and I love it (man I'm behind the times). Was waiting to get a next gen video card before I played it. Finally got an X800 Pro and I'm loving it .
 

klah

Diamond Member
Aug 13, 2002
7,070
1
0
* Normal Map Compression. Requirements: NVidia: Geforce FX Family or better,
ATI: x800 card or better. This feature is disabled by default. To enable it,
type r_TexNormalMapCompressed 1 in the console after loading a level. Enabling
this feature during the game may take some time - the PC may appear to freeze.
This variable will not be saved when restarting the game. Enabling normal
map compression will have prolonged execution the first time running through a
level due to initial compression phase occurring in real time through the
level. Subsequent reloads of the same level will yield better performance and
therefore we recommend that you run any benchmark twice and to take the second
of the two runs for benchmarking purposes since this most closely represents
the usual user experience.

* SM 3.0 and SM 2.0x are now enabled by default when graphics settings are set
to ?Very High?. To see performance increases you must have Direct X 9.0c
installed.

* Anisotropic filtering disabled for some textures (light-maps, several lookup
textures, fall-off maps) for increased performance.

Time for more benchmarks.
 

Gamingphreek

Lifer
Mar 31, 2003
11,679
0
81
Anyone with a Geforce 6 card.... any gains noticed.

Does anyone know when a native SM3.0 game is slated to come out?

Nice find Skywalker.

-Kevin
 

PrayForDeath

Diamond Member
Apr 12, 2004
3,489
0
76
Originally posted by: Gamingphreek
Anyone with a Geforce 6 card.... any gains noticed.

Does anyone know when a native SM3.0 game is slated to come out?

Nice find Skywalker.

-Kevin

The only title I know is S.T.A.L.K.E.R (hate to write that!) which is coming out early 2k5 IIRC.

Does normal map compression offer IQ improvements? Or is it just to improve performance?
 

Brian48

Diamond Member
Oct 15, 1999
3,410
0
0
Wow! First, the Red Sox finally beat the Yankees and now, Crytek finally releases the v1.3 patch. Hell has indeed frozed over :Q
 
Jun 14, 2003
10,442
0
0
seems u have to enable normal map compression your self......do they improve performance or do they improve image quality?
 

element

Diamond Member
Oct 9, 1999
4,635
0
0
Originally posted by: Brian48
Wow! First, the Red Sox finally beat the Yankees and now, Crytek finally releases the v1.3 patch. Hell has indeed frozed over :Q

Frozen you illierate sox bastard.

j/k
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!

F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!
 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!

F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!

It probably was impractical to run it on ati's hardware. It can run, but it would require more passes to do the same thing as compared to nv's 6xxx archetecture. I also think they should have enabled it, but maybe they had good reason not to is what im saying. Or maybe theyre working on it for the "next" patch which we all know will come out within a '04.
 

CU

Platinum Member
Aug 14, 2000
2,414
51
91
They may not have it working on ATI hardware yet. One card has to be coded for first. Also, it is a beta feature remember.
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: otispunkmeyer
seems u have to enable normal map compression your self......do they improve performance or do they improve image quality?

It just compresses the normalmaps in the game to add peformace, they would have had to make new normalmaps to use it for higher quality.
 

jiffylube1024

Diamond Member
Feb 17, 2002
7,430
0
71
Originally posted by: Marsumane
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!

F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!

It probably was impractical to run it on ati's hardware. It can run, but it would require more passes to do the same thing as compared to nv's 6xxx archetecture. I also think they should have enabled it, but maybe they had good reason not to is what im saying. Or maybe theyre working on it for the "next" patch which we all know will come out within a '04.

I lost the link but there was an interview (perhaps at Sharky Extreme) about it and their excuse is pretty weak, and a thinly veiled ploy.

They talk about how for quality they decided to code HDR for FP32 instead of FP16 (they neglect to even mention FP24, which half of the market uses).

Then the interviewer asks something to the effect of "oh, so if it supports FP32 then it should work on the FX 5900's then (obviously with lower performance)" and the Crytek guy says something about "well, no, because the FX6800 has a type of blending technique that we use (so it's 6800 only)" .

It's bollocks - just a convenient excuse to cut out owners of older cards and ATI users for not having FP32 (funny, HDR seems to be working fine in HL2 on both ATI and Nvidia cards).

It shouldn't even be an issue of "ATI vs Nvidia," it's just common sense - support what your userbase runs. Perhaps FP24 would run faster than FP32 on the X800 series vs the 6800 series. So what? It's an apples to oranges comparison anyways, as the 6800 series is running it at a higher quality (whether this would make a significant difference or not in actual visual quality is currently unknown). It's just annoying. ATI has the support for HDR right there. It runs fine in RTHDRIBL and Pixel Shader demos...
 

Keysplayr

Elite Member
Jan 16, 2003
21,209
50
91
Originally posted by: jiffylube1024
Originally posted by: Marsumane
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!

F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!

It probably was impractical to run it on ati's hardware. It can run, but it would require more passes to do the same thing as compared to nv's 6xxx archetecture. I also think they should have enabled it, but maybe they had good reason not to is what im saying. Or maybe theyre working on it for the "next" patch which we all know will come out within a '04.

I lost the link but there was an interview (perhaps at Sharky Extreme) about it and their excuse is pretty weak, and a thinly veiled ploy.

They talk about how for quality they decided to code HDR for FP32 instead of FP16 (they neglect to even mention FP24, which half of the market uses).

Then the interviewer asks something to the effect of "oh, so if it supports FP32 then it should work on the FX 5900's then (obviously with lower performance)" and the Crytek guy says something about "well, no, because the FX6800 has a type of blending technique that we use (so it's 6800 only)" .

It's bollocks - just a convenient excuse to cut out owners of older cards and ATI users for not having FP32 (funny, HDR seems to be working fine in HL2 on both ATI and Nvidia cards).

It shouldn't even be an issue of "ATI vs Nvidia," it's just common sense - support what your userbase runs. Perhaps FP24 would run faster than FP32 on the X800 series vs the 6800 series. So what? It's an apples to oranges comparison anyways, as the 6800 series is running it at a higher quality (whether this would make a significant difference or not in actual visual quality is currently unknown). It's just annoying. ATI has the support for HDR right there. It runs fine in RTHDRIBL and Pixel Shader demos...

If my apple could accomplish the same thing your orange could, only quicker, wouldn't you be a little miffed? ATI's next gen will support FP32 IIRC. So, this is a good thing for next gen ATI owners.

 

Marsumane

Golden Member
Mar 9, 2004
1,171
0
0
Originally posted by: jiffylube1024
Originally posted by: Marsumane
Originally posted by: jiffylube1024
BAH! HDR doesn't work on X800 cards, only 6800 cards, even though we've seen these kinds of effects on ATI hardware before. What utter nonsense!

F**king hell. Do these developers really need to pick sides when doing stuff like this? I can put up with it if they make it perform better on one hardware or another but to hold out features like this altogether!? A pox on them!

It probably was impractical to run it on ati's hardware. It can run, but it would require more passes to do the same thing as compared to nv's 6xxx archetecture. I also think they should have enabled it, but maybe they had good reason not to is what im saying. Or maybe theyre working on it for the "next" patch which we all know will come out within a '04.

I lost the link but there was an interview (perhaps at Sharky Extreme) about it and their excuse is pretty weak, and a thinly veiled ploy.

They talk about how for quality they decided to code HDR for FP32 instead of FP16 (they neglect to even mention FP24, which half of the market uses).

Then the interviewer asks something to the effect of "oh, so if it supports FP32 then it should work on the FX 5900's then (obviously with lower performance)" and the Crytek guy says something about "well, no, because the FX6800 has a type of blending technique that we use (so it's 6800 only)" .

It's bollocks - just a convenient excuse to cut out owners of older cards and ATI users for not having FP32 (funny, HDR seems to be working fine in HL2 on both ATI and Nvidia cards).

It shouldn't even be an issue of "ATI vs Nvidia," it's just common sense - support what your userbase runs. Perhaps FP24 would run faster than FP32 on the X800 series vs the 6800 series. So what? It's an apples to oranges comparison anyways, as the 6800 series is running it at a higher quality (whether this would make a significant difference or not in actual visual quality is currently unknown). It's just annoying. ATI has the support for HDR right there. It runs fine in RTHDRIBL and Pixel Shader demos...

That is true. It probably isnt that big due to it taking more passes to do the same thing. I just think they wanted to get a patch out because they are way overdue and it may have been a bit easier to code for the nv hardware (as carmack said doom 3 was due to longer shader instruction length), or maybe they are just "supported" more by nv or have some sort of bias with the nv hardware. Even if it did tax the x800 series more, im sure some ppl would be glad to run it at 10x7 instead of 16x12 just to see HDR effects.

Also, why does it take an x800 card to run "normal map compression"? I assume this is to just compress the normal maps as doom 3 did for any lower image quality setting below (high?) ?
 

CraigRT

Lifer
Jun 16, 2000
31,440
5
0
Originally posted by: jiffylube1024
Awesome. I just got this game the other day and I love it (man I'm behind the times). Was waiting to get a next gen video card before I played it. Finally got an X800 Pro and I'm loving it .

Far Cry runs fantastic on my 9800 pro
one of the reasons I love the game
 

kylebisme

Diamond Member
Mar 25, 2000
9,396
0
0
Originally posted by: jiffylube1024

I lost the link but there was an interview (perhaps at Sharky Extreme) about it and their excuse is pretty weak, and a thinly veiled ploy...


That doesn't sound right; if it was just a matter of fp32 then the fx cards could do it too. From what I understand, you basicly need a 16-bit floating point frame buffer to pull off HDR with alpha channels; so with Far Cry and it's alpha channel heavy vegitation HDR is really only a viable option on nv4x hardware. However, the Geforce6 cards can't do a 16-bit floating point frame buffer and AA at the same time, and at least on my 6800gt the HDR in Far Cry is too damn hard on framerate to really play with it on anyway; so this really isn't much of a loss for Ati anyway.
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |