Medal of Honor: Airborne performance perview

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Screw hardware legion :!
The game doesn't even support DX10.
Last time i ever link them :! they go into my not linking website like Hardocp , legion and others.

 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
No AA which means the results aren't terribly useful; nVidia can run AA in this game.

I cannot wait to pick up the game next week. :thumbsup:
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: BFG10K
No AA which means the results aren't terribly useful; nVidia can run AA in this game.

I cannot wait to pick up the game next week. :thumbsup:

Blame Epic for not officially supporting AA with Unreal 3.0 :!

Alot of games using Unreal 3.0 engine don't have AA support.

I bet you would have to rename the exe file.

For Nvidia : R6Vegas_Game.exe
For ATI : Oblivion.exe

I really think Epic and Valve are wingers :!

They complain about everything :!

While ID software and Crytek just make the engine without complaint.

ID software's ID tech 5 engine is a true cross platform engine thats works excellent on PS3 , 360 , PC and Mac. Like at Quake Con , ID showed a demo of making a small level and then compiling it and then making it run on PS3 , 360 , PC and MAC. They were running at over 60FPS which would really impress a cross platform developers.

While Epic complain about "Ohh we don't have the man power" about Gears of War add on content for 360 or a demo for UT2007 for PS3 or 360. they make up the most BS reason on why Gears of War on PC having more content because that content is impossible to port on 360 because of a different version Unreal 3.0 was used on 360. Also a reason they can't realese a demo for UT2007 for the PS3 but can for the PC is because they don't have enough man power. Every thing say about Unreal 3.0 always contradicts what they say when they want to sell the Unreal 3.0 license. They also have screwed over alot of PS3 developers that had to abound the Unreal 3.0 engine because epic wouldn't tweak the engine to work well with Cell's Sep. Also it took sony some $$$ convincing before Epic actually started to really start tweaking it for the PS3 after E3 2007.

Valve on the other hand complains about development and every thing they can on something else. Like it takes them same amount of time to do 2 episode for HL 2 and tweak the Source engine as it took Crysis to Design and Develop the Cry engine 2 and Crysis. They have 2 other dev team working on portal and TFT2.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Blame :! Epic for not supporting AA with Unreal 3.0

Alot of games using Unreal 3.0 engine don't have AA support.
If the application doesn?t support it the driver should have a means to force it unless it?s absolutely impossible.

nVidia have driver level AA in R6 Vegas, Bioshock and MoH Airborne and that's a big IQ advantage.

I usually praise ATi's driver support but they've been slack in this area, especially since in theory it should be easier to get their post-filter style AA working with deferred rendering compared to regular MSAA/CSAA.
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
I bet you would have to rename the exe file.

For Nvidia : R6Vegas_Game.exe
Not with nVidia you don't; all you need is a profile with the R6 Vegas AA flag set.

For ATI : Oblivion.exe
If that even applies AA you'd lose application specific optimizations so it's not a good solution.
 

tuteja1986

Diamond Member
Jun 1, 2005
3,676
0
0
Originally posted by: BFG10K
Blame :! Epic for not supporting AA with Unreal 3.0

Alot of games using Unreal 3.0 engine don't have AA support.
If the application doesn?t support it the driver should have a means to force it unless it?s absolutely impossible.

nVidia have driver level AA in R6 Vegas, Bioshock and MoH Airborne and that's a big IQ advantage.

I usually praise ATi's driver support but they've been slack in this area, especially since in theory it should be easier to get their post-filter style AA working with deferred rendering compared to regular MSAA/CSAA.

I think ATI would like if the developer would actually let ATI tweak and fix the game engine so it would run good on ATI card like they let Nvidia do and not get shipped with major bugs like World in Conflict and Bioshock was shipped with. Dude owning a ATI card had not only performance issue but also game stopper bug where the floor in the game would completely disappear.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: tuteja1986
Originally posted by: BFG10K
Blame :! Epic for not supporting AA with Unreal 3.0

Alot of games using Unreal 3.0 engine don't have AA support.
If the application doesn?t support it the driver should have a means to force it unless it?s absolutely impossible.

nVidia have driver level AA in R6 Vegas, Bioshock and MoH Airborne and that's a big IQ advantage.

I usually praise ATi's driver support but they've been slack in this area, especially since in theory it should be easier to get their post-filter style AA working with deferred rendering compared to regular MSAA/CSAA.

I think ATI would like if the developer would actually let ATI tweak and fix the game engine so it would run good on ATI card like they let Nvidia do and not get shipped with major bugs like World in Conflict and Bioshock was shipped with. Dude owning a ATI card had not only performance issue but also game stopper bug where the floor in the game would completely disappear.

Nvidia pays out money to put their logo on the game in a non-skipable ad during loading. That's why they usually have better initial performance.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Where do you enable DX10 for this game? I have the game and have been slowly playing through it, but never saw an option for DX10 anywhere whatsoever.

I'm running Vista X64 and an HD2900XT before you ask that.
 

Modelworks

Lifer
Feb 22, 2007
16,240
7
76
They must be editing the files.
Theres an option in unreal3 titles for dx10.

I found the game fun, but way too short.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
After reading forums it seems that there is no DX10 option at all for Medal of Honor. It's DX9 only and editing the files does nothing at all (hence why screenshots between both modes are identical). Legionhardware is being slammed hard over at rage3d
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
Nvidia pays out money to put their logo on the game in a non-skipable ad during loading
TWIMTBP is a total joke of a program but I don't believe any money exchanges hands with it.

After reading forums it seems that there is no DX10 option at all for Medal of Honor.
So how do you explain the large performance difference with nVidia between the two modes then?

Perhaps it's ATi not rendering DX10 properly.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: BFG10K
Nvidia pays out money to put their logo on the game in a non-skipable ad during loading
TWIMTBP is a total joke of a program but I don't believe any money exchanges hands with it.

After reading forums it seems that there is no DX10 option at all for Medal of Honor.
So how do you explain the large performance difference with nVidia between the two modes then?

Perhaps it's ATi not rendering DX10 properly.

1)there is no option in any menu or config utility that offers it.
2) on BOTH platforms the game is 100% identical graphically in every single way when putting DX10 in the files (supposedly enabling it?).

I do not see legionhardware telling us all how they magically enabled DX10. It's a total joke TBH.

They tested DX9 on XP and their supposed DX10 in Vista likely under the assumption that in Vista it runs DX10 by default. They also uses Vista x64 which could possibly explain the performance difference between DX9 and DX10. Vista x64 drivers, and the slight hit from Vista itself. Logical right?
 

BFG10K

Lifer
Aug 14, 2000
22,709
2,980
126
They tested DX9 on XP and their supposed DX10 in Vista likely under the assumption that in Vista it runs DX10 by default. They also uses Vista x64 which could possibly explain the performance difference between DX9 and DX10. Vista x64 drivers, and the slight hit from Vista itself. Logical right?
Yes, I thought they used Vista for everything but I didn't realize they ran XP for DX9.

That would certainly explain the difference and actually puts ATi in a good light because their Vista drivers are clearly more optimal than nVidia's.
 

Drayvn

Golden Member
Jun 23, 2004
1,008
0
0
Runs bloody great on my old 1900xt and dual core E6600.

1650x1080 with everything on full and it ran at a constant 30fps and above. Thats a heck of an improvement in optimizing for the PC compared to R6 Vegas where it ran crap even with half the settings.

I found the game to be great fun, if short lived, but there is quite a bit of replay value in it.
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Does the DX10 install include DX9 inside of it for backwards compatibility?

Since there are performance differences, it seems that it has to be using DX10 somewhere. The graphics should be different, but DX10 is touted for performance increase. Is it possible that MOH:A decides internally whether to use DX10?
 

acole1

Golden Member
Sep 28, 2005
1,543
0
0
How CPU limited is the game?

I run into some chopping on my PC. Any ideas why?

X2 4400+ @ stock speeds
4x512 HyperX PC3200
BFG 8800GTS 640MB
Vista Ultimate 32bit
1280x1024
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: speckedhoncho
Does the DX10 install include DX9 inside of it for backwards compatibility?

Since there are performance differences, it seems that it has to be using DX10 somewhere. The graphics should be different, but DX10 is touted for performance increase. Is it possible that MOH:A decides internally whether to use DX10?

DX10 install? I didn't get an option.

read what I said before. They tested DX10 under Vista x64 under the assumption that Vista runs DX10 with this game. DX9 is in XP. Now I think that the FPS differences come from X64 drivers and Vista itself being slightly slower than XP.
 

Matt2

Diamond Member
Jul 28, 2001
4,762
0
0
Ok, so after reading this, I guess it is my system that is making my have extreme slowdowns with my 2900XT, Vista and MoH:A.

I am getting no where near 60 fps at 1920x1200, High quality and in Vista.

I've been blaming ATI's drivers for poor performance in MoH:A using vista, but Legion hardware didnt seem to have a problem. The only explanation now is the one I refused to believe in the beginning. My CPU is too slow. I still find it very hard to believe that a Opty @ FX60 would limit me so bad at 1920x1200 that frames drop into the single digits during firefights, even when lowering the resolution or detail levels.

The only time I ever got a solid 60fps playing this game maxed at native res was the training drops in the very beginning when I was getting 60-80 fps. As soon as the first firefights started, I was in the 5-40 fps range.

That being said, I would love to know which level legion hardware benched the game.
 

cmdrdredd

Lifer
Dec 12, 2001
27,052
357
126
Originally posted by: Matt2
Ok, so after reading this, I guess it is my system that is making my have extreme slowdowns with my 2900XT, Vista and MoH:A.

I am getting no where near 60 fps at 1920x1200, High quality and in Vista.

I've been blaming ATI's drivers for poor performance in MoH:A using vista, but Legion hardware didnt seem to have a problem. The only explanation now is the one I refused to believe in the beginning. My CPU is too slow. I still find it very hard to believe that a Opty @ FX60 would limit me so bad at 1920x1200 that frames drop into the single digits during firefights, even when lowering the resolution or detail levels.

The only time I ever got a solid 60fps playing this game maxed at native res was the training drops in the very beginning when I was getting 60-80 fps. As soon as the first firefights started, I was in the 5-40 fps range.

That being said, I would love to know which level legion hardware benched the game.

still waiting to see how they enabled DX10 too
 

speckedhoncho

Member
Aug 3, 2007
156
0
0
Originally posted by: cmdrdredd
Originally posted by: speckedhoncho
Does the DX10 install include DX9 inside of it for backwards compatibility?

Since there are performance differences, it seems that it has to be using DX10 somewhere. The graphics should be different, but DX10 is touted for performance increase. Is it possible that MOH:A decides internally whether to use DX10?

DX10 install? I didn't get an option.

read what I said before. They tested DX10 under Vista x64 under the assumption that Vista runs DX10 with this game. DX9 is in XP. Now I think that the FPS differences come from X64 drivers and Vista itself being slightly slower than XP.

Sorry, I didn't mean DX10 had to be installed before used; I thought it was pre-installed. I forgot they used XP for DX9.

They should've shown dxdiag's settings to clear what was going on in Vista.
 

nullpointerus

Golden Member
Apr 17, 2003
1,326
0
0
Originally posted by: Matt2
Ok, so after reading this, I guess it is my system that is making my have extreme slowdowns with my 2900XT, Vista and MoH:A.

I am getting no where near 60 fps at 1920x1200, High quality and in Vista.

I've been blaming ATI's drivers for poor performance in MoH:A using vista, but Legion hardware didnt seem to have a problem. The only explanation now is the one I refused to believe in the beginning. My CPU is too slow. I still find it very hard to believe that a Opty @ FX60 would limit me so bad at 1920x1200 that frames drop into the single digits during firefights, even when lowering the resolution or detail levels.

The only time I ever got a solid 60fps playing this game maxed at native res was the training drops in the very beginning when I was getting 60-80 fps. As soon as the first firefights started, I was in the 5-40 fps range.

That being said, I would love to know which level legion hardware benched the game.

I doubt your CPU would limit you like this. I am downloading the demo, and I have a 2900XT, C2D, and Vista, so I will check out your results. Is there any particular place in the demo level where your slowdowns began? Or do you only have the full copy of the game?

EDIT:

Check out this part of the review:
http://www.legionhardware.com/document.php?id=683&p=3

Then by under-clocking the Core 2 Duo E6850 to just 1.60GHz from 3.00GHz, an 87% reduction in frequency saw just a 15.6% performance advantage at 1920x1200 favouring the E6850.
Remember that the 1.60GHz C2D still managed over 50fps at 1920x1200 w/ 2900XT.

- What are the AA levels (if any) being used in these tests?
- Do you have AA forced on in CCC by any chance?

It looks like the reviewer did not use AA:

I will try and come back to it soon. There are so many games coming out right now and so little time I am testing ETQW (Quake Wars) right now and I have done with with No AA, 2xAA and 4xAA. The Radeon HD 2900XT does well with No AA as you would expect but it is still a little slower than the GTX. However, turn 2xAA and 4xAA on and the Radeon HD 2900XT falls behind the GTS.

It would be much the same in MOHA if you were to force AA on so those with the Radeon HD 2900XT will have to avoid doing so. I have to admit on a 24" LCD at 1920x1200 there is little need really, at least from what I have seen.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: Matt2
Ok, so after reading this, I guess it is my system that is making my have extreme slowdowns with my 2900XT, Vista and MoH:A.

I am getting no where near 60 fps at 1920x1200, High quality and in Vista.

I've been blaming ATI's drivers for poor performance in MoH:A using vista, but Legion hardware didnt seem to have a problem. The only explanation now is the one I refused to believe in the beginning. My CPU is too slow. I still find it very hard to believe that a Opty @ FX60 would limit me so bad at 1920x1200 that frames drop into the single digits during firefights, even when lowering the resolution or detail levels.

The only time I ever got a solid 60fps playing this game maxed at native res was the training drops in the very beginning when I was getting 60-80 fps. As soon as the first firefights started, I was in the 5-40 fps range.

That being said, I would love to know which level legion hardware benched the game.

get cmdrdredd to underclock his cpu to 2.2 (or whatever equivalent of 2.6 x2 is for an e6400) and play it for a while. I must warn you, cmdrdredd, I underclocked my cpu to 2.0 the other day to run a comparison for somebody and................IT................WAS..............SOOOOOOOOOOOOOOOOOOOOOOOO......SLOWWWWWWWWWWW..... it was actually painful, I must say, but it was for a good cause.
 

bryanW1995

Lifer
May 22, 2007
11,144
32
91
Originally posted by: nullpointerus
Originally posted by: Matt2
Ok, so after reading this, I guess it is my system that is making my have extreme slowdowns with my 2900XT, Vista and MoH:A.

I am getting no where near 60 fps at 1920x1200, High quality and in Vista.

I've been blaming ATI's drivers for poor performance in MoH:A using vista, but Legion hardware didnt seem to have a problem. The only explanation now is the one I refused to believe in the beginning. My CPU is too slow. I still find it very hard to believe that a Opty @ FX60 would limit me so bad at 1920x1200 that frames drop into the single digits during firefights, even when lowering the resolution or detail levels.

The only time I ever got a solid 60fps playing this game maxed at native res was the training drops in the very beginning when I was getting 60-80 fps. As soon as the first firefights started, I was in the 5-40 fps range.

That being said, I would love to know which level legion hardware benched the game.

I doubt your CPU would limit you like this. I am downloading the demo, and I have a 2900XT, C2D, and Vista, so I will check out your results. Is there any particular place in the demo level where your slowdowns began? Or do you only have the full copy of the game?

EDIT:

Check out this part of the review:
http://www.legionhardware.com/document.php?id=683&p=3

Then by under-clocking the Core 2 Duo E6850 to just 1.60GHz from 3.00GHz, an 87% reduction in frequency saw just a 15.6% performance advantage at 1920x1200 favouring the E6850.
Remember that the 1.60GHz C2D still managed over 50fps at 1920x1200 w/ 2900XT.

- What are the AA levels (if any) being used in these tests?
- Do you have AA forced on in CCC by any chance?

It looks like the reviewer did not use AA:

I will try and come back to it soon. There are so many games coming out right now and so little time I am testing ETQW (Quake Wars) right now and I have done with with No AA, 2xAA and 4xAA. The Radeon HD 2900XT does well with No AA as you would expect but it is still a little slower than the GTX. However, turn 2xAA and 4xAA on and the Radeon HD 2900XT falls behind the GTS.

It would be much the same in MOHA if you were to force AA on so those with the Radeon HD 2900XT will have to avoid doing so. I have to admit on a 24" LCD at 1920x1200 there is little need really, at least from what I have seen.
I don't know that I would take all of their info at face value. Did they actually play the entire game at 1.6? no, they probably just ran it for a few minutes at 1.6. Matt2 is experiencing horrible slowdowns at specific points in the game.

 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |