Originally posted by: rstrohkirch
I'll toss out some file sizes for texture based off the source engine compression standard
512x512 standard tga = 768k
512x512 tga compressed = 170k
512x512 tga with alpha compressed = 341k
1024x1024 standard tga = 3mb
1024x1024 tga compressed = 682k
1024x1024 tga with alpha compressed = 1.3meg
2048x2048 standard tga = 12mg
2048x2048 tga compressed = 2.7meg
2048x2048 tga with alpha compressed = 5.3meg
Generally speaking you do not want to compress your normal maps but it?s generally accepted that specular and diffuse maps are ok to compress
The source engine, like all new engines, dumps all of the resources for the level into system memory and then moves them to and from video memory based on what the client is currently rendering in the scene. So, realistically 256mb video ram is *generally* enough for the most part but system ram is a different story.
So let's say we go next gen with textures on our level and we have a scene with about 20 unique textures in it, which I would say is about average. We're going next gen so all our textures are at least 1024(we don't need no stinking 512) and a few of them are 2048. Let's just say we have 15 1024 textures each with normal maps and specular maps and 5 2048 textures each with normal maps and specular maps, none of them have alphas. This comes out to about 150 megs of texture and doesn't even include any textures from the models in the scene. An entire level by itself will probably have anywhere from 50-80 unique textures, maybe more. You add in all the textures for the models and effects, the fact that some textures have additional alphas, and the real possibility that the game uses more then just a base/spec/displacement texture set and you'll come to realize why you see those "low res" textures in games.
Retail games for the last 3 years could have easily been using high resolution textures but not all their customers at the time had 2gigs of ram amoung other things.
I still don't think you'll see all high resolution textures on these next gen consoles, they just don't have the hardware for it.
Originally posted by: 5150Joker
Originally posted by: hans030390
You're thinking along the lines of computer terms...not consoles. Consoles generally don't need amazing video cards to have great looking games. The systems aren't also dedicated to running an OS in the background like Windows.
Are you still living in 1997? In fact both 360 and PS3 have operating systems that eat up resources. The PS3 OS uses 32 mb of the 256 mb of vram and 64 mb of its 256mb of system memory vs the 360 OS that eats up 32 mb of total system memory and will likely keep going up with updates. These consoles are more like PCs than consoles of old. Those old arguments about them being closed systems and having higher efficiency doesn't apply anymore either since most console games are cross platform so they're not custom tailored for any single console. Just look at the 360, it took a year before a custom tailored game like GoW could show off its potential while the rest have looked fairly mediocre.
Originally posted by: Genx87
G80>PS3 GPU
Yes we could probably run circles around the PS3 on the highest end computers right now.
And yes I dont excited about playing games at 720p with a controller.
Originally posted by: Ichigo
Originally posted by: Genx87
G80>PS3 GPU
Yes we could probably run circles around the PS3 on the highest end computers right now.
And yes I dont excited about playing games at 720p with a controller.
Name one PC game that is superior to Gears of War graphically in both technical and artistic terms. Ha. Using the same logic, I don't get excited about playing the same old FPS game over and over. I just play one, CS:S. XD)
Originally posted by: Genx87
Originally posted by: Ichigo
Originally posted by: Genx87
G80>PS3 GPU
Yes we could probably run circles around the PS3 on the highest end computers right now.
And yes I dont excited about playing games at 720p with a controller.
Name one PC game that is superior to Gears of War graphically in both technical and artistic terms. Ha. Using the same logic, I don't get excited about playing the same old FPS game over and over. (I just play one, CS:S. XD)
I havent played Gears of War yet but the video's and screenies I have seen dont look much more advanced than what we are expecting or already have on the PC.
Originally posted by: Ichigo
Originally posted by: Genx87
G80>PS3 GPU
Yes we could probably run circles around the PS3 on the highest end computers right now.
And yes I dont excited about playing games at 720p with a controller.
Name one PC game that is superior to Gears of War graphically in both technical and artistic terms. Ha. Using the same logic, I don't get excited about playing the same old FPS game over and over. I just play one, CS:S. XD)
Originally posted by: thilan29
Oh BTW, that Killzone trailer looks pretty amazing and I'd be very impressed if that was ingame footage.
Originally posted by: schneiderguy
Originally posted by: thilan29
Oh BTW, that Killzone trailer looks pretty amazing and I'd be very impressed if that was ingame footage.
according to "PSM" magazine it was in game footage being rendered at ~5 fps on a dev box with dual 6800ultra's at around 5 fps, and they sped it up to 60fps to show the video. :Q
Originally posted by: beggerking
Originally posted by: 5150Joker
Originally posted by: hans030390
You're thinking along the lines of computer terms...not consoles. Consoles generally don't need amazing video cards to have great looking games. The systems aren't also dedicated to running an OS in the background like Windows.
Are you still living in 1997? In fact both 360 and PS3 have operating systems that eat up resources. The PS3 OS uses 32 mb of the 256 mb of vram and 64 mb of its 256mb of system memory vs the 360 OS that eats up 32 mb of total system memory and will likely keep going up with updates. These consoles are more like PCs than consoles of old. Those old arguments about them being closed systems and having higher efficiency doesn't apply anymore either since most console games are cross platform so they're not custom tailored for any single console. Just look at the 360, it took a year before a custom tailored game like GoW could show off its potential while the rest have looked fairly mediocre.
umm.. I think you misunderstood the term "operating system".. its nothing like windows and shouldn't need to use up resource as such because it doesn't needs them. Its a dedicated machine, not a general purpose machine as a personal computer.
As was witnessed in their demonstrations at GDC, Sony is planning to have an Operating System running constantly in the background, just like one witnesses when using the Xbox 360 OS (aka Dashboard).
Like the Xbox 360, these come at a cost and our sources have told us that these features use approximately:
- 64mb of the 256mb of available XDR memory off the Cell CPU
- 32mb of the 256mb of available GDDR3 memory off the RSX chip
- 1 SPE of 7 constantly reserved
- 1 SPE of 7 able to be "taken" by the OS at a moments notice (games have to give it up if requested)
In the case of the PS3 this equates to 12.5% of the available Cores on the CPU always reserved, an additional 12.5% sometimes taken by the OS, 12.5% of the available RSX memory and 25% of XDR Cell memory. Balancing these out, one could argue that Sony has removed up to 25% of the available CPU power and 18.75% of RAM for these features as well as others that are not mentioned here or will be added in future updates to the PS3 Operation System.
Well the PS3 basically has a G71 in it so it's no surprise that games for it look like they can be played on the cards we have today.Originally posted by: enz660hp
I was looking at videos and screens of this ps3 game and to tell you the truth....im not all impressed with the graphics of it. Some of the textures are low resolution, and I think this game can easily be played on cards that we have today. I was expecting much more from something that claims to be twice as powerful as the xbox360. Opinions anyone?
Originally posted by: m21s
Launch titles are always a little rough around the edges.
You wont be saying this come next year once the games are using the full potential of the PS3.
Gears of War for Xbox360 came out how long after the system released.....There you go.
Patience
Originally posted by: thilan29
Originally posted by: schneiderguy
Originally posted by: thilan29
Oh BTW, that Killzone trailer looks pretty amazing and I'd be very impressed if that was ingame footage.
according to "PSM" magazine it was in game footage being rendered at ~5 fps on a dev box with dual 6800ultra's at around 5 fps, and they sped it up to 60fps to show the video. :Q
How can u "speed it up"??
Originally posted by: schneiderguy
Originally posted by: thilan29
Originally posted by: schneiderguy
Originally posted by: thilan29
Oh BTW, that Killzone trailer looks pretty amazing and I'd be very impressed if that was ingame footage.
according to "PSM" magazine it was in game footage being rendered at ~5 fps on a dev box with dual 6800ultra's at around 5 fps, and they sped it up to 60fps to show the video. :Q
How can u "speed it up"??
take a screenshot of every frame then play the screen shots at 60 frames per second instead of 5fps like it was being rendered in real time
Originally posted by: schneiderguy
take a screenshot of every frame then play the screen shots at 60 frames per second instead of 5fps like it was being rendered in real time
Originally posted by: Crusader
It appears since G80 wiped ATI off the map
I don't know if your selective reading caught it, but Joker hasn't said that the G80 didn't wipe ATi "off the map".Originally posted by: Crusader
It appears since G80 wiped ATI off the map, joker has resorted to being a console fanboy (yet just so happens to be the console with the AMD chip in it even).
I guess if you cant beatem... run and hide.
Originally posted by: josh6079
What CPU do you have again?
Originally posted by: thilan29
Originally posted by: schneiderguy
take a screenshot of every frame then play the screen shots at 60 frames per second instead of 5fps like it was being rendered in real time
Oh...i see. Why were they rendering on 6800ultra SLI though?? I'm sure G70 was out at the time they were developing Killzone.
Originally posted by: Crusader
It appears since G80 wiped ATI off the map, joker has resorted to being a console fanboy (yet just so happens to be the console with the AMD chip in it even).
I guess if you cant beatem... run and hide.
Originally posted by: 5150Joker
Originally posted by: Crusader
It appears since G80 wiped ATI off the map, joker has resorted to being a console fanboy (yet just so happens to be the console with the AMD chip in it even).
I guess if you cant beatem... run and hide.
LMAO idiotically entertaining as usual. :thumbsup: