Its not like CDPR doesn't have a history of over promising on visuals and having to dial it way back. W3 had the same thing in early trailers. They just way over promised consoles. Show off stuff that will barely run on bleeding edge PC hardware and then ship the 'potato' version that will barely run at that. They really should have just released on PC this year and put it out on console in 6months or next Christmas. Heck they would be able to push ps5 and xbsx versions by then that don't suck. PC users are much more understanding of bleeding edge titles and beta software.
Yup totally agree. Even a 1080ti struggles to hit 30fps on their ultra recommend 1440 setting. I had to make several changes to settings before getting playable frames and one also included changing the memory config file in the game folder that sets your gpu and cpu memory for the game to use. They will fix it eventually but for now I'm seeing more stable fps compared to before.I can't say I'm understanding at all when they listed i7-4790 + GTX 1060 as the recommended hardware for 1080p high and that kind of setup would actually struggle to run at 30 fps from benchmarks I have seen. Struggling to run at 30 fps is what I would expect from hitting minimum requirements and not a recommended setup. Game doesn't look nearly good enough to have constant drops into the 30s and 40s on 4C/8T Haswell when the RDR2 can run at 60 fps with some settings tweaks on that kind of cpu. This isn't that impressive graphically in the way Crysis was in 2007 so I'm not understanding at all. CDPR ruined their good name with this unfinished crap.
I had to make several changes to settings before getting playable frames and one also included changing the memory config file in the game folder that sets your gpu and cpu memory for the game to use. They will fix it eventually but for now I'm seeing more stable fps compared to before.
I7 8700k and I'm at 3440x1440 so at ultra I was barely squeezing 30-33 with drops in the mid twenties.You'd have to define playable. GameGPU's benchmarks say that a 1080 Ti should be getting slightly above 30 fps at 1440p Ultra, at least in the scene tested. TPU seems to agree. What's your CPU?
Apparently it barely runs on base ps4, so something like 27fps and drops other than looking bad.
Apparently it passed Sony's certification but that was only probably for monetary reasons seeing how sony has pulled the game now from their store.Don't they even test before releasing on the PS4?
Apparently it passed Sony's certification but that was only probably for monetary reasons seeing how sony has pulled the game now from their store.
I7 8700k and I'm at 3440x1440 so at ultra I was barely squeezing 30-33 with drops in the mid twenties.
I can't say I'm understanding at all when they listed i7-4790 + GTX 1060 as the recommended hardware for 1080p high and that kind of setup would actually struggle to run at 30 fps from benchmarks I have seen. Struggling to run at 30 fps is what I would expect from hitting minimum requirements and not a recommended setup. Game doesn't look nearly good enough to have constant drops into the 30s and 40s on 4C/8T Haswell when the RDR2 can run at 60 fps with some settings tweaks on that kind of cpu. This isn't that impressive graphically in the way Crysis was in 2007 so I'm not understanding at all. CDPR ruined their good name with this unfinished crap.
sooo, those were the first released recommendations, right? almost a year ago? I recall seeing that and well, everyone, being astonished by the low hardware recs. But that has since been updated, right?
Recommendations from Steam page:
- RECOMMENDED:
- Requires a 64-bit processor and operating system
- OS: Windows 10
- Processor: Intel Core i7-4790 or AMD Ryzen 3 3200G
- Memory: 12 GB RAM
- Graphics: GTX 1060 6GB / GTX 1660 Super or Radeon RX 590
- DirectX: Version 12
- Storage: 70 GB available space
- Additional Notes: SSD recommended
Cyberpunk 2077 on Steam
Cyberpunk 2077 is an open-world, action-adventure RPG set in the dark future of Night City — a dangerous megalopolis obsessed with power, glamor, and ceaseless body modification.store.steampowered.com
Those requirements were just to get you to the game lobby. LOLI can't say I'm understanding at all when they listed i7-4790 + GTX 1060 as the recommended hardware for 1080p high and that kind of setup would actually struggle to run at 30 fps from benchmarks I have seen. Struggling to run at 30 fps is what I would expect from hitting minimum requirements and not a recommended setup. Game doesn't look nearly good enough to have constant drops into the 30s and 40s on 4C/8T Haswell when the RDR2 can run at 60 fps with some settings tweaks on that kind of cpu. This isn't that impressive graphically in the way Crysis was in 2007 so I'm not understanding at all. CDPR ruined their good name with this unfinished crap.
ah OK, great. I think the first release of that chart, the only one known about a year ago, was that it was just the first two systems, and resolution wasn't mentioned for what those tiers meant.
It makes a HUGE difference, lol.
The two specs were "minimum" and "recommended," the assumption being that, whoa, we're supposed to get something near 4k on a 1060 whu? (it might have been more 2080/5700XT as the top tier, though)...I think a lot of PC folks certainly assume early "recommended/high" settings would mean 4k today, right? ...which is why the performance is just way more shocking, even with the recently updated chart (still, basically, wrong...unless the game just has issues right now).
lol, yup! I think that is a copy of the first official hardware info released--is that still how steam lists it? hahahaha.
I don't know where you're getting this 4k talk from. I don't know anyone who would think 4k is the recommended spec. And their recommended spec isn't even any good for 1080p. They lied through their teeth and are still lying since i7-4790 + GTX 1060 6GB is still the official recommended spec. Wouldn't have been hard to change at any point, instead they officially doubled down on it November 20, 2020 knowing it was BS.
That part seems passable. From the benchmarks you should get ~30 fps with a 4790 and 1060 6 GB. Not much more than 30 though.
Gamegpu.ru's benchmark shows it dropping into the 20s on a 1060 6GB in this area, though averaging 31 fps at high. That seems pretty poor for recommended specs.
I don't know where you're getting this 4k talk from. I don't know anyone who would think 4k is the recommended spec. And their recommended spec isn't even any good for 1080p. They lied through their teeth and are still lying since i7-4790 + GTX 1060 6GB is still the official recommended spec. Wouldn't have been hard to change at any point, instead they officially doubled down on it November 20, 2020 knowing it was BS. CD Projekt Red has shown themselves to be about as trustworthy as EA.
Patch 1.05 to be out for pc soon. Lots of fixes for missions and some performance fixes for amd/ryzen.
Currently it's out for consoles now.
It will be out for pc soon. I think they want to do whatever they can to fix consoles first.Wow, seriously!? Look at the quest fixes! And they released game in such unfinished state and those fixes are probably only a fraction of issues
Edit: wait that is only for consoles lol
The steam listing you posted is actually all that was released about a year ago.
long before the updated recent listings that you also posted. I'm talking from the original first posted specs, a year ago, and what the gaming community thought that meant, at that time, with the knowledge of the world as it existed, at that time.
what do you think they assumed it meant, based on that information? all it said was "minimum" (which has only ever meant potato that passes the bare requirements to get the software to load) and "Reference" so...how we want you to play this game.
Now, it's one thing for those cards to be mentioned when they were actually the serious mid-tier, spendy standard cards of their time...two years before that announcement. ...but to be a 2 year-old, mid tier grandpa, recommended as the REFERENCE card for what should be "the visual masterpiece" of all games until that time, ~4 months to 1 year from now, you had to wonder....how efficient is this game? lol
And I don't mean reference in terms of studio perfection standards, but their design strategy to make this game visually perform to their consistent standards, on what should be standard hardware for "high end."
So, another way to think about it...another comparable AAA title, a year from release at the time of their first hardware announcement, (so direct comparison), would most likely have recommended a 2060 or 2080, really, or 5700XT. (already 1 year+-old cards, with another 1 year before release).
It was, from any reasonable comparable standard (think about how home values are largely set based on their recently-sold neighbors), those recommended specs, at that time, were already SUPER low, for what expectations reasonably would put those two data points at. Does that make sense? (so, if they are saying that a 1060 is reference, then surely my also-3 year-old 1080Ti can handle 4k, NO PROBLEM! with whatever this game puts out) see what I mean?