Nintendo DX GPU?

Page 6 - Seeking answers? Join the AnandTech community: where nearly half-a-million members share solutions and discuss the latest tech.

Mondozei

Golden Member
Jul 7, 2013
1,043
41
86
NTMBK you make a compelling case for why it's technically possible to make an ARM SoC for consoles, but that is a separate issue from feasibility.

You're still skirting the issue of performance. You can't build an ARM SoC which is even close to where the competition is and all indications we've gotten so far from Nintendo suggest that they're not willing to be the poor cousin from the countryside anymore compared to their competition.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
Designed for servers.

This would be the el-cheapo NX variant if Nintendo doesn't care about grunt.

https://www.youtube.com/watch?v=v6Ry9ct7ymY

Not sure why some of you consider the latest ARM cores to be cheap, when they are reserved & often only used in top of the line mobile devices that costs twice as much as a typical console.

The PS4/Xbone was released not long ago, as some of you may be aware, it was designed in 2009.

If the NX is releasing in 2016, assuming Nintendo is faster at hardware design (for whatever reason), they would have designed it in 2012/2013. Also, dev kits being sent to studios would imply production early this year.

For the time-frame we're looking at in its inception stage, AMD's x86/GCN SOC is proven tech for the console industry. If Nintendo wanted el-cheapo weak hardware, AMD can scale down, or if they want more powerful than Xbone/PS4, simply launching later would assure that given improvements in uarch on x86/GCN since those former SOC went into production.

The DS is still very popular, I don't see the NX replacing that for Nintendo's hand-held approach because its design is well loved by hand-held gamers. The "Wii U's gimmick controller ipad" thing if they go with that again for the NX is just too clunky to replace DS or DS-next-gen.

Not to defend anyone here, but by your logic PS4 and XBox One should not have Jaguar and GCN in them...
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
NTMBK you make a compelling case for why it's technically possible to make an ARM SoC for consoles, but that is a separate issue from feasibility.

You're still skirting the issue of performance. You can't build an ARM SoC which is even close to where the competition is and all indications we've gotten so far from Nintendo suggest that they're not willing to be the poor cousin from the countryside anymore compared to their competition.

Unless Nintendo can scale down an APU for use in the handheld version of NX, this is a moot point. Nintendo decides on the design and form factor of their hardware before they finalize the specifications, and if they can't make it work they will lower the hardware specs before they even consider changing the form factor. This happened with Wii U and is why several games were cancelled before launch, and why others didn't live up to the performance developers were bragging about. Wii U's power was cut significantly in early 2012 due to overheating in the small chasis.
 
Feb 19, 2009
10,457
10
76
Not to defend anyone here, but by your logic PS4 and XBox One should not have Jaguar and GCN in them...

You do realize architectures take many years to design? GCN was designed around that time, it was on the record. Thought it would be common knowledge.
 

MisterLilBig

Senior member
Apr 15, 2014
291
0
76
Not sure why people find ARM so unfeasible. Are the Vita and 3DS and New 3DS so unpopular in providing compelling fun and interactive experiences?

If I were to go by the most amazing gameplay experiences in my life, would x86 even register on my top 5 console/devices/system? (PC obviously)
The answer is, no. I still think the Dreamscast was amazing, and the N64 beyond fun, and the Playstation 1 and 2 as legendary single player experiences. GBA, DS, omg...years of fun.

Reality is, CPU does not equal Fun and Amazing Games. It is just a means towards something. An 8 core ARM CPU will do the same as an 8 core x86. ARM would benefit Nintendo because they already have thousands of engineers with experience with it. And with the move towards mobile apps also, more of a benefit.
 
Last edited:

NTMBK

Lifer
Nov 14, 2011
10,322
5,352
136
NTMBK you make a compelling case for why it's technically possible to make an ARM SoC for consoles, but that is a separate issue from feasibility.

You're still skirting the issue of performance. You can't build an ARM SoC which is even close to where the competition is and all indications we've gotten so far from Nintendo suggest that they're not willing to be the poor cousin from the countryside anymore compared to their competition.

The competition is running on 1.6GHz Jaguar cores. We're not talking about some sort of Skylake-killing monster here! The Cortex A57 cores in the Tegra X1 already outperform them- take a look at Phoronix's Linux benchmarks: http://www.phoronix.com/scan.php?page=article&item=nvidia-tegra-x1&num=3 In several tests, the X1 scores ahead of the Athlon 5350, a system with 2GHz Jaguar cores, and is consistently ahead of the 1.6GHz Athlon 5150. And Cortex A72 cores are even faster than the A57.

This isn't the ARM of a few years ago. Their modern high-end cores have some pretty serious performance.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
You do realize architectures take many years to design? GCN was designed around that time, it was on the record. Thought it would be common knowledge.

You're absolutely correct. So, that brings me to the point of that post, which flew over your head: the same would apply to ARM. With that in mind, what's stopping them from using a newer ARM core other than themselves?

Also, I'm telling you guys, NX is based on the iOS model where there are two separate devices using the same architecture and OS. Given all of the facts available to us, it's the only thing that makes sense.

(As a side note, the 3DS and the DS are not the same system. 3DS was the replacement for DS,)
 

poofyhairguy

Lifer
Nov 20, 2005
14,612
318
126
The competition is running on 1.6GHz Jaguar cores. We're not talking about some sort of Skylake-killing monster here!

Exactly. ARM has enough power to match a PS4 CPU.

Ports aren't as big of deal as people are making them out to be. Many companies have ported games to iOS or Android, or port games to the older consoles which have the same challenges. What matters is GPU power.
 
Last edited:

Exophase

Diamond Member
Apr 19, 2012
4,439
9
81
The competition is running on 1.6GHz Jaguar cores. We're not talking about some sort of Skylake-killing monster here! The Cortex A57 cores in the Tegra X1 already outperform them- take a look at Phoronix's Linux benchmarks: http://www.phoronix.com/scan.php?page=article&item=nvidia-tegra-x1&num=3 In several tests, the X1 scores ahead of the Athlon 5350, a system with 2GHz Jaguar cores, and is consistently ahead of the 1.6GHz Athlon 5150. And Cortex A72 cores are even faster than the A57.

This isn't the ARM of a few years ago. Their modern high-end cores have some pretty serious performance.

Exactly; I don't know why people think XB1 and PS4 have a performance advantage with their Jaguars just because they're x86. Note that the Tegra X1 is ultimately still something of a jacked up tablet SoC and that a dedicated console SoC could probably push the clock speed of A72 CPUs higher than 1.9GHz.

If we're talking x86, a Jaguar derived core (Puma?) is probably still the most realistic option for Nintendo; something like Skylake isn't even on the table. Excavator is probably a worse fit and Zen is probably not realistic. You really want relatively small cores with good perf/W so you can get good aggregate performance with a lot of them. This was what Sony and MS chose for this generation because games at this point are good enough at leveraging that number of threads.

It's been said a lot already, but it really behoves Nintendo to use the same basic architecture for their handheld and console. Where the latter is scaled up in core counts and clock speeds. For the handheld side, ARM is a better choice because something like Cortex-A72 simply has better power management and scaling at the very low end and is much more proven in this segment than something like Puma. It also allows better options for 3DS compatibility.

Trying to do a console SoC with both ARM and x86 CPUs like recommended would be a pretty serious design headache and area compromise. Not to mention what it would add to licensing costs. I doubt the advantages of using x86 would make this worthwhile.

I do however think that Nintendo should try to work Wii U compatibility in there somewhere. I don't think they can really do a good new console that has it builtin without adding too much of a price premium, but it can possibly be accomplished via some kind of upgrade kit. The question is, how much of the NX hardware can they leverage in such a kit? Sharing components like RAM would be ideal but I don't know if they can really do that over an expansion interface.

Software emulation could be explored; normally I wouldn't find this very realistic but I wouldn't have found XBox360 emulation on XB1 to be realistic either...
 

Headfoot

Diamond Member
Feb 28, 2008
4,444
641
126
The WiiU Wii mode is really very elegant. They built in identical resources from the Wii into the WiiU main SoC. There was one CPU core that ran at the Wii's speed and 2 others (iirc) that ran at a higher WiiU speed, but in WiiU mode it would use all available resources. In Wii mode it would only light up the parts of the hardware that were identical to the Wii. In essence, it was both a WiiU and a Wii on a single chip with the Wii resources available to the Wii U. Say what you want about Nintendo, their chip designers are resourceful as hell given their small die size and power budgets. They got native (or near native) Wii mode without having it be idle cost-adding silicon like the original PS3 w/ PS2 chip.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Don't remember Mr.Cerny worried about that.

Memory bandwidth was one of the most important decisions they had to make. They considered the option of DDR3 + eSRAM/eDRAM but then decided to go with GDDR5. This particular point has been missed by everyone in this thread entertaining the idea of ARM SoC or ARM + GCN. How do you design an ARM memory controller that can support 100-175GB/sec memory bandwidth that's required to push a graphics card with PS4's level of performance? Have you ever seen such a design?

I mentioned this point earlier and it was brushed aside as irrelevant. If you say well XB1 gets away with only 68GB/sec. Guess what now, an NX console with 50GB/sec memory bandwidth and GPU performance below XB1 will fail. Nintendo cannot release a console that's slower than XB1 in 2016 or it will be catastrophic imo.

You guys keep focusing on how ARM is better than X86 CPUs but:

-> You haven't provided proof that a very complex and custom ARM core + GCN + brand new memory controller integration would cost less mimicking the already widely successful PS4 APU.

-> If there is no new memory controller that allows a decent GCN GPU (not some weak 512-640 GCN core product), then you haven't addressed how such a console even makes any sense since it once again is weaker than PS4/XB1 which is where the Wii U sits. So what's the point of releasing such a product when Wii U failed?

-> The NX coming 3 years after XB1/PS4's launch inevitably means that its life cycle will somewhat overlap with XB2/PS5. In order to ensure that NX gets some 3rd party games towards the end of its life, it has to be able to run XB1/PS4 ports with aplomb. Having a GPU less powerful than XB1 will not accomplish this task. Now we are back to the above point -- how to integrate a GCN + ARM APU and not starve the GCN of its necessary memory bandwidth?

I mean this point is critical. Even a quad-channel 128-bit LPDDR4 3200mhz ARM A72 design maxes out at 51.2GB/sec memory bandwidth. Once again, major bottleneck for trying to even reach XB1's GPU performance.

As for why the PS4 didn't use ARM- as Mark Cerny himself has said, it is because at that point ARM did not have 64-bit. That no longer applies, as several 64-bit ARM designs are available.

You guys keep repeating this point and ignoring the technical reason -- how can you design an ARM APU with GCN and feed it 176GB/sec memory bandwidth? What memory controller would the ARM CPU need to have? The one from the future in the year 2020? Even if ARM had 64-bit capability, PS4 would have never used it anyway.

An A72 core has higher performance than an AMD Jaguar core, while also having 100% compatibility with the much lower power A53 core which could be used in a handheld. That is why I think they will go ARM- because AMD don't have an x86 CPU that scales down well to handheld gaming device power consumption (especially at 28nm), while they can use an off-the-shelf ARM design which gives them precisely what they need.

But you are now ignoring the GPU component. You are solving one problem and creating a secondary risk. Replacing X86 CPU with ARM automatically means you have to figure out a way how to provide ample memory bandwidth to the GCN part. How? A72 design calls for 128-bit quad-channel LPDDR4 3200 - max bandwith of 51.2GB/sec. That's a bottleneck.

There's all sorts of other possibilities- ARM in handheld and x86 in console, as you said, or they could go with NVidia and get a Denver CPU for the console.


No, this isn't a possibility. NV didn't win the NX design.

Nintendo could even ignore ARM entirely and get a MIPS design with PowerVR graphics.

Once again, no. I am not aware of any PowerVR graphics that can match an HD7790, the minimum required to reach XB1. If we are talking about the NX home console, it's been widely rumored that AMD won that design which means PowerVR for the NX home console shouldn't even be a discussion point.

It wouldn't be the first time they changed their mind at the last minute- the 3DS was originally meant to use an Nvidia Tegra 2, but they switched to a different vendor partway through development.

Not happening. Nintendo is sticking with AMD for the NX to ensure the most seamless compatibility of virtual library for Wii/Wii U, powered by ATI/AMD graphics, and because no one else can match AMD's price/performance in an APU design. Therefore, PowerVR, Imagination, Nvidia, none of those are options. The exact reasons MS/Sony chose AMD's APU design for their consoles are still in place today. The only questions now are if NX will use ARM cores + GCN or X86 GCN APU or some combination of these.

Some of you guys must have been reading too many iPhone 6S reviews. The best graphics in smartphones/tablets is 2-3 generations behind the GPU in XB1. My earlier contributions to this thread provided proof of that but a mod deleted it. HD7790 ~ GTX750, and that's miles better than anything in smartphones/tablets, miles.

Again, most of you are looking at ARM vs. X86 in isolation, fully ignoring the memory bandwidth constraint.

"Outside of the CPU and GPU, NVIDIA has also dramatically improved the rest of Tegra X1 in comparison with Tegra K1. We see a move from 64-bit wide LPDDR3 to 64-bit wide LPDDR4 on the memory interface, which improves peak memory bandwidth from 14.9 GB/s to 25.6 GB/s and improves power efficiency by around 40%"
http://www.anandtech.com/show/8811/nvidia-tegra-x1-preview

Even going wider from 64-bit to 128-bit and using LPDDR4 3200mhz doesn't get us to XB1's memory bandwidth. This is a design bottleneck/constraint that A72 would have to overcome in 2016. How? What would AMD's engineers need to do to go from 50GB/sec to 100-150GB/sec? If they cannot, the graphics card in the NX is underpowered turd and we are back to square 1 = NX weaker than XB1 and PS4. Would be a Tech FAIL of 2016.

They could split the ARM cores from the GPU cores and each would have its own memory but that increases costs, latencies and complexity and goes against the APU design strategy.

A much better option is to buy an XBox 360, which is far cheaper, roughly as fast, has a massive games library, fantastic online multiplayer service, and all the video streaming apps you could need.

Ya, and a similar argument could be applied to the NX home console by late 2016 when MS could drop the price to $249-279 and Sony could drop PS4 to $299-329. Nintendo would bring out a console less powerful than XB1 for $199 but XB1 will have a massive library of games, an online community of friends you could play multi-plats with and will have backwards compatibility with many Xbox 360 games. Why would many gamers be enticed to purchase the Nintendo NX in this case? You need some pitch/selling point and thus far there is none. If the NX's biggest selling point will be its low price of $199, the competitors could just lower the price and the NX is dead. If the NX is less powerful than XB1, forget 3rd party support. Same issue as the Wii U.

Therefore, Nintendo cannot use a very low price as the main differentiating strategy for NX's home console success. It will not work. If someone just wants casual games, they'll play them on their smartphone/tablet. Lack of 3rd party support further undermines the console. What causes lack of 3rd party support? Many reasons but not having hardware that's at least on par with the XB1 guarantees lack of 3rd party support.

Exactly. ARM has enough power to match a PS4 CPU. What matters is GPU power.

And how would they combine a powerful GPU with an ARM CPU? You need memory bandwidth, something A72 design doesn't have natively.

The ARM© CoreLinkTM DMC-500 Dynamic Memory Controller boosts mobile system performance by providing fastest access and highest bandwidth to memory. The CoreLink DMC-500 offers lowest latency supporting LPDDR4/3 memories at DDR-3200+ transfer speeds.



HD7850 has 256-bit @ 4800mhz GDDR5 => 154GB/sec, lower than PS4's.

Look at this --

E8950 = 95W TDP with 2048 GCN cores and 192GB/sec memory bandwidth.
E8870 = 75W TDP with 768 GCN and 96GB/sec memory bandwidth.

It makes sense for them to take either of these or some balanced chip in between. Now if they figure out a way to integrate ARM CPU cores and 1100+ GCN chip with > 100GB/sec of memory bandwidth that would be impressive.
 
Last edited:

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
Embedded RAM is very expensive and kind of useless unless you use enough of it. It's an unnecessary band-aid to a problem that's already been solved (PS4 - 8GB GDDR5).

I agree with RS, the main console won't be ARM based due to the complexities of memory bandwidth and performance.

Why is it hard to imagine a system with updated X86 cores using a lower process node?

Mark Cerny and team did a great job with the PS4 so why would Nintendo and or AMD sink a bunch of R&D into designing an ARM processor with enough bandwidth tied to GCN cores? Seems like a waste of resources when simply using updated X86 cores would fit the bill. The weakest area of the current consoles is the Jaguar CPU's. It would be trivial [today] to add a CPU that doubles or triples the performance of the existing consoles making the NX the easiest and more desirable platform to develop for.

Sure the guts of the handheld portion could use some sort of ARM/GCN hybrid if they keep the handheld screen to 720P or less to avoid memory bandwidth issues but Nintendo will have to do much better on the main console.

Occam's Razor points to some sort of X86 HBM/APU for the main console. ARM would be silly here for the reasons already stated.

The handheld portion of the NX is the real puzzle to solve here.
 
Last edited:

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
Why couldn't they go with embedded ram again?

They could but XB1's approach is inferior to PS4's because embedded ram requires additional game development resources to take advantage of the extra bandwidth it offers. With such time and budget constrained game development cycles, it's more strategic to make game development as seamless and cost effective as possible. From that point a large unified/native pool of memory bandwidth is better than 2 separate pools of dedicated memory bandwidth. Without specific optimizations, that means the native memory bandwidth of XB1 without taking full advantage of embedded RAM tanks (aka XB1's max of 68GB/sec). Also, embedded ram has two other major downsides: (1) it's still too small (32-128MB); (2) Cost - it takes up an exponential area on the die that instead could be used to save on fabrication costs by making a smaller die or it could be allocated towards more CPU cores/more powerful CPU cores, larger CPU/GPU cache, or a bigger GPU component.

A perfect example is the XB1 where the XB1's die is actually larger than PS4's die but the GPU is ~50% less powerful.

Something else to keep in mind is that almost all rumors for the NX are stating AMD X86 APU. I haven't read a single rumor of an ARM SoC or ARM APU for the NX home console. This doesn't mean that ARM is out of the question but when most of the rumors keep repeating the same information, a lot of times it tends to have some grain of truth to it. Hopefully in the next 6 months we have more info as Nintendo has been very secretive about what it means to have a completely new gaming experience/eco-system.
 
Last edited:

dogen1

Senior member
Oct 14, 2014
739
40
91
Embedded RAM is very expensive and kind of useless unless you use enough of it.

This seems like a weird thing to say. If something is too small to have any use of course it is useless. How much is a useful amount? And how expensive is it? How much more did the xb1's memory setup cost over the PS4?
 

Snafuh

Member
Mar 16, 2015
115
0
16

The bandwidth has very little to do with the Core design itself. Kabini itself (Jaguar cores like the Ps4 and Xbox one) has a single memory DDR3 controller. The Xbox one has 4 and the PS4 has a GDDR5 interface. All with the same core design.
AMD already ships dual channel DDR4 chips (both x86 and ARM). It should be possible to add 2 more memory controllers resulting in a Quad-Channel DDR4 interface comparable with the Xbox One but with the advantage of higher clock rates. This could result in about 100GB/s. In the long run it might be cheaper than the PS4 memory.

No matter if Nintendo wants a x86 or a ARM chip, they have to design it themselves using existing building blocks. There is not x86 APU with enough GPU power and bandwidth out of the box either. AMDs Seattle can be used as "template" for a ARM chip just like Kabini for the PS4/Xbox One.
 

Techhog

Platinum Member
Sep 11, 2013
2,834
2
26
The WiiU Wii mode is really very elegant. They built in identical resources from the Wii into the WiiU main SoC. There was one CPU core that ran at the Wii's speed and 2 others (iirc) that ran at a higher WiiU speed, but in WiiU mode it would use all available resources. In Wii mode it would only light up the parts of the hardware that were identical to the Wii. In essence, it was both a WiiU and a Wii on a single chip with the Wii resources available to the Wii U. Say what you want about Nintendo, their chip designers are resourceful as hell given their small die size and power budgets. They got native (or near native) Wii mode without having it be idle cost-adding silicon like the original PS3 w/ PS2 chip.

FYI, Wii U does not use an SoC.
 

dogen1

Senior member
Oct 14, 2014
739
40
91

I can see why you say it's not a good choice considering the space required, but I'm not sure why you say 32MB is too small.
I'd also say the optimization required to use it benefits other hardware as well, and there are some scenarios where ESRAM has the advantage over using GDDR5(Not that those justify it's use. Just that they exist).
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
The bandwidth has very little to do with the Core design itself. Kabini itself (Jaguar cores like the Ps4 and Xbox one) has a single memory DDR3 controller. The Xbox one has 4 and the PS4 has a GDDR5 interface. All with the same core design.

That's a good point. I am coming from a position where AMD knows its x86 processors and would be able to design a custom memory controller and integrate it with GCN. They have already done this with XB1 and PS4. OTOH, AMD has very limited experience with ARM cores, integrating ARM + GCN. This entails an all-new design, custom integration, more risks, more costs. Is it possible? Ya, but does it make the most sense? I guess if they want the NX eco-system to be underlined by Android OS and have ARM in the home console and ARM in the handheld, maybe.

No matter if Nintendo wants a x86 or a ARM chip, they have to design it themselves using existing building blocks. There is not x86 APU with enough GPU power and bandwidth out of the box either. AMDs Seattle can be used as "template" for a ARM chip just like Kabini for the PS4/Xbox One.

Well in this case it would be AMD or whatever partner they have chosen. Nintendo hires 3rd parties to design chips for them. They aren't in a business of hardware chip design or fabrication.

The author/editor of ExtremeTech makes a strong case for ARM:

"ARM CPUs may not have the performance of a Haswell, but that's not relevant in this context. The Xbox One and PS4 run on AMD's budget CPU core. Jaguar is a fine chip, but it's well matched by any Cortex-A57 or equivalent CPU. That's why AMD built an entirely new set of ARM server cores as opposed to simply prioritizing Jaguar in servers, and if you look at the few data points available for their ARM Cortex-A57 cores as compared to their Opteron based Jaguars, they predicted performance would be substantially higher, core-for-core, on the Cortex-A57 Seattle as compared to Jaguar.

Jaguar is a dual-issue CPU with modest branch prediction capabilities and a short pipeline. It's a very good chip with relatively few drawbacks, but it's *not* a heavy hitting CPU core.

The reason not to use ARM in this context would be because you don't want to do the custom interconnect work required, not because the CPU cores themselves can't handle the job.

ARM also is not a minority architecture (there are far more ARM devices in the world than x86 chips and smartphones and tablets dwarf the size of the x86 market). Finally, the 3DS already uses an ARM processor, albeit an old model.

It is, of course, only a rumor. All of this is rumor. But if Nintendo *wants* to make an explicit console around Android, there's no reason not to use ARM."


I suppose if Nintendo wants to make mobile Android games, then the custom Android NX eco-system would allow home console users to purchase those games and play them directly on their home console from day 1.

I can see why you say it's not a good choice considering the space required, but I'm not sure why you say 32MB is too small.
I'd also say the optimization required to use it benefits other hardware as well, and there are some scenarios where ESRAM has the advantage over using GDDR5(Not that those justify it's use. Just that they exist).

Right but I think Nintendo it's hard to imagine Nintendo affording both eSRAM and GDDR5. They'll have to choose. It's impossible to say what they prioritized, performance or cost or a balance for their NX. I can see validity in the argument that it wouldn't make a lot of sense for them to make a GPU much more powerful than PS4's since most developers wouldn't take advantage of that extra capability given NX's small install base in the beginning of its cycle. Also, Nintendo already struggled with making games for the Wii U and more powerful hardware would only raise complexity and make making games a lot more expensive for Nintendo itself.

The part that has me puzzled is how they estimate 20 million NX sales in the first 12 months of release. How is that possible unless it's a dirt cheap home console or the NX is a handheld like the New 3DS?

There are also rumors that the NX won't have optical disc drive/media. That would be a very risky move, no?

FYI, Wii U does not use an SoC.

This is a good diagram for the Wii U for anyone who wants to nerd out.

 
Last edited:

monstercameron

Diamond Member
Feb 12, 2013
3,818
1
0
@RS I don't understand why you think the memory subsystem would be a problem. What is stopping AMD from throwing their 256 bit gddr5 memory controller from their IP portfolio and connecting arm cores to it?
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
This seems like a weird thing to say. If something is too small to have any use of course it is useless. How much is a useful amount? And how expensive is it? How much more did the xb1's memory setup cost over the PS4?

RS elaborated on why embedded RAM isn't great right under my post.
 

Madpacket

Platinum Member
Nov 15, 2005
2,068
326
126
@RS I don't understand why you think the memory subsystem would be a problem. What is stopping AMD from throwing their 256 bit gddr5 memory controller from their IP portfolio and connecting arm cores to it?

Cost. They might be able to reuse some of the existing plumbing but why bother? This problem has already been solved on X86.

As an aside, if they're planning on a completely new APU with HBM instead of GDDR5 new plumbing work would have to be done anyways. So I guess there's room for ARM to replace X86 here but then we go back to the ease of porting limiting the appeal of third parties from developing on the platform.

As for the folks who says it's trivial to recompile code from one architecture to the next.... Sure middleware exists and the big studios use it to release shovlelware for porting games to iPhone and Android OS. I would imagine a porting a AAA title like for like would require more than simply recompiling the code.
 
Last edited:

dogen1

Senior member
Oct 14, 2014
739
40
91
RS elaborated on why embedded RAM isn't great right under my post.

Right, saw it. I just think stuff like embedded memory is cool.


I mean, I'd like to see a high end console using an ARM cpu, that would just be really cool I think.
 

RussianSensation

Elite Member
Sep 5, 2003
19,458
765
126
@RS I don't understand why you think the memory subsystem would be a problem. What is stopping AMD from throwing their 256 bit gddr5 memory controller from their IP portfolio and connecting arm cores to it?

Not necessarily stopping, but creating additional barriers.

1. It would need a custom integrated design. More cost, more risk vs. already proven X86 APU designs which are least risky, and least costly from R&D perspective.

2. This point had me thinking about it too. "Die size dictates memory interface width, so the 256-bit interface remains but Microsoft chose to go for DDR3 memory instead." Would ARM CPU+GCN vs. X86 CPU + GCN create a die size constraint for a 256-bit GDDR5 pin layout? Prob. not but it was something that had me questioning it to an extent that it might not be as easy as integrating the components like LEGO together.

3. Prob. my biggest reason was AMD's ARM execution. AMD is having a lot of issues with volume production and overall success of 64-bit ARMv8-A-based Opteron microprocessor code-named “Seattle." If they can't even get this right, what are the chances they can get 20 million units of NX with a custom ARM + GCN + 256-bit GDDR5 design? Sounds wayyyyyyy more complex and costly.

Of course I am just outlining my thinking but I could be wrong, way wrong.

My biggest fear is that when I remove all constraints and I ask myself what console could Nintendo build that would get them back in the game, I don't have a good answer since I think their timing to re-enter the market is all wrong. I think they should have coasted on the Wii U for 1-2 more years and not abandon it this quickly, just made more games for it while lowering the price. Right now they are following the Sega business model -- if their hardware/console gen fails (Saturn), they'll just ditch it and move on (Dreamcast). The problem with this is you undermine consumer confidence for your next console, your 'next gen' console could be too advanced compared to PS4 which means most developers wouldn't be taking advantage of the extra hardware but at the same time it's not fast enough to compete with XB2/PS5. The timing issue makes me question almost any NX console in 2016. What do you guys think about this entire strategy?

In this context, even if they put a Fury inside the console and sell it for $249, would it matter? This point makes me nervous about the success of the next gen NX. I think while they have to at least match XB1, going with a very powerful GPU seems like a point of diminishing returns since they are entering in the mid-cycle of the current gen.
 
Last edited:
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |