That's not at all how it works.
Consoles have two separate dual-module CPUs (APUs) the console OS runs on the two first cores of the first CPU,and only on those two cores,leaving six cores for the game,now communication between the two CPUs is slower then communication within the same CPU so devs use the 4 cores of the second CPU for the demanding threads and the two "leftovers" for secondary threads (that's why you still see next to no scaling with over 4 cores)
Anyway the only reason we have bad console ports is because the consoles are not a multitasking environment, the games run all alone on their cores with nothing bothering them and devs just can't be bothered to adhere to multitasking rules so the games mess up windows's multitasking/task manager/whatever.
Well, i am very interested where you get that information about the communication between the modules.
I am interested in the details how it functions, but it is hard to find. I am not writing you are wrong, i am interested in what you know.
But i think you should see the bigger picture. Let us agree that the communication between the modules is too slow to for a true cpu unity to be seen by the programmer. I understand that jaguar cores are optimized for easy synthesis for to sell to different customers with possible different foundries. I read that the Xbox uses it Sram as inter communication medium between the modules, but i have this from "hear say". The PS4 seems to have multiple buses between the cpu modules and the memory.. I do not know how the modules communicate or the max bandwidth and latency.
Maybe, with the new PS4 and Xbox scorpio coming, they upgrade the interconnect fabric as well if it is really that worse. Both Sony and Microsoft seem to have a history of listening to what developers would like. If this is really an performance limiting issue. I assume they would tackle this with the new upcoming console revisions. Also, for Microsoft this would play in their cards of the idea of Microsoft everywhere. Windows 10 + DX12 and the xbox live integration in Windows 10 seems to confirm this. A more uniform programming model between the pc and the console would help(And then presenting a console alike pc to replace the pc for people who need a pc for average duties). And deprecating the use of the embedded sram for video graphic related operations will help for the console to pc port.
That bad console port consensus everybody has is also because the game developers have to sort of keep some abstraction layers that are detrimental for performance on the pc.
Close to the metal and proper programming with lot of options and defines for the game developer will help as Doom shows with their vulkan port. Also now that 8 thread ( note that i not say 8 core for the time being) pc become more present will be helpful. But it will be a while before there are true 8 cores on the majority of pc. Give it another year or two. And yes, not all cores are are used ofor the game on game consoles but that should be helpful for pc ports.
Although i am kind of drifting of the subject :
But when that happens, with the strength of pascal and of polaris (and future improved derivations of both architectures) starting from about 2018, we will see that unification. When HBM2 becomes more of a commodity in the pc market, it will also make the entrance in the console market.
And then HSA will be a natural part of the future pc/console (If that may be equiped Nvidia or AMD) . I suspect that for a while we will continue to see a separate GPU die and separate CPU die. Just because when 4k becomes mainstream and virtual reality takes of, the resolution will probably increase again(Of course up to the point that there is no need for it because the human eye retina can only detect so much). Creating a need for more computational power that an apu even with HBM2 for a given power envelope can provide. Nvidia is save for upcoming years. But it would help that by the time that AMD has mastered perfect HSA APUs, that Nvidia should also get an x86-64 license and Intel would master the idea of GPU design.
http://www.eurogamer.net/articles/digitalfoundry-2016-doom-tech-interview