- Aug 25, 2001
- 56,474
- 10,133
- 126
Isn't the invention of the Atom and the Brazos CPUs, basically the big industry players thumbing their nose up at Moore's law?
CPUs always got faster/better/cheaper, until Atom hit the scene, then it stagnated in terms of design, power-consumption, and performance for so long. (Soon to be remedied by their new 22nm OoO design.)
Anyways, I don't have a clear picture of where I'm going with this thread, other than I was just browsing Desktop PCs at Newegg, and the cheapest new one is an Acer "Veriton" Nettop, with Linux, and get this, 2GB of RAM! Only 2GB. What are they thinking???
It's companies that cripple their pre-made PCs, that are still being sold today, that affect how application developers write their code. Who wants to take the plunge and take advantage of 8GB of RAM on the desktop, when it could make the application run faster, if the application also has to run on their anemically-supplied PCs with only 2GB of RAM.
I'm sure most of you well know the story of the Intel Netbook (TM). How MS and Intel conspired to cripple the specs of these "Netbooks", so as to not undercut the demand and pricing for regular laptops. But crippling innovation in that sector pretty massively.
Who knows, if MS/Intel hadn't crippled Netbooks, perhaps the flood of Android/ARM tablets wouldn't have happened, because we all would have been using NON-crippled updated "Netbooks" instead.
I own an Acer Aspire One 722 "Netbook" (but not Netbook (TM)), with a C-60, 4GB of RAM, 1366x768, and I threw in a 120GB SSD. It's actually quite decent to use for web browsing, unlike those Atom 1.6Ghz single-core POS Netbook(TM)s, with their 1GB of RAM and 1024x600 screens. Who in the world would have ever thought such a low-res screen was a good idea? Especially since Windows ever since quite a while has basically required a x768 screen.
So my argument is this, "good enough" computing has set the industry, or at least, Intel and Microsoft, back quite a bit. People aren't demanding performance in their PCs anymore. Thus less high-end R&D, which always used to trickle-down to mainstream computing.
We can see that with Haswell, it doesn't clock any higher than IB so far, instead, it is supposedly more power-efficient, at the base/stock clocks it was designed for.
We can also see it, in a lack of 8-core consumer offerings from Intel. Not even a 6-core yet. 4-core was so 2006.
CPUs always got faster/better/cheaper, until Atom hit the scene, then it stagnated in terms of design, power-consumption, and performance for so long. (Soon to be remedied by their new 22nm OoO design.)
Anyways, I don't have a clear picture of where I'm going with this thread, other than I was just browsing Desktop PCs at Newegg, and the cheapest new one is an Acer "Veriton" Nettop, with Linux, and get this, 2GB of RAM! Only 2GB. What are they thinking???
It's companies that cripple their pre-made PCs, that are still being sold today, that affect how application developers write their code. Who wants to take the plunge and take advantage of 8GB of RAM on the desktop, when it could make the application run faster, if the application also has to run on their anemically-supplied PCs with only 2GB of RAM.
I'm sure most of you well know the story of the Intel Netbook (TM). How MS and Intel conspired to cripple the specs of these "Netbooks", so as to not undercut the demand and pricing for regular laptops. But crippling innovation in that sector pretty massively.
Who knows, if MS/Intel hadn't crippled Netbooks, perhaps the flood of Android/ARM tablets wouldn't have happened, because we all would have been using NON-crippled updated "Netbooks" instead.
I own an Acer Aspire One 722 "Netbook" (but not Netbook (TM)), with a C-60, 4GB of RAM, 1366x768, and I threw in a 120GB SSD. It's actually quite decent to use for web browsing, unlike those Atom 1.6Ghz single-core POS Netbook(TM)s, with their 1GB of RAM and 1024x600 screens. Who in the world would have ever thought such a low-res screen was a good idea? Especially since Windows ever since quite a while has basically required a x768 screen.
So my argument is this, "good enough" computing has set the industry, or at least, Intel and Microsoft, back quite a bit. People aren't demanding performance in their PCs anymore. Thus less high-end R&D, which always used to trickle-down to mainstream computing.
We can see that with Haswell, it doesn't clock any higher than IB so far, instead, it is supposedly more power-efficient, at the base/stock clocks it was designed for.
We can also see it, in a lack of 8-core consumer offerings from Intel. Not even a 6-core yet. 4-core was so 2006.