cytg111
Lifer
- Mar 17, 2008
- 23,551
- 13,116
- 136
6 internal hard drives
- wow, look into some NAS loving? Would be my advice.
6 internal hard drives
- wow, look into some NAS loving? Would be my advice.
There was a time -- long, long ago -- when you could never get enough diski storage. If you were a DIY enthusiast-builder, you would begin accumulating used HDDs from previous machines, either in the parts-locker, external boxes, a server, etc.
It eventually got to the point that I'd order two drives when I needed one, so I've got a box filled with five or six SATA drives in that parts locker. The spares come in handy: I recently had a VelociRaptor 600GB go south on me (also recently retired as a system/boot drive), and have inquired of WD whether I get an RMA: it was three years old. If they say "no," I'll send it to a friend who wants the heatsink. But it died when I was prepping it to use as a clone of my SSD. I went to the parts locker, found a WD Blue 320 still in stat-wrap, and I was good to go.
On the other hand, with five -- no, it's six now -- computers in the house, I don't want all these HDDs spinning. My server is limited to four NAS drives. Of three workstations in the total, I only have SSDs. Files are stored on the server. After I had four drives in my Q6600 workstation before retiring it in 2011, I'm deterred by the weight, the power-consumption -- all of it.
But who am I to tell the OP how many drives to put in the box?
Athlon 64 was great. But it was downhill from there.
Some people - notably AMD diehards - still associate Intel with anti-competitive practices due to their deals that they cut with OEMs back in the K8/Netburst days.
Not quite. The X2's were good compared to Intel's P4-D. Then it was downhill after that.
And some of the 'deals' were if you continue to sell AMD's, you won't get anymore Intel's, so sell only Intel. Theres the ongoing problem of Intel's compiler crippling AMD performance by not enabling vector instructions. Manipulating the Bapco benchmark and PCMark by bribery.
And there's more than that (the entire RDRAM fiasco, especially on the i840/i820 chipset, pushing the "cloner" companies off Intel platforms by patenting the physical socket, etc.). Intel has done plenty over the years to leave a bad taste in the mouths of some.
Some people still care about all that, and some don't.
And some of the 'deals' were if you continue to sell AMD's, you won't get anymore Intel's, so sell only Intel. Theres the ongoing problem of Intel's compiler crippling AMD performance by not enabling vector instructions. Manipulating the Bapco benchmark and PCMark by bribery.
Its funny isnt it. AMD and nVidia are doing essentially that in the graphics segment. Yet its ok as long as both does it. And AMD mislead the entire community twice in terms of CPU performance and upgrades.
RDRAM was more linked to this than intel:
http://en.wikipedia.org/wiki/DRAM_price_fixing
Intel tried to get us away from parallel bus based memory. And we still suffer today, while basicly everything else have moved to a serial based connection.
Patenting the physical socket?
You can find just as much dirt on AMD for that matter if you want to play that game.
Only if you're a pro-RAMBUS partisan. Intel customers who bought into the i820 + SDRAM will remember the MCH being crippled when not using RDRAM.
Intel was trying to wipe out its competition by adopting a proprietary memory interface which should, in theory, have forced all the memory manufacturer's hands into producing mostly RDRAM, at least for desktops. AMD and Cyrix should have been plowed under by that move. DRAM price fixing saved them (though Cyrix didn't last long enough to really enjoy that benefit).
I guess. The high latency of RDRAM was its real curse. It never would have worked well for AMD systems (certainly not K8), and I doubt that Conroe et al would have benefitted much from it versus JEDEC-standard memory in dual-channel configurations. Frankly I don't think we missed much.
If you say so. Remember when non-Intel companies used to be able to sell socket-compatible x86 processors? Remember how that stopped for Slot 1? Intel couldn't block AMD or Cyrix (or IDT) from creating devices that were compliant with their own bus protocol, but they COULD patent the physical socket interface, which is how they kicked all the "cloner" companies off their motherboards, forever. It is also a darn miracle that AMD survived that one.
AMD has never had enough clout anywhere to inflict that kind of damage with a patented socket, unless Cyrix wanted in on Slot A or something. Yeah, that might've happened.
Regardless, it's ancient history. Some people care about that sort of thing, while others don't. For some, it's just water under the bridge. Still others see nothing wrong with what Intel does, and instead blame their competitors for being incompetent. It's all a matter of perspective in the end.
I am not convinced at all that intel is in a huge conspiracy to skew benchmarks but I think your comparison isn't right. Neither Nvidia nor AMD are working with a company like 3dmark to break benchmarks are they?
I believe its a different thing altogether to work with a developer on an actual product like NV/AMD do.
They do cheat in their drivers. They also pay gaming companies to get advantages. Its the exact same thing. And the history shows for both companies, including with 3dmark. Even today companies are caught cheating in benchmarks. Samsung is a good example there.
They do cheat in their drivers. They also pay gaming companies to get advantages. Its the exact same thing. And the history shows for both companies, including with 3dmark. Even today companies are caught cheating in benchmarks. Samsung is a good example there.
Its been a very good while since I've heard anything about driver cheats on benchmarks. Do you have a link to any recent articles showing gaming/cheating of the tests?
I agree there is rampant cheating in the mobile space, not doubt about that. With all the IQ Nazi's out there you would think any sort of gaming of benchmarks by reducing quality would get quickly caught and shouted from the mountain tops- but I haven't seen anything on it in years.
You also missed the high bandwidth of RDRAM.
Thing is..I have 6 internal hard drives(and a add-on SATA card with 4 more ports..).Plus my Blu-ray burner,so that makes 7 SATA devices.
I'll look around for where Gigabyte hid the other 2 SATA ports.
It was pretty late last night when my tech friend Lowell got my rebuild hooked up,so he might have missed something.
(I have a PC Power & Cooling Turbo-cool 1200(80 mm fan version) powering my PC,so there's a lot of cables to sort thru..)
Many associate Intel as being the more "evil, big bad guy, devil, etc..."
It seems you are too focused on something being wrong you start to skip essential parts. For SDRAM to work with the 820, you needed and extra MTH chip. It was a hotfix to a problem that the DRAM manufactors created in their price fixing scheme. Sony went on the RDRAM wagon as well with the PS3. Simply because it was better than anything DDR could offer.
So you defend DRAM price fixing, because Intel was evil wanting us to move to a serial interface to break the memory bottleneck and limitations? Maybe you need a wakeup call. DDR is a problem, its a bottleneck and its inefficient. "Evil Intel" also backed SATA, USB, PCIe etc. maybe we should just remove those and go back to the old days with the parallel bus. Another hint for you should be when we see DDR4. You may find yourself missing a couple of DIMM slots.
You also missed the high bandwidth of RDRAM.
Its a darn miracle AMD got away with blatantly stealing Intel IP and products for so many years.
It wasnt the socket that was patented. You have to stop that nonsense.
Obviously you have different standards and morales for different companies. So for you its not about the company ethics, its about certain companies.
Companies are not people, they dont act with the emotions like people either. And they will do anything for profit.
Not really. I seem to recall the i865 and especially the i875 could match or exceed i850E performance with PC1066 RDRAM. The i845 doesn't count, that was single channel only...
There was actually a period where the E7205 Xeon chipset was extremely popular with enthusiasts because Intel didn't want to release a dual channel DDR desktop chipset... :whiste:
no there is no real cheating.
Intel really hurt AMD on the release of there C2D line.
AMD really hurt themselves on bulldozer.
There was a time -- long, long ago -- when you could never get enough diski storage. If you were a DIY enthusiast-builder, you would begin accumulating used HDDs from previous machines, either in the parts-locker, external boxes, a server, etc.
It eventually got to the point that I'd order two drives when I needed one, so I've got a box filled with five or six SATA drives in that parts locker. The spares come in handy: I recently had a VelociRaptor 600GB go south on me (also recently retired as a system/boot drive), and have inquired of WD whether I get an RMA: it was three years old. If they say "no," I'll send it to a friend who wants the heatsink. But it died when I was prepping it to use as a clone of my SSD. I went to the parts locker, found a WD Blue 320 still in stat-wrap, and I was good to go.
On the other hand, with five -- no, it's six now -- computers in the house, I don't want all these HDDs spinning. My server is limited to four NAS drives. Of three workstations in the total, I only have SSDs. Files are stored on the server. After I had four drives in my Q6600 workstation before retiring it in 2011, I'm deterred by the weight, the power-consumption -- all of it.
But who am I to tell the OP how many drives to put in the box?
For SDRAM to work with the 820, they could have made it a SDRAM-native chipset and left RDRAM to the more expensive i840 aimed at power users. Intel didn't want to support JEDEC memory standards, period. Adopting RDRAM was all about killing Super7.
Whether or not I defend it is not the point. It saved AMD's butt. They would have died horribly had cheap (or at least price-competitive) RDRAM become 80-90% of all available non-ECC, non-registered memory on the market. AMD could not have realistically adopted an RDRAM interface, especially since RAMBUS, Inc. had no real incentive to license memory controller designs to them.
If you want to write a piece on why JEDEC-standard DDR/DDR2/DDR3 is inferior to RDRAM, why the Pentium III really needed it (all the reviews I remember said that it didn't), or anything else in that vein, be my guest, but that flies in the face of everything I remember. Intel specifically designed an architecture that needed RDRAM (Netburst). The Pentium III, K7, K8, Core 2, and K10/K10.5 certainly didn't need that kind of extra bandwidth at a cost of latency. K8/K10/K10.5 are all about low latency. Core 2 and Pentium 3 had FSB limitations on available memory bandwidth. RDRAM was never that good for desktop PCs . . . and even the mighty Pentium 4 wound up with DDR once dual-channel DDR333/DDR400 became reality.
Okay, fine. It was the SEC package design and the DIB architecture that was patented. Which was enough to push everyone off the socket, effectively. Cyrix claimed to have obtained certain slot 1 patents which, apparently, never bore fruit in the form of a Cyrix Slot 1 chip.
These statements are important. Intel really hit it out of the park with Conroe, and AMD deserved everything that they've gotten since then. Intel didn't have to intimidate anyone into using Conroe . . . it was too awesome compared to the competition. The same can be said of most (if not all) following desktop products.