Building a large Storage Array

jonnyjl

Member
Feb 11, 2008
26
0
0
So, my goal is... around June... to build one huge frakkin Storage array to keep things like Backups, Media, and VMs. Although I'd be sad/pissed if the array was ever lost it wouldn't be the end of the world, but I'd at least want to try to build something that will last. I think at a later point I'll ask about opinions on what strip-size to run, whether or not to partition, etc but I think hammering out the hardware first would be a good idea. I'm new to this, so please take it easy if I write something stupid You can call me totally nuts, as long as its tactful/funny

Long term plans would include migrating this to a separate box. Though that's money/space limited right now. Current computer specs are listed last.

Requirements:
6 disk (max) Raid-6 (like to keep a ~30% redundancy) array with at least 6TB of usable space

Hard Drive(s):
Unless Seagate or Hitachi step up to the plate, looks like I'm going 2TB Green Power Western Digitals. It's not that unappealing to run a 5400RPM since heats obviously going to become an issue.

RAID Controller:
From what I can gather from reading various forums/websites and such is 3ware's probably the best bet. Specifically, 3ware 9650SE-8LPML. If I could go x8, I'd seriously consider an LSI card, especially with the issues the 9690 seems to have. I'd most likely get the backup battery unit if I went with 3ware or LSI. I'm open to suggestions, like I said I'm new to building an array of this magnitude.

---------------

Computer Specs:
OS: Vista Ultimate x64

CPU: Q9550 (E0) @ 425x8.5 (1.2625 vCore, VID is 1.25) with PM enabled

RAM: 4x4GB of OCZ Ram (running 1:1)

MB: Gigabyte GA-EP45-DS4P (PCI-Express x4 available). I don't think I can run an x8 RAID controller in the second "Video" slot right? I've read people running into issues, so *shrug*.

Video Card: ATI 4670

PS: PC Power & Cooling 610 Watt Power Supply. I'm thinking this might be a problem. What do you guys think? I may eventually upgrade anyways, as I may want a more powerful Video Card.

HDs:
WD 1TB Caviar Black, this is the System Disk that will be brought over. Other disks will be taken out. If arrays successful may look into buying another 1TB and building a RAID 0 array with the MB controller.
Seagate 7200.11 1.5TB
Hitachi 750GB.
A few external 2.5 and 3.5 HDs that house media and VMs that aren't always needed.

Case: CoolerMaster Stacker 832. Cooling situation is in flux. Especially after my damn Scythe Slipstream leaked it's fluid over my HD. Took those out, currently running a mixture of Delta and Panaflo fans. Going to try some Yate Loons, but CPU is around 38-40C idle (1 or 2C above of the MB temp) and around 56-58C on load. Hard Drives stay around 36-40C.

Misc:
APC Smart-UPS 1500 with a 24" LG Monitor and Logitech z-560 speakers connected to it. So far it never goes past "40% load". Basically, it has a 5 light system each representing a 20% block, usually only one is lit, but with some load tacked on the second light comes on.

PS In case anyone is wondering, I'm over 72hours Prime95 blend, 20+ runs IBT, ~24hours OCCT, and 24+hours Memtest stable.
 

skyking

Lifer
Nov 21, 2001
22,386
5,360
146
Backups and media = files. I would suggest building a dedicated file server on a server OS, and keep it separate from the desktop machine. It does not need any amount of horsepower to do the job, a 2gig machine with a gig of ram is plenty.
 

sxr7171

Diamond Member
Jun 21, 2002
5,079
40
91
Okay this is a completely different approach but hopefully cheaper and definitely quieter and more power efficient.

1. Full service: http://www.tranquilpc-shop.co....talog/SQA-5H-3000.html

Self-service: http://www.tranquilpc-shop.co..../BAREBONE_SERVERS.html

2. http://www.newegg.com/Product/...x?Item=N82E16816111029

For your eSata port. For a total of 18TB or 9TB mirrored in Windows Home Server. If you prefer run another OS. All this while sipping on 20w + Drives power. It will be on 24/7 and really it's purpose to to give you access to your data for the least amount of power consumption. With the option to add 4 more 4 or 8 bay USB units for over 64TB of potential storage.

Just an option.
 

pjkenned

Senior member
Jan 14, 2008
630
0
71
www.servethehome.com
Very similar to my server setup. I'm actually running:

Core 2 Duo: E6420 2.13ghz (had it sitting around not sure if I'll stick the Q6600 in there for TDP reasons.)
Gigabyte EP45-UD3P
4GB OCZ Platinum DDR2 (had laying around)
EVGA GeForce 7200GS (has analog and dvi outputs. passive cooling, and was in the spare parts bin)
Raid 1 2x 7200.11 1TB drives for OS
Raid 6 8x 1.5TB Seagate 7200.11's (going to either 10 or 11x drives as some may migrate from the main system)
Adaptec 31605 (16 ports and cheap)
Antec 1kw power supply (overkill by a ton, but I again had it sitting around)
Case: I went with the Norco 4U ($75 free shipping at newegg and GREAT for this application w/ the drive cages)
HD Cages: iStarUSA 5 in 3 trayless. (most important thing)
For the UPS I use the IBM eServer 1500 which is almost the same as the APC unit with a few extra watts of juice.

OS wise I got WHS... it's great. 32-bit is a bummer.

Just so you know, I'm having no problems running the 31605 in the second GPU spot. In my main PC with the X58-Extreme I'm running a 3805 on the x4 slot, dual GTX285's in PCIe x16_1 and 2, and PCIe x16 3 has an Adaptec 5805.

If you are going the BYO storage route, I'd suggest Openfiler as the free version or Windows Home server as the pay option. I will say, after trying FreeNAS and Openfiler, then moving to WHS, I really like the WHS box. Plus, it's easy to make with old hardware. I'll be updating my site with more information in the coming weeks (going to be traveling this week and am anticipating my grandfather passing at 96 this week). Feel free to PM me with any questions or just ask here.
 

jonnyjl

Member
Feb 11, 2008
26
0
0
So x8 might be a possibility.. interesting.

I'd really prefer not to do another box until I move. What might be up for consideration is building a slightly smaller internal RAID array now (maybe RAID5 4x1.5-2TB), keeping that internal and going bigger when a separate server is built. Maybe go faster on the internal when that's built.

pjkenned,
Liking that 4U case.

What OS do you run on your main PC, are you generally impressed with the Adaptec card?

 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: jonnyjl
RAID Controller:
From what I can gather from reading various forums/websites and such is 3ware's probably the best bet. Specifically, 3ware 9650SE-8LPML. If I could go x8, I'd seriously consider an LSI card, especially with the issues the 9690 seems to have. I'd most likely get the backup battery unit if I went with 3ware or LSI. I'm open to suggestions, like I said I'm new to building an array of this magnitude.

For this application is there a specific reason you aren't considering Areca? 4GB cache, IOP348.

http://www.areca.com.tw/produc...cietosas1680series.htm
 

pjkenned

Senior member
Jan 14, 2008
630
0
71
www.servethehome.com
Main PC's OS is Vista Ultimate x64, Gaming Notebook is Visa Home Premium x64, work notebook is XP Pro, Dell Mini 9 is OSX. I also have a box I use just for Linux and another XP MCE (32-bit) box that I use whenever I need something that is 32-bit or XP based i.e. some flash utilities.

The XP box uses the Dell Perc 5/i's. If you go with a small storage array, and just want something cheap, they are great for $120. You can see some benchmarks on my site, but the IOP333 worked fine with the 1.5TB seagates, yet the XOR calculations proved too great with 8 drives. Then again, IF you have a storage server, the Perc 5/i is perfectly capable of saturating 2x GigE bonded ports. This is the same reason I'm using the 31605 on the storage server because it has a slightly faster 800MHz IOP333, and running Raid 6 I know the controller is a "bottleneck" running local applications, but WHS isn't that efficient, and the I/O is bonded GigE links so it only needs to pull 250MB/s to reach the theoretical GigEx2 bandwidth. Of course with overheads involved with everything, 200MB/s is probably the best that server will ever put out. Also, the 31605 has 16 onboard ports, which is super since the 4U case can handle over 8 dives, and it was cheaper getting the 31605 versus an expander.

Finally, I would certainly look at the Areca 12xx series, 1680 series, and the Adaptec cards. Personally, it made sense to go with Adaptecs at this point because I have enough boxes where making sure everything is accessible via one online Raid management tool is important.

Again, I own two Perc 5/i's, and they are GREAT cards for raid 0/1/5 (6 isn't supported). A major plus is also that they are re-branded LSI MegaRaid cards so you can flash to the LSI firmware and use the MegaRaid utility, which I slightly prefer to the Adaptec one.

Also, if you wanted to save a few bucks with the Adaptecs, go to ebay and look for external port models. For example, the 3805 is normally $450-550 which isn't too great. The 3085 I bought off of ebay was $125 (no BBU making the Perc 5/i still much cheaper). I have been at least considering picking up a spare 5085 for $250 on ebay since that is HALF the price of an ebay 5805. I managed to get my new/sealed/with receipt 5805 for $410 purchasing locally off ebay/craigslist from someone who was unloading extras from client work.

Getting a nice raid card is cheap. Even that 3085 works perfectly in the x4 slot on the X58 Extreme, with the 5805 installed in the graphics slot. For $125 shipped it was a great way to add 8 SAS ports and keep my main pc/ WHS managed on Adaptec software.

I think the Highpoint 4320 card that is on sale at Newegg could be a great value on a dual core IOP348 card, albeit with 256MB of cache. Just remember to add the BBU price to anything you buy unless you are willing to take the risk of losing a raid 5/6 array due to a power failure or the like.
 

jonnyjl

Member
Feb 11, 2008
26
0
0
At first glance, it doesn't seem like Areca 1680 brings anything to the table for the price point. I don't mind paying upwards for $900 but it needs to be pretty compelling. I'm going to read up on the software management.

Is there any good review sites for this stuff?

I think at this point the Adaptec 5805 is the front-runner.

 

pjkenned

Senior member
Jan 14, 2008
630
0
71
www.servethehome.com
So being fair, the Areca 1680 is supposed to be best with SSD's.

Review sites, especially for the desktop user, you pretty much need to Google and compare. Safe to say though that these things are generally in families. For example, you generally see performance go;
IOP333 @ 500Mhz -> IOP333 @ 800MHz -> IOP348 @ 800MHz -> IOP348 1.2GHz dual core

Some of the non-Intel IOP things fit in there but I've never seen them beat the IOP348 @1.2ghz. Of the above, I own both mentioned IOP333 iterations, and the IOP348 1.2ghz, it is clear that the IOP348 is a great chipset. If you look at my site there's a Perc 5/i (read LSI) w/ 512MB of ram and another benchmark of the IOP348 based 5805. You can clearly see where the IOP333 is capped in Raid 5 as the 7200.11's can put out sequential reads of well over 110MB/s each, and there are eight attached.

And lest you count out the 5085, the external SAS connectors are pretty cool if you want to have an external drive enclosure. Also, it looks a bit ghetto, but you can bring them back internal. Price wise, the 5085 is a steal at $250.

Finally, remember that once you move to an external server, if you can get 250MB/s read/write out of a card, it is almost moot since that will be your max dual GigE bonded speed (well unless you want to spend tons and buy quad port adapters for server and host).
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
My 1680ix can be managed without even having an OS that's running! All that's needed is any PC that's on the same network (can be managed anywhere with a VPN) and compatible browser. Both firefox and IE work without any problems.

Arrays can be managed, monitored (alerts sent out via email etc.) as well as rescued from this console. Absolutely no software is required for management of these adapters.
 

jonnyjl

Member
Feb 11, 2008
26
0
0
Good points pjkenned, thanks.

Hopefully, if the SSDs prices become more reasonable, when I move to external server, I can keep any I/O intensive items in my case (like VMs/Games) and purely keep the storage items on the server (Backups/Media). Boot Disk doesn't concern me since I'm on a desktop that runs 24/7. Everything that resides on my system disk seems snappy enough.

By then maybe the Areca will be more reasonable unless I'm just looking at the wrong place, the entry-level 1680s seem to run at 700.

Out-of-Band (I think that's the right term) management seems appealing.
 

jonnyjl

Member
Feb 11, 2008
26
0
0
I know I didn't want to ask any Software questions, but I guess I might as well.

Is there any good utilities to check the array. Like write/read/verify the entire array? I haven't had a chance to look at the manuals for controllers, so maybe its just through their utilities. Would of course want something that runs within Windows since... doing that on a 4+TB disk would probably be time-consuming, to say the least.
 

taltamir

Lifer
Mar 21, 2004
13,576
6
76
get opensolaris from genunix.org
they also have instructions for setting up CIFS (for working with windows SMB shares).
use ZFS raidz2 (raid6 without any of the problems of raid6, only compatible with ZFS filesystem)

ZFS is an awesome filesystem that actually has file level checksumming, every 2 months or so I find 1 error has occured on my 5 drive raidz2 (raid6) array (which is natural in drives), on any OTHER filesystem it would have caused data loss, but ZFS detects it and automatically corrects it.

It is also completely controller independant, I can and I HAVE done a format of the OS drive and installed a different OS that supported ZFS... all I had to do was type:
#impool import -f tank

And it instantly imported the array. ZFS is fast and reliable. The only better filesystem is the google file system, which is one of googles top trade secrets and advantages over its competitors.
 

jonnyjl

Member
Feb 11, 2008
26
0
0
Ruby,

Do you have the Areca card? I'm going back and forth on this lol.

God, I wish Seagate would just release their 2TB SAS drive and let me build it now....
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Yes I have the 1680ix-24. Seagate is releasing Constellation series enterprise SAS drives this summer.

Text
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
I don't have the 16xx series, have the 1280ML here, but my experience with Areca in general (products and technical support) is what motivated me to throw their name into the hat. They are professional grade.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Originally posted by: Idontcare
I don't have the 16xx series, have the 1280ML here, but my experience with Areca in general (products and technical support) is what motivated me to throw their name into the hat. They are professional grade.

The 1280 is actually better IF you plan on to ONLY run SATA drives. It will be a little faster. For strictly SAS setups, it's very good and can address 4GB cache although that DIMM is still very expensive - about 1/4 the cost of the controller (24 port version)!

Never had to use Areca support. There's plenty of good folks over at 2cpu.com forums that know them well. There's a lot of hate too - due to "dropped drive syndrome" but I have not experienced this issue across several SATA and SAS products with varying configurations. The cables shipped with the 1280ML were kind of crappy so if you start to have drive issues replace them out ASAP.
 

Idontcare

Elite Member
Oct 10, 1999
21,110
59
91
Originally posted by: Rubycon
The cables shipped with the 1280ML were kind of crappy so if you start to have drive issues replace them out ASAP.

That is good to know. My support issues so far have been limited to helping me debug the 2GB cache dimm I bought from 3rd party and getting my card's firmware updated so it would work with my irams.

Firstly the response time from technical support is silly fast (<2hrs on all accounts) and the lengths they went to in order to resolve my issue was ridiculous - they bought the exact same DIMM I bought just to test it out for me in Taiwan, isolated the specific jedec spec violation on the dimm (afflicted all the dimms of this particular 3rd party oem) and recommended me another dimm for $3 more from the same supplier which they pre-tested to confirm it worked...can't beat that kind of tech support.
 

Rubycon

Madame President
Aug 10, 2005
17,768
485
126
Sounds good. LSI service is good too as long as you have a valid E number.

I was lucky with a Crucial DIMM. Cost $55 then and now it's under $30. :Q

I have a 1680 (non IX) on an Oracle box running R6 that gets hit HARD and it alarmed on me once - DRAM 1bit error. The box did not go down. Has not happened again but I have another DIMM in case. I don't like hearing that beeper unless the boxes are restarting or dinner's ready. :laugh:
 

pjkenned

Senior member
Jan 14, 2008
630
0
71
www.servethehome.com
The alarm is one of two things the Perc 5/i's don't have that caused me to move away from them. Raid 6 is the other.

Running external storage is great, as is having a fast main PC drive for a home user. The 8x Savvio 15k farm has been serving me well.

I will say that if I could have gotten a new 1680 8 port for $500 + 120 BBU or a 16 port for $700 + BBU, I would be running one or more of those. Alas a new 5805 for $410 (and after work at 8:30PM) was too tempting.

On the other hand, I am interested to see AT's article on the Highpoint cards as there is a chance that they could be decent on the price/ performance scale.

For home use, especially in an external server, I would strongly suggest making a clear plan first. Start with the application, then work backward to the hardware so you build once and not in 5 iterations of a storage solution that will last 24 months. That tends to waste money if you are a home user.

An example would be if you want an external file/media server. You are going to be doing a lot of sequential reads/writes especially with 1-10 clients hitting it at a time. This is very different from a business application database server that will see 100's or more users with lots of random reads/writes simultaneously.

Why this is important is because you have to remember that you are going to be limited by network speeds at some point. Building an in-PC array you have PCIe 2.0 x8 bandwidth (don't remember off the top of my head but it is probably in the area of 4GBytes/s. GigE is going to be 1000Mbits/sec max or double that for two bonded controllers in a perfect world. The difference is that you will not saturate the PCIe bus with spindle drives, but you will saturate your server's network connection. Fundamentally that meant that I was looking for different things on my desktop versus my server.

Also, don't buy a card today planning to use some new high capacity drive tomorrow. There is a decent chance the new drive will require a firmware update which may be ready at the drive's launch or months later.

Finally, consider how long you plan your solution to last. For me, I expect the Savvio farm to last 12-18 months until I anticipate jumping on the SSD bandwagon. On the other hand, I expect the server to last 24-36 months. My server I'm expecting to hit 40% capacity by the end of May excluding PC backups, and that figure should steadily climb at a nominal 5%/month save months that I am 100% on the road. By 24 months out I expect new drive generations to be out, possibly using the new SATA interface or with cheap SSD storage, and new controller generations to be available supporting them. At that point I may even build a new home server.

The key is determining what your needs are and how you can satisfy those needs in the most cost efficient manner. Then going down that path.

I would suggest that it may be better to wait until
 

jonnyjl

Member
Feb 11, 2008
26
0
0
Good points all around everyone.

pjkenned, I definitely plan on waiting until the drives are released before buying the card.

The most I would do, if I do settle on Areca, is buy an extra module. Looks like you can get a 2GB for around $35, seems like a good compromise from 512MB to 4GB.

I just wish Seagate would release their drives already Or at least announce pricing (don't think that's known), so I know if I want to wait or not.
 

imported_wired247

Golden Member
Jan 18, 2008
1,184
0
0
Originally posted by: Idontcare
Originally posted by: jonnyjl
RAID Controller:
From what I can gather from reading various forums/websites and such is 3ware's probably the best bet. Specifically, 3ware 9650SE-8LPML. If I could go x8, I'd seriously consider an LSI card, especially with the issues the 9690 seems to have. I'd most likely get the backup battery unit if I went with 3ware or LSI. I'm open to suggestions, like I said I'm new to building an array of this magnitude.

For this application is there a specific reason you aren't considering Areca? 4GB cache, IOP348.

http://www.areca.com.tw/produc...cietosas1680series.htm


based on the OP I vote areca as well
 

jonnyjl

Member
Feb 11, 2008
26
0
0
I know it's been awhile, but I think I may have narrowed down what I want, any opinions comments?

I've changed my mind and I do want my data backed up so I'm thinking of using the RAID array purely as a backup/temporary storage. I'd start of initially with 5 disks and grow when needed. I would add internal disks as I need it. Will go SSDs for my I/O intensive tasks and go 2TB LP drives for media. After the array is built I can probably go a couple months without upgrading since I can re-allocate a 1.5TB for media duty and bring in my "old" 1TB drive.

Raid Array

Initial:

5x2TB Seagate or WD 2TB Lower Power Drives. Since my use has changed, 7200RPM SAS/SATA drives don't seem appropriate.
Areca ARC-1680ix-12 SAS Raid Controller /w BBU and upgraded RAM (still researching the RAM).
8x3.5" External SAS Enclosure. I'm unsure of what to use but I'm looking at This thing from PC-Pitstop or EnhanceBox E8-MS. Leaning to the Enhancebox, since it's the only one I can find reviews on, it says it only supports 1.5TB drives, do you think that's a limit of the backplane, or just a CYA statement since they haven't tested 2TB drives?
Which means I'll need a SAS adapter to route the internal (probably do both) to an external connector (right? because I'm not getting an expander).
 
sale-70-410-exam    | Exam-200-125-pdf    | we-sale-70-410-exam    | hot-sale-70-410-exam    | Latest-exam-700-603-Dumps    | Dumps-98-363-exams-date    | Certs-200-125-date    | Dumps-300-075-exams-date    | hot-sale-book-C8010-726-book    | Hot-Sale-200-310-Exam    | Exam-Description-200-310-dumps?    | hot-sale-book-200-125-book    | Latest-Updated-300-209-Exam    | Dumps-210-260-exams-date    | Download-200-125-Exam-PDF    | Exam-Description-300-101-dumps    | Certs-300-101-date    | Hot-Sale-300-075-Exam    | Latest-exam-200-125-Dumps    | Exam-Description-200-125-dumps    | Latest-Updated-300-075-Exam    | hot-sale-book-210-260-book    | Dumps-200-901-exams-date    | Certs-200-901-date    | Latest-exam-1Z0-062-Dumps    | Hot-Sale-1Z0-062-Exam    | Certs-CSSLP-date    | 100%-Pass-70-383-Exams    | Latest-JN0-360-real-exam-questions    | 100%-Pass-4A0-100-Real-Exam-Questions    | Dumps-300-135-exams-date    | Passed-200-105-Tech-Exams    | Latest-Updated-200-310-Exam    | Download-300-070-Exam-PDF    | Hot-Sale-JN0-360-Exam    | 100%-Pass-JN0-360-Exams    | 100%-Pass-JN0-360-Real-Exam-Questions    | Dumps-JN0-360-exams-date    | Exam-Description-1Z0-876-dumps    | Latest-exam-1Z0-876-Dumps    | Dumps-HPE0-Y53-exams-date    | 2017-Latest-HPE0-Y53-Exam    | 100%-Pass-HPE0-Y53-Real-Exam-Questions    | Pass-4A0-100-Exam    | Latest-4A0-100-Questions    | Dumps-98-365-exams-date    | 2017-Latest-98-365-Exam    | 100%-Pass-VCS-254-Exams    | 2017-Latest-VCS-273-Exam    | Dumps-200-355-exams-date    | 2017-Latest-300-320-Exam    | Pass-300-101-Exam    | 100%-Pass-300-115-Exams    |
http://www.portvapes.co.uk/    | http://www.portvapes.co.uk/    |