So I recently took a part time gig helping an netadmin while he is recovering from back surgery (the kind where they fuse together disks… oww!) Anyway right now they are running off a T-1 (looking at solutions to get rid of that too) and using tapes to backup approximately 2 TB’s. This place is looking for a place that has reasonable prices, very secure and reliable. I don’t think with our bandwidth it’s going to be any quicker to backup the servers to an offsite location then it is on tape. However, the bandwidth problem will be resolved possibly through Qwest or Comcast or something like that.
Some of the prices being offered for this services are just too pricey and not worth the money to this company. I actually wondered if Carbonite would offer an enterprise solution but I didn’t see anything on their website so that doesn’t appear to be an option. I did send them an email just to see what they say though.
2TB is not exactly a lot of data. Why not spring for a small NAS and use that as an on-site backup? You could probably put one together for less than $1500.
If it needs to be offsite, then you can just get a safe-deposit box to stash your tapes in.
Well, Carbonite does offer Carbonite Pro for businesses. It’d be pricey to back up 2TB though, about $1000 a month through Carbonite Pro. And services like Carbonite, Mozy (which is around the same price) etc are at the low end of what you’re expecting to pay. It’s definitely cheaper over time to backup to LTO tapes and offsite those. We use a service called DataSafe to offsite tapes, whose rates are very reasonable. Or if you have multiple offices you could just bring them to a different office.
Of course, while it doesn’t feel “enterprise-y”, buying a couple external 2TB disks and backing up to those on a rotating basis would be a very cheap way to go. And you could offsite those as well.
Yes, I think the best solution at this point is exactly what you both suggested and just buy some more hard drives. DataSafe told me they only service Nothern Cali. Either way it really looks like any of these services are going to cost more then the company wants to invest. Thanks for the replies!
Amazon cloud is only .15/GB last I checked, might be a little more cost effective that the $0.50 Mozy charges.
A few things to note that may help with the sale. If an IT guy is spending say 4 hours a week @$20/hour managing backups, plus the cost of any replacement tapes. They could easily be spending $300-400/mo on backup solutions anyway whereas an automatic solution will cost a similar amount but once in place frees up IT resources for other tasks.
It could also very well be they are carrying alot of archival storage that could just be backed up to an external drive or two and set aside without worrying about daily or weekly backups for it. 2TB of live work in progress is ALOT unless you are editing HD video or something. Maybe they could get away with 200-300GB of offsite and a DVD binder of the old stuff in a fire box.
I think it’s going to be the cost of the bandwidth to back up that amount in a reasonable time that’s going to be the killer. 100 Mbit links don’t come cheap. 2x DS3 = $10,000 / month. An onsite backup system is going to be far more cost-effective. Tapes or hard drives can be physically taken off-site.
I’ve got just one question: what thought has been given to the restore process?
Also as mentioned above, USB3 is preferable to USB2 for speed. You can also go with eSATA, though. If the machine doesn’t have a USB3 port or an eSATA port, if you’ve got an internal SATA port available you can add eSATA for less than two bucks: http://bit.ly/gwVvKu
eSATA is exactly the same speed as an internal SATA disk. You can add a USB3 card for around 30 bucks but the eSATA option is even cheaper.
The software is smart, it only creates new backup files of files that changed that changed, so even a pretty hoppin business is probably only generating a tiny fraction of that 2TB a month, or created a couple hundred new documents.
I am backing up 146GB to Amazon cloud myself, most of it is disk images so if my shop burned down/was robbed I can recreate all my software/windows disks. The actual backup takes less than 5 min because 99% of it does not change.
It’s an architectural firm with about 35 users and the backups are actually running around 1.6 TB’s. Lots of CAD. And, yes there is a lot of things we are trying to get moved off the servers but getting the people in the firm motivated to do so is not an easy task. And being as this is the least anal admin I’ve ever worked with there is not a very hard push to get things moved to local drives. Even in a near perfect world though there will probably always be at least a terabyte of storage needed. As was said this is not really a large amount of data but it still seems to be very expensive to have it done by a storage service.
I spoke with the lead planner today and he seems to be leaning towards just switching from tape to HD’s and rotating them.
Quartz: Excellent question. As of now we just restore off of tape using Symantec backup.exe I’m not very fond of this software because of the way they showed me you have to find what you’re looking for to restore. I guess it’s possible there is a better way in this software like a search feature or something. I’ve only used it twice. As far as restoring in the future that is defintely something we have to consider. USB3 hd’s would be great. It is running on Server 2003 on a Dell rack (not sure about the hardware specs). Not exactly “state of the art”. I’ve seen some USB3 cards that are supposed to work with 2003.
I’m not sure that I agree with the idea of moving stuff off the server onto local drives. What’s going to happen when the hard drive on one of those desktop systems crashes and you lose the data on it? Unless the data on the local drives is stuff you don’t care about losing.
But how often would a business repeat that whole-server backup? Quite frequently, I’d suggest. And after a disaster a business would want to be back up and on top of things if not running in less than 5-6 days.
I agree. Doing that means that you spread your losses so that the chance of a large failure is greatly reduced while the chance of a small-scale failure is almost inevitable and that can be a huge problem on its own. You have to go with managed server backups to shared folders however you do it. Buy an external 2 TB hard rive tomorrow for < $200 just to get the process started.
I also agree that backups are useless without a good restore plan. I work for a company everyone has heard of and we know for a fact that some of our restores don’t work because we test them very year and some of them fail. You need exact hardware replication to make some of it work and some of it is so old we don’t know how to get it at least not quickly. Good disaster recovery means that you can survive a fire, earthquake, if not a nuclear war. Companies like Iron Mountain can off-site backups so that any of those could be recovered from. Anything less and you aren’t doing it right. Those architectural plans may be just files to you but they are worth a whole lot in total costs to produce.
Why would you want things pushed off the server and onto local drives? That doesn’t make any sense.
The numerous positives for having data on the server and the numerous negatives for having data locally are both so well established that I don’t think I’ve seen anyone try to store locally since the early 90’s.
What can be done is having a share on selected workstations to store backups of data on the server. I do something similar in many small businesses without servers. One machine is usually hosting the primary applications and or document shares, I then have a folder on another machine that the host machine backs up files to.
Once the server is back online you selectively restore what is critical now (work in progress) you will have that in an hour or two, restoring backups of archival data to the server can trickle in. Remember the files are there and safe in the offsite storage, they will come back.
The applications/OS tend to be pretty static, only the data changes on a day to day basis. You could have a month old image of the main array on an external, fine, no biggie, the killer is gonna be work in progress. How many software updates were there in that time? Probably none.
Having access to 10 year old drawings might be nice, but if 1TB of that 1.6TB is basically archives that have not been touched in years, time to drop those projects to DVD’s in a binder or a couple copies on external drives and set them aside then drag out the binder if someone needs them.
If something huge happens, and you end up doing heavy work on a server, your images are gonna be worthless anyway barring server virtualization. Saving the work in progress is gonna be what keeps the business alive.
Also, I’m pretty sure that the online backup services like Mozy or Carbonite will send you the backup on DVD (or other media) if you can’t wait for the restoration to happen via the online service.
My proposed solution would be, if this was my customer:
Install a 2TB drive in a different machine in the office, preferably with GB network hardware all the way between it and the server, and share it to the server.
Have the server do its backup image to that drive, at gigabit network speeds, this is not gonna be a huge problem, you can probably get away with weekly incremental backups to capture OS and software updates along the way.
If you want you could probably do two machines and alternate using a different one each week, so even if the server and one of the “backup” drives are hosed, you still have a week old image on the other box.
Setup offsite backup on “work in progress” items, probably only about 10% of the total data. 2-3 times a day, its gonna do it on its own, and its only gonna take 10-15 min tops.