The U of M’s backup pricing structure: Do these people live in Oz?

You must protect yourself from those evil marketing rays
Creative Commons License photo credit: Unhindered by Talent

Backups are good.

Everybody says so.

Really, Really Good.

So you’d think the University of Minnesota would be working to provide a reasonable on-line, off-site backup system for its folks. Unfortunately, their pricing structure seems to be from another planet where storage is, like, well, really … uh … expensive!

To quote from the relevant web page:

OIT-UDMS Backup pricing (as of 2/2008)

Storage used Cost per month
  <128GB $25
  128GB ā€“ 256GB $50
  256GB ā€“ 384GB $75

To calculate the cost of backup service, simply round up the amount of data you need backed-up to the nearest increment of 128GB and use the formula of $25/128GB/month multiplied by the retention period (30 day backup is 4X the primary data). Backup data can be held for up to 90 days. Incremental backups are run daily and that data is retained for 2 weeks. Full backups are run weekly and those backups are held for one month.

We have the additional options of 2 weeks, 30 days and 60 days if your requirements for retention are shorter.

Yup, that’s right. $25 a month for 128Gb of storage. I can buy 500 Gb hard drives for under $100, so I could buy 3 drives (1.5 Tb of storage) a year for he cost of their backup system. will backup an unlimited amount of data for $5/month for home users; not sure what their enterprise pricing is like, but I kind of doubt that they’re going to jump to the U’s pricing.

Bet the U doesn’t get a lot of takers at these prices. Bet their staff aren’t backing up nearly as much as they’d like, either. Hmmmm … a relationship worth exploring?

Related posts

2 thoughts on “The U of M’s backup pricing structure: Do these people live in Oz?”

  1. Huh. I haven’t seen organizations offering online backup like this before. So far I’ve only seen 3rd parties like Mozy offering it.

    Looking at the linked page, it sounds like they’re offering not online/offsite, but the far more traditional offline/offiste such as provide by bog-standard tape. Or its newer cousin the backup-to-disk “virtual tape array”. I also note that they claim to not be taking any ne backup customers, which tells me that they’ve already maxed out their backup capacity.

    Services like Mozy make their money on the fact that the limit is not provided by them rather the upload speed of your primary internet connection, coupled with greatly limiting the number of versions that are archived. For a file that changes daily, such as an Outlook email archive file, the UofM archive could provide you with 12 versions of that file (8x incrementals, grabbed daily, + 4x on the fulls). So you could tell them “give me the file from 3 weeks ago” and they can give it, where iwth Mozy they probably can’t.

    Also keep in mind their pricing includes 4 full replicas of your data-set, plus 8 incrementals the size of which depends on your data-change rate. Worst case is they need to keep 12 full copies of your data. This is how 128GB turns into 1.5TB of consumed space. Add in the fact that this space is almost definitely being stored on some kind of SAN-based disk-array, and that $150 drive from NewEgg turns into a $450 drive. Then I go back and re-read your post and see this…

    I can buy 500 Gb hard drives for under $100, so I could buy 3 drives (1.5 Tb of storage) a year for he cost of their backup system.

    So their prices are actually not that bad, assuming worst-case storage consumption :).

    In the greater sense, there are more and more companies offering full enterprises off-site backup solutions based completley over the internet. They ship you an array that you hook up to your network for the initial copy, and take it back to their data center, and all backups from there are handled by an agent on the server sending net-change blocks/files to their service. Not a good idea for a GIS class that routinely throws around 3TB of files, but GREAT for the small to medium business file-server.

    Non-traditional backups are really looking to change the industry. The U just hasn’t gotten there yet.

  2. Catbert, Denier of Information Services strikes again.

    It’s another one of the “We don’t want to offer this service, so we’ll make it as inconvenient as possible so no one uses it.” type of deals.

    I mean, that’s a small window of availability there. I’d reckon that project storage requirements tend to be exponential at the 100G range.

    99.44% of projects will require << 100G of storage space. Of the projects that require more than 100G, I’d bet that the requirements that inflated them to 100G will also push them past 384G pretty quickly.

Comments are closed.