Best Practices for Backup to the Cloud – Part 3

Best Practices for Backup to the Cloud – Part 3

It can sound so easy to back up to the cloud. Just plug in a backup appliance and it does all of the work. While there is an element of truth to that, there are certain steps organizations should take to ensure they are getting the results they expect when implementing a backup appliance that backs up to the cloud. In this third and final segment of my interview series with STORServer’s Jarrett Potts, we discuss best practices for backing up to the cloud and recommend some steps that organizations should take to maximize backup and recovery times while minimizing costs.  

Jerome: Any best practices that you recommend users follow when deploying and using a STORServer Backup Appliance?

Jarrett: If they have an environment with 10-15 servers, a small headcount, a small number of servers, and they have no secondary site to which to replicate data, then a public cloud is the only obvious choice. Using STORServer, this would allow them from a best practices perspective to do incremental forever as well as set up a routine that would make sure that their most important data–their production data–is protected the most quickly. This ensures their mission critical devices and applications are protected first and then all other data goes after that.

What we are really talking is the ability to discern which is the most important data. One of the best practices for smaller companies that are backing up to the cloud is to get rid of data that they just do not need.

In a customer environment, there are many who say somewhere between 40 and 60 percent of the data in a SMB (small and midsized business) account has not been used in the last year. However, even data that has not been used in the last year is still backed up. If STORServer could speed up recovery time by 40 percent by getting that inactive data off of the original system, then it’s a win/win for everybody.

This is one of the things STORServer would like to suggest as a best practice: Go out, find all of your old data, and archive it to the cloud. It is still there so you can go out and find it at any time. But, this gets rid of it on your production and/or primary systems. Then, the backups are backing 40 percent or whatever percent amount of data that has not been touched in a year.

This means your backup is 40 percent faster and your recovery is 40 percent faster. Further, you just freed up 30 percent of your production disk so you might not have to buy storage for a longer period of time. So, it is always a best practice to get rid of old and aging data.

Then, of course, there is another best practice that is not necessarily a best practice from an IT technical perspective. When you pay for a cloud service, you normally pay by the TB, either by month or year. You have to be very careful of what you send to the cloud, especially for small and midsized customers that do not have the infrastructure to sort through data. So, as they back up servers and maybe workstations, they may be sending data to the cloud that does not need to be there.

For example, I run a small nonprofit organization out of my home. The data associated with this is located on two laptops that I back up to the cloud using STORServer. One of the things that I noticed was that the amount of data I was pushing was too high for what I was trying to do.

I looked to see what I was backing up and discovered that I was backing up my iTunes directory. My iTunes directory is 20 or 30 GB. I then excluded that directory and immediately my backups became much faster because I was not sorting through mp3 files and pictures that are in my iTunes directory. Of course, the same is true for my email directory. I got rid of it because it is on my server at work. I do not need to back it up on my laptops.

Most people would probably want to not back up all files on their local machines and shared IT directories, and they probably do not want to back up PSP files from Exchange. There are some considerations of data to exclude or include when you are going to the cloud to make it a little more intelligent.

Jerome:
 Does STORServer help companies with their backup appliance deployments?


Jarrett:
 STORServer sits down with a customer and they tell us what their most important systems are. They identify which ones are tier one, which are tier two, which data is mission critical and which is not. Then, there is even a tier three that has everything else.

Once the data is tiered, STORServer on the back side makes sure that the tier one data has the most resources when necessary. The tier one data is the most important data, so when that comes in for backup and recovery, it takes priority over everything else.

This is something that we set up inside of the appliance. But, we also have to consult with the customer because they know best which data is the most critical in their environment.

In Part I of this interview series, we discuss how backup to the cloud is coming into its own.

In Part II of this interview series, we discuss how to choose the right backup appliance for cloud backup.