In part one of this blog series, we went over the history of backup appliances and their downsides in today’s environment. Proprietary backup appliances—dedicated hardware used for aggregating and pushing backup data to the cloud—were meant to solve problems with cloud-based backup, mostly around slow transfers over limited-bandwidth networks. When cloud backups first became viable, network speed was a major issue across the industry. Slow backups could cause resource drain on an organization, and made backups to the cloud unappealing, forcing IT professionals to rely primarily on on-premises backup.
However, many businesses don’t need backup appliances anymore. In fact, they can shed some of the “weight” of the backup box by choosing a solution that was built cloud-first.
Storing backups exclusively on premises has some downsides. Not only do you have to pay for the local storage hardware and for ongoing management, but you also risk the possibility of permanently losing data if a site gets knocked out completely (say from a natural disaster). The classic “3-2-1” recoverability strategy stipulates that one copy of your data should be kept off site, yet setting up your own co-located secondary data center can be expensive.
Cloud storage can potentially eliminate these downsides. However, the laws of IT physics brought another challenge—local area networks (LAN) are typically faster than wide area networks (WAN). If you wanted fast backups (and more importantly, fast recovery), you either needed to keep them on-premises or you needed to come up with some other solution. Dedicated backup appliances were created to solve this problem. They acted as a convergence point to push backup data to the cloud during off hours, when the impact would be reduced, as well as a local place to store backup data.
However, this method doesn’t really solve the problem—it simply relocates it. This type of appliance takes some burden off the network, but adds a new expense and a new single point of failure. All hardware is subject to failure, and if a backup aggregator appliance fails, backups are not taken, and RPO is potentially impacted.
If we shift our mindset from the assumption of “local first,” and instead create a backup product that is specifically optimized for sending data to/from a cloud location via the WAN, everything changes.
Sometimes, the biggest breakthroughs come due to constraints. Assuming you didn’t want aggregator hardware for your cloud-based backup, and you couldn’t make the WAN faster than the LAN, there’s one solution—make backup data smaller.
It makes sense—the smaller the amount of data being sent to the cloud, the shorter your daily backup time. The key is using a mixture of a few different techniques to make backups lightweight.
However, the real benefit comes from a more granular approach: block-level deduplication. In this instance, the backup solution focuses on specific data blocks within files for deduplication. This takes deduplication into overdrive; now, the system can simply look for block-level changes between backup versions.
For example, let’s say you have a PowerPoint presentation you’re working on. You simply want to swap out the footer on each slide. If you were using file-level deduplication, it would recognize the change to the file and back up the entire file to the cloud (as it is technically a new version of the file). However, with block-level deduplication, it can send only the block-level changes related to your edits to the PowerPoint’s footer. This makes the backup process far more efficient and sends far fewer blocks over the wire.
Compression techniques attempt to reduce file sizes by using algorithmic techniques to represent redundant data blocks in a more compact format. You’re familiar with compression if you’ve ever downloaded a .zip or .rar file. Compressing a file (or a group of files) makes them smaller and easier to send over the web. While most backup products have some form of data compression, a product built from day one for the WAN will make efficient data compression a key feature.
Deduplication involves removing redundant copies of files themselves. For example, if you have multiple versions of a financial spreadsheet stored across multiple PCs in the organization—maybe between the CFO and multiple accountants—file-level deduplication would send only one version of this file to the cloud.
The final piece of the puzzle is change tracking. By working in the background to automatically track and note small block-level changes to the data, a compressed, deduped copy of only the changed blocks can be ready to go at backup (or recovery) time, speeding up the whole process.
Some backup products also include WAN optimization techniques. Once transferred data sizes have been effectively reduced, WAN optimization techniques allow the backup product to make sure data travels through the most efficient channels. This further speeds up the backup (and recovery) process.
Cloud-first backup doesn’t mean cloud-only
So far, we’ve spoken about both the limitations of proprietary backup appliances, as well as some of the techniques modern backup solutions use to make these appliances unnecessary. However, this doesn’t mean that you may not want an on-premises copy of your data. Some data is mission-critical, and you may want a local copy for the very fastest possible restore, or in the event of disruption to your internet access.
In this case, many solutions, like SolarWinds® Backup, are designed to enable users to keep a local copy of their data using any piece of hardware they have lying around. This could be an external hard drive, an old server you purchased years ago, or even something as simple as a USB drive (depending on file size). This offers the value of a local backup copy without having to invest in an expensive proprietary backup appliance, keeping overhead low. This method also eliminates the single point of failure, as the backup process is not dependent upon the local data storage device.
Ready to ditch the backup box?
For most businesses, it may be time to break free of backup appliances.
If you’re ready to try a “no-appliance required” backup solution, check out SolarWinds Backup. You can get started in minutes since you don’t have to purchase an appliance, and the management console is a hosted SaaS application. SolarWinds Backup uses the techniques mentioned in this blog post to help make backup and recovery fast. Once you get past the initial full backup, the system only tracks block-level changes rather than full files. Try it free today: https://www.solarwindsmsp.com/products/backup/trial
Carrie Reber is senior product marketing manager for SolarWinds MSP.
To find out how SolarWinds® Backup can help you improve backups for you and your customers, click here.
The SolarWinds and SolarWinds MSP trademarks, service marks, and logos are the exclusive property of SolarWinds MSP UK Ltd. or its affiliates. All other trademarks are the property of their respective owners.
Get the latest MSP tips, tricks, and ideas sent to your inbox each week.