Diversifying your backups is the most important thing to do

Diversifying your backups is the most important thing to do this year
(Image credit: Shutterstock / Blackboard)

To hedge against the risk of unexpected accidents, tragedies or disasters befalling us, we’ve developed sophisticated mitigation instruments like insurance. Yet while many businesses make sure to insure against these eventualities through building insurance, public liability cover and employers’ liability insurance, a surprising number of organizations don’t hedge against one of the most pressing business risks facing them today - data loss.

About the author

David Friend is the co-founder and CEO of Wasabi Technologies.

Today’s business environment depends upon teams consistently being able to access and iterate upon their data. Whether it’s customer records, website assets, project files or customer correspondence, business continuity for firms in every sector depends on shared repositories of data being secure and constantly available.

Last year IBM estimated that downtime from data breaches cost the average company $1.52m worth of lost business. A downtime, however, pales in comparison to a total data loss, which can see years of business output destroyed - especially if you put all your eggs in one basket. A fire or flood can easily destroy an entire server bank or data center, causing immeasurable harm to businesses that don’t have a backup stored elsewhere.

Understanding durability

To capture and express the risk of total data loss, using the language of probability is useful. In general, the probabilistic gold standard of safe storage is “eleven nines” of durability, which refers to a 99.999999999 percent probability a customer will not lose a given file in a given year. That means that someone storing their data can expect to lose only one file out of a hundred billion per year (that’s 1 in 100,000,000,000).

Eleven nines of durability covers safeguarding against disasters and also protecting against more conventional issues, such as degradation in hard disks or servers. This level of cybersecurity is effectively achieved by storing multiple copies of the same data.

Imagine that there's a 1 in 100 risk of a file being lost in a given year - through keeping a copy of that file, you can now geographically separate the two files to hedge against disasters, while also being able to compare them to check for any degradation. This means that the only way you can 'lose' the file is if both copies of it either are lost to a disaster or data corruption at once, which cuts your risk from 1 in 100 to 1 in 10,000.

Introducing 3-2-1

In the cloud storage industry, a good vendor will be able to guarantee eleven nines of durability to their customers against everyday causes of data loss, such as regular degradation or human error from in-house teams. However, that doesn’t price in the likely unknown risk of their data center being hit by a natural or man-made disaster of some sort.

To ensure that companies price in this risk, it’s useful to employ what’s known in the data storage industry as the “3-2-1” rule: keep three copies of their data, with two on different media formats, with one on a different site. Each of these moves decreases the risk of data loss and data interruption from technical, human and geographic risks.

In particular, keeping a copy of your data off-site or, in the case of cloud solutions, with a different vendor, is vital. Risks such as natural disasters or internal sabotage will always loom over any data that’s stored with just one provider and cannot be offset through more copies within that provider’s bubble. Instead keeping off-site backups can allow you to distance your operations with what’s called an “air-gap”.

Immutable storage

It would be a comforting thought if most cases of data loss come about as a result of rare natural disasters or physical wear-and-tear. In reality, this sadly isn’t the case - two out of every three data data loss incidents aren’t due to hardware failure or disasters, but rather bad actors or human error. Humans are usually the rogue element in data storage mishaps, with software misconfigurations, sabotage or malware being the prime causes of data loss in organizations.

That means, even if you follow the 3-2-1 rule perfectly, there’s still a risk of you or your team exposing you to a catastrophic data loss. So diversifying your backups should only serve as a first step on the path towards hedging against risk.

To offset human risk, organizations should consider making use of immutable storage. Immutability is a data storage option that prevents anyone - whether they be from your company or the data center - from modifying or erasing data that’s on file. Once you’ve written data into an immutable storage bucket, it can’t be changed until a pre-set timer has run its course - any attempt to write or delete that data will just generate an error message. This almost entirely removes the risk of human error or sabotage destroying your backups.

In general, the gold standard of eleven nines of durability for your data hinges on you using a diversified set of backups, with the 3-2-1 rule providing a clear guide for how you should structure and distribute those backups. Through cutting out the risk of human error through introducing immutability to the mix, you should then be confident in the knowledge you’ve done all you can to offset the risk of a potentially catastrophic data loss.

David Friend

David Friend, co-founder and CEO, Wasabi Technologies.