System misconfigurations have contributed to a vast majority of high-profile data breaches from major cloud providers as the security gaps add up to multiple potential access points for hackers.
Accidental exposure continues to plague organisations as they start to introduce new cloud services to provide shared storage, containers, database services and serverless functions.
In the same way, over 250m customer service and support records, spanning over a 14-year, were exposed by Microsoft due to a server misconfiguration in December 2019, a massive Xbox data leak in May this year and a 500GB of data from Microsoft’s private GitHub repositories was leaked this year.
The cause of all data breaches mostly came down to human error when misconfigurations led to exposing of data to a wider audience, which means that anyone with access to S3 server can view, access or download the content.
According to Sophos' State of Cloud Security 2020 study, security gaps in misconfigurations were exploited in 66% of attacks while 33% of attacks used stolen credentials to get into cloud provider accounts.
John Shier, Senior Security Advisor at Sophos, told TechRadar Pro Middle East that system misconfigurations contribute to the vast majority of data breaches.
He said that no cloud provider will take responsibility for securing a customer's assets because there is always the danger of misconfiguration.
“AWS has a ‘Shared Responsibility Model’ which states that the security of the cloud [network, hardware and pieces] is the responsibility of AWS and customer is responsible for the security of data [apps, services] in the cloud. AWS is responsible for the integrity of its overall infrastructure and the customer handles their necessary configurations.
“It is the customers’ responsibility to do the patching and upgrading to bring visibility to the user. Platform providers give the ability to configure all the stuff but customers have to do the configuration,” he said.
Hackers keep a close watch
Leaving the access point to public and API keys or encryption keys that are discoverable by attackers, even though the data is encrypted, Shier said that hackers steal the encrypted data and start looking around for getting the encryption keys to decrypt the data.
Roy Illsley, Chief Analyst for Enterprise IT at UK-based research firm Omdia, formerly Ovum, said that systems configuration is very important and that is why all the cloud providers have picked up on that.
“Misconfiguration, up to a certain extent, is a customer issue and that is a big challenge that people don’t realise. The cloud is a shared responsibility and while the cloud providers are providing some of the responsibility, customers fail to recognise where the line stops.
“Some customers feel that as the data sits in the cloud, they don’t need to back it up. The data backup is customers’ responsibility and does not guarantee that cloud providers will back it up for you. Customers should know clearly what the shared responsibilities are,” he said.
Now, with core workloads moving into the cloud, he said that cloud providers are making sure that their cloud is much more secure and risk-free, and provide tools to customers to overcome challenges as it was not the case earlier.
At the same time, he said that there are managed service providers who deal with everything and all customers’ responsibilities.
AWS, at one stage, he said had a default of “public access” in S3 bucket.
“Now they have taken that out and made it a selectable option. But you still can misconfigure that very easily today. It is protected by default today compared to the past,” he said.
“AWS, being the biggest cloud provider, is the biggest target for hackers. Oracle has learnt from this and developed its Gen 2 cloud with security as a core design principle” Illsley said.
The key differences between Oracle’s Generation 1 and Generation 2 cloud are that Generation 1 cloud places user code and data on the same computer as the cloud control code with shared CPU, memory, and storage, so, the cloud providers can see user’s data while Generation 2 cloud puts customer code, data, and resources on a bare-metal computer, while cloud control code lives on a separate computer with a different architecture.
If you are an enterprise and who does look for the best cloud security, Illsley said that Oracle is one of the leading ones and is well respected as Google.
“Google has got very good credentials. Microsoft is widely used for a border set of workloads but they got a weaker security response because they got a wide range of customers and that is why, some of their data is hackable,” he said.
It is difficult to pinpoint who has the best; he said, but by looking at the workloads that are run and the attacks successfully executed against them, on a general basis, you hear more about AWS and Microsoft than Google and Oracle.
Amazon and Microsoft did not respond to requests for comments on this story.
Getting the configurations to be the correct and monitoring the configurations is critical, Illsley said, no matter whether the data storage is in the cloud or on-premises.
“Oracle is increasingly becoming an influential enterprise-class cloud provider due to its reliability, high performance and security,” he said.
Lydia Leong, Distinguished VP Analyst at Gartner, predicts that Oracle’s dedicated cloud at customer, offering all public cloud services and same SLAs as in the public cloud, at a company’s data centre, will raise Oracle’s profile as an alternative to the big hyperscalers, among enterprise customers and even among digital-native customers.
Manish Ranjan, Program Manager for Software and Cloud at IDC Middle East, Turkey and Africa, said that Oracle Cloud at Customer brings all of its public cloud service offerings (SaaS, PaaS and IaaS) into customer's data centre to meet the data residency and data sovereignty requirements.
After its first announcement in 2017, he said that Oracle has made several advancements within Cloud at Customer.
“With the recent announcement of Dedicated Region Cloud at Customer, organisations which are highly regulated such as government, banking and insurance, can enjoy exact same services which are available in the public regions of Oracle Cloud Infrastructure (OCI). This can be a game-changer for Oracle especially within the Gulf region where Oracle does not have in-country data centres, such as in countries like Qatar, Oman, Kuwait and Bahrain,” he said.
Proactive monitoring is needed
“There are quality tools that enable you to identify various configurations and apply automation to bring compliance. All of the different settings such as security patch levels, firmware levels need to be up to date and you can monitor that,” Illsley said.
The gap of about 150 days, to patch a server, he said gives a window of opportunity for hackers.
“Having an autonomous way to automate helps but you need to trust the software and the patching. If I do it, does it impact my applications and that is the balance the enterprises have to make,” he said.
Going forward, he said that an autonomous approach is going to be needed to do the patching and the server updates.
“When we move to the Kubernetes world, using containers and all the latest technologies, managing all of them manually is a difficult task and it has to have an autonomous solution. The strength of Oracle is their autonomous database technology,” he said.
In terms of autonomous database technology, Illsley said that Oracle is more mature than others and is the “gold standard’ in the industry.
Nothing is 100% secure, Shier said.
However, he said that proactive monitoring of configurations by a security team can significantly reduce the likelihood of breaches.
“Cybercriminals use a wide range of techniques to get around defences. When one is blocked, they move on to the next one until they find something a weakness that can be exploited. Make sure to defend against all possible vectors of attack,” he said.