Story image

The evolution of “trust” to a cloud environment

07 Dec 17

Today, consumers expect access to their data and applications 24/7, but don't want to compromise on security. This landscape has not changed since the inception. One landscape that is changing at full speed, is the trust model between infrastructure service providers and consumers. 

This article will examine the evolution of trust that has had to happen to enable us to access infrastructure from any device, anytime, anywhere.

First evolution of trust

When companies grew in size and started expanding geographical locations, these server rooms had to be moved into separate physical locations. These physical locations were facilitated by what we now call “data centres”. This was the beginning of “trust” in a service provider context for everyone in IT. Questions like — Can we trust someone with our servers? What about the sensitive data we process on these servers? What if, they decide to access these sensitive servers?  – were regular conversations for organisations that were experiencing growth.

They were being asked to trust a service provider to take care of their most prized possession - their servers, while providing only pre-scheduled physical access to the facility or remote connections to the servers in return.

To build a higher level of trust for data centres in a security conscious organisation, there were global and national standards established by reputed bodies for “data centres” to abide by and operate with. Any data centre with the approved standards meant that the organisations can trust them if they trusted the standard and can be more confident to use their services.

This was a good option for the organisation from both a monetary and logistics point of view - as it meant they could lower their server capacity and location cost while still being able to access the servers remotely over a secure connection in a few mins.

Once big organisations started trusting these service providers, the SMBs followed soon after. From this point on, a trust model was formed with data centres. Which laid the foundation for the first version of the ‘shared responsibility model’, which stipulates who is responsible and for what in this trust model.

Second evolution of trust/ current trust model

As organisations grew larger and started opening offices in different countries, it became harder to manage these global data centres at a reasonable cost.

Then came the ‘outsourcers’, the service providers who offered to take care of your data centre servers on behalf of your organisation at a minimal cost, while maintaining 24/7/365 service, with a real person at the end of the support number.

There was shift of trust at this point, where again we were asking ourselves the same questions as we did with data centres. This time, in return we would not be given any access to servers. Instead, we would have to raise a support ticket and pay money each time we want to change anything on the server.

This time, we decided the trust can be achieved by onboarding these external service providers as employees of the organisation. This gave us the reassurance that in the case of an incident they can disable all these external users who have access to their servers. This reduced the sysadmin roles to mostly manage and follow-up server requests, limited access to servers themselves and actioning any urgent request for which someone has to be physically present. 

Future/next trust model

Over time with the growth in outsourcing companies and data centres and growing organisations, the side effects of such a trust model started appearing. Outsourcing companies were taking more organisations on board as a way of growing and with cost optimisation even in these organisations only the required employees were kept in the workforce.

This meant that every request to these “outsourced” aka “vendor managed” support/service desk took a lot longer to fulfil. And when the employees at the other end of the phone/service desk have multiple clients requesting similar requests, the delay is understandable. 

Then there was cloud, with the promise of low cost and ease of access to resources in minutes. What cloud providers offer is an extension of the data centre services. Similar to data centres, these cloud providers have to get their physical premises approved against similar, sometimes even more stringent global standards to house servers for both public and private industries.

There’s not a huge difference between what was asked from thought leaders with the first trust model and the new trust model which is marketed and championed by a lot of engineers, managers, start-ups and technology companies.

I believe as security and technology leaders in any organisation, especially in organisations which are going through this change of trust, with the right security controls a good trust model can be established.

Article by Ashish Rajan, security architect, Versent.

Disruption in the supply chain: Why IT resilience is a collective responsibility
"A truly resilient organisation will invest in building strong relationships while the sun shines so they can draw on goodwill when it rains."
Businesses too slow on attack detection – CrowdStrike
The 2018 CrowdStrike Services Cyber Intrusion Casebook reveals IR strategies, lessons learned, and trends derived from more than 200 cases.
What disaster recovery will look like in 2019
“With nearly half of all businesses experiencing an unrecoverable data event in the last three years, current backup solutions are no longer fit for purpose."
Proofpoint launches feature to identify most targeted users
“One of the largest security industry misconceptions is that most cyberattacks target top executives and management.”
McAfee named Leader in Magic Quadrant an eighth time
The company has been once again named as a Leader in the Gartner Magic Quadrant for Security Information and Event Management.
Symantec and Fortinet partner for integration
The partnership will deliver essential security controls across endpoint, network, and cloud environments.
Is Supermicro innocent? 3rd party test finds no malicious hardware
One of the larger scandals within IT circles took place this year with Bloomberg firing shots at Supermicro - now Supermicro is firing back.
25% of malicious emails still make it through to recipients
Popular email security programmes may fail to detect as much as 25% of all emails with malicious or dangerous attachments, a study from Mimecast says.