Safety in the clouds
Businesses are adopting cloud services at a rapid pace as they realize the benefits of scalability, flexibility, agility and better workload distribution – all implemented at lower cost. Conventional wisdom, however, insists that the public cloud should only be trusted for less critical data and applications, while more sensitive or private data should only be processed in the enterprise’s private cloud.
Is this really the best choice? It is an understandable reaction but not a logical one. I predict the time will come when, instead of “risking operations to the cloud”, we will actually send them to the cloud for their greater security – but first we need to take a closer look at human perceptions of risk.
Safety above the clouds
How many times must we be told that flying is safer than going by car, before everyone can relax and enjoy the flight? There is no simple answer, because the fear of flight has a strong emotional and instinctive basis, and such gut feelings take a long time to respond to logical arguments.
Our guts tell us that travelling at 800kph above the clouds is a lot more dangerous than walking at 6kph on the ground, and our guts are right. But that is exactly why global business and government regulations have invested billions over a century to overcome that danger. In fact the biggest risk in traveling by air happens not in the plane but during the journeys to and from the airport.
That’s just how it is with cloud computing. Our guts tell us that it is safer to keep critical processes in our own data center where we know where it is and who can access it, but there’s a multi-billion dollar business out there doing everything it can to make sure our guts are wrong – investing massively in systems to protect cloud data and operations because their very reputation and the future of cloud computing depends upon it. Of course the cloud industry knows it is a target for cyber attacks – just as a plane is an attractive target for terrorists – but that is exactly why so much is spent to protect against those attacks, more than could ever be justified to protect a private cloud.
Read related articles in Business Review Canada
- The development of cloud technology: Past, present and future
- What to look for in a cloud computing company
- Modern data centers: A fit for your business?
In fact the biggest risk is not in the cloud but in the journey to the cloud – just like the journey to the airport. Once that is understood, it makes sense to avoid trusting critical data to the Internet and using an Ethernet private line to connect to the cloud.
It’s an old business cliché that buying decisions rely heavily on emotions, and it remains partly true even when making well-considered IT choices. What is needed is a clearer understanding of the actual rather than gut risks of cloud computing, their relative importance, and how best to minimise them. The CEF (Cloud Ethernet Forum) is addressing this need by analyzing the security challenges under four categories of use cases:
- Security within the cloud
- Cloud networking
- Privacy concerns
- Security from the cloud
Security within the cloud
You don’t send data to the cloud to have it treated badly or lost. Cloud services must be implemented in a high quality and well-maintained infrastructure. In fact this is one of the public cloud’s key attractions: instead of the on-going burden of updating PC applications, operating systems and security, the cloud uses the latest technology to deliver state-of-the-art service on a “pay as you use” basis.
Data integrity and cloud service performance depend directly on the stability and normal operation of the cloud infrastructure, which must be protected from unexpected events such as human operation errors, power outage, and natural disasters.
A major concern for cloud customers is that their data must be accessible to the legitimate users but not to unauthorized users. Access management and user identity recognition is vitally important in the cloud. This is not just about protecting from data theft: cloud-assisted Machine-to-Machine (M2M) communication and the Internet of Things (IoT) requires allocating access only to devices covered by existing contract agreements with the cloud information provider.
Then there is a particular example of “the fear of the unknown”: added vulnerability from the virtualization technology underlying much of today's cloud services. Distributed Denial-of-Service (DDoS) attacks that target VMs could exacerbate service degradation at the hypervisor, and traditional solutions for protecting non-virtualized systems may not be sufficient. Such attack traffic should be filtered before it reaches the target VM.
Cloud networking security
It was suggested that the journey to the airport could be more risky than the flight itself, and the same applies to the cloud. However safe and efficient the operation inside the cloud, it comes to nothing if the data transfer between cloud or between the user and the cloud becomes as uncertain as taking a car to the airport during rush hour. This can be quite a challenge when you consider the size of many data transfers and the low latency demands of many users. Add the risk of cyber-attacks on the network – or being hi-jacked on the way to the airport – and a balance needs to be struck between how far you secure the network while still allowing fast, low latency connection.
Encrypting every bit of data between clouds and customers adds a heavy computation burden, and not every bit is equally sensitive: so agreement between customer and provider is needed on the security level of each type of data. The Internet services provider should guarantee the safety of the data transfer at the physical layer, and a Virtual Private Network (VPN) solution between cloud and customer is a more reassuring means of providing security and privacy in the data transfer process. Add to that the usual intrusion detecting solutions – firewall, DPI, threat management, log management, anti-virus services etc.
Just as server virtualization adds an element of uncertainty within the cloud, so does network virtualisation add both positive and negative potential. SDN makes the network structure more flexible by separating the control and data planes: this would restrict malicious access to the data plane unless attackers gain access to the control plane. So it is all the more vital to defend the control plane against malicious access and attacks. As with the cloud hypervisor, the controller would be a target for DoS and Information Disclosure attacks. Techniques such as rate limiting, event filtering, packet dropping and timeout adjustment should be deployed in the control plane to help defend against this.
We have already addressed the more obvious question that no-one wants their data leaked or tampered with in the cloud. This issue takes it a stage further: can the user rely on the cloud to “solve a problem without knowing what the problem is”?
This applies especially to massive big data or number crunching operations. A bank, for example, as part of its portfolio management might not have the internal number crunching capability to solve a large-scale optimization problem, and so it might want to use cloud computing resources. But the actual data sets include some sensitive information that should not be leaked to a third party or the public. Techniques such as homomorphic encryption should make it possible to process the data without its actual content being revealed but, to justify using this solution, the computation overhead in (a) encrypting the input, (b) decrypting the solution, and (c) verifying the correctness of the solution has to be much lower than the complexity of the computation task itself.
An added factor is the increasing concern from governments about data routing. Cloud data auditing needs to be efficient as well as easy to implement, and it must at the same time retain the privacy of the data being audited.
Cloud Managed Security (CMS)
This is a particularly promising development, best explained by an analogy with the water supply. A service provider that allows malware and junk mail to reach the customer premises, and so expects each customer to buy and rely on their own defensive measures (firewall, anti-virus software, mail filtering etc) is comparable to a water company delivering polluted water to the home. We should expect the data and services to arrive clean, and be as safe to consume as good tap water.
Cloud Managed Security replaces traditional, complex, and often inefficient distributed security methods with centralized, uniformed and flexible cloud computing enabled security. This “Security as a Service” could be built on the cloud’s seamless detection, management and monitoring solutions and would need to meet several demands. It must be: robust enough to guarantee the connection between cloud and customer; comprehensive enough to replace all local security facilities such as antivirus software, filter and firewall; it must be consistent between customers with different security and other needs and operations and must deliver high Quality-of-Service in terms of accurate real-time, low latency malware detection and disposal.
Transferring local security arrangements to the cloud, in this way, would cut overall costs, and dramatically improve user experience with increasing convenience. As a managed service from the cloud provider, it provides a whole new business opportunity where the cloud, instead of being seen by the customer as a source of risk, becomes the means to avoid risk. The service could take a number of forms.
With the perimeter based technique, any traffic arriving at the customer’s perimeter would be re-directed to the cloud for security inspections, processed and returned to the customer. Appliance-based security solutions such as proxy devices, firewalls in the customer gateway, could redirect to the cloud using encryption and elastic methods such as IPsec VPN. For higher levels of security demand, SSL can protect the communication regardless of the underlying connection type, and can further support portable device users.
A more straightforward solution could transfer data to the cloud directly for security inspections and enforcement, by properly configuring the DNS server in the Internet and adopting a proxy mechanism. So the cloud acts as a highway checkpoint inline between the customer and the public Internet. It examines the data in bit unit with a cluster of servers in the cloud. SaaS services, GRE Tunneling, Proxy Chaining, Port Forwarding, Web Browser Proxy Configuration are some methods that might be used to meet different demands and achieve sufficient efficiency.
Software defined network security (SDNS), is a third approach, and a fundamental new paradigm that capitalizes on the latest network virtualization and SDN technology. Data traffic arriving at customer home gateway, edge router, or vSwitch is virtualized into virtual overlay flows for transmission into the cloud, and operated on by virtualized network security functions built around data inspection and security intelligence modules for filtering and anti-spam/virus. The cloud also hosts management and control modules such as billing, security service subscription and management, incident and event aggregation, service provisioning, network provisioning and directory and report services. The customer’s data traffic, after security inspection and enforcement, is then transmitted to the customer's target network from the security management cloud.
Despite all that is proven about the greater safety and security when you are in an airplane high above the clouds, I doubt that even Howard Hughes would choose to spend the rest of the one’s life flying between airports. But there are some people choosing to retire to homes in giant cruise liners, enjoying the security as well as all the services of a vast ocean liner.
Done on such a scale, so much can be invested by the liner’s operators to not only provide outstanding service but also to minimize risk of attack or accidents, that it really begins to look like a safer and better deal than a private house on land.
That is what could happen to our perception of the cloud. It will come to be recognized as simply the best and safest means to store and process data – however private and however critical.
How changing your company's software code can prevent bias
Two-third of tech professionals believe organizations aren’t doing enough to address racial inequality. After all, many companies will just hire a DEI consultant, have a few training sessions and call it a day.
Wanting to take a unique yet impactful approach to DEI, Deltek, the leading global provider of software and solutions for project-based businesses, took a look at and removed all exclusive terminology in their software code. By removing terms such as ‘master’ and ‘blacklist’ from company coding, Deltek is working to ensure that diversity and inclusion are woven into every aspect of their organization.
Business Chief North America talks to Lisa Roberts, Senior Director of HR and Leader of Diversity & Inclusion at Deltek to find out more.
Why should businesses today care about removing company bias within their software code?
We know that words can have a profound impact on people and leave a lasting impression. Many of the words that have been used in a technology environment were created many years ago, and today those words can be harmful to our customers and employees. Businesses should use words that will leave a positive impact and help create a more inclusive culture in their organization
What impact can exclusive terms have on employees?
Exclusive terms can have a significant impact on employees. It starts with the words we use in our job postings to describe the responsibilities in the position and of course, we also see this in our software code and other areas of the business. Exclusive terminology can be hurtful, and even make employees feel unwelcome. That can impact a person’s desire to join the team, stay at a company, or ultimately decide to leave. All of these critical actions impact the bottom line to the organization.
Please explain how Deltek has removed bias terminology from its software code
Deltek’s engineering team has removed biased terminology from our products, as well as from our documentation. The terms we focused on first that were easy to identify include blacklist, whitelist, and master/slave relationships in data architecture. We have also made some progress in removing gendered language, such as changing he and she to they in some documentation, as well as heteronormative language. We see this most commonly in pick lists that ask to identify someone as your husband or wife. The work is not done, but we are proud of how far we’ve come with this exercise!
What steps is Deltek taking to ensure biased terminology doesn’t end up in its code in the future?
What we are doing at Deltek, and what other organizations can do, is to put accountability on employees to recognize when this is happening – if you see something, say something! We also listen to feedback our customers give us and have heard their feedback on this topic. Those are both very reactive things of course, but we are also proactive. We have created guidance that identifies words that are more inclusive and also just good practice for communicating in a way that includes and respects others.
What advice would you give to other HR leaders who are looking to enhance DEI efforts within company technology?
My simple advice is to start with what makes sense to your organization and culture. Doing nothing is worse than doing something. And one of the best places to start is by acknowledging this is not just an HR initiative. Every employee owns the success of D&I efforts, and employees want to help the organization be better. For example, removing bias terminology was an action initiated by our Engineering and Product Strategy teams at Deltek, not HR. You can solicit the voices of employees by asking for feedback in engagement surveys, focus groups, and town halls. We hear great recommendations from employees and take those opportunities to improve.