The Security of Cloud Services and SaaS, Part 3

In Part 3 of this series, let’s take a look at the financial implications of cloud computing and the security benefits of centralised SaaS computing.

Daniel Ayala will be posting articles about information security, privacy and compliance in our blog. Throughout his 25 year career, he has led security and privacy organisations in banking and financial services, pharmaceutical, information, higher education, research and library organisations around the world, and both writes and speaks regularly on the topics of security, privacy, data ethics, and compliance. He also happens to be LabArchives Chief Information Security Officer!

Part 3: Financial Implications

Greetings LabArchives Reader,

In Part 3 of this series, let’s take a look at the financial implications of cloud computing and the security benefits of centralised SaaS computing.

Feel free to peruse Part1: The Value of Scale and Scope and Part 2: Technology Secured at Scale and then come back for this installment.

Meet the Demands

If you have ever gone through a capital planning exercise, you know that there is a part of the equation that asks, “what is the useful life of this asset.” For capital expenditures – the money spent to buy, run and maintain these large fixed asset investments, like buildings, computers, and more – the longer the life of the asset, the more benefit there is to the organization. However, in technology, the useful life of hardware goes down as processing demands go up. A server bought five years ago will be significantly less productive today. As such, purchasing computing hardware for a data centre may not be the best way for organisations to invest. Cloud services allow the service provider (in LabArchives’ case, Amazon Web Services) to invest in purpose-built hardware, at a massive scale, to provide and update the systems needed by businesses and institutions more cost-effectively.

Coupling this with the ability to dynamically scale the amount of computing that an organization needs, another huge benefit of cloud computing becomes clear: buy the processing you need only when you need it, and remove other instances when you do not. To accomplish this in an on-premise offering, an organization would have to buy enough hardware to fill the peak need, which at a University usually takes place two times each year, for 1-2 weeks each in November and May; the rest of the time the hardware would be mostly unused, but still costing the institution to depreciate and run.

This article is about security, so here is your reminder that security professionals recommend turning off or decommissioning systems that are not needed or used. By having out-of-use systems running, they increase the organization’s attack surface and create unnecessary risk. The act of powering these systems down until the next peak takes time and effort to execute. And when they are off, systems may not receive security patches, so they also put the systems at risk until updated when turned back on at the next peak usage period.

As discussed in our previous installment, organizations benefit greatly from the dynamic scaling that can be part of cloud applications. Typically, organizations view investments in security as a cost rather than an investment. However, the major cloud services providers can invest large sums into securing cloud data as that is their core business. With this model, innovations in security can be quickly applied across all devices using the application for continuous improvement.

Pick-Em

I had a colleague who would loudly decry: “it’s the app, stupid!” And do you know what? He was right (and still is). As I mentioned at the beginning of this series, either on-premise or cloud-hosted environments are nothing if the application is not securely developed, with scalability and redundancy in mind and appropriately operationalized by the support team. But as a CISO with experience in security and privacy in corporate and academic settings, my preference underlying infrastructure environment is increasingly swinging toward cloud-hosted.

With centrally managed SaaS applications built on top of cloud services, whenever features or security updates are released they are installed centrally and across the world, reducing the risk of exposure and ensuring the consistent deployment of fixes. Rather than waiting for internal technology or security staff to patch or update the application, SaaS users are always up-to-date. A recent security issue with locally installed Microsoft Exchange servers was exacerbated by the fact that many organisations didn’t implement the patch in a timely manner and were exploited. Eventually, law enforcement intervened to patch and recover a large number of mail servers to stop the attacks. SaaS, in the way operated by LabArchives is FORM: Fixed Once, Reaped by Many.

Thanks to the significant research and development that goes into securing cloud technology at Amazon Web Services, Google Cloud, and Microsoft Azure, along with the architectural assurance of availability and scale, cloud services are more security-minded than ever before. Information security is based on three tenets: confidentiality, integrity, and availability. Cloud services and SaaS applications exist to support all three of these in robust ways that can benefit your organization.

Coming Up

Security, privacy, and research data integrity are core to LabArchives. I will continue to share insights that help researchers understand and integrate these topics, and build stronger relationships with technology and security teams within your organisations and institutions. In the meantime, please reach out with questions or thoughts, or ideas for future posts. Stay tuned!

Daniel Ayala

Daniel Ayala

Chief Information Security Officer

LabArchives

Latest Blog Posts

The NIH 2023 DMP requirement encourages researchers to proactively plan for data sharing, with the expectation that data sharing will become an integral part of regular research conduct. In this article, we provide an overview on data repositories and how to identify compliant repositories that are most suitable for the data you are expected to share.
The NIH 2023 DMP requirement encourages researchers to proactively plan for data sharing, with the expectation that data sharing will become an integral part of regular research conduct. In this article, we summarize the types of the data you should expect to share and best practices that you should start familiarizing yourself with to be best prepared.
Adopting a modern electronic laboratory notebook (ELN) like LabArchives to support best practices in data management, lab connectivity, collaboration and research reproducibility can help position institutions for a successful REF 2029 review.
The purpose of this document is to outline the key components of the NIH 2023 Data Management and Sharing (DMS) policy

Get started with LabArchives today

Start for free and upgrade as your team grows