Server-less Computing Briefing Paper

Revision as of 13:06, 25 October 2018 by John.bain2 (talk | contribs) (Created page with "Back to Tech Briefs list page ==Business Brief== The term Serverless Computing refers to a form of utility computing. It is a cloud computing execution m...")
(diff) ← Older revision | Latest revision (diff) | Newer revision → (diff)

Back to Tech Briefs list page

Business Brief

The term Serverless Computing refers to a form of utility computing. It is a cloud computing execution model in which the cloud provider will charge the client based on the resources used rather than a specified amount of capacity. This can greatly reduce cost from both standpoints. As a service provider, resources will be saved if the server is sitting idle when it is not in use. From the client perspective, this allows the client to tailor his research/development and testing these needs. Clients can use more capacity if need be but are not forced to pay for resources not being used. Much the same as Infrastructure as code, serverless computing saves developers a lot of time when auto scaling policies and systems are set up. The cloud provider is responsible for assessing the demand of a client and scaling the capacity accordingly. Serverless computing is also referred to as Function as code, which means that the cloud provider simply implements functions that perform tasks related to cloud management. Only simple functions are exposed to be used by the outside world meaning the client does not have to worry about things like multithreading, and HTTP requests. Serverless computing saves the cloud provider idle-time when the servers or virtual machines are not being used. Application containers become popular due to the host of microservices they offer. FaaS provides a similar concept except microservices are designed for processes, while FaaS is tailored toward the implementation individual functions. These functions are stateless and asynchronous, meaning when additional functionality is required, a new function can be added without disruption.

Technical Brief

Current Server-based applications are designed having a three-tier architecture. The three levels are the presentation layer, the application layer and the database layer. The presentation layer is the user interface of the system and it is where the user will interact with the application. The application layer is where all the logic for the application is run. The database layer is where the database is located. In the example of a web application, the user can interact with the application through a web browser or mobile smartphone. Before the application code is written for the application, the developer must perform several tasks to construct the architecture needed. Some of these tasks include creation/management of servers, installation of proper operating systems, and ensuring the application is highly available and fault tolerant. There also may be need for the application to be highly scalable depending on the traffic expected. This could involve the use of load balancers and other costly resources. Serverless computing eliminates the need to perform all tasks except the writing of the application code. There are still servers running to process and fetch the data required, however, now all the configuration and management of all servers as well as the installation of the proper OS and software updates are all done through the cloud provider. The developers can now focus on correct execution of their application logic and do not have to be as concerned with the infrastructure supporting it. This also makes the application highly available as users will be less likely to experience problems with the servers. Serverless computing also makes the application highly scalable. Since resources are allocated dynamically depending on demand, when traffic on the application is high, the application is able to scale up to support this.  

Industry Use

Several companies have begun using serverless computing in their business. Any organization running multiple server instances with the need to update their systems in real-time can benefit. In 2015 U.S. shipping company Matson inc. decided to close four of its data centres and shift its operations to Amazon Web Services (AWS) cloud. The company experienced heavy server traffic as they offer their customers the ability to track their shipping containers in real time. The decision was to use lambda functions as an experiment. When the serverless idea was proposed, its purpose was to decrease spending on EC2 virtual server instances. The company began automating the functions to only create and run EC2 instances during business hours. They also created an API gateway which many clients of using the serverless will do. The API allows the client easier access to the function services. Matson’s API gateway was created to let their offshore contract team manage servers, and avoid the issue of creating too many AWS user accounts. Matson experienced positive results and decided to further integrate the serverless architecture in their systems. Major technology giants, including Amazon, Microsoft, IBM and Google, have all released platforms that run serverless architectures. AWS Lambda has been one of the most widely used due to its being one of the first to offer automated functions.

Canadian Government Use

In 2017 the GC began working with AWS to verify their ability to meet security assurance needed for protected B, medium integrity, and medium availability (PBMM) on the cloud. AWS is already well versed with existing national security accreditation, including ISO 27001, Service Organization Control (SOC), FedRAMP Moderate and High, and the U.S. Defense Department’s Impact Level 5 most recently. This demonstrates AWS’s ability to provide security for public cloud infrastructure, which makes them a viable option for running the GC’s public cloud infrastructure. Delegating cloud services to AWS would eliminate significant resource burdens from the GC. It would also allow the GC to enter the space at a slower pace. This would allow AWS to adjust the full scope which services are offered and how they are offered. AWS data centres have already been built in Canada, which allows the GC to benefit from low-latency AWS services. With CloudFront edge locations, the GC can offer web and application content with low latency to end users. Offering serverless infrastructure to GC departments and industry partners can reduce cost and, if done through a secure public cloud, eliminate significant concerns. The offering of a secure public cloud within the GC can provide similar cost-saving benefits.

Implications for Departments

Shared Services Canada

Value proposition

The capability of offering serverless computing aligns with three areas of SSC’s target reference architecture. Serverless architecture expands the capabilities of Infrastructure as a Service (IaaS). Infrastructure deployment and server provisioning can still be done, except using serverless architecture grants both the client and provider a more efficient way of doing so. Platform as a Service (PaaS) can still be offered under a serverless architecture. The client can still perform the auto-scaling and automated infrastructure configurations, except now there is no idle times between tasks where resources are running unnecessarily. Software as a service (SaaS) can also still be offered since services can still be centrally hosted and purchased by clients on a subscription basis. Serverless computing does not disrupt the manner in which SSC has to adhere to the NIST reference model. Using a serverless architecture in both public and private cloud settings still allow SSC to play multiple roles as a consumer, provider, and broker of the technology.

Challenges

There are challenges when adopting serverless architecture. Serverless computing requires the user’s data to be stored on the vendor’s servers. This can be a challenge since sensitive data can be subject to compromise. Latency can be introduced when running SaaS applications. If the application requires data processing and returns it in the milliseconds, this can be a problem for serverless architectures. Also, due to the autoscaling nature, the provider can sometimes “spin down” or shut off a server when it is not in use. This means that, when the client attempts to perform the task again, the system has re-created and reconfigured the new instances. This can also introduce greater latency. Serverless computing also is not suited for large computing needs.

Dept X

Content to be added by each departments

Sources

https://www.cio.com/article/3244644/cloud-computing/serverless-the-future-of-cloud-computing.html

https://www.networkworld.com/article/3187093/cloud-computing/serverless-explainer-the-next-generation-of-cloud-infrastructure.html

https://www.forbes.com/sites/janakirammsv/2017/01/31/how-can-enterprises-leverage-serverless-computing-platforms/#49f526b53c35

https://www.stratoscale.com/blog/compute/will-serverless-computing-kill-infrastructure/

https://techbeacon.com/essential-guide-serverless-technologies-architectures

https://read.iopipe.com/voyage-into-the-world-of-serverless-computing-with-a-135-year-old-shipping-company-4a8380b90bf3

http://www.govtech.com/opinion/Serverless-Computing-Is-a-Growing-Trend-Heres-What-You-Need-to-Know.html

https://aws.amazon.com/blogs/publicsector/security-assurance-package-submitted-to-the-government-of-canada/

https://aws.amazon.com/blogs/publicsector/canada-central-region-now-open/

https://impaddo.com/blog/serverless-architecture/

https://en.wikipedia.org/wiki/Software_as_a_service