Technology Trends/Software Defined Anything

From wiki
Jump to navigation Jump to search


Status Published
Initial release November 6, 2019
Latest version February 17, 2020
Official publication Software Defined Anything.pdf
Traffic cone.png This page is a work in progress. We welcome your feedback. Please use the discussion page for suggestions and comments. When the page is approved and finalized, we will send it for translation.

Software Defined Anything (SDx), also often referred as Software Defined Everything/Anything (SDE/SDA) is the concept of having a computing infrastructure that is completely virtualized. In this manner, the infrastructure can be delivered to customers as a service.

Hide Detailed View


Business Brief

As the software is inherently more flexible than hardware this gives IT a chance to faster response to the ever changing business requirements. SDx minimize costs by automating the process control and replacing traditional hardware with software. The benefits of reduced network provisioning time, simplified network environment, reduced service costs, and enhanced network management efficiency are the key factors for the rapid SDX market growth.

The working scope of field engineers in an SDx environment is reduced. Management and maintenance tasks are simplified using SDx, however, there are still considerations to be accounted for. Regardless of the decoupling aspect from the hardware, systems built to be entirely virtualized must still be extensively tested on the eventual hardware they will be running on. In network virtualization the complexity of the infrastructure is a key factor of success. Controllers requiring high maintenance should be kept to a minimum as failure to do so will result in more failure occurrences and a higher vulnerability to security threats. The two largest benefits to SDE are cost optimization and manpower reduction, however, human effort is still required and cannot be overlooked in the process. Fundamentals of the underlying system should be understood as well as current trends. Instructure can then be programmed accordingly to meet these requirements.

Technology Brief

In general SDx is achieved by the decoupling of the software control functions from the hardware it runs on, through the implementation of Application Programming Interface (API) that enable this concept. In the SDx mode the infrastructure becomes virtualized, completely controlled by the software and delivered as a service to the clients. The automation is achieved by removing of the control from the hardware in the sense that the aspects like configuration, deployment and provisioning are controlled and driven by the software.[1]

In general, SDx is an umbrella term that includes three primary concepts:

  • Software defined networking (SDN): A network architecture to make network devices programmable. SDN addresses the failure of the traditional networks to support the dynamic, scalable computing and storage needs of today’s applications.[2] SDN achieves that goal through the separation of the network management from the underlying network infrastructure and components. This means network behavior can be controlled using software external to physical networking devices. Networks are therefore more customizable from a service offering perspective, where services can be tailored to individual customers. An SDN is separated into three layers. These are the application layer, control layer, and Infrastructure layer. Between the control and infrastructure layer is what is known as the control plane-data plane interface. The control layer is where the SDN software resides and performs its functions. Physical switches in the infrastructure layer will either use a proprietary technology or Open-Flow protocol. With this protocol traffic is routed in a way that allows the control plane server to dictate where the switches send packets, which removes the responsibility/functionality from the data plane. The controller will also run several policy, traffic-engineering, and security applications to control the network elements via APIs. In addition, these APIs allow new functionality to be integrated quickly.
  • Software defined storage (SDS): This refers to computer data storage software that is independent of the underlying hardware.[3] Storage automatically adapts to new demands by pairing resource flexibility and programmability. Programmability includes policy-based management and the automated provisioning of resources. Software is decoupled from hardware. SDS allows you to leverage existing storage solutions such as Storage Area Network (SAN) and Network Attached Storage (NAS) on any industry standard hardware.
  • Software defined data centers (SDDC): This is one in which all elements of the data center infrastructure (networking, storage, CPU, and security) are delivered as a service. Control of the data center is automated by software. Basically, SDDC consists of three core components: network virtualization, server virtualization, and storage virtualization.[4]
  • Software-Defined Application Services (SDAS)
  • Software-Defined Edge (for IoT)
  • Software-Defined Commute (SDC)
  • Software-Defined Storage (SDS)
  • Software-Defined Infrastructure (SDI)

Other standalone or partially overlapping SDx domains are terms are:

  • Software defined computing: The purpose of software-defined computing is to remove intelligence from the hardware and abstract it to a far more standardized software layer. The technology functions are moved to a virtualized infrastructure, thereby presenting computing infrastructure as pools of virtual and physical resources.
  • Software defined environment: In an SDE environment, storage, data center infrastructure, and network management are automated by intelligent software rather than by hardware.
  • Software defined security: This is a new approach to improve security within software defined networking environment. Because SDS is entirely software-based, security policy is elastic and security is available on demand.[5]
  • Software-defined Hypervisor (SDH) - refers to virtualizing the hypervisor layer and separating it from its management console

Industry Usage

From now until 2020 the size of the digital universe will double every two years. The industry is under an ever-growing pressure to replace their existing IT infrastructure with innovative models that can reduce costs. Consequently, companies are increasingly adopting SDx, as it provides a leaner business model mostly through virtualization and flexibility brought in by the hardware controlling software. In this growing market an abundance of the technical solutions and products is becoming available in order to atomize every aspect of the IT Service delivery. A predicted market growth of 32% is expected over the period of 2016-2020. The shift towards the virtualization of IT infrastructure is the in order to significantly reduce the costs.[6]

The best know and the most adopted examples of the SDx concept in use are Platform as a service (PaaS) and Infrastructure as a service (IaaS) forms of the Cloud Computing. Driven by the virtualization and SDx, cloud computing has become a mainstream concept and the IT solution of choice in the recent years to the point where some or the major IT vendors have shifted their business strategy towards this concept. An example is Microsoft who in 2011 committed 90 percent of its $9.6 billion R&D budget to its cloud.[7] An investment that has been well paying off as Microsoft has become only third company globally to reach 1 trillion market value, thanks to their strategic shift from traditional personal computing towards the cloud.[8]

Other competitors like Amazon, Google, Apple, etc., are also heavily investing and counting on profitable return of investment from the cloud, especially since some of the major corporation.[9] have already adopted the Cloud First strategy. The Software Defined Storage Market accounted for $4.18 billion in 2016 and expected to grow at a CAGR of 39.0% to reach $42.1 billion by 2023. The global software defined networking market size will grow by USD 16.27 billions during 2019-2023.

The global software-defined networking market is expected to post a CAGR of close to 24% during the period 2019-2023, according to the latest market research report by Technavio. However, the market’s momentum will decelerate in the coming years because of the decrease in year-over-year growth. A key factor driving the global software-defined networking market size is the increasing demand for cloud solutions.

The global software-defined compute market is expected to post a CAGR of over 13% during the period 2019-2023, according to the latest market research report by Technavio.

The global Software-Defined Data Center (SDDC) market size is expected to grow from USD 33.5 billion In 2018 to USD 96.5 billion by 2023, at a Compound Annual Growth Rate (CAGR) of 23.6% during the forecast period. SDDC enables unified management and monitoring of data center resources, empowering faster allocation of computing, storage, and network resources using a single point of control that is delivered by SDDC solution. However, traditional data centers face issues with integration of servers, networks, and storage infrastructure in the data center through SDDC technologies. These factors may restrict the adoption of SDDC.

Canadian Government Use

The Government of Canada (GC) relies heavily on Information Technology (IT) to conduct its operations and daily business activities. IT is plays an integral role in government operations while also being a key enabler in transforming the business of the GC. IT is an essential component of the GC’s strategy to address digital transformation challenges and enhancing services to the public for the benefit of citizens, businesses, taxpayers, and employees. (Government of Canada, 2018)

The GC Enterprise Security Architecture (ESA) Program, led by Treasury Board of Canada Secretariat (TBS) and supported by Shared Services Canada (SSC) and Communications Security Establishment (CSE), is a GC-wide initiative to provide a standardized approach to developing IT security architecture, that ensures that basic security blocks are implemented across the enterprise as the infrastructure is being renewed. These three stakeholders formed the IT Security Tripartite to develop and maintain a consistent and cohesive enterprise IT security architecture vision, strategy and designs under the ESA program.[10]

The GC ESA program is separated into eight individual Enterprise Security Focus Areas (ESFA)[11]: Identity, Credential and Access Management (ICA); Endpoint Security (END); Data Security (DAT); Application Security (APP); Network and Communications Security (NCS); Security Operations (OPS); and Compute and Storage Services Security (CSS).[12]

The CSS target architecture transition is focused on the virtualization of compute, storage, file, network, and database services within the framework of a Software-Defined Environment (SD-Environment).

In a SD-Environment, SD-computing infrastructure is transforming from a hardware based set of standardized and consolidated infrastructures, to virtualized services with full lifecycle management that is on-demand independent of the location of the underlying services and physical assets. As the control plane infrastructure is isolated from service requests and hardware assets the computing infrastructure is implementable in software across cloud provider solutions, and is well suited to a hybrid cloud implementation.

The roles of TBS, SSC, and CSE in carrying out the GC ESA program vary but complement one another. The TBS’ role is to develop the long-term vision and establish the priorities for the GC ESA program. It also leads the development of enterprise strategies and designs. SSC’s role is to implement designs for consolidated IT infrastructure and provide service delivery. Lastly, CSE’s role is to provide specialized technical expertise for enterprise designs and contributes design support and review for critical components.

The GC is taking steps to transform the legacy systems and aging IT infrastructure into an integrated, secure, modern, and agile environment that will provide the GC, citizens, partners, with reliable and trusted access to GC programs and services.

Canadian Government departments and agencies directly or through its proxies (SSC) are in the process adopting the SDx concept. Treasury Board of Canada supports Cloud computing and other SDx based concepts through its initiatives and directives.[13]

Software defined networking (SDN), Software defined storage (SDS) and Software defined data centers (SDDC) concepts have already been adopted in the Government of Canada (GC) to a certain degree.

However, worth mentioning is that the majority of the SDx adoptions in the GC at this stage, are through the SDx hybrid solution and implementations, like Software defined WAN[14], that communicate and integrates with more traditional WAN networks that might be geographically separated, or inhabiting different network zones.

Implications for Government Agencies

Shared Services Canada (SSC)

Value Proposition

Each SDA topic (SDN, SDDC, SDAS etc.), has its own specific benefits and business values. However, the overall business value of implementing SDA are: Improved Hybrid Flexible Infrastructure and Scalability; Improved Business Agility, Control, and Velocity; Increased Cost Savings through Automation; and Improved Security.

Improved Hybrid Flexible Infrastructure and Scalability

  • Software-Defined infrastructure is highly flexible, in that it enables better cloud and edge enablement, which in turn allows workloads and application states to be more easily moved to and from cloud and on-premise data centers. As new hybrid environments proliferate, where customer data is often scattered across on-premises, colocation, Edge and multicloud environments, the ability to function seamlessly across environments is critical. SDA allows organizations to work between on-premise data centers and off-premise vendors with more accuracy.
  • Software-Defined Systems (SD-Systems) allows movement between one system and another, switching environments whenever desirable. Applications can be set up instantly over SD-Systems, and can be taken down instantly as well. SDA avoids the hassle of a “rip and replace” hardware or hiring extra staff to manage an environment change. It allows an organization to scale on a very low margin. Since the management and control of the networking, storage and/or data center infrastructure is automated by intelligent software rather than by the hardware components of the infrastructure, SDA can automatically scale up or down depending on client load requirements. This means that organizations are better situated to handle on and off boarding of numerous clientele without requiring new staff to handle the new traffic.

SDA also flexibility of deployment choices, help I&O leaders enable hybrid cloud workflows, as data can be ingested, processed and integrated across any deployment scenario.

Improved Business Agility, Control, and Velocity

  • SDA improves agility; agility is the ease in which an organization’s data computation can navigate complex environments quickly, according to its specific needs. Since SDA supports automation, orchestration, and event-driven activities, it enables faster deployment and changes in response to changing business conditions compared with traditional architectural approaches.
  • In a Software-Defined environment, management and control of the storage, networking and/or data center infrastructure is automated by intelligent software, rather than by hardware. SDN provide a centralized view of the entire network, which makes it easier to centralize enterprise management. Since SDN abstracts the control and data planes, it can accelerate service delivery and provide more agility in provisioning both virtual and physical network devices from a central location. Infrastructure can be scaled up or down programmatically via SDA, without being tied to specific physical infrastructure components. This capability increases the velocity of delivery to clients.

Cost Savings through Automation

  • SDA decouples the software from the industry-standard hardware. This allows tech leads to maintain hardware for a longer period of time without having to upgrade and migrate data, and avoids hardware vendor lock-in. This also helps organizations reduce costly instances of major or large-scale upgrades (known as forklift upgrades), where large parts of the infrastructure must be overhauled and new hardware invested in.
  • The flexible pay-as-you-go purchasing model of SD-Systems can also help realize cost-efficiencies. Instead of choosing from a number of large vendors, SDA enables organizations to leverage emerging, smaller, and sometimes less expensive providers for specific business lines.
  • Although automation can be achieved with manual scripts, it’s often easier with SDA due to greater extensibility. The ideal end-state is an intelligent, integrated framework to improve automation within the entire infrastructure layer. Many data center networking tasks are performed manually, which increases time, cost and likelihood of human errors, which reduces flexibility. SDA incorporates automation into the network, data center, infrastructure and extends programmatic control. This greater control can even reduce overall energy consumption and result in energy savings.

Improved Security

  • One of the advantages of SDN is the centralized security. Virtualization of machines has made network management more challenging. With virtual machines coming and going as part of physical systems, it’s more difficult to consistently apply firewall and content filtering polices. Additionally, complexities such as securing BYOD devices compound the security challenges of today’s networks.[15]
  • The SDN Controller provides a central point of control to distribute security and policy information consistently throughout the organization. Although centralizing security control into one entity (SDN Controller), has the disadvantage of creating a central point of attack, SDN can effectively be used to manage security throughout the network if implemented properly.

Challenges

Information Technology enables the GC to conduct operations and deliver services to Canadians. It is strategically critical for increasing government productivity and enhancing services to the public for the benefit of citizens, businesses, taxpayers and employees. The GC invests a significant portion of its annual budget on IT and supporting infrastructure. However, rapidly developing technology, incompatible business practices and a fragmented approach to IT investments can undermine effective and efficient delivery of government programs and services.[16]

SDx is a great concept that brings many benefits. However, it faces various adoption challenges including managing programmatic control, monitoring ever-expanding environments, reconciling incompatible legacy systems, working with the immature SDx industry, and re-training existing personnel or hiring new personnel experienced in SDx.

If an organization decides to move in SDx direction, enabling programmatic control over some or all infrastructure platforms can be a significant challenge. If programmatic control is a priority, this will influence infrastructure purchase and deployment decisions. The switch from manual to programmatic control can be a barrier to change. Additionally, whenever a new service starts in SDx, it deploys the necessary virtual infrastructure, and the number of monitored elements can grow rapidly with increased demand. This can outpace the traditional monitoring capacity management. Having service context is an expectation for IT today. The challenge is ensuring SDx systems can conduct performance monitoring to listen in context of a particular customer or tenant of the network. This can be difficult if programmatic automation and monitoring are not in sync.

Additionally, legacy and packaged applications may not be adequately service-enabled to participate in new age modern application architectures such as SDx. Traditional networks are difficult to automate because of the distributed nature of their decision making process for switching or routing which are collocated on the same device. Some legacy systems may never be able to shift to SDx architectures.

The SDx product offerings are also highly varied and constantly changing. There are established and emerging vendors that deliver different value propositions and product capabilities. IT leaders will often struggle to navigate SDx vendor solutions. The value shift to software is disrupting traditional business models. Without an effective software-defined guidance process to follow, IT professionals struggle to plan and implement a cohesive framework that spans from on-premises data centers to edge and hybrid-cloud environments.

Significant investments in personnel with specialized skills will be necessary to effectively develop and manage SDA within an organization. Most architects do not have the domain knowledge required to successfully deploy an extensive, flexible and modular set of infrastructure capabilities that support the development, deployment and operation of applications implemented using software-defined application services (SDAS). Implementing software-defined architectures can be challenging for technical professionals. A large burden of responsibility is put on IT professionals in that they must deploy, integrate, and orchestrate numerous SDx technology components, often from a variety of vendors, such as software-defined storage (SDS), software-defined networking (SDN) and software-defined compute (SDC). Alternatively, organizations have sought out converged solutions such as hyperconverged infrastructure (HCI) and hyperconverged integrated systems (HCIS) from a single vendor, where the vendor has integrated the SDx stacks into a cohesive solution offering. This is another challenge as a single vendor approach to SDx can address many integration concerns, but often cannot address the entire enterprise.

Considerations

SDx solutions promise the ability to manage and control IT systems and solutions for less in terms of time, money, and complexity. SSC being the major digital/IT service provider to the GC should proceed with caution in leveraging this concept in the future.

In the recent years there has been an increased demand for the SDx solutions and many vendors have been capitalizing on it. The key players of SDx market are Cisco, HP, IBM, Microsoft, Citrix, EMC2, and VMware. The other worth mentioning are Juniper Networks, NEC, 6Wind, Arista Networks, Avaya 0, Dell, Ericsson, Fujitsu, Big Switch Networks, Brocade, DataCore, Hitachi Data Systems, NetApp, Nexenta, Pertino, Pivot3, Plexxi, PLUMgrid, and SwiftStack.

However, the future of the corporate networks, data centers, data storage, security implementation, is likely to be a hybrid system where certain functions will be translated into the commanding software layer. However, many other aspects will continue to exist within the hardware itself. Balancing the solution architecture will continue to be critical in order to achieve desired results.

SSC must be cognisant of the fact that not all SDx products deliver on all benefits. Use cases and strategic plans required. SDx is in the early stages of maturity and will change significantly over the next five years. IT leaders must assess vendor lock and develop contingency plans for migration to alternative technology. Most software-defined offerings are limited to infrastructure silos and lack cross-silo interoperability, automation and integration unless provided by a single provider. Other considerations for implementation include planning software-defined environments and capabilities beyond the data center to edge environments with IoT and how that will affect their organization.

IT professionals will struggle in the planning and implementation of a cohesive framework that can span from on-premises data centers to edge and hybrid-cloud environments It is important that organizations develop a phased SDx framework first. Understand how the framework will be implemented, who will use it and how it integrates into an overall enterprise-based hybrid-cloud strategy. Allowing stand-alone software-defined architectures may in fact create SD sprawl. If IT organizations do not adhere to a sound guidance framework for implementing SDx, this may result in a proliferation of SDx control planes (particularly from multiple vendors) and create increased complexity and interoperability challenges.

A developed data center strategy for hybrid operations is another consideration. As IT organizations embrace hybrid environments between data centers, edge environments, and cloud services, a clear strategy to address distributed architectures and hybrid-based operations will be valuable. Without a clear strategy for the transformation based on the organization’s needs, IT leaders can become absorbed in the technology and miss the bigger picture of how and when they need to reshape the data center for hybrid operations. The “perfect” data center is not one that implements every new technology available, rather, it is one that is ideally suited for an organizations specific, current, and future business needs while balancing technology innovation with real, quantitative business value.

The overall SDx transformation is likely to span multiple years, during which management and priorities may change. The CIO turnover rate is often three to five years. If a new CIO arrives, and goals, timelines and measurable benefits are not documented in a strategy, he or she may decide that SDx efforts already underway aren’t worth continued investment. Setting goals and expectations with management is critically important.

Retraining staff to become brokers of hybrid IT services is a paramount consideration. A key priority related to the hybrid-cloud transition is to retrain staff capable of running the development and operations for this new environment. In particular, staff will need to be more knowledgeable of multiple architectures and solutions in order to ensure the organization properly integrates SDx within the greater IT infrastructure.

SSC could consider conducting an option analysis of the top ten costliest software and areas that can be automated for programmatic scalability currently provided to the GC in order to assess whether credible SDx solutions could be leveraged in order to realize greater operational efficiencies and cost savings. An organization should not preference traditional or SDx over the other, instead consider evaluating service and business lines to determine where SDx could be leveraged. Wide-sweeping SDx initiatives and change-overs are to be avoided. Moving full-scale off of traditional legacy and established products is not a prudent strategy for organizations who are not built in an agile or flexible way. It will be important to understand what legacy and current technologies SDx replaces, as well as how existing technologies will need to change as a result of the new architecture.

SSC should consider where SDx can deliver the biggest impact and address immediate needs, identify operational and business delivery “pain points,” and set SDx project priorities accordingly. Attempting to implement SDx in a predefined order when it may not be appropriate, flexibility and assessing where to leverage SDx is key to having a successful SDx environment. Some areas of SSC’s infrastructure may never need to be part of the SDx transition. For example, if a specific storage configuration changes only once or twice a year, it’s probably not worth software defining that piece.

Ultimately, determining the right time to act and drive toward SDx shouldn’t be driven by vendor hype or budget availability. Rather, it should be driven by the need to meet business requirements.

References


  1. CIO QuickPulse. (2015, January). The Road to a Software-Defined. Retrieved from F5.
  2. Sadiku, M., Nelatury, S., & Musa, S. (2017, January). Software Defined Everything. Retrieved from oaji.
  3. Wikipedia. (2019, May 17). Software-defined storage. Retrieved from Wikipedia.
  4. Beal, V. (n.d.). SDDC - software-defined data center. Retrieved from Webopedia.
  5. Sadiku, M., Nelatury, S., & Musa, S. (2017, January). Software Defined Everything. Retrieved from oaji
  6. Wood, L. (2016, January 12). Research and Markets: Global Software Defined Anything (SDx) Market Growth of 32% CAGR by 2020 - Analysis, Technologies, Opportunities & Forecasts 2016-2020. Retrieved from BusinessWire.
  7. CloudTimes. (2011, April 12). Microsoft Says to Spend 90% of R&D on Cloud Strategy. Retrieved from CloudTimes.
  8. Vena, D. (2019, April 25). How Microsoft Hit $1 Trillion: Cloud Computing, Steady Growth. Retrieved from The Motley Fool.
  9. Konrad, A. (2016, March 23). Why Coca-Cola Works With Both Google And Its Rivals In The Cloud And Warns Not To Worry About Price. Retrieved from Forbes
  10. Government of Canada. (2018, January 7). Government of Canada Enterprise Security Architecture (ESA) Program. Retrieved from GCPedia.
  11. Shared Services Canada. (2016, October 4). Cyber and IT Security - July 7, 2014. Retrieved from ssc-spc.gc.ca.
  12. Government of Canada. (2018, March 29). Policy on Management of Information Technology. Retrieved from Government of Canada.
  13. Government of Canada. (2017, November 1). Direction on the Secure Use of Commercial Cloud Services: Security Policy Implementation Notice (SPIN). Retrieved from canada.ca.
  14. SDxCentral Staff. (2017, February 9). What’s the Difference Between Hybrid WAN and SD-WAN. Retrieved from sdxcentral.com.
  15. Data Center. (2017, August 8). 7 Advantages of Software Defined Networking. Retrieved from imaginenext.ingrammicro.com.
  16. Treasury Board of Canada Secretariat. (2019, August 2). Directive on Management of Information Technology. Retrieved from tbs-sct.gc.ca.