The parallels between manufacturing supply chain best practices and IT service delivery are clear. But what hasn’t been clear is the nature and construct of a working system to provide automation of IT service delivery. The goal is to have a commerce engine for IT service consumption, where the business user is in charge of the consumption. The following article is a discussion about how it will actually work.
Now that we have virtual machines that can model almost any component (security systems, load balancers, apps, OS, compute, etc.) in a datacenter, we have the opportunity to automate the creation of a complete IT service. Building an IT organization that follows the Demand Supply IT operating model is challenging because it’s process intensive. Trying to fulfill this model where everyone in the IT organization is interacting with one another manually will put a tremendous amount of strain on them just to maintain the process continuity. To find success in the model, you have to automate the process of service delivery. Only then will you realize the cost savings and end user experience you were hoping to achieve.
BOMS and Blueprints
In manufacturing, a finished good is understood by examining the Build of Material (BOM) structure. These are the major and minor components that when put together make the finished good. When we examine an IT service consumed by a business user, we have a Service Blueprint. That blueprint describes all the IT components required to deliver the service.
Ecommerce the model for IT Service Delivery Automation
When you look at the Demand Supply IT model, what you are hoping to accomplish is to build a commerce engine to create a supply of IT services for business users to consume as they see fit. But unlike Amazon where they are picking actual products, packaging them, and shipping it to a customer, you are delivering an IT service. Services can be created entirely in virtual spaces (in software) and therefore require a different supply chain construct.
Blueprints and Orchestration
When building a standardized set of components, we create a blueprint. That blueprint not only describes the virtual components but can also describe dynamic scaling when growth or contraction of a service stack is required. This blueprint will live in an orchestration engine which will have the ability to create through APIs the required virtual systems to build out the entire service based on that blueprint.
That blueprint of systems could land in a private cloud, a virtual private cloud, or a public IaaS provider environment. It’s essential to have an orchestration platform that can push those blueprints to a number of public providers or to your own private cloud environment. This will provide your service owners options to help manage costs and security concerns.
For those concerned about compliance or cost, having a policy engine that can decide where a blueprint should reside (private, virtual private or public cloud) is essential. It may be a requirement to keep certain IT services within your private network. A policy engine serves as a compliance tool, to be assured that the blueprint will be deployed based on that kind of rule. Cost may also be a factor, and a policy engine could dictate for instance that a test environment can only be deployed in a consumption driven public cloud for a limited time before the VM’s are turned down.
Governance is about who has authority to request services. It also can be used in conjunction with the policy engine to determine which instance of a service someone can request. Possibly the governance engine will create an approval workflow to grant someone the ability to get the service they are requesting. By leveraging directory services, service delivery automation can occur with compliance and accurately match the right service with the right person to the right provider.
The catalog is where the service owner can request for their service to be deployed in a cloud environment. It also can be where a business user can request services that they require for their own productivity or job function.
Identify and Access Management Platform
When a business user makes a request for a service, the same exact engine that provides the deployment of blueprints can also be built to work with a system that manages identity and access management. The result is that the end user request for services can be auto-provisioned and access can be granted to the end user with single sign-on capabilities. The service can also be provisioned for that user with the appropriate level of authorization. Coupled with the Policy engine, the right instance of a service (like Salesforce.com) can be deployed.
Bill of IT
The cost of IT is a major part of the Demand Supply IT model. Bill of IT is the ability for IT service consumption to be understood and passed back to the business user as a per unit charge. When working with cloud providers like Amazon for IaaS, you get billed based on resource consumption. To accurately understand the cost of your consumption, a billing API needs to be exposed by the provider. The orchestration engine needs to be able to leverage the billing API to understand the true cost of the service consumption.
This is an area that all IaaS providers should be offering. When working with SaaS providers they general charge per person/seat. That is a fixed unit of cost that can be understood month over month. When dealing with IaaS which provide a resource consumption model, it get challenging to understand your costs. This is why billing APIs are so important.
ERP for IT is a System of Commerce
Having a single system that can tie together the catalog, governance, policy, blueprint with API logic to properly deploy in a cloud environment of choice, and Bill of IT is the essence of the modern ERP for IT. One single system of record to process business user requests, automate the service delivery and provide charge-back for their actual consumption.
The implications for the IT organization are significant. Today people can buy product designs for 3D printers and produce the products they want at their discretion. Now, imagine providers of services that will sell you a blueprint based on a best-practice deployment. The need to have engineering resources in the future will be even more reduced. Enterprise IT organizations can purchase blueprints and managed services to support them. Where the systems get deployed will be at the discretion of the enterprise.
This paradigm won’t be exclusive to Enterprises. Service providers will use an orchestration engines like this as well. Based on customer requirements their services could be deployed in the provider’s public cloud or deployed on the Enterprise customer’s premise. The notion of having to accept SaaS services only delivered from a public cloud, get shattered here.
The field for these ERP of IT systems is still nascent. But when you look at solutions from Gravitant, ServiceMesh (now part of CSC), Jam Cracker, and Virtustream, you can see the future of the Enterprise IT landscape taking shape.