Looking at the latest vRA Cloud Template Schema, I saw something interesting in the definitions.
The ability to have a resource type of “codestream.execution”. This allows you to execute a Code Stream pipeline from within a cloud template. Once deployed, a Deployment will feature a resource object, of which you can also link a custom day 2 action to!
This opens a lot of future possibilities of creative ways to extend your automation.
The schema looks like the below. And you can continue to follow this blog for an example.
"codestream.execution": { "description": "Request schema for running a pipeline", "properties": { "count": { "default": 1, "description": "The number of resource instances to be created.", "ignoreOnUpdate": true, "title": "Count", "type": "integer" }, "inputs": { "recreateOnUpdate": true, "type": "object" }, "outputs": { "computed": true, "type": "object" }, "pipelineId": { "recreateOnUpdate": true, "type": "string" } }, "type": "object" },
Working Example
High level overview of the working example I will describe below.
- Pipeline
- Deploy a Kubernetes namespace and Ngnix deployment to a cluster using the Kubernetes endpoint task.
- Cloud Template
- Execute the pipeline with user inputs.
- Custom Day 2 action
- Change the configured replicas in the deployment by running a vRO workflow that calls another pipeline.
Putting it together, the Cloud Template will execute the pipeline, and in the resulting deployment, a Day 2 action will be available.
Pipeline
This is a simple code stream pipeline which uses the Kubernetes endpoint task, to create a namespace and a deployment which is configured to deploy Nginx.
You can find my example on GitHub.
Cloud Template
For the cloud template, you will notice you cannot drag the codestream.execution resource from the selection list on the right hand side. Instead, you will need to manually create this.
Based on the provided Schema, I created the following, you will need the pipeline ID which can be found by looking at the URL when editing your pipeline, or by gathering it from an GET API call to “/pipeline/api/pipelines”.
https://www.mgmt.cloud.vmware.com/codestream/#/pipelines/4b5583fc-b82f-4f20-b00a-714ac92a2740
Under inputs property, this needs to match all inputs for the configured pipeline. Finally, if you are planning on using Custom Day 2 actions, configure the outputs property.
You will see an object be represented on the canvas from this configuration.
formatVersion: 1 inputs: namespace: type: string title: namespace replicas: type: string title: Number of replicas resources: cs.namespace: type: codestream.execution properties: pipelineId: 4b5583fc-b82f-4f20-b00a-714ac92a2740 namespace: '${input.namespace}' inputs: namespace: '${input.namespace}' replicas: '${input.replicas}' outputs: computed: true
Deploying the Cloud Template will result in a deployment from the image shown at the start of the blog post.
If you look at the pipeline execution, you will see it has a tag and a comment identifying it as executed by the catalog service.
Custom Day 2 Resource Action
One of the most exciting things about this capability is the ability to link a Custom Day 2 resource action.
In my testing, I build a vRO workflow which reads the data passed from the below property when the workflow is run by vRA as a day 2 action, and then connects to vRA to execute a known pipeline.
System.getContext().getParameter('__metadata_resourceProperties');
You can see the mapping below when it’s in action.
The pipeline is designed to take an input of a namespace, deployment name and replica count. It then updates the deployment configuration in the Kubernetes cluster.
To create the Resource action, the important part is to set the resource type as “codestream.execution”
When you view your deployment resource objects, you will now see the custom day 2 resource action.
And below we can see the history of the action being computed.
Summary
This was a quick tour of the “feature” I stumbled upon. I managed to get it working quickly. The hardest bit for me was getting the vRO workflow sorted to work as a functioning day two action with the appropriate inputs, as I wanted it to call another pipeline, rather than build a full workflow that interacts with Kubernetes cluster itself. Either method would suffice.
Regards