Thursday, July 28, 2016

Starting vRO Workflows with Log Insight Webhooks

Beginning with version 3.3 of Log Insight, alerts can be forwarded via a webhook.  Basically, any URL you designate will have an HTTP POST issued with the alert contents as a JSON body.  This feature provides some very basic capability and most use cases will require a bit more functionality.

For example, vRO workflows can be started from the vRO REST API, but you need to authenticate and prepare a JSON body with expected inputs at a minimum.

I learned of a nice shim, written in Python by some of my peers here at VMware.  You can read more about the general capability of the shim at this link.  The shim had the basic capability I needed, it just didn't support a vRO endpoint.  The authors (Alan Castonguay and Steven Flanders) invited contributions via a pull request, so, I added the vRO shim and I want to provide a little more information about the usage here in this blog post.  General install and usage instructions are on the Github page for the shim, so I won't cover that.

Authentication

The "vrealizeorchestrator.py" shim does not include any authentication information.  By default, Requests library defaults to the user's .netrc file if no auth options are given.  So, you will need to set up a .netrc in the user home directory with the hostname, username and password, i.e.

machine vro-01a.corp.local
login administrator@vsphere.local
password VMware1!

This has the benefit of securing the password without adding additional code and exposing the credentials.

To Parse or Not to Parse?

The example code shows a workflow that accepts two inputs, both string.  One is "value" which I will talk about below in the next section.  The other, "alertName" is a value that can be retrieved via the parse() function that is part of the main file "runserver.py" and does a nice job of returning a Python object that you can pull various bits of the alert payload from.  

Do you need to use the parsed alert?  Depends.  In the use case I was writing this shim for, I needed the entire payload in vRO so I could parse it there.  This is because the alert itself has a lot of variability - it is an alert that watches for NSX distributed firewall drops on port 22, by src/dst pair.  That src/dst pair could be different each time and there could be any number of those pairs in a single alerts - practically impossible to anticipate for workflow input.  On the other hand, if your alert watches a specific system for a specific event... well, you probably don't even need to pass inputs. Or maybe you just need the system name.  My recommendation is to try and make your Log Insight alerts as specific as possible and then deal with the remaining variables in either the Python code via the parse() function or in the case of multiple variables by passing the entire alert payload on to vRO for evaluation.

JSON Payload Serialization

I ran into a fun problem with trying to pass a JSON string to vRO as an input.  Basically, it confuses vRO because it reads the JSON string input value as part of the parameter input body.  My work-around was to serialize the JSON string from Log Insight and as such vRO would just see a long string input and be happy.

So, this is why you see the line:

"value": base64.b64encode(request.get_data())

As discussed above, you might not need to do this.  However, if you do, you'll want to grab the CyrptoJS actions package for vRO It has a base64 decode action and I tested it with the shim.


Comments and feedback are welcome, of course.  The shim is provided as an example and should not be used in production.





Wednesday, March 23, 2016

Retooling the Infoblox vRA Plugin to Support Event Broker

In this blog post I will provide information on using the Infoblox "VMware Cloud Adapter" version 3.2 with the new Event Broker feature of vRA 7.

This does not require any modification of the Infoblox workflows and the adapter will continue to work as written by Infoblox.

This content is intended for readers comfortable with vRealize Automation and vRealize Orchestrator.  As always, I do not assume any risk nor do I provide support for using the modifications outlined below in your environment.

Summary

I will create a workflow wrapper for one of the IPAM workflows to allow them to be executed from an Event Broker subscription.

You can download an example of the workflow from VMware's Sample Exchange.

Background

I just finished a proof-of-concept with a customer where the Infoblox IPAM plugin for vRA/vRO was used to integrate DDI with vRA.  It's a great solution for customers who own or are considering Infoblox IPAM, with one drawback - it currently supports only using Workflow Stubs in vRA version 7.

While that's not a huge problem, it lacks the flexibility and usefulness of the Event Broker's Event Subscription capability.  If you're not familiar with the Event Broker, you can learn more in this webinar I conducted on vRA 7 extensibility.  Briefly though, the Event Broker replaces the Workflow Stubs beginning in vRA version 7.  Workflow Stubs are still supported for integrations that have already been built around them, but customers and partners are encouraged to transition to the Event Broker.

Environment

I am using:

  • vRealize Automation 7.0.1
    • simple install
    • embedded vRealize Orchestrator instance
  • Infoblox DDI Evaluation
    • NIOS for vSphere 7.2.6
  • Infoblox VMware Cloud Adapter version 3.2
  • "Reserve an IP in a network" will be the methodology used
You can follow the instructions from Infoblox for installation of the VMware Cloud Adapter (which I will refer to as "the plugin" or "IPAM plugin" in this post).  The installation will set up the plugin to use Workflow Stubs, and that's fine.  If you wish you can skip the steps to create the Workflow Stubs but you will still want to create the property group (we will make some small edits later).

The Event Broker Wrapper Workflow

If you watched the video I linked above, you will be familiar with a "wrapper" workflow that I provide for usage with Event Broker subscriptions.  This wrapper will extract the properties provided by the Event Broker input, allowing you to capture any information you need as inputs for the IPAM workflows.

For example, the IPAM workflow "Reserve an IP in network for VM" has the following inputs:



While the Event Broker provides a single input:


That single input contains most of what we need for the IPAM workflow.  There are a couple of exceptions, and I'll cover how to find those.  For now, let's start with the easy ones.

The Event Broker wrapper workflow simply prints the key:value pairs found in the eventPayloadProperties input, but it's easy enough to assign them to an attribute for further processing.  For starters, I make a duplicate copy of the Event Broker wrapper workflow and add the IPAM workflow "Get IP in network for VM" into the schema as follows:



Next, I'll promote all of the inputs for the IPAM workflow as attributes:



Next, since I will use the script element "Get payload and execution context" from the EB wrapper, I need to map the output of the script element to those global attributes so that I can update them.



Now, I will make changes to the script in the "Get payload..." element so that I can set the attributes as needed.  Luckily, Infoblox uses the same parameter names as the EB wrapper script!  So, once I have made the attribute bindings the following are already retrieved by the script:

  • vcacVm
  • vCACVmProperties
However, you will need to make some changes.  For starters, the "vCACHost" object is retrieved by the script, but using the var "host" but all I did to fix this was search for instances of "host" in the script and replace them with "vCACHost" as below (the red boxes indicate the two occurrences).


Next, for the "virtualMachineEntity" I added a single line of code:


I inserted that just below the block in the last image:

 

For what's worth, the IPAM workflow I am using has as input the vSphere virtual machine object, but it doesn't actually use that for anything - which is great because at the state this is called from vRA the VM doesn't yet exist in vSphere!  But, I'm including the code anyway since it will be helpful for other IPAM workflows.

Now to get the "vCenterVm" I need to look that up by the virtual machine ID provided in the machine properties from the Event Broker, using the vSphere plugin.  The following code block will handle that just fine:


I inserted that into the script just below the last line I added:




That leaves "externalWFStub" and honestly,  you could just leave this alone and not assign a value.  It is only used in the IPAM workflow for logging information, as shown below in this example workflow log:



I promised no modification of the Infoblox workflows, so I'll stick to that. Ideally, I could change the IPAM workflow to display "Workflow started from Event Broker" and provide the event information.  But, just to provide some logging info, I will get the Event Broker payload properties and use that for the "externalWFStub" value so at least that information is logged on run.

So, I added the lines below to my wrapper script:


Inserted below the last code block I added above.

With those changes, I now have a wrapper for the IPAM workflow that can be used with Event Broker subscriptions.  The other IPAM workflows can be used within the same wrapper, although you should verify that the inputs are handled by the wrapper script as I did for this IPAM workflow.

Preparing vRA 7

If you followed the installation instructions for the Infoblox VMware Cloud Adapter, then you should have one or more Property Groups in vRA.  These property groups are used to pass the property values that the IPAM workflows need.  However, these groups also activate the WFStubs and since I am using Event Broker, I don't want those WFStubs to execute.

I made a copy of the property group "Infoblox Reserve IP in Network" as below:



I renamed the new Property Group "EB InfoBlox Reserve IP in Network" and removed any properties that reference WFStubs and saved the Property Group.



Now I can associate this Property Group with a blueprint (and disassociate the Infoblox created Property Groups if needed).

Creating an Event Broker subscription is beyond the scope of this post, but you can find more information about that in the video recording I linked above.  A few things to note on the subscription:


  • I used the Lifecycle State Name "VMPSMasterWorkflow32.BuildingMachine" and Lifecycle State Phase "PRE" as this equates to the WFStub,BuildingMachine.
  • I made the Event Broker subscription a blocking subscription with a suitable timeout.  Obviously you need IP info or you don't want to continue with a build.
There's a lot of duplicate logging of properties between the workflow wrapper and the IPAM workflow spitting out the machine properties.  If I were running in production, I'd probably trim a lot of that logging information out or set it for debug only.

Finally, keep in mind that this is only for the IP provisioning IPAM workflow.  You will need to wrap the IP deprovisioning IPAM workflow as well and create an Event Broker subscription for that.

**UPDATE** Some readers have reported that the workflow package I linked will not import into vRO.  So, here is a link to the scriptable task within the wrapper workflow:

http://pastebin.com/NJExZ6pW


Wednesday, December 2, 2015

Captain's Log, Supplemental - HOL-SDC-1602, Migrating to VDS Using Host Profiles

As a Lab Captain for VMware's Hands-on-Labs (HOL) this year, I enjoyed creating content for the lab "vSphere with Operations Management 6- Advanced Topics" (HOL-SDC-1602).  However, I had to cut some of the content due to the length of the lab.

One of the topics I had to cut was "Migrating to the vSphere Distributed Switch" where you learn how to migrate using the Host Profiles method.  In the official lab guide, you will use the vSphere Web Client to perform the migration from a vSphere Standard Switch (VSS) to a vSphere Distributed Switch (VDS).  However, another method exists using Host Profiles and works much better at scale and ensures that any new hosts will be set up for networking appropriately.

You can find the supplemental guide here.  HOL are available here and registration is free and easy.

Friday, November 20, 2015

Get A List of vR Ops Available Metrics

When building dashboards, reports or even symptoms and alerts in vRealize Operations Manager 6, it may be helpful to have a handy reference of metrics available in a given instance of vR Ops.  This is dependent on the solutions installed, of course, so this will vary from instance to instance.

Also, with the vRealize Operations Manager 6.1 Endpoint Operations capability, it may be useful to have a list of all metrics available after installing a new solution or plugin (such as SQL or IIS).

I have created a vRealize Orchestrator workflow and action to provide an HTML output of available metrics for a given solution.  In vR Ops API terms, solutions are "adapter kinds" so the workflow input and output uses this nomenclature.  But, they both mean the same thing.

You can download the solution from FlowGrab here.  Sample output below.

Thursday, November 5, 2015

Cannot Start a VM - A general system error occurred: Connection refused

The error referenced in the title was one I ran into today while trying to start some VMs.  I finally found a KB somewhat related to this issue and the workaround did the trick for me.

Basically, the vCenter Workflow Manager service had stopped on my vCenter server.  In my case, I'm running the 6.0.0 vApp so I had to open an SSH session and enter the following:

shell.set --enabled true
shell
service-control --status vmware-vpx-workflow
   (this command confirmed the service was stopped)
service-control --start vmware-vpx-workflow

Once running, I was able to start VMs again from vCenter.

Not sure why this was so hard to find in a web search but hopefully the search engines will pick up this blog post and save people some time!

See also:

VMware KB: VMware NSX for vSphere 6.x Controller deployment fails the error: Failed to power on VM NSX Controller
VMware KB: Stopping, starting, or restarting VMware vCenter Server Appliance 6.0 services

Friday, July 24, 2015

Using Postman to Explore the vRealize Operations Manager 6 API

If you are like me, the concept of REST was something I understood but as a vAdmin I had no exposure to actually using for automation tasks.  In short, the REST API provided with vR Ops 6 allows you to interact with vR Ops through scripts and workflows (i.e. vRealize Orchestrator).  But before jumping into that, it is helpful to understand, "How do I get to the REST API and what can I do?"

For that, I have prepared a video walking you through a set of Postman REST calls I put together in a collection.  I have grown to like Postman, which is a Google Chrome extension, as my REST client of choice.  Using Postman, I can formulate and test REST calls that I may wish to use with scripts and workflows, or just quickly grab information or make changes.

If you don't have Postman installed, it's easy enough to start.  Just install the extension; it's free (although there are additional costs for advanced features - I don't use any in this tutorial).

Next, you will need the link to my Postman collection for vR Ops 6.  Once you have that, you are ready to watch and then practice on your own.  You will, of course, also need your own instance of vR Ops to experiment with and I highly recommend you do this on a non-production system.


Tuesday, April 14, 2015

Adding New Property Sets in vRealize Automation 6

Build Profiles in vRA are a great time saver for managing sets of custom properties that share a common purpose.  When you create Build Profiles, you can add sets of custom properties without having to refer to the vRA documentation with Property Sets.


It would be handy to create your own property sets - for example, I'm always looking up the properties needed to envoke the Guest Agent.  I just can't seem to remember them!

Fortunately, you can do this in vRA by importing an XML document with the properties you want to create a custom Property Set.

The format for the XML is -

<?xml version="1.0" encoding="UTF-16"?>

<Doc>
  <CustomProperties>
    <Property Name="propertyname" DefaultValue="somevalue" Encrypted="true_false" PromptUser="true_false"/>
  </CustomProperties>
</Doc>

So, in my case, I created an XML document for Guest Agent properties as follows -

<?xml version="1.0" encoding="UTF-16"?>

<Doc>
  <CustomProperties>
    <Property Name="VirtualMachine.Admin.UseGuestAgent" DefaultValue="true" Encrypted="false" PromptUser="false"/>
<Property Name="VirtualMachine.Admin.CustomizeGuestOSDely" Encrypted="false" PromptUser="false"/>
<Property Name="VirtualMachine.Customize.WaitComplete" DefaultValue="true" Encrypted="false" PromptUser="false"/>
<Property Name="VirtualMachine.Software0.Name" Encrypted="false" PromptUser="false"/>
<Property Name="VirtualMachine.Software0.ScriptPath" Encrypted="false" PromptUser="false"/>
  </CustomProperties>
</Doc>

From the

Infrastructure tab > Blueprints > Build Profiles section in the UI, you can select the Manage Property Sets link to import the XML (scroll down to the bottom of the screen) -




Once imported, your Property Set is ready to use in Build Profiles -