Generate a GitHub file from your PowerShell module help

When publishing a script or a module to a GitHub repository, it is a best practice to add a file in the same folder as a script file. That way, anyone who stumbles upon the script on GitHub is presented with a brief documentation of what the script does and how to use it.

I write exclusively PowerShell modules (.psm1) nowadays, as opposed to scripts (.ps1). The main reasons for that are : I want to build tools which behaves just like native cmdlets and I want my tools to be easily reusable. Documentation is an important part of making a module reusable by anyone.

Even though I’m a big believer in documentation, sometimes, I have a hard time practicing what I preach. After working for a few hours on a module, I just want to upload it and be done with it. Exploring, tinkering and scripting in the shell and/or the ISE is the fun part, writing the documentation is the boring part. But wait, what do we do with the boring stuff ?
We AU… TO… MATE it !

If you lookup on the internet, you will find PowerShell scripts which do this, but they are using the content of the file and parsing it to extract the desired information. Text-parsing in PowerShell… Ugh. It’s so against nature, it makes me feel dirty.

Usually, I already have comment-based help in the module, so let’s use it by importing the module and leveraging the object-oriented output of Get-Help. As a bonus, this approach will work with any PowerShell module (script or compiled) and any help (comment-based or XML-based).

PS C:\> Import-Module "C:\GitHub\Powershell-Utility\Example\Example.psm1"
PS C:\> $FullModulePath = Resolve-Path -Path "C:\GitHub\Powershell-Utility\Example\Example.psm1"
PS C:\> $Module = Get-Module | Where-Object { $_.Path -eq $FullModulePath }
PS C:\> $Module

ModuleType Version    Name                                ExportedCommands
---------- -------    ----                                ----------------
Script     0.0        Example                             {Get-Nothing, Set-Nothing}

To illustrate how to do this, we are working with a module called Example.

The is a text file so, from a PowerShell perspective, it will be an array of strings. To prepare it, it’s as simple as that :

PS C:\> $Readme = @()

It is empty, but we are going to populate it as we go.

PS C:\> $Commands = Get-Command -Module $Module
PS C:\> $Readme += "##Description :"
PS C:\> $Readme += "`n`r"
PS C:\> $CommandsCount = $($Commands.Count)
PS C:\> $Readme += "This module contains $CommandsCount cmdlets :  "
PS C:\> $Readme
##Description :

This module contains 2 cmdlets :

First, we got the command(s) in the module and store them in $Commands.
“##” is a markdown tag to mark the following text as a second level heading, like h2 in HTML.
`n is the new line character and `r is the carriage return.
By the way, `n doesn’t work well in GitHub markdown. So here is a trick : to start a new line without starting a new paragraph, you need to add 2 trailing spaces at the end of the line.

Then, $Commands.Count gives us the number of commands in the module. We store it in the variable $CommandsCount, which allows us to document the number of cmdlets in the module.

Then, we add the name of each cmdlet :

Foreach ($Command in $Commands) {
    $Readme += "**$($Command.Name)**  "

"**" is a markdown tag. The text surrounded by "**" will be in bold.

Now, I want to document the PowerShell version required by the module. This is not in the help, but this is specified in the #Requires statement at the very beginning of the module, so we can get to it this way :

$FirstLine = $Module.Definition -split "`n" | Select-Object -First 1
If ($FirstLine -like "#Requires*") {
    $PSVersionRequired = $($FirstLine -split " " | Select-Object -Last 1)
If ($PSVersionRequired) {
    $Readme += "It requires PowerShell version $PSVersionRequired (or later)."

Now, let’s start the cmdlet-specific section :

PS C:\> $Readme += "`n`r"
PS C:\> $Name = $Commands[0].Name
PS C:\> $Readme += "##$Name :"
PS C:\> $Readme += "`n`r"
PS C:\> $HelpInfo = Get-Help $Name -Full
PS C:\> $Readme += $HelpInfo.description
PS C:\> $Readme
##Description :

This module contains 2 cmdlets :
It requires PowerShell version 4 (or later).

##Get-Nothing :

This cmdlet does absolutely nothing and does it remarkably well.
It takes objects as input and it outputs nothing.

We run Get-Help for the first cmdlet in our module, using the -Full parameter to get everything from the Help and store that in a variable $HelpInfo.
Then, the description of the cmdlet is easily retrieved from the description property of $HelpInfo.

Then, let’s get to the parameters section of our documentation.

For each parameter, we want to document its name, description, and if it has a default value, a user-friendly sentence explaining its default value. It turns out that everything we need is in the “parameters” property of our variable $HelpInfo :

PS C:\> $CommandParams = $HelpInfo.parameters.parameter
PS C:\> $CommandParams

-InputObject <PSObject[]>
    Specifies one or more object(s) to get.
    It can be string(s), integer(s), file(s), any type of object.

    Required?                    true
    Position?                    1
    Default value                (Get-Item *)
    Accept pipeline input?       true (ByValue)
    Accept wildcard characters?  false

-Filter <String>
    Specifies a filter in the provider's format or language. The value of this parameter qualifies
    the InputObject.
    The syntax of the filter, including the use of wildcards, or regular expressions, depends on
    the provider.

    Required?                    false
    Position?                    2
    Default value
    Accept pipeline input?       false
    Accept wildcard characters?  false


Let’s add this to the README :

PS C:\> $Readme += "###Parameters :"
PS C:\> $Readme += "`n`r"
PS C:\> Foreach ($CommandParam in $CommandParams) {
           $Readme += "**" + $($CommandParam.Name) + " :** " + $($CommandParam.description.Text) + "  "

           If ( $($CommandParam.defaultValue) ) {
               $ParamDefault = $($CommandParam.defaultValue).ToString()
               $Readme += "If not specified, it defaults to $ParamDefault ."
           $Readme += "`n`r"
PS C:\>

The default value for a parameter might not be a string, so we used the method ToString() which can convert any object to a string.

I usually give 2 or 3 usage examples in the comment-based help and this is the case in our Example module. So let’s add these examples into our README :

PS C:\> $Readme += "###Examples :`n`r"
PS C:\> $Readme += $HelpInfo.examples | Out-String

And here is how it looks like on GitHub :

README Screenshot

You can see it for yourselves here.

With these little tools in my belt, I wrote a PowerShell Module named ReadmeFromHelp and it is available on GitHub. It contains one cmdlet New-ReadmeFromHelp and the for this module has been automatically generated by… itself.

Configure ESXi host RAMdisks with PowerCLI

The symptoms of a full RAMdisk on a ESXi host can be pretty nasty and diverse. The possible causes are also very diverse (search “ramdisk full” in the VMware Knowledge Base, you will see many articles). Also, it can be affecting the RAMdisk “root”, “tmp”, or even the RAMdisk “hostdstats”, depending on the cause, so this is not easy to troubleshoot.

To help prevent this type of issues, we can increase the size of ESXi RAMdisks by increasing their memory reservation, memory limit and set their reservation as “expandable”, just like in resource pools.

This corresponds to settings in “System Resource Allocation” :

System resources allocation

Let’s see how we can configure this in a more automated way, with PowerCLI.

PS C:\> $ESXiHosts = Get-VMHost
PS C:\> $Spec = New-Object VMware.Vim.HostSystemResourceInfo

Here, we save all our ESXi Hosts into a variable for later use, because we want to configure all the ESXi hosts in the vCenter. We also create a new, empty HostSystemResourceInfo object, which we are going to populate with the memory settings we want.

Now, the tricky part is to use the appropriate key, depending on the RAMdisk we want to configure. This can be one of 3 possible RAMdisks that we might want to configure, so this is a good candidate for a Switch statement :

PS C:\> $RamDisk = "tmp"

PS C:\> switch ($RamDisk) {
             'tmp' {$Spec.Key = "host/system/kernel/kmanaged/visorfs/tmp"}
             'root' {$Spec.Key = "host/system/kernel/kmanaged/visorfs/root"}
             'hostdstats' {$Spec.Key = "host/system/kernel/kmanaged/visorfs/hostdstats"}

As an example, we are going to configure the “tmp” RAMdisk.
Then, we create a new, empty ResourceConfigSpec object and store it into our Config property :

PS C:\> $Spec.Config = New-Object VMware.Vim.ResourceConfigSpec
PS C:\> $Spec.Config

Entity           :
ChangeVersion    :
LastModified     :
CpuAllocation    :
MemoryAllocation :
LinkedView       :
DynamicType      :
DynamicProperty  :

Even though, the CPU allocation is not applicable to a RAMdisk, we need to create one and assign it to the CpuAllocation property of our ResourceConfigSpec. Why ? Because the vSphere API won’t let us apply the ResourceConfigSpec to a host, if the CpuAllocation or the MemoryAllocation property is null.

PS C:\> $Spec.Config.cpuAllocation = New-Object VMware.Vim.ResourceAllocationInfo

Now, let’s set the memory reservation to 30 MB, the limit to 400 MB and the reservation as expandable. Expandable reservation means that more than the reservation can be allocated to the RAMdisk if there are available resources in the parent resource pool.

PS C:\> $Spec.Config.memoryAllocation = New-Object VMware.Vim.ResourceAllocationInfo
PS C:\> $Spec.Config.memoryAllocation.Reservation = 30
PS C:\> $Spec.Config.memoryAllocation.Limit = 400
PS C:\> $Spec.Config.memoryAllocation.ExpandableReservation = $True

Now, it’s time to apply the configuration to each individual ESXi host :

Foreach ($ESXiHost in $ESXiHosts) {
    $Spec.Config.ChangeVersion = $ESXiHost.ExtensionData.SystemResources.Config.ChangeVersion

What is this ChangeVersion business ?

We get the version identifier of the current ESXi host configuration and we make sure the ChangeVersion property in our ResourceConfigSpec matches with it. This is to prevent problems in case the ESXi host configuration was changed between the moment we last read it and the moment we apply a new ResourceConfigSpec to it. For more information, you can refer to this documentation page.

Lastly, we apply the resource allocation settings contained in our $Spec, using the method UpdateSystemResources of our HostSystem view (we used the ExtensionData property above, but it is the same as a view).

Putting it all together :

Using these techniques, I wrote a function called Set-VMHostRamDisk and packaged it in a module available here.

As you can see below, it is fully parameterized and accepts one or multiple ESXi hosts from the pipeline :


I took the time to write a proper comment-based help, so if you need more information on how to use the function, Get-Help is your BFF.

A PowerCLI alternative to the Storage Reports feature

As you may know, the Storage Views and Storage Reports features have been removed from vSphere 6. Here is the official (and laconic) statement from the vSphere 6.0 release notes :

“vSphere Web Client. The Storage Reports selection from an object’s Monitor tab is no longer available in the vSphere 6.0 Web Client.
vSphere Client. The Storage Views tab is no longer available in the vSphere 6.0 Client.

Quite a few customers were unhappy and asking what we were offering as a replacement/alternative.

To ease the pain of some customers, I wrote a PowerCLI alternative for the defunct Storage Reports feature. It is also a good way to showcase PowerCLI capabilities because it is a very typical usage of PowerCLI : extracting the information you need from different objects and grouping these pieces of information into custom objects.

The resulting PowerCLI module, Get-StorageViewsReport.psm1, made its way to a public knowledge base article with examples and screenshots of its usage. So, all you need to know to use this module is in the KB article and in the module Help accessible via Get-Help.

It obtains storage capacity and utilization information by datastore, by VM or by ESXi host. It provides the same information as the Storage Views reports.

It requires PowerCLI 5.5 or later and Powershell 3.0 or later.

You can download the module from the KB article but it is not up-to-date, so I would recommend to get it from GitHub to get the latest version. This version adds support for PowerCLI 6.0.

What’s the difference between ToolsVersionStatus and ToolsVersionStatus2

Recently, I had a customer who wanted to check if the VMware Tools were installed and up-to-date using PowerCLI.

An relatively easy way to do this is with a VirtualMachine view or the ExtensionData of a VM object :

PS C:\> $VMView = Get-VM -Name Test-VM | Get-View

PS C:\> $VMView.Summary.Guest

GuestId             :
GuestFullName       :
ToolsStatus         : toolsNotRunning
ToolsVersionStatus  : guestToolsCurrent
ToolsVersionStatus2 : guestToolsCurrent
ToolsRunningStatus  : guestToolsNotRunning
HostName            :
IpAddress           :
DynamicType         :
DynamicProperty     :

Looking at the above output, his question was :

“Should I use ToolsVersionStatus or ToolsVersionStatus2 ? What is the difference between these two ?”

He was using vSphere 5.5, so a good place to start is the vSphere API Reference Documentation for vSphere 5.5.
But what are we looking for ?
To search in the API documentation, we first need to know the type of the object we are investigating :

PS C:\> $VMView.Summary.Guest.GetType()

IsPublic IsSerial Name                                     BaseType
-------- -------- ----                                     --------
True     False    VirtualMachineGuestSummary               VMware.Vim.DynamicData


So, in our case we are going to look for the data object type : VirtualMachineGuestSummary.

API documentation

After locating VirtualMachineGuestSummary in the long list of data object types, we can see the properties for this type :

toolsVersionStatus :
Deprecated. As of vSphere API 5.0 use toolsVersionStatus2. Current version status of VMware Tools in the guest operating system, if known. Since vSphere API 4.0

toolsVersionStatus2 :
Current version status of VMware Tools in the guest operating system, if known.
Since vSphere API 5.0

So “toolsVersionStatus” is deprecated and its younger brother should be used if we use the vSphere API 5.0 or later.

This leads to yet another question : What if the vCenter Server is 5.5 but it has some old 4.x ESXi hosts and I might connect PowerCLI directly to these hosts ? How can I verify which version of the vSphere API I am querying ?

It is pretty simple because we already have a VirtualMachine view ($VMView) and PowerCLI views have a property named “Version” which is nested in “Client” and provides the API version for the current view :

PS C:\> $VMView.Client

Version          : Vim55
VimService       : VimApi_55.VimService
ServiceContent   : VMware.Vim.ServiceContent
ServiceUrl       :
ServiceTimeout   : 100000
CertificateError : System.EventHandler`1[VMware.Vim.CertificateErrorEventArg]


So to check if the tools are up-to-date or not, taking into account a possible ESXi 4.x host, we could do something like this :

If ( $($VMView.Client.Version.ToString()) -like "Vim4*" ) {

    $Props = @{'Name'=$VM.Name
Else {
        $Props = @{'Name'=$VM.Name
$CustomObj = New-Object -TypeName psobject -Property $Props


Automate the discovery of mandatory parameters

Sometimes, when trying out a cmdlet I rarely use, I get that :

Missing Mandatory parameter

This means I forgot to enter a parameter which is mandatory for this cmdlet. PowerShell is very forgiving and asks me nicely to enter a value for this parameter.

You see, learning PowerShell is not about rote knowledge of every single cmdlets. We, IT pros, have other things to do with our time than memorizing thousands of cmdlets and parameters. Thankfully, PowerShell has been designed to be highly discoverable. There are plenty of tools built-in to PowerShell which allows to discover cmdlets and their parameters.

Let’s explore some these tools and see how we can automate the discovery of mandatory parameters.

PS C:\> $CmdString = "Register-ScheduledJob"
PS C:\> Get-Command $CmdString | Select-Object -ExpandProperty Parameters

Key                                                Value
---                                                -----
FilePath                                           System.Management.Automation.ParameterMetadata
ScriptBlock                                        System.Management.Automation.ParameterMetadata
Name                                               System.Management.Automation.ParameterMetadata
Trigger                                            System.Management.Automation.ParameterMetadata
InitializationScript                               System.Management.Automation.ParameterMetadata
RunAs32                                            System.Management.Automation.ParameterMetadata
Credential                                         System.Management.Automation.ParameterMetadata
Authentication                                     System.Management.Automation.ParameterMetadata
ScheduledJobOption                                 System.Management.Automation.ParameterMetadata
ArgumentList                                       System.Management.Automation.ParameterMetadata
MaxResultCount                                     System.Management.Automation.ParameterMetadata
RunNow                                             System.Management.Automation.ParameterMetadata
Verbose                                            System.Management.Automation.ParameterMetadata
Debug                                              System.Management.Automation.ParameterMetadata
ErrorAction                                        System.Management.Automation.ParameterMetadata
WarningAction                                      System.Management.Automation.ParameterMetadata
ErrorVariable                                      System.Management.Automation.ParameterMetadata
WarningVariable                                    System.Management.Automation.ParameterMetadata
OutVariable                                        System.Management.Automation.ParameterMetadata
OutBuffer                                          System.Management.Automation.ParameterMetadata
PipelineVariable                                   System.Management.Automation.ParameterMetadata
WhatIf                                             System.Management.Automation.ParameterMetadata
Confirm                                            System.Management.Automation.ParameterMetadata

That was easy.
But hold on, what I want is only the mandatory parameters.
Also, I want everything I need to know to test a parameter and how it may work (or not work) with other parameters : its parameter set, the data type it expects, its position if it is positional, whether it accepts pipeline input and whether it accepts wildcards.

Let’s start with the parameter sets. If you don’t know what parameter sets are, essentially, they are a way to exclude 2 parameters of a cmdlet from each other to make sure that these 2 parameters won’t be used at the same time.

PS C:\> $CmdData = Get-Command $CmdString
PS C:\> $

Here, we see that the cmdlet Register-ScheduledJob has 2 parameter sets : ScriptBlock and FilePath. Parameter(s) which are in one parameter set but not in any other set are what I call “exclusive parameters”. Exclusive parameters cannot be used in the same command as any other exclusive parameter from another set.

Here is how to identify these “exclusive” parameters :

Compare Parameter Sets

The name of the parameter sets correspond to the exclusive parameter in that set. Many cmdlets parameter sets are designed that way.
So, if we use the -FilePath parameter, this puts the cmdlet Register-ScheduledJob in the FilePath mode, which prevents us from using the -ScriptBlock parameter. And vice versa.

Now, we select only the mandatory parameters :

Foreach ($ParameterSet in $CmdData.ParameterSets) {
    $MandatoryParams = $ParameterSet.Parameters | Where-Object {$_.IsMandatory }

But, it turns out that the Name parameter is displayed twice. This is because it is mandatory in both parameter sets. We don’t want duplicate parameters, so let’s do it another way.

The nested property called “Attributes” have some juicy bits for us :

Parameter attributes

So here is how we filter only mandatory parameters :

$MandatoryParameters = $CmdData.Parameters.Values | Where { $_.Attributes.Mandatory -eq $True }

Then, adding the parameter position, accepted data type, and parameter set is pretty simple because these are properties of our current objects, or of the nested Attributes property. Let’s build a custom object from that :

Foreach ( $MandatoryParameter in $MandatoryParameters ) {

    $Props = [ordered]@{'Name'=$MandatoryParameter.Name
                    'Parameter Set'=$MandatoryParameter.Attributes.ParameterSetName
                    'Data Type'=$MandatoryParameter.ParameterType

    $Obj = New-Object -TypeName psobject -Property $Props

Now, there are 2 more properties we want to add to our parameter objects : whether they accept input from the pipeline and whether they accept wildcards. To this end, we are going to use another invaluable discoverability tool : Get-Help.

PS C:\> Get-Help $CmdString -Parameter $MandatoryParameters[0].Name

-FilePath <String>
    Specifies a script that the scheduled job runs. Enter the path to a .ps1 file on the local computer. To
    specify default values for the script parameters, use the ArgumentList parameter. Every
    Register-ScheduledJob command must use either the ScriptBlock or FilePath parameters.

    Required?                    true
    Position?                    2
    Default value                None
    Accept pipeline input?       false
    Accept wildcard characters?  false

It seems we have what we need here, but is the property regarding pipeline input really named “Accept pipeline input?” ? Is the property regarding the wildcard characters really named “Accept wildcard characters?” ?

No, this is the result of the default formatting view for MamlCommandHelpInfo#parameter type. We override the default formatting to get the actual property names :

PS C:\> Get-Help $CmdString -Parameter $MandatoryParameters[0].Name | Format-List

description    : {@{Text=Specifies a script that the scheduled job runs. Enter the path to a .ps1 file on the
                 local computer. To specify default values for the script parameters, use the ArgumentList
                 parameter. Every Register-ScheduledJob command must use either the ScriptBlock or FilePath
defaultValue   : None
parameterValue : String
name           : FilePath
type           : @{name=String; uri=}
required       : true
variableLength : false
globbing       : false
pipelineInput  : false
position       : 2
aliases        :

So the properties we are interested in are named : “pipelineInput” and “globbing”. Let’s add them to our custom object :

Foreach ( $MandatoryParameter in $MandatoryParameters ) {

    $ParameterHelp = Get-Help $CmdString -Parameter $MandatoryParameter.Name

    $Props = [ordered]@{'Name'=$MandatoryParameter.Name
                    'Parameter Set'=$MandatoryParameter.Attributes.ParameterSetName
                    'Data Type'=$MandatoryParameter.ParameterType
                    'Pipeline Input'=$ParameterHelp.pipelineInput
                    'Accepts Wildcards'=$ParameterHelp.globbing

    $Obj = New-Object -TypeName psobject -Property $Props

As a bonus, we can make this work not just for cmdlets and functions, but for aliases as well. Aliases have a property named “Definition” which provides the name of the command the alias points to. So, if the user inputs an alias, we can use this to resolve the alias to the actual cmdlet or function, like so :

If ($CmdData.CommandType -eq "Alias") {
    $CmdData = Get-Command (Get-Alias $CmdString).Definition

Putting it all together :

The end result is a function using the techniques explained above :

function Get-MandatoryParameters {
        [ValidateScript({ Get-Command $_ -ErrorAction SilentlyContinue })]        

    $CmdData = Get-Command $CmdString

    # If the $CmdString provided by the user is an alias, resolve to the cmdlet name
    If ($CmdData.CommandType -eq "Alias") {
        $CmdData = Get-Command (Get-Alias $CmdString).Definition

    $MandatoryParameters = $CmdData.Parameters.Values | Where { $_.Attributes.Mandatory -eq $True }

    Foreach ( $MandatoryParameter in $MandatoryParameters ) {

        $ParameterHelp = Get-Help $CmdString -Parameter $MandatoryParameter.Name

        $Props = [ordered]@{'Name'=$MandatoryParameter.Name
                        'Parameter Set'=$MandatoryParameter.Attributes.ParameterSetName
                        'Data Type'=$MandatoryParameter.ParameterType
                        'Pipeline Input'=$ParameterHelp.pipelineInput
                        'Accepts Wildcards'=$ParameterHelp.globbing

        $Obj = New-Object -TypeName psobject -Property $Props

Here is what the output of this function looks like :

Get-MandatoryParameters Output

Backup/restore vCenter tags and tag assignments

Tags were introduced in vSphere 5.5, they are very versatile and are gaining more and more adoption. Because of this, resetting the Inventory Service database is becoming more and more problematic.
Remember, whenever you reset the Inventory Service database, you lose all tags.

I tested this in my lab : I reset the Inventory Service DB using my Powershell Script : Reset-ISDatabase .

Then, I checked back in the Web Client, and indeed, the tags, tag categories and tag assignments (which vCenter objects the tags are associated to) are all gone :

Tags are deleted

So, we need a solution to backup tags, categories and tag assignments to be able to restore them if we lose them all.
This script is great but it doesn’t take into account the tag assignments (it’s not its purpose).

For the sake of reference, here is the current state of the tags, categories and assignments in my test environment:

Tags and Assignments

Now, let’s see how we can export this information.

$TagCategories = Get-TagCategory
$Tags = Get-Tag
$TagAssignments = Get-TagAssignment

# Grouping the tag categories, the tags and the tag assignments into an array
$ExportArray = @($TagCategories,$Tags,$TagAssignments)

Export-Clixml -InputObject $ExportArray -Path Exported.xml

We group the tag categories, tags and tag assignments into an array because even though the cmdlet Export-Clixml seems to take multiple objects without complaining, according to Get-Help, it is supposed to only take one input object at a time.

We chose to export these objects to XML, rather than CSV because XML format is better suited for complex objects : objects with multi-valued properties or properties containing nested properties.

Now, let’s see how to use this XML file to restore all the categories, tags and assignments :

We import the data from the XML file back into PowerShell objects, like so :

$Import = Import-Clixml -Path Exported.xml

Remember it is an array, so we need to access the first element in the array to get the categories, the second element to get the tags and the third element to get the assignments.
For example, here is how to get the categories :

$Import[0] | fl *

Description : 
Cardinality : Single
EntityType  : {Datastore, DatastoreCluster}
Id          : InventoryServiceCategory-5ec0a856-78a4-4c3c-afb3-1c7d62b92537
Name        : Storage
Uid         : /VIServer=administrator@
Client      : VMware.VimAutomation.ViCore.Impl.V1.VimClient

Description : 
Cardinality : Single
EntityType  : {VApp, VirtualMachine}
Id          : InventoryServiceCategory-39c5fc34-4d8c-441a-b991-cd18390753b5
Name        : Business Impact
Uid         : /VIServer=administrator@
Client      : VMware.VimAutomation.ViCore.Impl.V1.VimClient

So, here is how to create the categories from the imported XML data :

Foreach ( $category in $Import[0] ) {

    New-TagCategory -Name $category.Name -Description $category.Description `
    -Cardinality $category.Cardinality -EntityType $category.EntityType

To re-create the tags, we use the same technique :

Foreach ( $tag in $Import[1] ) {

    New-Tag -Name $tag.Name -Category (Get-TagCategory -Name $tag.Category) `
    -Description $tag.Description

We have to use Get-TagCategory -Name $tag.Category to convert the value of $tag.Category back to an object of the type : “VMware.VimAutomation.ViCore.Types.V1.Tagging.TagCategory”.
Why ? Because the parameter -Category of New-Tag can only take a value of that type (Get-Help New-Tag can confirm that) and the specific type for this value was lost during the export to XML.

Now, let’s restore the tags assignments :

Foreach ( $assignment in $Import[2] ) {

    $AssignTag = (Get-Tag -Name $assignment.Tag.Name)
    $AssignEntity = Get-VIObjectByVIView -MORef ($assignment.Entity.Id)

    New-TagAssignment -Tag $AssignTag -Entity $AssignEntity

The tricky part here is the $AssignEntity. The parameter -Entity of the cmdlet New-TagAssignment expects a VIObject, which could be a VM, a host, a datastore … any kind of vCenter object and our imported data represents the entity Id (aka MORef) as a string :

PS C:\> $Import[2].Entity.Id


So we need to convert that back to the original VIObjects and that’s what Get-VIObjectByVIView allows us to do with the parameter -MORef.

Now, we have the same tags, categories and assignments as before but they have new UIDs so from vCenter point of view, they are different objects. This might have an impact on other sofware relying on vCenter tags, like vCloud Director for example. So, as always, test this in a lab before messing with your production environment !

Based on that, I made 2 functions Export-TagAndAssignment and Import-TagAndAssignment, which I packaged in a nice little module called vCenterTagging.psm1.
The module is available here.

Configure Windows crash behaviour with PowerShell

When there is a OS-handled crash (a blue screen), there are some settings in the Startup and Recovery Control Panel, which tells Windows how it should behave. For example, whether it restarts automatically or not , whether it writes a small memory dump (aka minidump), a kernel dump or a full dump, and where it saves the dump file :

Startup and Recovery Window

If you want to troubleshoot a recurrent crash, you may want to alter this behaviour, for example, you may need to set the system not to restart automatically to be able to see the blue screen. You may also need a complete memory dump to facilitate your investigations or to send to Microsoft Support.

Here is how to do this using PowerShell :

These settings correspond to properties of the WMI class : Win32_OSRecoveryConfiguration, so let’s start by checking what we have in there :

C:\ > $CrashBehaviour = Get-WmiObject Win32_OSRecoveryConfiguration -EnableAllPrivileges
C:\ > $CrashBehaviour | Format-List *

PSComputerName             : DESKTOP
__GENUS                    : 2
__CLASS                    : Win32_OSRecoveryConfiguration
__SUPERCLASS               : CIM_Setting
__DYNASTY                  : CIM_Setting
__RELPATH                  : Win32_OSRecoveryConfiguration.Name="Microsoft Windows 7 Ultimate
__PROPERTY_COUNT           : 15
__DERIVATION               : {CIM_Setting}
__SERVER                   : DESKTOP
__NAMESPACE                : root\cimv2
__PATH                     : \\DESKTOP\root\cimv2:Win32_OSRecoveryConfiguration.Name="Microsoft Windows 7
                             Ultimate |C:\\Windows|\\Device\\Harddisk0\\Partition3"
AutoReboot                 : True
Caption                    :
DebugFilePath              : %SystemRoot%\MEMORY.DMP
DebugInfoType              : 2
Description                :
ExpandedDebugFilePath      : C:\Windows\MEMORY.DMP
ExpandedMiniDumpDirectory  : C:\Windows\Minidump
KernelDumpOnly             : False
MiniDumpDirectory          : %SystemRoot%\Minidump
Name                       : Microsoft Windows 7 Ultimate |C:\Windows|\Device\Harddisk0\Partition3
OverwriteExistingDebugFile : True
SendAdminAlert             : False
SettingID                  :
WriteDebugInfo             : True
WriteToSystemLog           : True
Scope                      : System.Management.ManagementScope
Path                       : \\DESKTOP\root\cimv2:Win32_OSRecoveryConfiguration.Name="Microsoft Windows 7
                             Ultimate |C:\\Windows|\\Device\\Harddisk0\\Partition3"
Options                    : System.Management.ObjectGetOptions
ClassPath                  : \\DESKTOP\root\cimv2:Win32_OSRecoveryConfiguration
Properties                 : {AutoReboot, Caption, DebugFilePath, DebugInfoType...}
SystemProperties           : {__GENUS, __CLASS, __SUPERCLASS, __DYNASTY...}
Qualifiers                 : {dynamic, Locale, provider, UUID}
Site                       :
Container                  :

The parameter -EnableAllPrivileges will allow us to manipulate the properties of this WMI object if the current Powershell host was run as Administrator.

Here is how to prevent Windows to restart automatically after a system failure :

 $CrashBehaviour | Set-WmiInstance -Arguments @{AutoReboot=$False}

Let’s check if our change is effective :

No auto restart highlight

Woohoo, it worked !

Now, we are going to configure the type of memory dump: small, kernel, or complete. As you can see above, the default value is 2, which corresponds to kernel dump.
Here are the possible values and their meaning :
0 = None
1 = Complete memory dump
2 = Kernel memory dump
3 = Small memory dump

Here is how to configure Windows to save full dumps :

I had some difficulties to make it work using the variable $CrashBehaviour, so here is how to do it :

 Get-WmiObject -Class "Win32_OSRecoveryConfiguration" -EnableAllPrivileges  |
Set-WmiInstance -Arguments @{DebugInfoType=1}

And how to verify our change :

DebugInfoType Full Dump

If you have recurrent crashes and you want to check if it is the same type of failure every time, you may want to keep any existing memory dump when writing a new dump to disk. Be careful though, this may consume a large amount of disk space, especially with full dumps.

 $CrashBehaviour | Set-WmiInstance -Arguments @{OverwriteExistingDebugFile=$False}


Now, if you want to change where the dump files will be written, perhaps on a volume with more free space, here is how to change the location of the memory dumps :

 $CrashBehaviour | Set-WmiInstance -Arguments @{DebugFilePath="E:\MEMORY.DMP"}

And how to verify the modification :


By default, an OS-handled crash will write an event in the System event log. I’m not sure why you would want to change this behaviour, but in case you do, here is how to do it :

 $CrashBehaviour | Set-WmiInstance -Arguments @{WriteToSystemLog=$False}