ARM – Recreating VM Off Existing VHDs

Note – This blog pertains to the November 2015 release of Azure PowerShell

At some point, I apparently told Azure Resource Manager to delete the VM that runs SQL for my SCOM environment running in Azure.  While I completely disagree with the portal’s interpretation of my button clicks, my VM is missing and I need it back.  Thankfully, when you delete a VM through the portal (accidentally or otherwise…or not at all and it just magically disappears), the disks are left behind in the storage account.  In this case, this is a good thing.  At least I can recover.

The documentation around the new AzureRM cmdlets still has a gap or nine, so I wasn’t able to easily dig up a script that created a VM off an existing OS disk and attach all of the data disks as well.  Using PS help and assuming the process was still somewhat like what we had to do under the older cmdlets, I put together the following script:

$om03 = New-AzureRmVMConfig -VMName om03 -VMSize Standard_D2
$om03 | Set-AzureRmVMOSDisk `
-VhdUri
https://labtstazom0sa.blob.core.windows.net/vhds/om03osdisk.vhd `
-Name om03osdisk -CreateOption attach -Windows -Caching ReadWrite

$StorageAccountURI = “https://labtstazom0sa.blob.core.windows.net/vhds/”
$numdisks = 4

For($i=0;$i -lt $numDisks;$i++){
$OM03 | Add-AzureRMVMDataDisk -Name (“datadisk” + $i) `
-VhdUri ($StorageAccountURI + “OM03datadisk” + $i + “.vhd”) `
-LUN $i -Caching ReadWrite `
-CreateOption Attach -DiskSizeInGB 20
}

New-AzureRMVM -ResourceGroupName LabTSTAZRGOM `
-Location “West US” -VM $om03 -Verbose

I needed to make sure the data disk naming and URIs matched, I was provisioning to the right region, resource group, etc.  Executing the script resulted in the following:

image

I forgot to attach the network adapter.  Thankfully, the delete executed by the ARM gremlins left the interface so it should be as simple as doing a reattach and then I should be good to go.  Herein lies a pretty significant issue.  It would seem with this release of Azure PowerShell, the cmdlet Add-AzureRmVMNetworkInterface has been removed.  With that, I was not able to easily figure out how to attach an existing network interface to a new VM config through PowerShell using the AzureRM cmdlets.  You can do this with a template, but I want to just bang this out and get it done.  This leaves me with 2 options:

  1. Destroy the existing NIC and create a new one, or
  2. Do something somewhat hacky by stealing the network profile from one of the other VMs, modify it and then attach that network profile to my OM03 config.

I am guessing there is probably an additional route invoking .NET methods to create a net new profile and then assign the NIC but I will leave that research until later.  The first option would be straight forward, however, it might leave me with some cleanup in the lab afterwards (DNS, IPs, etc.).  Rather than messing with that, I decide to try the second option and steal the profile from OM01 VM and modify:

$nic = Get-AzureRmNetworkInterface -Name om03nic -ResourceGroupName LabTSTAZRGOM
$netprof = (Get-AzureRMVM -VMName OM01 -ResourceGroupName LabTSTAZRGOM).NetworkProfile
$netprof.NetworkInterfaces[0].ReferenceUri = $nic.id

$om03.NetworkProfile = $netprof

I get the existing OM03NIC, get the network profile from OM01 and then stuff the ID for the existing NIC into the profile.  Once I have this, I assign the network profile to the new VM Config.  This process actually successfully navigates the getters and setters.  After running this, I re-run the New-AzureRMVM cmdlet and I get a success:

image

Now, if it really worked, SQL would be running and I will be able to launch SCOM.  Logging and popping open the console:

image

Success!

ARM, Azure, Hack Job

ARM – Visual Studio Deployment With Oct 2015 Azure PowerShell Preview

With the release of the updated Azure PowerShell 1.0 Preview (October 2015), I was curious to see how much of a change would be required for me to continue to use Visual Studio 2015 to provision dev/test environments into my Azure account.  When VS pushes an ARM template, it executes an auto generated PowerShell script named Deploy-AzureResourceGroup.ps1.

image

Here are the changes I made in order to get the script to execute successfully.

Change 1 – Module Check

The old code checked for the existence of the AzureResourceManager module.  I updated the code to check for the new module named AzureRM

if (-NOT (Get-Module -ListAvailable | Where-Object {($_.Name -eq ‘AzureRM’) })) {
Throw “The version of the Azure PowerShell cmdlets installed on this machine are not compatible with this script.”
}

Change 2 – Import the module.  Because the script still needs access to the Azure Service Management cmdlets, both the Azure and AzureRM modules need to be imported

Import-Module AzureRM -ErrorAction SilentlyContinue
Import-Module Azure -ErrorAction SilentlyContinue

Change 3 – Get the storage account key.  The old code switches between AzureResourceManager and AzureServiceManagement depending on how the storage account was provisioned.  The switching is no longer needed since now both the ASM and ARM cmdlets can be loaded and accessed at the same time.  The code below will now retrieve the key in either case

    if ($StorageAccountResourceGroupName) {
$StorageAccountKey = (Get-AzureRMStorageAccountKey -ResourceGroupName $StorageAccountResourceGroupName -Name $StorageAccountName).Key1
}
else {
$StorageAccountKey = (Get-AzureStorageKey -StorageAccountName $StorageAccountName).Primary
}

Change 4 – Create a connection for ARM.  VS stores connection information for the old cmdlets and for ASM.  For the new AzureRM cmdlets, a new connection needs to be established.  In order to do this, I added in the new Login-AzureRMAccount cmdlet but only call it if there does not already exist a connection to ARM

try{
$AzureRMContext = Get-AzureRMContext
} catch {
Login-AzureRMAccount
}

Note – If you connect to more than one subscription or need to authenticate with more than one set of credentials, this cmdlet will have to be wrapped in additional logic

Change 5 – Deploy the resources.  In the prior version of the code, the deployment is done directly through the New-AzureResourceGroup cmdlet.  This cmdlet would update an existing resource group and force the replacement of resources, or deploy the RG from scratch if it did not exist.  In the new version, we need to use the Get-AzureRMResourceGroup cmdlet to see if the RG is already there.  If not, create it.  Then, we need to use the New-AzureRMResourceGroupDeployment cmdlet to actually land the resources in the RG

$ResourceGroup = Get-AzureRMResourceGroup | where{$_.ResourceGroupName -eq $ResourceGroupName}
if($ResourceGroup -eq $null) {
Write-Host “Provisioning Resource Group: $ResourceGroupName in Location: $ResourceGroupLocation”
New-AzureRMResourceGroup -Name $ResourceGroupName -Location $ResourceGroupLocation
} else {
Write-Host “Resource Group: $ResourceGroupName Exists”
}

New-AzureRMResourceGroupDeployment -Name $ResourceGroupName `
-ResourceGroupName $ResourceGroupName `
-TemplateFile $TemplateFile `
-TemplateParameterFile $TemplateParametersFile `
@OptionalParameters `
-Force -Verbose

That’s the gist of the changes.  The only difference in the actual deployment through VS is that I now get prompted for the AzureRM credentials in order to connect if a connection does not already exist.

image

Happy cloud deploying!

Example Script

ARM, Azure, Infrastructure as Code

SCOM 2012 R2 – Create Task Pane Dashboard Manually

A recent customer requirement was to add additional dashboards to the Navigation pane in the SCOM console.  There is this tool to assist with the creation on the MOM Team blog, however, it was not working in this particular case.  Rather than spending a ton of time attempting to troubleshoot, I took a more manual approach which ultimately yields similar results.

Tested version – SCOM 2012 R2 UR7

image

Step 1 – Create a new management pack to contain your dashboard

image

Step 2 – Under the Monitoring pane, locate the folder for your MP and create a new dashboard

image

Step 3 – Choose a layout.  In this case, I am just going to create Grid Layout dashboard with a single cell in order to look at CPU utilization on a chosen Windows Computer

A.  Choose Grid Layout

image

B.  Give it a name

image

C.  Choose a single cell for simple demo purposes

image

D.  Create

Step 4 – Export the management pack.  It is easier to move the dashboard under the Navigation pane and add the widgets after the fact

image

Step 5 – Open the XML in your favorite editor.  There are a few pieces we need to tweak.

A. Add a reference to the Windows Library

image

B. Modify the <ComponentType> to include a Target

image

Note: the reference is correct.  You do not use the typical alias when dealing with the mpinstance notation.

C. Modify the Parent for the <ComponentReference> to point at the Navigation pane rather than the default folder that was created within the MP

image

D. Modify the <ComponentImplementation> to point at the same Target as the <ComponentType>

image

E. Save your management pack.  Optional – increment the version number

Step 6 – Import your new management pack

image

Step 7 – Locate your dashboard.  If it still shows under the default folder in the monitoring pane, close and reopen your console.

image

Step 8 – Open your dashboard and click “Click to add widget…”

A. Select Performance Widget

image

B. Give it a name

image

C. Find an object of the specific type you are targeting.  It is key that select a specific item, not a group or an object of a different class.  In this case, Windows Computer

image

D. Select the desired performance counter and add

image

E. Choose a Time range

image

F. Choose whether or not to show the legend and then the desired fields if you choose to show

image

G. Create

Note – at this point you probably notice that the dashboard doesn’t work.  This is expected behavior as the code for the widget is not correct since it was authored through the console but not for specific use in the Navigation pane

Step 9 – Export your management pack

Step 10 – Open the XML in your favorite editor

Step 11 – Find the <ComponentImplementation> for the widget and modify the <Base /> tag

image

Step 12 – Locate the <Bindings> section under the <ComponentOverride> for the Widget

image

Step 13 – Highlight and cut all of the <Binding> tags

image

Step 14 – Paste the <Binding> section between the <Base> tags within the <ComponentImplementation> for the widget

image

Step 15 – Under the PerformanceObjectCounters binding, locate the ManagedEntityIds binding

image

Step 16 – Modify binding to accept the id of the targeted object in the console rather than a specific instance

image

Step 17 – Delete the rest of the <ComponentOverride> code for the widget

image

Step 18 – Locate the <DisplayString> for the widget component override and delete that as well

image

Step 19 – Save and import your management pack.  Post import, close and reopen your console

Step 20 – Test your dashboard.  Enjoy

image

 

Download

Management Packs, MP Authoring, SC Operations Manager

MS OMS – Performance Data Collection Now Live

Microsoft Operations Management Suite (OMS) is now able to collect performance data at a rate of up to every 10 seconds.

image

Configuration is super simple.

image

Simply add or remove the counters you wish to have collected and the data will start pouring in for review.  The only challenge I see here is how to collect SQL related data since the SQL instance name comprises part of he object name.  In order to test how this is going to be handled, I added the following counters just taking a guess (and hoping something was done to simplify the process) as to how OMS might handle this situation:

image

I found a bunch of these events for MSSQL and MSSQL$ in the event log shortly after attaching the SQL machine to OMS:

image

A few minutes afterwards, I found the performance data with the object name MSSQL$OMDB inside Search within the OMS Portal:

image

This means the SQL data will have to be treated a little bit differently from a collection perspective, but it is manageable.  Next steps would be to explore if there is a way to programmatically interface with OMS in order to configure these counters.  For now, most performance counters are very easily configured and collected for analysis in OMS!

Uncategorized

SCOM 2012 R2 – PowerShell Based Console Task

From time to time, it is nice to take advantage of the agent and console tasks exposed in the SCOM console to more easily accomplish some sort of remediation or to retrieve some information.  I have authored agent tasks before, and they are pretty straight forward.  I do not believe I have ever authored a console task, however, when I received the request from one of my peers, I figured it would be a snap.  The ask was to be able to execute a PowerShell script against the SDK straight out of the console and display the results back to the user.  It ends up this is not too bad, but you do have to do some digging in order to see how these tasks are actually constructed.

Step one for me is to always try and find an example that I can reference.  A quick Bing search did not turn up much, so I exported all of the MPs from my SCOM environment via PS and then scanned them for console tasks.  I found an interesting one in the Microsoft.Windows.Server.Library management pack.

<ConsoleTask ID=”Microsoft.Windows.Server.Computer.OpenPowerShell” Accessibility=”Public” Enabled=”true” Target=”Windows!Microsoft.Windows.Server.Computer” RequireOutput=”false”>
<Assembly>Res.Microsoft.Windows.Server.Computer.OpenPowerShell</Assembly>
  <Handler>ShellHandler</Handler>
<Parameters>
<Argument Name=”WorkingDirectory” />
<Argument Name=”Application”>powershell.exe </Argument>
<Argument> -noexit -command “Enter-PSSession -computer $Target/Property[Type=”Windows!Microsoft.Windows.Computer”]/PrincipalName$”</Argument>
</Parameters>
</ConsoleTask>

A straight forward task that simply opens PowerShell and creates a remote session on the targeted Windows Server Computer.  This is essentially what I want to do, except I want to execute code against the SDK and display results rather than simply opening a remote session.

1

This is great.  I did notice the <Assembly> line (highlighted in yellow above) that seems to basically define the type of task you are trying to execute.  Searching the code further for this particular assembly, I find a resource at the very bottom of the management pack:

<Resources>

<Assembly ID=”Res.Microsoft.Windows.Server.Computer.OpenPowerShell” Accessibility=”Public” FileName=”Microsoft.Windows.Server.Computer.OpenPowerShell” HasNullStream=”true” QualifiedName=”Microsoft.Windows.Server.Computer.OpenPowerShell” />
</Resources>

I scoured my workstation, my MS servers, and the installation media for this file and I was not able to find it.  I really want to track this down since this will potentially expose other types of console task types.  My best guess is that this code has been relocated into some other DLL but I do not know for sure.  If I find the code, I will post an update.

With that, I felt I had most of the necessary pieces in order to get an example running.  For this example, I just wanted to do something simple like list all of the properties for a selected Windows Computer.  Here is the script to do the work:

Param([String]$computerFQDN)
$key = ‘HKCU:Software\Microsoft\Microsoft Operations Manager\3.0\User Settings’
$SDK = (Get-ItemProperty -Path $key -Name SDKServiceMachine).SdkServiceMachine

Import-Module OperationsManager
New-SCOMManagementGroupConnection $SDK

## Get Windows Computer class
$computerClass = Get-SCOMClass -Name “Microsoft.Windows.Computer”

## Get SCOM object
$computer = Get-SCOMClassInstance -Class $computerClass | Where-Object {($_.FullName -eq $computerFQDN) -or ($_.Name -eq $computerFQDN)}
$computer | fl *

This code connects to the SDK using whatever SDK service the machine upon which the script is being executed last connected.  If multiple consoles are open and connected to multiple management groups, this approach will only work for Computer objects in the last console opened.  However, this is fine for demo purposes and my lab since I only have a single environment.

Since the OpenPowershell module opens PowerShell and executes a scriptblock, I need to wrap the code above in a scriptblock and pass in the $computerFQDN value using a $Target variable:

<Parameters>
<Argument Name=”WorkingDirectory” />
<Argument Name=”Application”>powershell.exe </Argument>
<Argument><![CDATA[ -noexit -command “& {Param([String]$computerFQDN)

}” ]]></Argument>
<Argument>$Target/Property[Type=”Windows!Microsoft.Windows.Computer”]/PrincipalName$</Argument>
</Parameters>

Dropping this into a <ConsoleTask> and adding the exact code chunk to add the Assembly from the bottom of the Microsoft.Windows.Server.Library management pack yields the following:

image

When I click on my “Console Task – Get Computer Info” task, it launches the script which returns the following:

image

There are all of the properties for the selected Windows Computer.  Results!

Management Packs, MP Authoring, PowerShell, SC Operations Manager