Datazen – Lab Installation

Per my previous post, MS has acquired a BI solution capable of providing dashboards using many different types of data sources. From a System Center perspective, we have been waiting for a scalable solution that meets the performance needs of our customers. Seeing this solution has a very nice look and feel, I was interested in getting my hands on the software to take it for a test drive.

Download

Reading through the documentation, I noticed that there are many options for scaling this product for a large scale customer. For lab, however, a single server scenario seems like the way to go. Just to make sure this would perform at least OK, I spun up a Basic_A3 VM in Azure (4 cores, 7GB of RAM) and downloaded the product and kicked off the install.

clip_image001

clip_image002

Since this a single server install, I leave all of the features selected. I do, however, not know how the product will react to the disk caching on the C drive. I decide to leave it as the C drive for lab purposes but would expect to move this off to a different drive if I was doing a production install.

clip_image003

After reading a little bit about security in from the included PDF, I decide to go with a domain account to run the core services. This is how you would approach the situation if this was a distributed deployment and better follows security best practices.

clip_image004

The Control Panel for the application (administrative web portal) has a default admin user. The user name is “admin” and here I set the password for that account.

clip_image005

I want to integrate with AD. I configure this connection to leverage the same service account I created and set to run the core services. In a production install, you would most likely use different accounts.

clip_image006

I copy off the encryption key and store it in a safe place.

clip_image007

I copy off the instance ID just in case I decide to add additional servers to the overall install in the future.

clip_image008

I choose to “Use Core Service credentials” in the next step for simplicity.

clip_image009

I want this server to host the websites so I leave the host name field blank.

clip_image010

I configure the Exchange settings so that the service can send emails.

clip_image011

Install.

clip_image012

At this point, the install takes quite a while (10-15 minutes) as the necessary windows features are added to the machine and product is ultimately installed.

clip_image013

Happy dashboarding! The next post will cover getting started with getting a hub set up, creating a data source, and building some views in order to populate the dashboards.

Presentation

Dashboards–MS Acquires BI Dashboard Solution

Dashboards!  Cloud ready, mobile ready and free for a bunch of customers.  Read more about it here:

Microsoft acquires mobile business intelligence leader Datazen

http://blogs.microsoft.com/blog/2015/04/14/microsoft-acquires-mobile-business-intelligence-leader-datazen/

As of today, SQL Server Enterprise Edition customers with version 2008 or later and Software Assurance are entitled to download the Datazen Server software at no additional cost.

From the perspective of Datazen:

http://www.datazen.com/blogs/post/datazen-joins-microsoft

Download here:

http://www.microsoft.com/en-us/server-cloud/products/sql-server-editions/sql-server-enterprise.aspx#sqlmobilebi

Publisher Application available via the Windows Store:

http://apps.microsoft.com/windows/en-us/app/datazen-publisher/a23f9233-bcda-49e3-bab4-5e8366b96a37

Happy dashboarding!

Presentation, SC Operations Manager, SC Orchestrator, SC Service Manager

SCOM vNext CTP – Module Changes

I recently installed the CTP for Operations Manager that was released in October on MSDN.  Using the same method I used to scan the R2 product for changes, I was able to find just a few nuggets of info.  This method simply looks at the out of the box management packs, snags all of the module types and looks at their definitions.  I then do a simple compare between the outputs (CTP vs. 2012 R2 UR4) to see if there are any new MPs, new modules, or changes to the parameters on existing modules.

Script for scanning: Here

The volume of changes were minimal and only seemed to apply to xPlat.  There were no new management packs, and there did not seem to be any changes to the parameters on existing modules (I was hoping for another change like was done to the ExpressionFilter).

Here are the net new changes:

MP: Microsoft.Unix.Library
Microsoft.Unix.WSMan.TimedEnumerate.LogicalDisk.DiscoveryData ? DataSourceModuleType
“TargetSystem” “Uri” “Filter” “SplitItems” “Interval” “SyncTime” “ExcludeFileSystemName” “ExcludeFileSystemType” “ClassId” “InstanceSettings” “OutputErrorIfAny”

MP: Microsoft.Unix.LogFile.Library
Microsoft.Unix.Invoke.Script.ProbeAction ? ProbeActionModuleType
“TargetSystem” “UserName” “Password” “Script” “ScriptArgs” “TimeOut” “TimeOutInMS”

Microsoft.Unix.Invoke.Script.DataSource ? DataSourceModuleType
“Interval” “SyncTime” “TargetSystem” “UserName” “Password” “Script” “ScriptArgs” “TimeOut” “TimeOutInMS” “FilterExpression”

With the CTP, there are two new DataSources and one new ProbeAction out of the box.  The CTP was purely the base install with no additional MPs imported.  There could be other changes or net new MPs/Modules that provide new functionality.  This is most likely the case as we start looking forward along with the integration out to Azure Operational Insights.

Management Packs, MP Authoring, SC Operations Manager, Uncategorized

SCOM 2012–Connect to the SDK from PowerShell in an MP

I’ve had a few different scenarios in the past where the need to connect to the SCOM SDK and retrieve data either as part of a monitor or as part of a discovery is necessary.  In most cases, the standard “Import-Module OperationsManager” gets the job done, the SCOM CmdLets are accessible, and whatever workflow I have executing seems to function just fine.  However, recently, I had a customer scenario where this wasn’t working.  The discovery was running on a SCOM management server.  The module was available and functioning outside of a SCOM MP workflow.  For some reason, the Import-Module simply wasn’t working in that specific environment when the PS code was embedded into a management pack.

Troubleshooting wasn’t getting us anywhere so I scanned the existing MPs to see if there was another way that other MPs were loading the OperationsManager module.  I found a few different ways, however, inside of the “Microsoft SystemCenter OperationsManager Summary Dashboard” management pack a connection to the SDK is made like this:

$SCOMPowerShellKey =
“HKLM:\SOFTWARE\Microsoft\System Center Operations Manager\12\Setup\Powershell\V2”
$SCOMModulePath =
Join-Path (Get-ItemProperty $SCOMPowerShellKey).InstallDirectory “OperationsManager”
Import-module $SCOMModulePath

While we didn’t nail down the root cause of the Import-Module failing, this route was successful in connecting to the SDK.

Management Packs, MP Authoring, PowerShell, SC Operations Manager

SCOM 2012–Parameterizing Operators Within MP

In order to create a generic monitor type that is based off a PowerShell DS, you may need to parameterize not only the operators within expression filters but parameterize the name of the name/value pair coming back from the property bag in your PS script.

image

Notice the UnhealthyOperator as well as the HealthyOperator specified within the configuration for the monitor type are of type CriteriaCompareType.  In order to access the type, the ExpressionEvaluatorSchema needs to be included.  Also, notice the Variable configuration element is meant to be the name of the value being passed back from the PS script which needs to be ran through the ExpressionFilter.  The ExpressionFilter is then populated using the $Config variables.  Here is a potential ConditionDetection for a healthy state:

image

Here is the resulting monitor:

image

The monitor would raise an alert if the random number returned from the script is <=5.

Management Packs, MP Authoring, SC Operations Manager, Uncategorized