PowerShell Module: Set the ConfigMgr Limiting Collections

I wrote a short blog post recently describing how you can semi-automatically update the limiting collection for all collections in a Configuration Manager console folder here.

However, as is commonly the case, a one-time task became something that needed to be done at scale. We suddenly needed to embark on a mass reorganisation of our ConfigMgr objects, including a rethink of the limiting collections. Once the new design was put together it was clear that the limiting collection would need to be updated on almost all the collections.

This was an opportunity I had been waiting on to exercise my PowerShell skills and publish my first module.


Get the module

If you have PowerShell version 5 or above you can grab the module by running the following command:

 

You can view the project on GitHub here:

https://github.com/markhallen/CMUtilities

 

You can view it on PowerShellGallery.org here:

https://www.powershellgallery.com/packages/CMLimitingCollection


Usage

To update the limiting collection for a console folder, simply comp the path from the address bar in the console.

Call the Set-LimitingCollectionForFolder function, passing the folder path and the target limiting collection name.

Bulk update ConfigMgr limiting collection

UPDATE: the requirement below grew after I wrote this post, and so I have created a module to streamline the process; you can read about and get it from here.


This is the second time that I needed to perform this task so, even though it is quite simple, I thought I would put it out there in case it helps someone else.

First, all the collections were within the same console folder so I opened the folder in the console and selected all the collections and pressed CTRL + C; when right clicking the selection ‘copy’ is disabled.

I pasted the results into excel and copied the collection name column into notepad and saved as collections.txt.

I know that the previous steps can be achieved programmatically but it would be overkill for this particular task.

Finally I executed a single line of PowerShell to update all of the collections listed in the txt file:

Simply replace <NewLimitCollId> with the ID of your new limiting collection.

Automated creation of ConfigMgr operational collections

Query collections can be used in SCCM to target or give a quick counts of specific device states. For example we could create a query collection that contains all Windows 10 workstations.

Manually creating these types of collections can be quite time consuming especially if you manage multiple sites or hierarchies, therefore automatically generating these query based collections using a script can save a lot of time.

Credit where it’s due

This script borrows heavily from the Set Operational SCCM Collections script by Benoit Lecours from systemcenterdudes.com. Benoit created a wonderful script to create query collections that he commonly requires and shared it on the TechNet gallery.

I wanted to use Benoit’s script on a shared hierarchy and to remove the collection details from the script to allow us to easily manage the collections and queries later without revisiting the script. I decided to use a separate XML file to describe the collections and have included a working sample XML in the GitHub repository.

Download

The script and the sample XML file can be found on GitHub here:

https://github.com/markhallen/configmgr/tree/master/New-CMOperationalCollections

Script details and usage

The script can be run multiple times on the same environment and will only recreate the collections that are missing.

.SYNOPSIS

Create collections in Configuration Manager that are useful for operational monitoring.

.DESCRIPTION

Will accept an XML file of the desired collections and queries and create the collections in Configuration Manager.

.PARAMETER SiteCode

3-character site code

.PARAMETER Path

Full or relative path to the XML file containing the collections and queries

.PARAMETER LimitingCollection

[Optional] Sets the LimitingCollection used for new collections.
[Default] “All Systems”

.PARAMETER Organization

[Optional] allows a default top level directory to be defined. This is useful
for a hierarchy that is shared between distinct departments or organizational units.
The value is also prepended to the collections names to allow each oraganisation to
create collections with unique names.
[Default] “”

.PARAMETER FolderName

[Optional] This is the folder that will contain the operational collections. It will
be in the root of device collections or within the Organization folder if defined.
[Default] “Operational”

.PARAMETER RecurInterval

[Optional] Used in conjunction with RecurCount to set a collection update schedule.
Acceptable values are ‘Minutes’,’Hours’ or ‘Days’.
[Default] “Days”

.PARAMETER RecurCount

[Optional] Used in conjunction with RecurInterval to set a collection update schedule.
[Default] 7

.NOTES

Author: Mark Allen
Created: 18/10/2016
References: Benoit Lecours: Set Operational SCCM Collections
https://gallery.technet.microsoft.com/Set-of-Operational-SCCM-19fa8178

.EXAMPLE

.\New-CMOperationalCollections.ps1 -SiteCode PR1 -Path .\MyCollections.xml
Will create the folder ‘Operational’ in the root node Device Collections.
The collections will be created in the Operational Folder.
EG Device Collections > Operational > <Collections from XML>

.EXAMPLE

.\New-CMOperationalCollections.ps1 -SiteCode PR1 -Path .\MyCollections.xml -LimitingCollection “All Systems” -Organization “MyOrg”
Will create a “Test” folder in the root node of each object type; subfolders will be
created within the relevant folder. Collection names will be prepended by <Organization>
EG Device Collections > MyOrg > Operational > <Collections from XML>
Collections will be prepended by “MyOrg …”

.EXAMPLE

.\New-CMOperationalCollections.ps1 -SiteCode PR1 -Path .\MyCollections.xml -RecurInterval “Days” -RecurCount “14”
A custom refresh interval will be set for all new collections.

XML Schema

Within the query tags we need to substitute the less than symbol < with the HTML equivalent &lt;

<Collections>

<Collection>

<Name></Name>

<Query></Query>

<Description></Description>

</Collection>

</Collections>

You can view the working example here: https://github.com/markhallen/configmgr/blob/master/New-CMOperationalCollections/CMOperationalCollections.xml

Import certificates into Java RTE using ConfigMgr Compliance Settings

import-java-certs

We recently had a requirement from a customer to import two certificates into the Windows client’s Java Runtime certificate store.

The problem they were experiencing was that when they tried to connect to a particular custom web app they were receiving a security prompt. Prompts such as this usually generate service desk calls and conditioning users to accept security warnings may have repercussions in the future.java-certs-warning

Microsoft have advice on how to import certificates into the Java runtime certificate store here using the keytool.exe file that comes bundled with Java Runtime and SDK. After validating the commands required, it was decided that we could use ConfigMgr Compliance Settings to import the certificate into the currently installed Java runtime, as well as ensuring that the certificates are imported into future versions that are installed.

Assumptions and requirements

The script assumes that the default Java certificate store password has not been changed from the default of ‘changeit’. I may come back to the script later and swap this out for a variable, however in most cases this should be the case as that is how it is distributed by Oracle.

Scripted actions

Both scripts utilise the Get-JavaHomeLocation function which was provided by my colleague Steve Renard from powershell-guru.com.

The Get-JavHomeLocation function utilises the registry to find the relevant Java home folder for the latest installed version of Java and returns the path or an error message.

If an error message is returned by the Get-JavaHomeFunction we can catch and return this easily using Test-Path; in cases where the Java home is not found or not valid then the reason will be available in the compliance report.

 

Discovery Script

The customisable component of the discovery script is a simple array of the relevant certificates aliases. This are set upon import into the certificate store. If this is a new certificate then you can create appropriate aliases yourself.

Finally we iterate through the array of certificate aliases and check if they are present in the certificate store. If one is missing the script will exit and return ‘Non-Compliant’.

I’m going to assume that you know how to create configuration items and baselines as there are loads of resources out there detailing how to do this (example). The key aspect of this script to be aware of is that it will return ‘Compliant’ when the relevant certificates are present. The condition on the configuration item compliance rule should be set to ‘Equals’ and ‘Compliant’

edit-rule

Remediation Script

The remediation script requires a little more editing than the discovery script because it also needs to access the actual certificate files. ConfigMgr configuration items do not natively distribute files so they need to be made available on a network share. In our case we created a subfolder on an existing share and made it available for read-only to ‘Everyone’.

The hash table requires that the key of each entry is the alias and the value is the name of the certificate file. For example, ‘my-alias-1’ = ‘MyCertificate-1.cer’.

Because the error handling of the remediation action of configuration items is quite limited, the remediation script will create a log file in the <JavaHome>\lib\security folder named import-certificates.log. In addition to the path checks from the discovery script we will also test the path to the external store that will be used to store the certificate files.

Next we will iterate through the hash table of certificate aliases and certificates files importing each into the Java certificate store.

Finally we iterate through the certificates in the hash table to check that they have been added correctly.

Download the scripts

I have added the scripts to GitHub and they are heavily commented; if you are not a confident scripter it is clearly marked where you can customise these scripts for your organisation.

The discovery script

The discovery script will check the Java RTE certificate store and return ‘Compliant’ or ‘Non-Compliant’.

https://github.com/markhallen/configmgr/blob/master/ConfigurationItems/JavaCertificate-DiscoveryScript.ps1

The remediation script

The certificates are imported into certificate store; any errors will be recorded in the import-certificates.log.

https://github.com/markhallen/configmgr/blob/master/ConfigurationItems/JavaCertificate-RemediationScript.ps1

Automated creation of a default Configuration Manager folder structure

configmgr-automated-folder-structure

TLDR: https://github.com/markhallen/configmgr/blob/master/New-CMFolders

We regularly create Configuration Manager hierarchies for new customers and we like to standardise the organisation of the different object types.

For this reason automation provides expediency, convenience and conformity.

The New-CMFolders PowerShell script will accept a CSV file of the desired folder structure and create the folders therein.

Mandatory parameters

The script accepts two mandatory parameters.

  1. -SiteCode indicates the Configuration Manager site code in order to initiate the connection to the site server.
  2. -Path is the path to the CSV file containing the desired structure.

Optional parameters

  1. -CustDirectory allows a default top level directory to be defined. This is useful for a hierarchy that is shared between distinct departments or organizational units.

-CustDirectoty param omitted

default

-CustDirectory param set to PST

custfolder

Limitations

Our current requirements only necessitate four directory levels be defined. A future development of the script might include a higher number of subdirectories handled dynamically.

Download and use

You can download the working script and example CSV file from github here.

If the Configuration Manager hierarchy is not shared or divided into organisational units then you can execute the following command in PowerShell.

If you have a shared hierarchy or maintain separation between organisational units then you can add the -CustDirectory parameter to the command line. This will create a top level folder in the root node of each object type and create the sub folders within.

 

Batch update Configuration Manager Software Metering Rules

update-cmmeteringrules-header-ps

I recently encountered an issue where a batch of Software Metering rules were not returning any results because the language settings were too specific.

To resolve the issue I used two PowerShell commands to batch update the language to “Any”.

The metering rules were all prefixed with a regular string so I first retrieved an array of these.

It is a good idea at this point to output the resultant array to check that you have captured the correct rules.

I then iterated through the resultant array to set the language to any using the language ID 65535.

To confirm in powershell we need to repopulate the $rules array and then output the results.

This method can be used to bulk update the following rule attributes:

  • Comment
  • FileVersion
  • LanguageId
  • NewProductName
  • OriginalFileName
  • Path
  • SiteCode

More detail can be found here: https://docs.microsoft.com/en-us/powershell/sccm/configurationmanager/vlatest/set-cmsoftwaremeteringrule

Windows 10 November 2015 Update disconnected me from the internet but there is an easy fix

I updated my home Windows 10 computer with the November 2015 (10586) update this week and when the process was complete I could not connect to the internet. I’m sharing my experience here in case someone else experiences the same issue because there is a very simple fix.

My initial thought was that the NIC drivers were updated but I checked and they hadn’t so my next step was to check Ethernet adapter device properties.

 

Network settings
Click the network icon in the system tray and the select “Network settings”

 

Change adapter settings
Select “Change adapter settings”

 

Ethernet adapter properties
Right click your primary Ethernet adapter and select “Properties”

 

When the Ethernet adapter properties window opened I noticed straight away that there was a problem. I would normally expect to see more that one use selected for the primary network adapter.

Ethernet Properties
There is a distinct lack of ticks here

 

For internet connectivity you require IPV4 and IPV6 so I enabled these as the minimum and the internet sprung back into life.

Ethernet properties
Ticks are good

 

IPV4 and IPV6
IPV4 and IPV6 are required for internet connectivity.

 

This might not be a problem for all users as I suspect it is related to the multiple virtual network adapters on my computer.

I hope this post helps some home users to a quick resolution to the problem.

Planning the bulk import of applications into Configuration Manager

import-cmapp-flow-ps-broad

I love working to a plan; I like to think through what is the best way to build something in the best possible way before implementation, however this isn’t always possible, and in some situations not desirable. I’m not talking about always planning the full life-cycle of a product or project but at least planning the first iteration, release, milestone or phase; we can of course assign these labels to interchangeably.

During my time at University I took the game development module and my lecturer used to talk about the “fuzzy” period at the beginning of a project. This is the time when we haven’t fully decided what we want to create but this is also applicable to any project. It is the period of confusion at the beginning where we haven’t fully dissected the problem; we don’t have a plan, we don’t know what tools to use and sometimes we don’t even know where a successful project will bring us. Luckily this last point is not always problem because for most projects we know precisely what the required result is, however you can rarely determine in advance what is the most appropriate approach for a project or what are the most appropriate tools without first knowing the problem(s) intimately.

Sometimes people come to me half-way through a half-baked plan facing multiple issues; this is a frustratingly common occurrence. I can handle my own lack of planning but handling someone else s means figuring out what has been done so far.

The problem

We were recently approached by a customer faced with a hard deadline enforced by the end of life of ITCM on the 31st of January 2016. They have 3 months to migrate to a new deployment architecture.

They have hired a consultant to sequence all the applications that can be sequenced and the remaining packages will be imported as they are. This work has already commenced but when I asked if they had applied a consistent naming convention I was informed that they had not. Great.

Although this is both encouraging, for the fact that they have done something proactive, and frustrating, because they have implemented it badly or without pre-planning, it is not an insurmountable problem.

The first phase should be application discovery and rationalisation exercise in order to remove some effort wasted dealing with applications that have overlapping functionality, are very old versions or applications that are simply not used anymore within the organisation. You would be surprised how many applications can be excluded after this exercise. There are even scenarios where applications costing thousands of pounds in license fees are simply not required.

 

This next phase is a little speculative, because I am not in possession of all of the facts and circumstances for this specific case, but for a mass migration of applications like this there are tools to facilitate the bulk of the grunt work. For instance you could use InstallShield AdminStudio’s batch packaging tool to create a library of crude MSIs; these MSIs could be imported into a tool like Citrix AppDNA (the customer already uses XenApp) or Dell ChangeBase both of which can automatically generate App-V packages and highlight further issues.

 

However, we were engaged too late in the process to affect any of this so I am just venting.

 

The case against the manual method

This one is pretty easy: an SCCM administrator will spend an average of 3 hours per application to import and configure each individual application;

Tasks, multiplied by 400
  1. Create the application in the SCCM console
  2. Customise the command line to include a log file (MSI only)
  3. Distribute the content to the distribution points
  4. Create the per application user collection with membership based upon an AD group
  5. Create the deployment to the new collection
Accumulated time
  • 400 applications x 3 hours = 1200 hours
  • or 150 days
  • or 30 weeks
  • or 7.5 months

Aside from the fact that this would be an horrendous experience for the administrator(s) this would be a massive waste of skilled resource.

Inspiration for Automation

Unnecessary automationDavid O’Brien has created a wonderful script to import MSI and legacy setup.exe applications based on an XML definition: here.

Deepak Singh Dhami has some useful examples for creating folders and moving applications into them here and another for distributing the content to the distribution point, creating a collection and creating the deployment here and here.

Armed with this inspiration I put together a proof of concept to demonstrate the possibilities. You can see the successful POC script here.

However PowerShell is not the only option. We can use the Configuration Manager SDK and C# to perform many of the same actions. You can see a basic example here and this is also something that we will look at for this mini project.

Preparing for automation

App organisationIn our environment applications are stored in folders according to vendor names as shown to the right, therefore we require the vendor name for each packaged application.

Given the absence of a naming convention we have a conundrum; I know from experience that we can extract MSI details using WMI queries but it is a little trickier with App-V. We can extract information in the package (using something like this) but the application manufacturer is not stored as a separate entity within the package.

This unfortunately means that the process cannot be fully automated but we can delegate a more menial task; organising the directory structure.

We will define a strict naming structure for the directories that contain the packages; we can then traverse this to extract the relevant information. This also gives us a repeatable formula that can be reused in ongoing or future projects.

Proposed automated solution

As with any early stage planning this could change but we will set out with the following process in mind.

SCCM-app-import

As a pre-requisite all packages should be organised according to a pre-defined folder structure.A strict naming convention should be applied that clearly separates the vendor name, application name and version number.

  1. Packages scanned for structure and naming integrity.
    1. Report issues and break or continue.
  2. Packages scan creates an XML document detailing application properties ready for import.
  3. The XML is processed, each section will import an application into SCCM;
    1. Create the application.
    2. Move the application into the vendor folder.
    3. Create the deployment type.
    4. Distribute the content to the distribution point(s)).
    5. Create the collection.
    6. Deploy the application to the collection.
  4. Report successful and unsuccessful imports.

Next steps?

Planning is one thing but plans are rarely static; they evolve as we experiment and then evolve some more during testing, but this is not a reason to omit planning. Building upon solid foundations ensures that the most value is extracted from the exercise. For instance I can already see possible uses for the code that we will create for this project; the import of future applications on an ad hoc basis could be automated giving further cost savings on an ongoing basis. For this reason we should develop the code with this in mind. A further advantage of automating future application imports would be in ensuring consistency in the environment.

Sometimes the first urge is to jump straight in a write the code but putting a little forethought in is always worthwhile. I always try to write code with reusability in mind; it means that with every piece of code I write my job becomes a little easier, or in case a customer is reading this, I become more efficient.

Next we will develop the script in PowerShell or C#. I will write a followup blog post detailing the decision process and the result. Please let me know your thoughts so far in the comments section or if this post has been helpful for you.

Proof of concept to mass import, organise, distribute and deploy applications in Configuration Manager

import-cmapp-poc-header

In time I will update this post with a breakdown of how it works but I am posting it in full now to support my blog post on Planning the bulk import of applications into SCCM 2012 R2

It is heavily influenced by the following three articles:

This script is not intended for production use but everything works as intended.

Here is the XML file that I used for testing:

 

Introducing the Mark Allen IT – End User Computing blog

introduction-header

The reasons

I have been working in IT for 13 years now and have always been ready to assist my colleagues when necessary, however I feel that something is missing. To help a colleague resolve an issue they are struggling with is a satisfying experience but I can help more. I can’t count the number of times I have found solutions to problems that are not documented on the internet and every time I reach the solution I immediately jump into the next problem.

MarkHenryAllenEACV-500px
My work history

 

analysis

I am also motivated for selfish reasons, because things are rarely completely altruistic, I never take the time to really reflect on what I have done. I do think over things and realise how I could have completed a task in a better way but it is not as thorough as the examination you need to give your work when you are publishing your methods to the world. Continuing on the selfish theme I also feel that if my profile was raised that I would get the opportunity to work on some more interesting problems.

 

 

The topics

So what do I do? I am a man of contradictions in this respect.

proprietary-technologies

 

My bread and butter, my day job, involves proprietary enterprise software mainly from Microsoft; App-V, SCCM, Windows Installer, Windows desktop management, PowerShell, Citrix XenApp & Xen Desktop, VMWare ThinApp and more.

opensource

 

 

Outside of work I love OpenSource solutions and the community. I have done an extensive amount of work in PHP and MySQL; I have created my own custom CMSs in pure PHP back in 2002 and from about 2007 using frameworks like Codeignier and CakePHP. This list is not exhaustive but forms the core of my experience.

 

 

 

The target audience

I’m hoping to connect with like minded individuals across enterprise IT, the OpenSource community and freelance website developers; please share your thoughts and let me know if there are any topics you would like me to cover.

The aims

github

I want to share my experiences but also to perform more structured analysis of my own work; I want to scrutinise my own work before allowing others to do the same. I’m also hoping to contribute more to the OpenSource community; I’ve been a member of GitHub for 5 years and I have zero public commits. I aim to rectify this and document it here.

So this is my mission statement; I hope you enjoy my blog.