Defender for Endpoint Upgrade Script – FOR ALL!

Working with a customer on the MDE Unified Installer for Windows Server 2016/2012R2 we ran into the issue that SCEP was installed and thus blocking the Unified Installer. Therefore, instead of the Install approach we really needed to perform an Upgrade, but would that mean we needed an approach for servers where SCEP had been installed vs. servers where SCEP was not installed? Answer: No!

MDE Unified Installer Upgrade Script

Microsoft has already published the Unified Installer Upgrade Script which allows organizations to move from the SCEP + MMA MDE approach to using the Unified Installer (which includes a number or extra capabilities). However, what is the necessary configuration of scripts, installers, etc. and is it only useful for upgrading is a bit vague so I’ll cover that below.

What does the script do?

The upgrade script takes a few actions, starting with removing the OMS Workspace and Workspace ID (Lines 220-236 of the script)…assuming you provide it. If you don’t use the RemoveMMA parameter no change will occur with MMA, so you could in theory end up reporting twice about the device (Note: I have not tested this scenario as I think you should remove the OMS information from MMA when moving to the Unified Agent).

Next, the script checks the registry to determine if the SCEP client was installed (Lines 253-267) and performs an Uninstall of SCEP. On line 257 the script assumes that the installer/uninstaller for SCEP is located in the standard Program Files path, so a custom install path for SCEP may cause issues (Note: I have not tested to verify this).

For Server 2012R2 instances the script ensures that two hotfixes (KB2999226 and KB3080149) have been applied and if not applies the Hotfixes (Lines 269-327).

Now that the server is ready to have the unified agent installed the script executes a quite MSI (no UI) install of the agent.

Finally, if the OnboardingScript parameter is provided, the upgrade script will execute the onboarding script (.cmd file) that is used in standard Windows 10, Server 2019 Onboarding GPO and the device will onboard to MDE.

Considerations for using the Upgrade Script

Like the onboarding script used by Windows 10, 11, and Server 2019 the upgrade script (install.ps1) needs to be in a location where all of the machines that will use it can read it. I recommend following the same guidance for the upgrade script as outlined here in Step 2 that is provided for the Onboarding Script. I would also recommend you consider storing the upgrade script in the same location as the onboarding script.

As detailed in the above section there are several parameters (RemoveMMA, OnboardingScript) that control how the upgrade script executes, but one important consideration was overlooked: the location of the Unified Agent’s MSI file. Currently, the Unified Agent’s MSI is assumed to be stored in the same location ($PSScriptRoot) as the upgrade script (ref Lines 99-105). Therefore, when you are setting up your shares a file locations be sure to locate the md4ws.msi in the SAME folder as the install.ps1 script!

If you have any servers that have a configured MMA agent, include the RemoveMMA parameter to ensure that MMA and the Unified Agent are not trying to report the same/similar information to MDE. If the server is not running MMA, or is not reporting to the workspace the script will detect this and skip removing the workspace. If the MMA agent is reporting to OMS and MDE only the MDE workspace will be removed.

Use the OnboardingScript parameter! Although you could chain the upgrade script with the onboarding script I don’t see a driving value for doing this. Using the OnboardingScript parameter will cause the immediate onboarding of the device, so you don’t need to worry about applying multiple GPOs or chaining GPO tasks, the script handles the right actions at the right time.

Finally, because you are running a Powershell script be mindful of execution policy that may be set on your Servers. Although the script is signed when I did initial testing I found that my execution policy was too restrictive to allow the script to run successfully.

Example GPO

I created a GPO exactly like the directions for Windows 10/11 or Server 2019 for use with the upgrade script (Immediate Task, Runs as System, Run with highest privileges, etc.).

For the Task itself the command I used was:

Program/Script: Powershell
Arguments: -ExecutionPolicy Bypass \\sharelocation\install.ps1 -OnboardingScript \\sharelocation\WindowsDefenderATPOnboardingScript.cmd -RemoveMMA MMA-MDE-Workspace-Guid

You can refer to my previous post about using a WMI Filter to target deployment to only Server 2012R2 and 2016 instances if your servers aren’t segmented into different OUs.

Defender for Endpoint Unified Package for Server 2016 and 2012 R2

Recently Microsoft announced the public preview of a unified EPP and EDR package that allows a similar onboarding approach for these servers as Server 2019, Windows 10, and Windows 11. Recently, a customer I support wanted to test this new method and perform deployment using the GPO methodology.

The documentation for how to set up and configure the GPO is available here and provides a great step-by-step guide. However, the guide only addresses linking the GPO to an OU, but for many customers having an OU per Server Version isn’t likely. This customer did have their servers were grouped into a couple of OUs, but not by OS version, so we needed to find a WMI Query that would target the correct set of machines.

Below is the WMI Filter for Server 2016 and 2012R2 that I was able to derive using resources listed below. I don’t claim this is perfect, but hopefully it is a good starting point for others.

Select * from Win32_OperatingSystem Where (Version like "10.0.14%" or Version like "6.3.96%") and ProductType="3"

Useful Resources

Wikipedia has a fantastic Windows Operating System list that covers both User and Server OS’s. The Version Number column makes up the first two place values of the WMI Operating System’s Version value. The Latest Build column makes up the final, third, segment of the WMI Operating System’s Version value. However, when you get to the Windows 10 core OS’s, Server 2016 and higher, only build numbers are listed in the Version Number column. You should refer to the WMI object that is returned by your machine, but in this scenario all of the OS’s (2012R2 and 2016) both use the 10.0 start to their version numbers.

Get-WmiObject was a key PowerShell command because it allowed for testing of parts of the WMI filter on the machines. In this scenario because we were working from Windows Versions the WMI Object we needed was Win32_OperatingSystem so the following command allowed for quick review of the WMI object

Get-WmiObject Win32_OperatingSystem

Adding the -Filter parameter allows for testing of the Where portion of the WMI Filter. If the filter matches the current machine then the WMI Object is returned, and if the filter fails to match then a Null result is returned.

Get-WmiObject Win32_OperatingSystem "(Version like '10.0.14%' or Version like '6.3.96%') and ProductType='3'"

Finally, using the WMI Filter documentation to target End User OS vs. Server OS vs. AD Servers allowed us to avoid the overlap with End User OS’s and avoid automatic deployment on Domain Controllers.

Automate Accounts for Azure AD

Azure AD’s B2B capability is a really powerful way to leverage identities from outside of an organization, but is it the right solution for seasonal, temporary, or white listed employees?  Maybe, maybe not, and if not then the creation of cloud only accounts may require a time consuming (possibly manual) request > approval > provision process.

Recently I had a customer that asked how we could automate an account provisioning processes that allow for a request, an approval workflow, automated account provisioning, association of the account with a ‘manager’, an automated actions if the ‘manager’ departed, and time boxing of the account.  In order to minimize development and utilize as much Out of the Box as I could I turned to Flow.

Start with SharePoint

So this is the benefit of experience: I actually started with Flow and discovered the template for Flow, SharePoint, and Azure AD.  Because I started with Flow I didn’t think about what data I wanted to capture first, I just wanted to get accounts creating and would add fields as I needed them.  This lead to some issues, probably because I’m impatient, between adding field and having those available in Flow.  Therefore, I recommend YOU think about the information you need to capture from a user, build your SharePoint list and then proceed.

I decided that I would create a new site for tracking requests and host my request list in this location.  In a real world environment this would allow an organization to have a single account request location which I viewed as valuable.

I created a list as shown below (Title will be used as the last name)

SP List

All fields are Single Line of Text except for Review Status which is a Choice field with Pending, Approved, Rejected as the options with Pending as the Default value.

Create your workflow with Flow

I am by no means a Flow expert, thanks to this demo I learned a little bit, but I really needed a simple place to start.  Fortunately, if you go to Flow select Templates and Search for Azure AD the second template is Create Azure AD User from SharePoint List.

Flow Templates

Once the flow is generated you need to update the first action with your SharePoint site Url and list name.

Flow Item Created

You can skip the second action as this will generate a random password for the account.

Next, you need to update the Create User step based on the fields you created in your list.  You can also use Expressions to customize the values you want to use when creating the user.  For example I use the following to create a username:

concat(triggerbody()['FirstName'], '.', triggerbody()['Title'], '')

Flow Create User You will also notice that I’ve clicked on the Show advanced options and updated the Business Phone, Department, Job Title, Mobile Phone, Office Location, and Preferred Language.

Account creation will fail if Preferred Language does not meet the specific format.  Business Phone can be an empty array, but cannot accept a null value.
eg. [] – ok
[null] – failure

Next, update the Update item action to set the current item’s ReviewStatus value to Approved.  You will also notice the IsComplete field with a value of true, this field needs to be added to your SharePoint list or else the Update item action will fail.

Flow Update Item.png

Finally, update the Send an email action to utilize the values captured from the list.

Flow Send Email.png

Now you should be able to test you Flow by creating an item in the SharePoint list and observe the execution of your flow, and if there are errors then you can perform troubleshooting and resubmit.

Flow Runs

Add the Review

Now that the creation process is working update the flow to include the actual review phase and condition handling. Add the Start and wait for an approval (v2) action to your flow AFTER the Initialize variable step and configure it as shown.

The Initialize Variable cannot happen within the Condition portion of the workflow, so you may as well initialize this immediately after the flow starts.

Flow Wait for Approval

Next, add a Condition action to your flow.  Update the Condition to use the Outcome of the Start and wait for an approval outcome to be equal to ‘Approve’.

Flow Condition

Finally, move (yes drag and drop does work) the Create User, Update Item, and Send an email actions into the If yes segment of the workflow. You should also add a Send an email to the If no segment of the workflow and send the user a notification that their request has been rejected.

Flow Condition Branches

I recommend testing again to make sure your approval process works as expected, and be sure to test both the Approve and Reject.

Collect Requests with Forms

Now that our flow works we need to set up a way for people to submit requests to be reviewed and approved/rejected.  Microsoft Forms is a simple way to create the request form you need and allow it to be shared outside of your organization.

Creating a Form is really easy so I won’t provide the full details, but create a new Form that captures the same information that the SharePoint list stores.  Don’t include the workflow type fields like Approval status and IsComplete field of course.  Here is an example of the Form I created.

Form Example

As you can see I provided friendly names for each of the user input fields and marked everything as required.

Now you need to allow this Form to be accessed by anyone with the link.  To do this click on the Share button in the upper right of the browser window and select the Anyone with the link can respond.  This will allow you to copy the URL and send it to any external participants.

Form Share

Tie this all together

The final part is to pull our Form submission into our SharePoint list, and again we go back to Flow for this and use an existing Template.

Form Flow

After creating the new Flow from the Template you need to customize the When a new response is submitted Action and select the form you just created.

Form Flow New Response

In the Apply to each action update the Get response details and select the form you created.

Form Get response details.png

Finally, update the Create item by selecting the Site Address and List Name, then expand the Advanced Options so that all the fields from your list display.

Form Create Item

Save your flow, and go test your solution from Flow to Account Creation.

Wrapping Up

You should now be able to share your Form with people outside of your organization, have them submit the form, record the entry in SharePoint and have the Approval process kick off and the account creation be performed.

There are lots of Flow templates and clearly the Approval process doesn’t specifically require SharePoint to store the item, so there are probably hundreds of ways to approach this problem.  However, I like this method because I can see the data move from Forms to SharePoint to Azure AD and creating tracking and report solutions are easy.

Incorporate Azure AD with your Angular App

I began my career as a software developer and I still love the opportunity to tinker with code from time to time.  Since I usually deal with authentication and identity I have a need from time to time to demonstrate how customers can add their own custom applications to Azure AD and how the protections can be applied.  So, I spent a few days recently building and testing my own, single page, custom application based on the latest version of Angular (Typescript).

While I could detail what I did to get the project working, it is probably easier to provide the various links I used to learn Angular as well as the libraries I used and added to get the project working.

Getting Going

Since I had ZERO experience with Typescript and the latest TS Angular I started with the Tour of Heroes tutorial.

Second, I was able to find the Angular-MSAL library available here on Github.  I recommend going here so you can read the friendly documentation, but use ‘npm install @azure/msal-angular’ to add this to your development project.

Third, I followed these directions to register my application in Azure AD.

Finally, I used the sample application found here to make my application.  This is where I found the most trouble so below I’ll focus on some of the issues I had.

Issues I Had

The first issue I ran into was that every time I logged in I would get an error about lacking some api permissions.  Searching for the error didn’t provide really relevant information so I started to eliminate as much as I could.  What I discovered was that during the  LoginPopup call the Sample code I copied and pased into my app include ‘api://a88bb933-319c-41b5-9f04-eff36d985612/access_as_user’ which is unnecessary for Login and user queries so I removed it.

The second issue I ran into was that the MsalService.getAllUsers() only returned my local user’s information, which is actually documented, but I wanted that ability.  Instead I had to call directly against the Graph services to get that information which you can find my solution here.

My App

If you are interested here is the app I created.  Yes there are still some issues which I’m working on, but it may be an easier starting point for others.


The Identity stupid!

James Carville’s campaign strategy for Bill Clinton’s ’92 campaign was “The economy, stupid!” These 3 words left no doubt to what was important, what to focus on, and the fact that getting the Economy right would make everything else possible.  Today, as we look at changes to the corporate IT network and infrastructure we should adopt a similar slogan:

The Identity, stupid!

Identity is a core enabler of modern solutions be they Collaboration, Security, from IaaS to PaaS.  Companies in the past could rely on physical controls to secure information, but today Cloud and the interconnectedness of businesses has destroyed those controls. So where does this leave us in a world where we don’t control where information is accessed from, by what devices, or where the information is stored? We are left with one truth, unique to everyone and applicable to devices and data: Identity.

The funny think is, we’ve known identity has been important for a long time. If you ever took a class on journalism the first thing they taught was the mantra “Who, What, When, Where, Why.” When you log into your computer today the first thing it asks you is: Who are you? When you go to buy a car, boat, or house you have to tell them is who you are.  You even have to tell the barista at Starbucks who you are!

Identity is important, so protect it!

Identity is the control mechanism today for enabling technology, if you secure the identity you’ve gone a long way to securing your systems and your data.  Here are some methods to improve your organization’s identity strength without hampering their ability to do work.

Update your password policy

Recently even NIST updated their password policy (Section to reduce the artificial complexity rules, changes passwords only when suspected of compromise, and perform checks against ‘dirty words’ and previously compromised passwords. At Microsoft the use of ‘Seattle’, and ‘Seahawks’ are rumored to be banned (I wouldn’t know because I don’t live in Seattle and I’m not a Seahawks fan).

Beyond these recommendations think Passphrase not Password.  The longer the password the more difficult it is to guess so brute force and dictionary attacks are less likely to be successful.

All of these policies are easy to implement, prohibited words/phrases, detection of compromised passwords, and password length controls, and even self service password resets are built into Azure Active Directory.  Azure Active Directory can become the central hub for password management with the ability to synchronization changes to your on-premise systems.

Enable MFA

I wrote about this in another post, but seriously if you have any admin accounts that don’t have MFA enabled stop reading this and go GO TURN IT ON NOW!

MFA is one of the simplest solutions to interrupt account compromises, and it has become more common for users because it is used in Banking Apps, Commercial Email, and even Facebook recommends your account be protected with MFA.  At Microsoft we see a decrease in account compromises by over 99%.  Clearly, this is the first step in enhancing the security of your identities.  This is already included in O365 E3 or Azure Premium licenses and enabling it is just a few checkboxes, so there really is NO EXCUSE!

Use data

Monitoring accounts is critical, but there is a lot of information about what is happening in the world, like Dark Web sale of Credentials, that may not show up in your organization’s monitoring of accounts. However, a service like Azure Active Directory which is used by millions of user accounts daily gets lots of insight not only about your accounts but from all accounts, so when an attack is detected everywhere everyone can benefit from awareness and steps taken to block this type of attack.

Use AI and ML

Along with information about what is happening globally around authentications, it is also important to understand what is ‘normal’ and what is ‘abnormal’ for your users.  If users sign in Monday-Friday between 9am and 5pm for 15 years then your identity system should recognize that a sign in on Saturday at 2:30am is abnormal.  In this scenario the system may require extra identity validation (MFA), block the login attempt, or alert your other security monitoring tools and personnel.  This capability is part of the Azure Active Directory Conditional Access which natively learns user behavior patters and can dynamically adapt the authentication experience based on user behavior patterns.

Change Written Policy to Automated Action

If you want to protect identities, really if you want to protect anything these days, then you need to take written policies and automate them in your identity system.  A written policy like “If a password is compromised require a user to change it” requires a user to be notified and then for them to take action.  Instead, your Identity tools should be able to detect the credential compromise and require a password reset (with MFA validation) on the next login attempt.  In Azure Active Directory this can be done with Identity Protection policies, so if a user’s authentication event appears risky then flag the account for a password reset.

Loose the Password

I mention this one last because a Zero Password World isn’t quite there for everyone, but we are close.  With Windows Hello and the Microsoft Azure Authenticator app we are moving closer and closer.  Personally I don’t have a password for any of my Microsoft consumer accounts (Hotmail, OneDrive, etc.) and I very seldom use a password when accessing my Microsoft corporate resources.  Actually, one the rare occasion I am prompted for a password I usually have to perform a Self Service Password reset, because I honestly don’t remember it.

Azure Active Directory has added this ability, but it is currently in Preview (maybe even Private Preview) so customers have to opt it to enabling the capability, but this is coming and I predict by the end of 2019 this capability will be readily and easily available to customers.

The Identity, Stupid!

It is time for us to focus on what is most important to the success of modern IT, both for usability and security, and it is all about Who!  Like the 90’s campaign use this motto/mantra/whatever you want to call it to help you focus on The Identity, Stupid!  If you get Identity right you can make everything else happen.

Azure AD MFA managed by User Account Administrator Role

Many organizations want to delegate enabling and disabling MFA for a user to their helpdesk, but the only RBAC role that allows MFA management is the Global Administrator and no one wants to grant helpdesk technicians Global Admin access to their tenant.  However, there is a way around this RBAC limitation if your organization has Azure AD Premium.

General Concept

At a high level enabling and disabling MFA will be managed by adding and removing users from a security group.  The security group will be included in a Conditional Access policy which defines the MFA requirements.



  1. Admin with Conditional Access administrator role
  2. Helpdesk user(s) with User Administrator role assigned


Have a Helpdesk user create a security group in Azure Active Directory and assign the users your organization wants to require MFA when accessing applications.  Make sure to include a descriptive name like MFA Required Users.


Next, have the Conditional Access Admin create a new Conditional Access rule with Assignments target set to the group created by the Helpdesk user.


Next, select the Cloud apps you want to require MFA before allowing access, or select All Cloud Apps.


Next, choose the option to Grant Access and check Require multi-factor authentication.


Finally, Enable the policy and choose Create.



Now, when the Helpdesk (someone with User Administrator Role) needs to enable or disable MFA for a user all they need to do is add (Enable MFA) or remove (Disable MFA) the user from your MFA Security Group.

O365 MFA vs Azure AD MFA

As a Technical Solutions Professional at Microsoft who covers Identity and Security I get a lot of questions about Office 365 MFA vs. Azure Active Directory MFA around the differences, benefits, and what I suggest.  Customers always assume because I concentrate on the EMS stack Microsoft offers (Intune, Azure AD, Azure Information Protection) I recommend Azure AD MFA over Office 365 MFA, but the reality is when customers really compare the experiences they will almost always go with Azure AD MFA.

Before we talk about Office 365 vs Azure AD MFA let me make this position perfectly clear.

Use MFA! If you are not using, or haven’t implemented, MFA stop reading and GO TURN IT ON especially for your Administrator accounts.

Why?  We, Microsoft, find that by enabling MFA on your accounts the your organization will reduce account compromise by OVER 99%!

Office 365 MFA

Office 365 E3, and up, subscriptions entitle an organization to enable Multi Factor Authentication for their users who will be accessing O365 resources (SharePoint, OneDrive, Office Pro Plus, etc.).  When a user is entitled and enabled to use MFA they have three (3) options:

  1. Azure Authenticator App
  2. Text Message
  3. Phone Call + PIN

To enable Office 365 MFA you must turn the feature on for each user individually (user-by-user), and once MFA is required for the user, it is always required for the user.  Therefore, when a user is authenticating to O365 resources from their work computer or home computer using Office or browser, they will be prompted for MFA verification.

Azure AD MFA

Azure AD MFA is available for organizations that purchase Azure AD Premium P1, or P2, licenses for their users and this Multi Factor Authentication solution can be use with Office 365, Azure, On-Premise applications, third party applications (SaaS), and custom built Line of Business applications.  Like the O365 MFA offering Azure AD MFA provides three (3) ‘native’ options:

  1. Azure Authenticator App
  2. Text Message
  3. Phone Call + PIN

Azure AD also offers customers the ability to use 3rd party MFA providers including the following:

  1. RSA
  2. DUO
  3. Trusona
  4. (More to come)

This additional integration with 3rd party MFA providers means that any existing investment in MFA can continue to be leveraged and we can provide MFA support even in locations where mobile or office phone access is limited or prohibited.

The way an organization applies MFA with Azure AD is also different than Office 365.  When applying MFA with Azure AD an organization does so by creating Conditional Access (CA) rules.  CA rules for MFA can be very simple:

All Users + All App + MFA = Grant Access

Basically this is what the Office 365 MFA solution provides, but limited to O365 apps that is.  However, CA can do much better, it can actually allow you to address questions and policies intelligently:

  • Why prompt for MFA when a user is connecting from a corporate network and is using a corporate device?
  • Why prompt for MFA when a user is connecting to their time card the same way you would if they were connecting to the corporate account line of business application?
  • Why MFA everyone all the time, can we target specific users when they are accessing accessing sensitive information?

Using CA to drive MFA also allows your organization to integrate MFA easily with Windows Always-On VPN solutions.  Now not only do you protect a user when their app connects to a service, but you protect your corporate network when an endpoint device connects and its all managed with the same CA, MFA, and identities.

What drive Azure AD MFA over Office 365 MFA

I find most organizations choose Azure AD MFA over Office 365 MFA for one of these two reasons:

  1. They already invested in an MFA solution, maybe RSA, so the users know it, IT trusts it, and they can continue to use it.
  2. They don’t have to use an All-Or-Nothing approach, they can apply a Who-What-When-Where approach to their MFA policy and only require MFA when necessary.

To me, the greatest benefit of Azure AD MFA is the ability to target MFA scenarios.  I’ve seen many customers push MFA for everyone all the time, and within a short period of time they turn it off because “there was too much prompting”

Azure – PowerShell Capabilities I Love

I use Azure for Development and Testing very heavily with my job as a consultant for Microsoft.  Since most of my work is done deploying systems On-Premises I usually have to build environments for testing of deployment scripts etc.  This means I have the option to go through the Azure Portal and create machine after machine, or I can use PowerShell to script these processes.  As such I have gone through many of the IAAS PowerShell commands and thought I would share some of my commonly used commands.

IAAS Commands I Always Use


So, you create a VM and now you want to configure it before you actually log in, like make it a domain controller or join it to a domain.  No problem, the Set-AzureRmVMCustomScriptExtension allows you to push and run a script file on the Azure VM without needing to log in, and you can even pass arguments to the script. This command does require a bit of information (Resource Group Name, Storage Account Name, Container, and others) but being able to create a VM AND set it up as the domain controller without ever logging in first…you can’t beat that.


This command is a MUST KNOW because it allows you to move content from your local machine to an Azure Storage Blob, and if you want to use Set-AzureRmCustomScriptExtentions, your scripts have to be in an Azure Storage Blog.  This command is actually pretty straight forward, give it the filename (blob), Container, Storage Account Context and the local file path and upload away.


Every time I create a new “Environment” I create a new resource group partly because I’m lazy, but also because I’m really picky.  I don’t like having 2,  3,  4, … environments inside of one resource group because when I script things I really just want to say something like “Start My Resource Group xyz” and let the script handle the rest.  Also when I’m done with an environment I can easily clean it up by using the Remove-AzureRmResourceGroup, and poof its gone.


Need a new VM, here you go.  This command isn’t as straight forward as it seems, really to use New-AzureRmVm you must create the Azure RM Config object and all the necessary elements, but this inside of a simple ForEach-Object loop can save you hours of entering information into the Azure Portal forms.

Runbooks – Stay under that spending limit

Azure Runbooks are one of my favorite capabilities available.  First, the interface is web based so you can write and test your PowerShell directly in the Azure Portal which is a really nice capability.  Second, you can schedule these books to run so if you forget to shutdown and environment, the scheduler will do it for you.  Third, if there was a problem your output from each run is available for review so you can always go back and review the Runbook output and check the script health.  Finally, Runbooks have access to variables stored Outside of the Runbook, so no need to include the admin account’s info in your PowerShell script, just save it in the Runbook’s variables (as a Credential, so the password is hashed) and make nice generic runbooks.

I highly recommend using runbooks to at least stop your development, and possibly test, environment on a daily basis.  My Stop-Daily runbook is configured to run every day at 6PM so I know all of my VMs will be shutdown.  I typically keep my runbook(s) in a separate Resource Group from the different Development/Test environments I create, this way I can destroy the environment without losing the runbooks.

Runbook(s) work within a single subscription, so if you have multiple Subscriptions you will need to create runbooks for each.