Monday, March 04, 2024

Pure Capsaicin at Spiceworks!

As many readers know, I've regularly contributed to the Spiceworks online community (www.spiceworks.com/comunity) for a long time. As well as contributing I've been a moderator for several forums, including the PowerShell (obviously), Windows, Virtualisation, Azure, and of course DNS. I first joined in August of 1972, found it was fun to contribute (and it was a great place to help IT Pros to get the most out of PowerShell. I've been contributing ever since!

Contributors get points for doing things withing the comunity, such as posting a new question (10 points), authoring articles that appear on the Comunity home page (50), and providing a best answer to a query (50).  I'd point you to the pages that explain it but in the recent upgrade that has just taken place (as I write this article) those pages were lost and all the references inside Google no longer provide the page. Hopefully, that will change.

To show a contributor's level of contribution, Spiceworks uses has a level system. The level system is based on different peppers of increasing spiciness (as based on the Scoville Scale) - the more points the higher your level of spicenes. New users (aka Spiceheads) start out at the Pimento level (0-100 points), quickly ascending to Sonora (100-250), Anaheim (250-500) and 12 more to reach the pinnacle Pure Capsaicin ( 250.000 points), also known as PC. I'm told that the PC level was originally a theoretical level that no one was expected to ever reach. WRONG

Reaching the PC level requires a significant contribution over a long time. I was looking at the numbers, and I joined the site just over 4200 days ago, averaging just under 60 points per day over that time period. I have certainly enjoyed the ride so far. I've met some awesome people, had fun helping others and learned a bit along the way. I confess I've also used the forums on a few occasions, to help me troubleshoot issues I encountered in real life.  

What a long strange trip is's been.


Friday, February 24, 2023

Having Fun with Get-Content

Most IT Pros know and use the Get-Content to get the contents of a file into a variable/array. You can then process the array to do useful things. I use this cmdlet a lot - in my last book (which has 111 total scripts), I used the cmdlet in over 15|% of the scripts.

Most usage cases involve relatively small files, but sometimes the files can get quite big. And when you import large files, you find that Get-Content is slow. There are a couple of reasons for this blog post shows. One important reason is that Get-Content uses a PowerShell provider - but there is more!

As it turns out, you can't use Get-Content with the Registry, Certificate, or WSMan providers. And for all but the File System provider, the cmdlet does not return much of value (or return information you could not easily get another way). I would argue almost all usage of the cmdlet is based on the file system provider.

As an alternative to using Get-Content, you could use the IO.File .Net class, and invoke the ReadAllLines() method.

To test this out, I downloaded a large text file (War and Peace), then tested the two methods of retrieving the text. Here is what I see:

PS C:\Foo>  # 1. Get War and Peace
PS C:\Foo> $URI = 'http://textfiles.com/etext/FICTION/warpeace.txt'
PS C:\Foo> $WAP = Invoke-WebRequest -URI $URI
PS C:\Foo> $Outfile = '.\WarAndPeace.txt'
PS C:\Foo> $WAP.Content |  Out-File -Path $OutFile
PS C:\Foo> Get-ChildItem  -Path $Outfile

    Directory: C:\Foo

Mode                 LastWriteTime         Length Name
----                 -------------         ------ ----
-a---          24/02/2023    15:30        4434672 WarAndPeace.txt


PS C:\Foo> # 2. Get the contents into a variable using Get-Content
PS C:\Foo> $M1 = Measure-Command -Expression {
             $File1 = Get-Content $Outfile
           }
PS C:\Foo> "Using Get-Content took {0:n2} milliseconds" -f $M1.TotalMilliSeconds
Using Get-Content took 663.80 milliseconds
PS C:\Foo>
PS C:\Foo> # 3. Now with .NET
PS C:\Foo> $M2 = Measure-Command -Expression {
             $File2 =  [IO.File]::ReadAllLines($Outfile)
           }
PS C:\Foo> "Using Native .net {0:n2} milliseconds" -f $M2.TotalMilliSeconds
Using Native .net 91.14 milliseconds

As you can see, using the native method is a lot faster (nearly 6 times faster).  But why is this?

Well, the first reason is that using a provider is just slower. But another reason Get-Content is so much slower is that it adds several properties to every line returned. You can see this as follows:

PS C:\Foo> # 4. look at output types
PS C:\Foo> "Get-Content produces a $($File1.GetType().FullName) object"
Get-Content produces a System.Object[] object
PS C:\Foo> ".NET produces a $($File2.GetType().Fullname) object"
.NET produces a System.String[] object
PS C:\Foo> 
PS C:\Foo> # 5. And look at what Get-Content does for us:
PS C:\Foo> $File1 | Get-Member -MemberType Properties

   TypeName: System.String

Name         MemberType   Definition
----         ----------   ----------
PSChildName  NoteProperty string PSChildName=WarAndPeace.txt
PSDrive      NoteProperty PSDriveInfo PSDrive=C
PSParentPath NoteProperty string PSParentPath=C:\Foo
PSPath       NoteProperty string PSPath=C:\Foo\WarAndPeace.txt
PSProvider   NoteProperty ProviderInfo PSProvider  Microsoft.PowerShell.Core\FileSystem
ReadCount    NoteProperty long ReadCount=1
Length       Property     int Length {get;}

PS C:\Foo> $File2 | Get-Member -MemberType Properties

   TypeName: System.String

Name   MemberType Definition
----   ---------- ----------
Length Property   int Length {get;}

As you can see, Get-Content returns an object array of strings - where each member (ie each line of the text file) has 7 additional properties over and beyond what is in a string array. So if you import a 56,859-line text file, Get-Content adds 390,013 properties to the array that pretty much NO one needs or uses. And that takes time.

So,  if you are using Get-Content to retrieve text from a file, and performance is important, consider using .NET.





Thursday, February 02, 2023

My Latest (last?) PowerShell Book is published!

This week I got the news that my latest PowerShell book has been published and is available for order:

You can order it today from all the usual places, including https://smile.amazon.co.uk/Windows-Server-Automation-PowerShell-Cookbook/dp/1804614238.

This book updates earlier editions and covers, specifically, PowerShell 7.2 (as an LTS release) and Windows Server 2022. It should also be useful if you are using PowerShell 7 on earlier versions of Windows Server too. 

Here is the table of contents:
  1. Installing and Configuring PowerShell 7
  2. Managing PowerShell 7 in the Enterprise
  3. Exploring .NET
  4. Managing Active Directory
  5. Managing Networking
  6. Implementing Enterprise Security
  7. Managing Storage
  8. Managing Shared Data
  9. Managing Printing
  10. Exploring Windows Containers
  11. Managing Hyper-V
  12. Debugging and Troubleshooting Windows Server
  13. Managing Window Server with Window Management Instrumentation (WMI)
  14. Managing Windows Update Services

An addition is a chapter on WSUS. The WSUS module is one of the three modules you can not use within PowerShell 7. You can not load the module natively within a PowerShell 7 since the .NET APIs that the module relies on are unavailable (in .Net) Additionally, the normal Windows PowerShell compatibility mechanism does not work with this module because the WSUS module is based on methods and not actual cmdlets. With WSUS, you instantiate the WSUS server instance of the server you wish to manage, then use that object's methods. With the compatibility solution, you do not have access to the methods. 

There IS a way around this - you can create a remoting session to a Windows PowerShell endpoint and do all the work within that session. It is a bit more work: you create the remoting session, create script blocks that perform WSUS management activities, then execute those script blocks within the session.

My publisher is looking for potential reviewers - you get a copy in exchange for writing a review. Contact me if you are interested.





Thursday, June 16, 2022

More Hyper-V Troubleshooting Woes - but a script to help

In a previous post, I noted some challenges I faced with Hyper-V. The TL;DR is that Windows Update breaks Hyper-V, often as a result of a new WIndows Insider Build. In that post I explained the basic fix for this issue. 

Jeffrey Snover once pointed out that when Linux guys are faced with an issue for a second time - they create a script. Well - I am now facing this Hyper-V problem more frequently. The root issue is that an Update can break Hyper-V. The solution is to re-create the VMs. Which can be a lot of work.

As I recovered from the previous failure, I started to write a script to so many of the actions. This week, I had ANOTHER update (WSL) that broke Hyper-V. So to recover, I write the script below. I hope you find is useful.

# fixing hyperv
# a script in several parts

# part 1 - remove Hyper-V from Win 11
# run in elevated command prompt

# 1.1 View what is there
Get-windowsoptionalFeature -FeatureName Microsoft-hyper-V* -online |
  Format-Table -Property *Name,State

# 1.2 Remove the Hyper-V optionalfeature  
Get-windowsoptionalFeature -FeatureName Microsoft-hyper-V* -online |
  Disable-WindowsOptionalFeature -Online


# note this pops up a 'do you want to reboot' - say yes/.
#
# Note the reboot may fail (trying and not succeeding to stop the hyper-V service)


# Step 2. After the reboot from step 1

# 2.1 View Hyper-V Data Folder
$DataBase = "$env:ProgramData\Microsoft\Windows\Hyper-V"
Get-ChildItem -Path $DataBase

# 2.2 Remove the Hyper-V Data
Get-ChildItem -Path $DataBase -recurse |
  Remove-Item -Recurse -Force

# 2.3 Check the features
Get-windowsoptionalFeature -FeatureName Microsoft-hyper-V* -online |
Format-Table -Property *Name,State

# 2.4 Re-Add Hyper-V to Windows and Reboot
Get-WindowsOptionalFeature -FeatureName Microsoft-hyper-V* -online
| Enable-WindowsOptionalFeature -Online

# Reboot


# Step 3 - Creater New Switchs

New-VMSwitch -Name Internal -SwitchType Internal

$NIC = Get-NetAdapter | where {($_.status -eq 'up') -and ($_.Name -notmatch 'vEthernet')}
New-VMswitch -Name External -NetadapterName $Nic.Name -AllowManagementOS:$true


# Step 4 - recreate the VMs

# At this point, you may need to merge any difference disks. So far I have not automated this
# So use the GUI



# Cookham1
$vmbase = 'D:\VMs\Cookham1'
$Vhdx   = "$VMbase\Virtual Hard Disks\Cookham1.vhdx"
New-VM -VHDPath $vhdx -Generation 2 -MemoryStartupBytes 8gb -Name "Cookham1" -Path $vmbase
Start-VM -VMName 'Cookham1'



# Reskit VMs
# DC1
$vmbase = 'D:\v9\DC1'
$Vhdx   = "$VMbase\dc1.vhdx"
New-VM -VHDPath $vhdx -Generation 2 -MemoryStartupBytes 8gb -Name DC1 -Path $vmbase
Start-VM -VMName dc1

# DC2
$vmbase = 'D:\v9\DC2'
$Vhdx   = "$VMbase\dc2.vhdx"
New-VM -VHDPath $vhdx -Generation 2 -MemoryStartupBytes 8gb -Name DC2 -Path $vmbase
Start-VM -VMName DC2

# SRV1
$vmbase = 'D:\v9\SRV1'
$Vhdx   = "$VMbase\SRV1.vhdx"
New-VM -VHDPath $vhdx -Generation 2 -MemoryStartupBytes 6gb -Name SRV1 -Path $vmbase
Start-VM -VMName SRV1


# SRV2
$vmbase = 'D:\v9\SRV2'
$Vhdx   = "$VMbase\SRV2.vhdx"
New-VM -VHDPath $vhdx -Generation 2 -MemoryStartupBytes 6gb -Name SRV2 -Path $vmbase
Start-VM -VMName SRV2


# UKDC1
$vmbase = 'D:\v9\UKDC1'
$Vhdx   = "$VMbase\UKDC1.vhdx"
New-VM -VHDPath $vhdx -Generation 2 -MemoryStartupBytes 6gb -Name UKDC1 -Path $vmbase
Start-VM -VMName UKDC1

# Step 5. reset networking

# On DC1.DC2 - turn OFF the DHCP serice temporarily!
# Remove all Reservations and zones

# Then On a VM by VM basis:
#   Open the VM'S Hyperv- Settings and remove Nics
#   In the VM, use Device manager, view hidden devices and Remove ALL the Hyper-V NICS
#   In the VM's Settings - add two nics: Internal (Ethernet) and External (Ethernet #2)
#   Re-configure the Internal statisc NIC's IP address (see reskitnetowrk.ms)


# Recreate the DHCP Scopes and reservations and test

# Test internal communicationns (from VM to each DC and server)
Test-NetConnection DC1
Test-NetConnection DC2
Test-NetConnection UKDC1
Test-NetConnection SRV1
Test-NetConnection SRV2




Tuesday, May 24, 2022

Troubleshooting Hyper-V in Windows 11

 I love Hyper-V inside Windows 11 (and inside Windows 8.x and 10 in their day). An utterly simple hypervisor (to use) but one that provides a lot of benefits. I have written 4, and shortly 5, books that made use of this feature. I have created a nice server farm (2 forests, 3 domains, multiple servers, etc) which runs nicely on my boxes. The scripts to create the VMs, and configure them are all published to GitHub,. 

Now to run a VM at a reasonable speed, you need to give the VM resources. My old Precision T7500 did not and could not support the hardware necessary for Win 11. I now have a Precision T7920 with 2 16-core Xeon Gold processors, 128 GB of ram, 1 TB of Nvme SSD, 2x2TB plus 1x1TB SSDs. So a LOT of storage, a LOT of memory, and a lot OF CPU. My farm works great. Except when it doesn't.

I run this box as part of the Windows Insider's program. That means I get a more-or-less weekly upgrade that brings new features and bug fixes. It is a lot of fun - except sometimes the upgrade does NOT work. Over the years I have had numerous issues with an upgrade and have generally been able to back up to an earlier build or take a newer build (with a fix). For the most part this has been an inconvenience to me, but useful for the program for a while - I just love providing feedback.

One specific set of issues caused by an updated Insider build is that Hyper-V does not work. This has happened now around 5 times over the years. NOTE: This was originally written in 2002. Since this blog post was first published, this issue has affected nearly every Windows Insider update I have taken since.

Backing up to an earlier build sometimes cures the problems. But all to often, I just could not get things going again. This happend again a few days ago - and this blog post notes some of the issues and some troubleshooting tips (at that time).

So after an upgrade, starting the Hyper-V MMC showed something like this:



As I hope you can see, the MMC is showing two VMs trying to start. One is stuck at restoring (10%), the other at just restoring. Trying to stop a VM from PowerShell was also not successful. 

Troubleshooting tip #1

As long as you have the VHDX(s), you can always rebuild the VM. And if you created the VM with a PowerShell script, re-creating it is trivial.

So with that in mind, I did the obvious thing:  I uninstalled Hyper-V totally and rebooted. Then, I "hid" the folders containing my VMs and re-installed Hyper-V. My thinking was that this would clear everything away, and I could unhide the VM folders and re-import the VMs. Except it didn't work/

Having removed then re-added Hyper-V did NOT clear down the configuration. In fact, what I now see is:

So even though I removed and re-added Hyper-V (and hid all the folders containing the VMs/VHDXs) Hyper-V still remembered the old VMs. This seemed illogical until a nice MS person explained it as the de-installation does not take everything related to Hyper-V off your box - in case you want to reinstall it - your configuration magically reappears. 


Troubleshooting tip #2 and #3

A conversation on Twitter led to the discovery. The list of VMs that the MMC uses when it starts up is contained in a file: $env:ProgramData\Microsoft\Windows\Hyper-V\data.vmcx although it was different in earlier versions of Windows.

The second troubleshooting tip: is that it is in binary form, so hand editing is not really possible. I was also told that removing the file totally was a bad idea since it contains other valuable data, too.

So how to recover? 

Troubleshooting tip #4

As noted, the details of the VMs and Hyper-V itself are, by default, contained in a folder (with sub-folders) at $env:ProgramData\Microsoft\Windows\Hyper-V. So to get back to a pristine state, remove the Hyper-V feature from Windows 11. After the reboot, remove this folder totally then re-install Hyper-V and reboot. After the reboot, you should see a nice clean MMC console!

With a pristine Hyper-V environment, I have to re-create the virtual switches and then re-create the VMs. It appears that removing the Hyper-V feature deletes all the VM data except the actual VHDXs themselves. So recreating the VMs meant just building a new VM that contained the old VHDX. 

There was only one final problem - networking. If you remove a vNIC from a VM, and then add a new NIC, Windows sees the new NIC as, well, a NEW NIC and creates a new configuration for it. But it also keeps the old configuration. This can cause some issues, including not letting you rename the connection inside ncpa.cpl - Windows claims another connection already has that game.

If you open Device Manager, you see just the most recently added vNIC, something like this:



Troubleshooting tip #5

If you enable hidden devices inside Device Manager you can see the "removed" net adapters.

In Device Manager, click on View, then select Show Hidden Devices then you see:


You can then right-click each adapter and uninstall the device. If you then click Action, and then click Scan for hardware changes - you should see just the actual vNICs in your VM.

Troubleshooting tip #6

If you add a 'new' nic to a VM, Windows sees it as a brand new NIC, so sets the device to get its configuration from DHCP. 

This might be fine if the host was DHCP configured but not if you configured it with static IP addresses (eg for a DC or a DNS server). Fortunately, you have PowerShell and can easily script the NIC configuration. 

I hope this helps someone!

[note: this issue has been with me for a long while. This post was first written in May 2002, but the underlying issue has been consistent since them. I have made some updates (and fixed typos) to this post in May 2023.




Saturday, March 19, 2022

Configuring PowerShell 7 With Group Policy

Introduction

Group policy is a feature of Windows Server Active Directory which automagically deploys groups of policies to users and computers. A policy is some computer setting you wish to enforce, such as which screen saver to use, what desktop background to use, or what the default execution policy should be. 

Windows PowerShell has for a while allowed you to set certain group policies to control how PowerShell works. Windows PowerShell 5.1 provides five specific policy settings. PowerShell 7 provides all the Windows PowerShell policies, plus one more. I describe each policy below. 

One neat feature of PowerShell 7 is that you can enable independent policy values for PowerShell 7 and Windows PowerShell. Or you can enable a  PowerShell 7 policy and take any values, such as the Execution Policy from Windows PowerShell policies.

You can set policies for a computer or a user. I base the Group Policy Editor and the PowerShell cmdlets make use of administrative templates stored C:\Windows\PolicyDefinitions folder in your DC (or to a central policy store shared on all DCs). The templates include an XML file that defines a policy or set of policy, which has the extension ADMX. Each template definition a localised set of strings stored in an ADML file. The ADMX file contains pointers to strings defined in the ADML file. Having both files enables the GP Editor to use localised language.

After you apply a policy, the group policy agent on the computer creates entries in the user or computer's registry policy area. You can see the policy if you use registry editor or PowerShell and navigate to HKCU:\Software\Policies\Microsoft\PowerShellCore  for user settings, or you can navigate to HKLM:\Software\Policies\Microsoft\PowerShellCore for computer settings.

The group policy agent runs each time the computer starts, and each time a user logs on. The agent also runs at intervals of 2 hours (less a random time up to 30 minutes). To immediately invoke the agent, you can use the gpupdate.exe console application. 

NOTE: This article started out simple, but as it grew, I've had to split it into two. In this article, I look at the GPO settings you can specify and which registry key(s) and value entries they use. Armed with this information, the next article looks at how you set each policy using PowerShell 7. 

PowerShell Group Policy settings

There are six PowerShell 7 related group policies you can deploy

  • Execution Policy
  • Console Session Configuration (new with PowerShell 7)
  • Module Logging
  • Script block logging
  • Transcription
  • Updatable Help.

Setting Policies

Before setting any policy, note that each policy sets one or more keys and value entries in the user or computer section of the registry. Most policies are a single registry key with one or more value entries. Let's look at the individual policies, and the registry keys and value entries deployed.

Since these policies are registry settings, you can deploy them using group policy, or you can set the registry key manually. You could use a .REG file containing the settings you want to apply to a host, or use PowerShell to set the keys and value entries needed for the policy. 

In the following section, I show how you can create the key and value entries needed for user settings of a policy. In the next article, you can see how to create and deploy the group policy settings using PowerShell cmdlets. 

Execution Policy

The Execution Policy policy defines a particular execution policy or whether to disallow all script execution. I would expect you to know that PowerShell's Execution Policy is not a security barrier. More a speed bump to stop a very in-experienced user to do something silly. I would hope you know it is pretty easy to work around a restricted execution policy. 

For most organisations, an Execution Policy of RemoteSigned is probably sufficient. Your IT Professionals might want to log in and have full access without having to work around a more restricted policy, so setting the policy to Unrestricted for members of the IT department might be useful. Mileage varies. 

The registry key set by the policy is
HKCU:\Software\Policies\Microsoft\PowerShellCore
This is for the user policy setting. If you are setting computer policies, you can use HKLM: instead, which is the case with all the other policy keys used.

There are two value entries for this policy:
  • EnableScripts - This value entry enables this policy. It is of type dword and has a value of 1 if the policy is enabled 
  • ExecutionPolicy - This is the execution policy to be applied. It is a string and can contain any valid PowerShell execution policy 
Here is how to set this policy using PowerShell, in HKCU:
# 1. Set Execution Policy
# Create Key
$Key = 'HKCU:\Software\Policies\Microsoft\PowerShellCore'
if (Test-Path $Key) {
  Write-Verbose "Registry Key exists [$key]"
}
Else {
  Write-Verbose "Creating registry key [$key]"
  New-Item -Path $Key
}
# Set value for execution policy 
Write-Verbose 'Setting Execution Policy On'
$CVHT1   = @{ Path  = $Key
              Name  = 'EnableScripts'
              Type  = 'Dword'
              Value = 1 }
Set-ItemProperty @CVHT1
Write-Verbose 'Setting Execution Policy to Unrestricted'
$CVHT2   = @{ Path  = $Key
              Name  = 'ExecutionPolicy'
              Type  = 'String'
              Value = 'Unrestricted'}
Set-ItemProperty @CVHT2


Console Session Configuration 

The Console Session Configuration specifies a remoting configuration endpoint to use. Remote sessions then run against that endpoint. You can specify any endpoint, include a JEA endpoint.

The  registry key set by this policy is:

HKCU:Software\Policies\Microsoft\PowerShellCore\ConsoleSessionConfiguration

There are two value entries for this key:
  • EnableConsoleSessionConfiguration - this entry enables the policy. It is of type dword and has a value of 1 to indicate that the policy is applied.
  • ConsoleSessionConfigurationName - this entry is the remoting endpoint that is used. It is a string, such as "PowerShell.7". 

Module Logging

Module Logging requires PowerShell to log module usage and for which modules. If you set this policy, PowerShell logs pipeline execution events for the specified modules to the PowerShell Core event log. 

Be careful with this setting as you could generate a lot of log entries. If you are going to log module events, have a strategy for reviewing the events generate

This policy requires the use of two registry keys. The first is:
HKCU:\Software\Policies\Microsoft\PowerShellCore\ModuleLogging 

There is one value entry for this key:
  • EnableModuleLogging - This entry enables the policy. It is of type dword and has a value of 1 to indicate that the policy is applied.

The second key used by this policy is:

HKCU:\Software\Policies\Microsoft\PowerShellCore\ModuleLogging\ModuleNames

There are multiple value entries you can specify for this key, each one a module for logging. Each value entry has a name and a value of the module name. So if you wanted to log events for the FOO42 module, you would have a registry value name FOO42 with an associated value of FOO42. You can specify as many modules as you like.

    Script block logging

    You use the Script block logging policy to turn on PowerShell's logging of any important script blocks.  PowerShell does not log all script blocks only those that change something. Nevertheless, this policy can generate a lot of logging. In Windows PowerShell, setting this policy could result in performance degradation although has been much improved in PowerShell 7.

    The  registry key used by this policy is:

    HKCU:Software\Policies\Microsoft\PowerShellCore\ScriptBlockLogging

    There is one value entry under this policy.
    • EnableScriptBlockLogging - This entry enables the policy. It is of type dword and has a value of 1 to indicate that the policy is applied.

      Transcription

      The Transcription policy allows you to automatically create a transcript of every PowerShell session. 

      The  registry key set by this policy is:
      HKCU:\Software\Policies\Microsoft\PowerShellCore\Transcription

      There are two value entries used by this policy:
      • EnableTranscripting - This entry enables the policy. It is of type dword and has a value of 1 to indicate that the policy is applied.
      • OutputDirectory - this is a string and specifies the folder in which PowerShell writes session transcripts. 

      Updatable Help

      The Updatable Help policy allows you to configure a default value fore Update-Help's parameter SourcePath. If you enable the policy, Update-Help uses the setting as a default location for new help information. You can always override this by specifying a different value for the source path when you run Update-Help.

      The  registry key set by this policy is:

      HKCU:\Software\Policies\Microsoft\PowerShellCore\UpdatableHelp

      There are two value entries under this policy.
      • EnableUpdateHelpDefaultSourcePath - This entry enables the policy. It is of type dword and has a value of 1 to indicate that the policy is applied.
      • DefaultSourcePath -This parameter defines the location to be used as the default source path and is of type String. You would probably use a network share. Irrespective,  you should note that in this string you must escape any back slash character with aqn additional backslash. If the default source path is \\dc1\help, you see it as "\\\\dc1\\help".

      Using Windows PowerShell Settings

      As I mentioned above, you can set a policy for PowerShell 7 use but use whatever values you set for Windows PowerShell policy. In most or at least many cases, if you have Windows PowerShell policies set, you probably want the same policies set for PowerShell usage as well.

      To set any of the 6 polices to apply to PowerShell 7, but using the Windows PowerShell session you set two value entries in the six polices review above. The first is the Enable value entry, which enables that policy. Then you add another value entry:
      • UseWindowsPowerShellPolicySetting - This value entry indicates that the policy details should come from Windows PowerShell policies. It is of type dword and has a value of 1.

      Summary

      We looked at the six policies you can set and which registry key and value entries used by each policy. In the next article, I show you how to use the Group Policy cmdlets and PowerShell 7 to create and deploy these policies.

      Thursday, December 09, 2021

      Viewing PowerShell Files in Preview within WIndows Explorer

      Windows Explorer has a nice feature that shows the contents of a selected file. Click on a file and in the right pane you can see the contents. This is a great feature, except it does not work for ALL filetypes out of the box. So would you wish to view a .PS1 file this way - you are out of luck. By default

      By default, the preview pane is disabled. So you must first enable it. With the latest versions of Windows 11, it looks like this:


      But even after enabling this, you still can not view .PS1 files (or .PSM1 or .PSD1 files either) in the preview pane within Explorer.  

      But like just about every default - with PowerShell there is usually a way to override it. And it turns out to make these files visible in the preview pane, you just need to run the following script:

      # Define paths
      $Path1 = 'Registry::HKEY_CLASSES_ROOT\.ps1'
      $Path2 = 'Registry::HKEY_CLASSES_ROOT\.psm1'
      $Path3 = 'Registry::HKEY_CLASSES_ROOT\.psd1'

      # Set registry values to enable preview
      New-ItemProperty -Path $Path1 -Name PerceivedType -PropertyType String  -Value 'text'
      New-ItemProperty -Path $Path2 -Name PerceivedType -PropertyType String  -Value 'text'
      New-ItemProperty -Path $Path3 -Name PerceivedType -PropertyType String  -Value 'text'

      That's it. Once you run the script, you see things like this:


      Another mystery solved

      Monday, October 25, 2021

      Saving PowerPoint Deck as a PDF - with PowerShell 7

       This week, I got into a twitter discussion about a presenter posting their slides for delegates. My personal preference is to always share the slides. Often, especially when I was working for an employer, I'd share a PDF of the file rather than the slides. Some decks had had a LOT of effort put in (some by inhouse graphics guys) and giving those animations away was sub-optimal for IP reasons. 

      My approach was to post whatever material was appropriate to a DropBox share - and let people have that. I created a unique Bit.Ly short URL, and put that on the first few and final slides. Since the shortend URL never changed, I could put whatever content was relevant - letting attendees know this would all disappear in a few week's time.

      What I used to do was to run a short little PowerShell script, during the final development of the deck. If, like SO many presenters, I made a change in the speaker's lounge just before going on - I could run the script just before unplugging and heading to the talk. I could just share https://bit.ly/TFLPresentations and manage the content.

      Converting your PowerPoint deck to a PDF is straightforward - I blogged about how to convert a Powerpoint deck into a PDF using PowerShell a long time ago.  The script uses the PowerPoint objects that are part of Office and which Office setup conveniently stores in the .NET Framework's Global Assembly Cache (the GAC). 

      After the Twitter conversation, I dusted off this script and to my surprise, it did not work. And the reason is that PowerShell 7 uses .NET and not The .NET Framework. The former does not make use of the GAC, so when I tried to load the appropriate assembly I got this error:

      PS C:\Foo> Add-Type -AssemblyName microsoft.office.interop.powerpoint

      Add-Type: Cannot find path 'C:\Foo\microsoft.office.interop.powerpoint.dll' because it does not exist.

      After a few minutes contemplation I realised what the issue was. And by using Void Tools most excellent SearchEvereything product, I found the assembly at the snappy location of: C:\Windows\assembly\GAC_MSIL\Microsoft.Office.Interop.PowerPoint\15.0.0.0__71e9bce111e9429c\Microsoft.Office.Interop.PowerPoint.dll.

      So now I know the location, I could do this to save a single file:

      # Define the files
      $PPTFile = 'C:\foo\test.pptx'
      $PDFFile = 'D:\Dropbox\AAA-TFL Presententations LIVE\test.pdf'
      # Add the DLL
      $DLLFile = 'C:\Windows\assembly\GAC_MSIL\Microsoft.Office.Interop.PowerPoint\' +
                 '15.0.0.0__71e9bce111e9429c\Microsoft.Office.Interop.PowerPoint.dll'
      # Add the type
      Add-Type -Path $DLLFile
      # Create save as PDF option
      $SaveOption = [Microsoft.Office.Interop.PowerPoint.PpSaveAsFileType]::ppSaveAsPDF
      # Open the PowerPoint Presentation
      $PPT = New-Object -ComObject "PowerPoint.Application"
      $Presentation = $PPT.Presentations.Open($PPTFile)
      # Save it as a PDF and close the presentation
      $Presentation.SaveAs($PdfFile,$SaveOption)
      $Presentation.Close()
      # Close PPT and kill the process to be sure
      $PPT.Quit()
      Get-Process 'POWERPNT' | Stop-Process

      This fragment adds in the necessary DLL, opens the presentation from wherever I have it stored today, and then saves the file as PDF into the DropBox folder. 

      I kind of thought that the Quit() method would kill PowerPoint but in my testing, the process hung around hence killing the process at the end of the fragment. Depending on how adventurous you wish to be, you can always wrap this code to get all the PPT files in one folder and save them away. 

      Tuesday, October 19, 2021

      Patching and PowerShell 7

       Many of you reading that blog know I'm a big supporter of PowerShell 7. I hope that many of you share my enthusiasm. After writing several books on PowerShell, and two on PowerShell 7, I find it a pretty good product.

      One of the challenges which PowerShell brings to the enterprise is updates. Like most software products, there are bugs and vulnerabilities in PowerShell. I was reminded again of this fact reading an article from my Powershell Paper.Li paper. Today the top article comes from Bleeping Computer: https://www.bleepingcomputer.com/news/microsoft/microsoft-asks-admins-to-patch-powershell-to-fix-wdac-bypass/.

      The team do a great job in updating the code as soon as the vulnerabilities are found. In some (many) cases, the issue is not in PowerShell itself, but in one of the components, such as .NET. Once the team releases an update, you need to ensure that the update is rolled out everywhere you use PowerShell 7.

      The method you use to keep PowerShell 7 up to date depends on how you installed PowerShell 7 in the first place. And here you have (at least) 3 options:

      • Use Install-PowerShell.ps1 - this is a script you download from GitHub which enables you to install the latest production version of PowerShell, the latest next version Preview and (for the very brave) today's daily builds. If you use this, then you must manually update the software yourself. I love this as I am in control!
      • Use the Microsoft store - you get the released version directly from the store. This should automatically keep your version up to date as the PowerShell team. At present, there is no Store application for Windows Server. You can also get the Preview version from the store (although I do not know how that plays if you also have the released version installed). 
      • Use a third-party repository - you could download PowerShell directly from Github and install it (and manually update it as new versions get pushed out). You can also use Chocolatey although technically that is not supported.
      So you have options. Personally, I update the daily build once or twice a week and update the preview and production versions once I get a warning about an updated version the next time I start PowerShell.  

      So no matter how you install PowerShell 7, make sure you have a patch strategy in place. And if you have read this far, make sure you have installed 7.1.5 (or 7.0.8 if you are still on 7.0).

      Sunday, August 08, 2021

      A Great Book On Windows Terminal

       

      Review of Will Fuqua’s book “Windows Terminal Tips, Tricks, and Productivity Hacks”


       

      I have had the pleasure of reading through this recently published book (Smile.Amazon.Com). This is a timely book, full of tips and tricks for using the new Windows Terminal.

      Ever since Microsoft got into the OS business, we’ve had a console or shell built into the OS. The earliest, command.com, gave way to cmd.exe with NT 3.1. And later came Windows PowerShell, the PowerShell ISE and of course PowerShell 7 with VS Code. And now we have Windows Terminal.

      As the book clarifies, there really are two components at work when you use tools like PowerShell. The first is the shell, with the second being the terminal. As the book makes clear, the terminal is, essentially, “what you see” when you use a command-line tool. The terminal renders any text, draws any UI, and accepts kb/mouse input. The terminal sends input to the shell for the actual processing. The shell then processes the input and returns the results back to the terminal to display.

      When you use cmd.exe or PowerShell, conhost.exe is the terminal with the command prompt or Windows Powershell/PowerShell 7 as the shell.  The actual shell does not have a UI as such – it gets input from and sends output to the terminal. It is important to separate the two.

      Conhost.exe is really pretty primitive – it works and does the job but could do so much more. Which is a thought that has led to the development of a new, open-source, cross-platform terminal supporting just about any shell, including the shells in Linux distributions you can run under the Windows Subsystem for Linux (WSL2).  The new windows terminal is not just a re-write of conhost.exe, but is so much more!

      As a PowerShell user, I traditionally used PowerShell via either the command line (i.e. pwsh.exe, cmd.exe, powershell.exe) or most often via VS Code (and some ISE).  The tool I chose at any given time reflected what I was about to do. I use a GUI to develop and test scripts. And sometimes to run code. On the other hand, when I only need to run code ( for example, my Get-TodayInHistory.ps1 script (https://github.com/doctordns/GDScripts/blob/main/Get-TodayInHistory.ps1) that helps me pick music to play today), I chose the console as my terminal.

      The book begins with a look at both what the terminal is and how you install it, followed by a great chapter on the UI. If you are going to be using the new terminal, Chapter 2 has several important key sequences you need to work into your muscle memory.

      In the second section, the book looks at how you can configure Windows Terminal and the shells you use. The book contains lots of great tips for using the terminal and PowerShell and WSL2 Ubuntu via Windows Terminal.

      For the hardcore developer, the final section in the book looks at how you can use Windows Terminal in development. The book looks at using GIT and GitHub and building web applications (with React) and  REST APIs. The book finishes with a look at connecting to a remote host and managing hosts in the cloud by using Azure Cloud Shell or Google Cloud shell in the Windows Terminal.

      If you are a casual terminal user, Windows Terminal does everything you used to use consoles for – and a lot more. Windows Terminal is a great tool if you are a WSL2 user, perhaps creating APIs or web apps. For both audiences (and everyone in between), this book provides great guidance, tips/tricks, and best practices.

       

       

      Monday, June 21, 2021

      My Next Book is Announced!


      I am pleased that my publisher, Packt, have announced my next book on PowerShell. The new book is entitled Windows Server Automation with PowerShell Cookbook and it should be released in July.

      You can pre-order it from Amazon - in the UK, use the Smile.Amazon.Co.UK site. In the US, use the Smile.Amazon.Com site

      I am looking for a few good books reviewers - you get a copy of the book in exchange for a review. If you are interested, send me some email at DoctorDNS@Gmail.Com.

      Tuesday, June 01, 2021

      Resolving Hyper-V issues with WIn10 Insiders builds

       I have been a member of the Windows Insiders group since 2014, and have enjoyed testing out the updates and looking at some of the new features coming to a new Win10 release in due course. A very few Insiders builds were bad - and just did not work. But that was rare and for the most part, the new builds work well.

      I recently encountered an issue whereby after the upgrade, no Hyper-V VM would start. The error log shows an error message about MCompute faults - which looks like this in the event log.

      Faulting application name: vmcompute.exe, version: 10.0.21354.1, time stamp: 0x1aa49b37

      Faulting module name: vmcompute.exe, version: 10.0.21354.1, time stamp: 0x1aa49b37

      Exception code: 0xc0000005

      Fault offset: 0x00000000001c065a

      Faulting process ID: 0x23a8

      Faulting application start time: 0x01d72d3e4d15903d

      Faulting application path: C:\WINDOWS\system32\vmcompute.exe

      Faulting module path: C:\WINDOWS\system32\vmcompute.exe

      Report ID: f8375064-c2fc-4a89-a8f1-af662c551008

      Faulting package full name: 

      Faulting package-relative application ID:

      I reported this error multiple times, but to no avail. Every week, Windows Update would offer a new build that would fail to restart Hyper-V.. I'd revert to a working build, report the issue - then wait a week and repeat. Despite multiple reports, I never got any response from MSFT. 

      I decided to take a near-nuclear option - which worked! In hopes that someone else who encounters this issue and, using their search engine, finds this post - here is my solution:

      1. Start, then shut down all VMs.
      2. Took an inventory of the VMs.
      3. Remove all the un-needed VM snapshots - there were too many and all are now gone! 
      4. Export all the VMs to a backup drive (removing the snapshots helped reduce the time).
      5. Remove the Windows 10 Hyper-V feature - all of it.
      6. Reboot.
      7. Accept the latest Insiders update and let it install.
      8. Once the upgrade is complete, add the Hyper-V feature back in.
      9. Reboot.
      10. Open up the Hyper-V console and install the necessary VM switches (two in my case).
      11. Import the VMs specifying the location on the VM location on the host's disk of the VM (and not the backup!).
      12. For some VMs< i need to specify where the virtual hard disks were stored (I use a non-standard location).
      13. Start the VMs and enjoy.
      Here is what I see now!




      Once you are up and running, make sure to remove the backups.

      From start to finish, it took me a couple of days elapsed time (the backup to an external USB2 drive was slow!). 

      In step 2, I took inventory (including identifying the snapshots and working out how much data had to be backed up. One useful step wasn running this query and saving the output. 
      PS C:\Foo> Get-VM | Format-Table -Property Name, Path

      Name           Path
      ----           ----
      ***Cookham1    D:\VMs\Cookham1
      CH1            D:\V8\CH1
      DC1            d:\v8\DC1
      DC2            d:\v8\DC2
      HV1            D:\V8\HV1
      HV2            d:\v8\HV2
      Hyper-V Server D:\V7\HyperVServer\Hyper-V Server
      Old - Bridge   D:\V7\Bridge
      PSRV           D:\V8\PSRV
      SMTP-2019      D:\V8\SMTP-2019\SMTP-2019
      SRV1           D:\V8\SRV1
      SRV2           D:\V8\SRV2
      SS1            D:\V8\SS1

      After all the updating, in step 11, I pointed the MMC's import wizard at the path noted above and most of the VMs came in fine. For the "V8" VMs, I had to specify the same path to let the wizard know where each of the disks was stored. 

      Hope you never encounter this issue but if you do the is a workaround. 








      Monday, January 11, 2021

      Packt Publishers Sale on Through Jan 13 2021

      In my morning email was a message from the publishers of two (and soon three) books. My books are on PowerShell, but as many of you know, they publish books, ebooks, and videos across the IT Spectrum. In my experience, some of their books are better than others. I'd like to think that some books, including mine, are in the 'good' category - but as ever Caveat Emptor.

      Their reason for writing to me is that they currently have a sale on at the moment. This sale, which is on through Jan 13th 2021 offers certain WIndows/Linux books and ebooks for sale. 

      I apologise for the short notice, but I only heard about this from a mail I got this morning.

      Mastering Windows Server 2019 - https://www.packtpub.com/product/mastering-windows-server-2019-second-edition/9781789804539?utm_source=4sysops&utm_medium=&utm_campaign=5dollar2020&utm_term=system-administrators

      Mastering Windows Security and Hardening - https://www.packtpub.com/product/mastering-windows-security-and-hardening/9781839216411?utm_source=4sysops&utm_medium=&utm_campaign=5dollar2020&utm_term=system-administrators

      Windows Server 2019 Administration fundamentals, Second Edition - https://www.packtpub.com/product/windows-server-2019-administration-fundamentals-second-edition/9781838550912?utm_source=4sysops&utm_medium=&utm_campaign=5dollar2020&utm_term=system-administrators

      Mastering Linux Security and Hardening - https://www.packtpub.com/product/mastering-linux-security-and-hardening-second-edition/9781838981778?utm_source=4sysops&utm_medium=&utm_campaign=5dollar2020&utm_term=system-administrators

      Windows Subsystem for Linux 2 (WSL 2) Tips, Tricks, and Techniques -https://www.packtpub.com/product/windows-subsystem-for-linux-2-wsl-2-tips-tricks-and-techniques/9781800562448?utm_source=4sysops&utm_medium=&utm_campaign=5dollar2020&utm_term=system-administrators

      And what list of great books on offer would be incomplete without my latest book: 

      Windows Server 2019 Automation with PowerShell Cookbook - Third Edition -https://www.packtpub.com/product/windows-server-2019-automation-with-powershell-cookbook-third-edition/9781789808537







      Sunday, January 10, 2021

      My Wiley PowerShell Book - Issues with the code samples

      I recently posted (https://tfl09.blogspot.com/2020/12/my-latest-powershell-book-copies.html) about my latest PowerShell book, published by Wiley. I am proud of this book for many reasons. Most of the writing was done after the start of the Covid pandemic. Frankly, the writing helped keep me sane (well saner). 

      Writing a book involves a lot of people. I write stuff  and then a good team of others help to turn this into a finished product. In particular, I was very happy with the excellent editing done by the Wiley team. The development process was laborious and my editors were a pedantic nightmare - but all in the Best Possible Way! I appreciated the almost anal nitpicking done and the total focus on quality. Everyone involved has my gratitude for their excellent work.

      But...

      Today I received a mail from a wonderful reader in Italy. He complained that cutting and pasting from the e-book failed. He provided me with some examples. Yes, he is right that what he showed me is NOT what I wrote and submitted.

      My apologies for this to him, and everyone reading my book. I can assure you that I did everything I could to avoid this very issue. Jim, my TE, was totally focused on helping me to avoid issues like this - every missing capital letter, every missing parameter name, and every other minor error was flagged and resolved. I did what I could, and am sorry for the resulting issues.

      With that said, one of the things I insisted on was that I could publish these scripts on GitHub. You do not have to buy my book (although I hope you will) to use the scripts. You canfindt the scripts at GitHub account - https://github.com/doctordns/Wiley20.

      So if you want to try out the book and the scripts contained in it, please do NOT cut/paste from an electronic version - use the scripts on Github. If you find errors there, PLEASE file an issue at https://github.com/doctordns/Wiley20/issues. I will do my best to fix anything that might be broken and republish the updated script.  You are welcome to fork this repo and I'll accept PRs that fix errors in the scripts.

      Thanks for everyone who either both my book or uses my scripts.

      Saturday, December 19, 2020

      My Latest PowerShell Book - Copies Available

       

      After a long development process, my latest PowerShell book has been published by Wiley


      This book began before the Covid pandemic began and progress as a result of the virus was a lot slower than I would have liked, but thanks to some great people we got it done and this week I got my review copies.

      As I announced on my Twitter (@DoctorDNS) feed, I have some copies of this book to give away. I am more than happy sign and send off copies of the physical book. I also want to find some reviewers and can provide them with an electronic copy for review if you contact me offline. 

      I love the PowerShell community and nothing would please me more than to give a copy to anyone who asks. But that is not possible. So I am happy to give away some free copies, but to make it a bit more interesting, there is a little catch! :-)

       I will give away five signed copies of the physical book to the first FIVE people who ask and who donate to the Epilepsy Society (https://epilepsysociety.org.uk/). My daughter has severe epilepsy and I support this charity. So to get a copy, confirm a donation and EMAIL me. My email addresses DoctorDNS@Gmail.Com.

      [LATER]
      Owing to Covid restrictions announced today in the UK, physical copies may be delayed until I can get out to the Post Office. 

      Friday, December 11, 2020

      How To: Change the File Time for a Windows File

      When using PowerShell in Windows, The Get-ChildItem (and other cmdlets) return objects of the type System.IO.FileInfo for files in the NTFS File system. Each of the returned objects contains six date/time objects:
      • CreationTime and CreationTimeUtc - the date/time that the file was created (in local time and UTC)
      • LastAccessTime and LastAccessTimeUtc - the date/time that the file was last accessed
      • LastWriteTime and LastWriteTimeUtc - the date/time that the file was last written
      These properties allow you to check a file (or for that matter a folder) for when it was created and when it was last written to and read from. This is very useful to help IT Professionals to discover files that are underused and may be candidates for deletion. To cater for files being accessed in different time zones, the UTC variants provide time-zone proof times for comparison.

      One challenge is that you may wish to change these times. There is no cmdlet that can change these times directly. Fortunately, there is a simple bit of PowerShell magic. Assume you have a file for which you wish to change date/times. 

      Here is you can do it:

      PSH [C:\Foo]: Remove-Item -Path C:\Foo\Foo.xxx -ea 0  
      PSH [C:\Foo]: "FOO" | Out-File -Path C:\Foo\Foo.xxx
      PSH [C:\Foo]: #    Find file and display
      PSH [C:\Foo]: $File = Get-ChildItem -Path c:\foo\Foo.xxx
      PSH [C:\Foo]: $File| Format-List -Property name,*time*

      Name              : Foo.xxx
      CreationTime      : 11/12/2020 10:09:57
      LastAccessTime    : 11/12/2020 10:09:57
      LastAccessTimeUtc : 11/12/2020 10:09:57
      LastWriteTime     : 11/12/2020 10:09:57
      LastWriteTimeUtc  : 11/12/2020 10:09:57


      PSH [C:\Foo]: #    Get new date
      PSH [C:\Foo]: $NewDate = Get-Date -Date '14/08/1950'
      PSH [C:\Foo]: #    Change the date
      PSH [C:\Foo]: $File.CreationTime      = $NewDate
      PSH [C:\Foo]: $File.CreationTimeUTC   = $NewDate
      PSH [C:\Foo]: $File.LastAccessTime    = $NewDate
      PSH [C:\Foo]: $File.LastACcessTimeUTC = $NewDate
      PSH [C:\Foo]: $File.LastWriteTime     = $NewDate
      PSH [C:\Foo]: $File.LastWriteTimeUTC  = $NewDate
      PSH [C:\Foo]: #    Recheck file to see changed date/time
      PSH [C:\Foo]: $File = Get-ChildItem -Path C:\Foo\Foo.xxx
      PSH [C:\Foo]: $File| Format-List -Property Name,*Time*

      Name              : Foo.xxx
      CreationTime      : 14/08/1950 01:00:00
      CreationTimeUtc   : 14/08/1950 00:00:00
      LastAccessTime    : 14/08/1950 01:00:00
      LastAccessTimeUtc : 14/08/1950 00:00:00
      LastWriteTime     : 14/08/1950 01:00:00
      LastWriteTimeUtc  : 14/08/1950 00:00:00

      Thursday, November 26, 2020

      Creating a PowerShell Cmdlet Using C# and DotNet.eXE

      As part of a book I am writing, I wanted to show how simple it could be to create your own cmdlet in C#. This was to be a part of an advanced look at the things you can do with PowerShell 7 and DotNet Core. 

      Most IT professionals know about the .NET objects produced by cmdlets, how to discover details about those objects, and how to reach into the .NET BCLs to do things that cmdlets can't. You can also extend PowerShell using a .NET Language such as C# to create a class. You use Add-Type to import the class into your PowerShell session and use it just like any other .NET class. Creating a cmdlet turned out to be more complex.

      The first question any sane person might ask is; WHY? Why would anyone want to write a cmdlet or create a class using C#? The obvious reason, per Jimmy The Swede, is because you can. As an IT professional, knowing how is just another tool on your tool belt. But aside from that, there really are some good reasons. Using C# is easier when you need to perform asynchronous operations or multi-threading, or if you wish to use Language-Integrated Query (LINQ). IN those cases, C# is a lot easier than trying to use PowerShell in those scenarios. Another use case is taking some ASP.NET code, such as a page which creates a list of out of stock items and re-purposing it for administrative use.  Admittedly, there are not a lot of use cases in the wild - but as a PowerShell person, you should know!

      If you are a hardcore Windows developer, you probably have Visual Studio and know how to use it. But in this article, I'm going to use a different approach which is to use the .NET SDK and the dotnet.exe command. You can do most of this from the command line, although there seems no simple way to install the SDK. So here goes

      1. Install the .Net Core SDK.  

      This is almost the hardest part of the job. You need to install the Software Development Kit in order to get the tools you need to create the cmdlet. This involves using the GUI.

      Navigate to the Dotnet download page: https://dotnet.microsoft.com/download/


      From here, click on the download button. This leads to the downloading page:


      The download takes a few seconds/minutes to download. Eventually, you can see in the bottom left corner that the download is complete and you can now click on the open file as you can see here:



      Clicking on the open file link runs the installer to install the .NET Core SDK on your computer. Eventually, you see this:



      Click on Close and you now have the SDK loaded on your system. From here it's fairly simple - here's the script:

      # 2. Create the cmdlet folder
      New-Item -Path C:\Foo\Cmdlet -ItemType Directory -Force
      Set-Location C:\Foo\Cmdlet

      # 3. Creating a class library project
      dotnet new classlib --name SendGreetings

      # 4. Viewing contents of new folder
      Set-Location -Path .\SendGreetings
      Get-ChildItem

      # 5. Creating global.json
      dotnet new globaljson

      # 6. Adding PowerShell package
      dotnet add package PowerShellStandard.Library

      # 7. Create the cmdlet source file
      $Cmdlet = @"
      using System.Management.Automation;  // Windows PowerShell assembly.
      namespace Reskit
      {
        // Declare the class as a cmdlet
        // Specify verb and noun = Send-Greeting
        [Cmdlet(VerbsCommunications.Send, "Greeting")]
        public class SendGreetingCommand : PSCmdlet
        {
          // Declare the parameters for the cmdlet.
          [Parameter(Mandatory=true)]
          public string Name
          {
            get { return name; }
            set { name = value; }
          }
          private string name;
          // Override the ProcessRecord method to process the
          // supplied name and write a geeting to the user by 
          // calling the WriteObject method.
          protected override void ProcessRecord()
          {
            WriteObject("Hello " + name + " - have a nice day!");
          }
        }
      }
      "@
      $Cmdlet | Out-File .\SendGreetingCommand.cs

      # 8. remove class file 
      Remove-Item -Path .\Class1.cs

      # 9. Build the cmdlet
      dotnet build 

      # 10. Importing the DLL holding the cmdlet
      $DLLPath = '.\bin\Debug\net5.0\SendGreetings.dll'
      Import-Module -Name $DLLPath
      at that point, you can use the Send-Greeting command, like this:


      Thanks to James O'Neil who got me started on this solution. Thanks also to Thomas Rayner's excellent PowerShell summit talk at  https://www.youtube.com/watch?v=O0lk92W799g&t=4s