Learn PowerShell in a series of free Live Meetings

On Monday 12th March, Ed Wilson, the Microsoft Scripting Guy, is starting a week of free Live Meetings to get beginners up to speed with Windows PowerShell. The live sessions are at 10am(Pacific) each day, so that’s 7pm to us, but they’ll also be recorded and available at the TechNet Script Center’s Learn PowerShell page, where you can already find some great content.

The Windows PowerShell for the Busy Admin series covers the following:

Session 1PowerShell SmowerShell or: Why Bother to Learn Windows PowerShell

In this session, Microsoft Scripting Guy ,Ed Wilson, discusses the fact that in addition to being the management future for Microsoft products, Windows PowerShell offers a number of compelling reasons for learning it. These reasons include the following: it is powerful and provides the ability to collect and to consolidate information from multiple remote systems into a centralized view of the data. It is safer than many other tools, and offers the ability to prototype a command prior to the command execution. There is also a confirmation mode that will allow a network administrator or other IT Pro the ability to selectively step through a group of commands to cherry pick commands to execute or ignore. Windows PowerShell also has built in logging that provides documentation of not only what commands are executed, but the resultant output from those commands. In addition, Windows PowerShell contains numerous features to promote a high level of discoverability and intuitive usability. This session is heavy with practical tips and demonstrations.

Session 2Heard It Through the Pipeline or: How to Compound PowerShell Commands for Fun and Profit

One of the most basic and one of the most powerful features of Windows PowerShell is the pipeline. By using the Windows PowerShell pipeline, one can take a basic set of cmdlets and build a nearly infinite assortment of useful commands. And yet, all of this boils down to using the pipeline to perform essentially four types of activities. The first is to use the pipeline to retrieve items and to work on them. The second is to use the pipeline to filter out data. The third basic use of the pipeline is to persist information. Lastly, the use of the pipeline to format output. In this session, all four basic uses of the pipeline are covered with a heavy dose of demos.

Session 3Sole Provider? Not Hardly or: A Look at Windows PowerShell Providers

One of the revolutionary concepts in Windows PowerShell is the idea of PowerShell providers. Windows PowerShell providers provide a singular way to access different types of data that are stored in different locations. Default providers include a file system, registry, alias, variable, function, and environmental variable. This means that one can use Get-Item to access content stored in any of these locations. Not only that, but these providers are extensible, which means that Microsoft teams (and non-Microsoft developers) can create additional providers.

Session 4The Main Event or: PowerShell Does Event Logs

Regardless of one’s position, it seems that at some point or another everyone will be involved in looking at event logs. And why not…especially since Windows has such great logging support. Whether it is for security reasons, troubleshooting reasons, or general Windows health monitoring, the logs contain nearly all of the required information one seeks. In this session, Microsoft Scripting Guy, Ed Wilson, discusses the classic and the newer ETW style of logs, and looks at the tools that are used with each type of log.

Session 5More than Remotely Possible or: Using PowerShell to Manage the Remote Desktop

Let’s face it, even though there are lots of commercial products out there that assist in managing desktops,or servers, most are very complex, and they require a dedicated support team to manage them. Even in organizations where such tools exist, the teams agenda, and the front-line admin’s agenda often clash. For adhoc situations, using Windows PowerShell to manage remote machines fills-in the gray area. In this session, Microsoft Scripting Guy, Ed Wilson,discusses using Windows PowerShell to manage remote machines.

I’d encourage anyone who hasn’t already begun, to learn PowerShell before it’s too late!

Windows 7 now the most used OS on campus

It’s been a while since I did these stats, and I don’t think I’ve ever blogged them before, but I was prompted to check this after hearing Microsoft say that Windows 7 has overtaken previous versions worldwide.

It turns out that even though there hasn’t been any particular institutional drive towards Windows 7, thanks to the efforts of pro-active staff across the University, it does now significantly outnumber all other operating systems in our Active Directory combined! In the last 3 months, the breakdown of computers is…

Windows 7: 7155
Everything else: 5385

There are an unknown number of machines that aren’t connected to Active Directory, but the spread of those won’t affect this a great deal.

Another interesting stat that I heard recently came from Dell. They now predict that 50% of all server workloads are virtualised. Thats probably about true here – I’ll see if I can get those stats later.

Incidentally, the breakdown of Windows servers in our AD sees roughly equal numbers of Server 2008 R2 and Server 2003 (just over 200 of each), and just over a hundred running Server 2008.

If anyone is interested in grabbing these stats in their own organisation (or OU), then it’s just a simple bit of PowerShell using the Quest AD cmdlets

$computerObjects = Get-QADComputer `
-IncludedProperties pwdLastSet -SizeLimit 0
$recentComputers = $computerObjects | `
Where {$_.pwdLastSet -ge ((Get-Date).AddDays(-90))}
$recentComputers | Group OSName | `
Sort Count -Desc | Format-Table Count,Name -AutoSize

(That’s just 3 lines of code, but it could be fewer. The ` character extends the line in PowerShell)

Pre-staging Computers in Active Directory for WDS with PowerShell and Quest AD cmdlets

One of the most common issues when buidling computers with Windows Deployment Services (WDS, and RIS before that) are typos in the GUIDs used to net-boot the PCs. When you’re entering them by hand as you pre-stage the computer objects in Active Directory it’s very easy to make mistakes, especially when you’re entering a lot of them. It’s also extremely time consuming if you have to boot each machine to the point of PXE displaying the MAC and GUID – that’s why the smart move is to request that information from the supplier, preferably before they deliver the machines.

Anyone who has pre-staged a computer object before will be aware of the jiggery-pokery that goes on with switching round the first half of the GUID, so that when you view it later in ADUC, you see something significantly different to what you typed in. It appear that this conversion is done by the GUI when you create the object, so when you’re adding them programatically, you need to change the format yourself.

Microsoft published a VBScript function to reformat the GUIDs so they could be added to AD by a script, but I haven’t seen similar in PowerShell, so here it is:

function flip-guid ([string]$g) {
$g = $g.replace("-","").replace(" ","")
-join $g.substring(0,16).tochararray()[6,7,4,5,2,3,0,1,10,11,8,9,14,15,12,13] + $g.substring(16,16)

The function takes the GUID as a string and first removes any dashes or spaces (since I’ve received them from suppliers with both at different times). Next it converts the first half into an array of characters, selects them back in the new order and uses the join operator to make them back into a string, to which it concatenates the second half, unchanged from the original. As with most things in PowerShell it could be reduced down to a single line, or expanded further to enhance readability.

So, given the ability now to change the format, I use Quest’s AD cmdlets (if you haven’t come across these before, take a look now!) to create the computer objects. Assuming that you have a CSV file containing the new PC’s name and GUID, just do this…

Import-Csv newpcs.csv | foreach {
New-QADComputer $_.name -ParentContainer "SomeOU" -ObjectAttributes @{netbootguid = ([guid](flip-guid $_.guid)).ToByteArray()}

That’ll leave you with a load of new computer objects ready for WDS. 🙂

NB. It’s likely that the code snippets above have been wrapped to fit the page layout. In the function there are only two lines – everything from “-join” to the end is the same line. In the foreach scriptblock that’s just a single line.

Must have Powershell snippets

Over the last few months my colleague Jon has been providing me with some very useful Powershell snippets which I thought I’d share. A number of them require the Quest ActiveRoles Management Shell for Active Directory

Display Group memberships for a user.

(Get-QADUser username).MemberOf

Display the members of an Active Directory Group

Get-QADGroupMember "Groupname" | ft name,displayname -a

Bulk remove machine from Windows DNS

The text file contains a list on NetBios machine names.

gc computers.txt | %{dnscmd dnsservername /RecordDelete campus.ncl.ac.uk "$_" A}

Recuse through a Directory Structure and delete all file with a Creation date > 90 days.

The text file contains a list of UNC paths.

GC filecontainingpaths.txt | %{dir $_ -recurse | ?{!$_.psiscontainer -and $_.creationtime -lt ((get-date).adddays(-90))} | del -whatif}

Active Directory Spring Cleaning: Unnecessary Computer Objects

Yesterday, a PowerShell script I’d written sent an email to the members of the Active Directory security groups that are delegated control of computer objects within the OUs for various sections of the University under the “Departments” OU. These messages contained a list of all the computer objects in each departmental OU which haven’t contacted the domain to change their password for 90 days (by default a computer will change its password every 30 days) – that being an indication that the computer object may be unneccesary and could possibly be deleted.

In order to generate these reports, I use Windows PowerShell and the Active Directory cmdlets from Quest Software. Once you have those installed, you’ll find a “Quest Software” folder in the Start menu, which contains the “ActiveRoles Management Shell for Active Directory” – you should run this as a member of the admin group that has permissions on the OU you want to report on. Then it’s just a case of a couple of lines of PowerShell.

[If none of this makes any sense, then I’m going to recommend that you go and read the Getting Started chapter from the TechNet Script Center’s PowerShell Owner’s Manual]

First we’ll put the OU’s distinguished name in a string variable, just to reduce the amount of wrapping we’re going to have on the next line…

$OU = "OU=ISS,OU=Departments,DC=campus,DC=ncl,DC=ac,DC=uk"

Then we find the computer objects by using Quest’s Get-QADComputer cmdlet, and filtering it to find the pwdLastSet property longer than 90 days ago…

Get-QADComputer -SearchRoot $OU -SearchScope Subtree
-IncludedProperties pwdPastSet -SizeLimit 0 | where {$_.pwdLastSet -le $((Get-Date).AddDays(-90))}

That gives you a table of computer objects using the default formatting, but we can do better than that.

If we pipe the output of the filter to the Select-Object cmdlet, we can select interesting properties to look at. I’m going to select the computer object’s name, description and parentcontainerdn so we can see how we’ve labelled it and exactly where it is in our OU structure…

Get-QADComputer -SearchRoot $OU -SearchScope Subtree
-IncludedProperties pwdPastSet -SizeLimit 0 | where {$_.pwdLastSet -le $((Get-Date).AddDays(-90))} | select name,description,parentcontainerdn

These might not be the most helpful properties for the computers you manage, so you can check the full list of properties of the computer objects by piping one into the Get-Member cmdlet.

We might have some useful data at this point, but there’s probably going to be some truncation going on, and it might be more useful if we could sort it. You could use the Sort-Object and Format-Table cmdlets to help, but I’m going to suggest that we might be better getting it out into Excel so you can order it and play with it in any way you want. To that end, we’ll pipe the whole lot into the Export-Csv cmdlet…

Get-QADComputer -SearchRoot $OU -SearchScope Subtree
-IncludedProperties pwdPastSet -SizeLimit 0 | where {$_.pwdLastSet -le $((Get-Date).AddDays(-90))} | select name,description,parentcontainerdn | Export-Csv "C:\temp\computers.csv" -noTypeInformation

I hope that helps. 🙂

VBUG Newcastle’s first IT Pro event

On Tuesday, the University hosted the first IT Pro event held by VBUG Newcastle. Going forward the aim is to host developer and sys admin events in alternate months. For the first set of sys admin content, I did a presentation entitled “PowerShell: 0-60 in One Evening”, which you can find the details of at: http://www.jonoble.com/blog/2009/3/26/powershell-0-60-in-one-evening.html

For a first event of a brand new group, I think a turnout in the high teens was ok, and from a speaker’s perspective the level of interaction was good, but we’d love to see double that number next time. We’ve got a great speaker planned for the May event and I’ll post the details here as soon as everything is confirmed.

Free PowerShell event: 24th March

I’ve been working with Andrew Westgarth, who runs the VBUG Newcastle events, to try to provide IT Pro (i.e. sys admin) content as well as their traditional developer events. The idea is that we’ll run free developer and IT Pro events alternate months on the Newcastle University campus.

The first of the IT Pro events will be held on the 24th March in Claremont Tower and I will be presenting “Windows PowerShell: 0-60 in One Evening”. The presentation will highlight a number of free tools to help you get up to speed quickly with PowerShell.

All the details are on the VBUG site at:

If you’d like to come along, please book your free place (just so we don’t run short of refreshments).

Forthcoming Events on Campus

I will be preseting a brief introduction to Windows PowerShell at this month’s Super Mondays event in the Beehive (Old Library Building) on the 23rd at 18:00. These events are growing rapidly and well worth attending – the topics are usually diverse enough that there should be something of interest to everyone. See SuperMondays.org for the full line-up and the Upcoming link if you’re attending. I hope to see you there.

The postponed VBUG Newcastle developer event on “Parallel Programming in .NET (VS2010)” with Eric Nelson has now been rearranged for Tuesday 24th February at 18:30 in Claremont Tower room 118. Sign up at:

‘Windows Server’ 7 aka Windows Server 2008 R2 Feature list

Last week at PDC Microsoft announced that Microsoft Windows Server 2008 R2 will be the server variant of Windows 7.

Here at TechEd we are seeing demonstrations of some of W7/R2’s features. Here is a quick run through. More detail to follow.

  • Live Migration
  • Remote Desktop Services which will supersede Terminal Services.
  • Bitlocker to go
  • Direct access (a possible killer app for Server 2008 R2 and IPv6)
  • BranchCache.
  • SMB enhancements
  • Offline file enhancements including a ‘Usually offline mode.’
  • Wake on Wireless LAN.
  • Improved power management and increased control via Group Policy.
  • Group Policy scripting with Powershell.
  • Programmatic interface in to performance and reliability systems.

SQL Server 2008 arrives

At the moment WIT run a collection of SQL 2000 and 2005 servers that host around a hundred databases of varying size and importance to the institution. The lion’s share of those databases are currently on the older SQL Server 2000, so several months ago, with the end of mainstream support for that product approaching, we started making plans for migration.

We’ve been keeping a close eye on the development of the latest version, SQL Server 2008, since it was announced, and trialing pre-release versions. SQL Server 2008 offers a number of advantages over previous versions and the migration path from SQL 2000 to 2005 or 2008 is much the same, so we’ve opted to take those databases that are currently on SQL 2000 straight to 2008, rather than moving them twice.

SQL Server 2008

Last week, we were fortunate to have Microsoft’s Andrew Fryer spending a day with us, discussing our migration plans. Since none of our databases do anything especially odd (not that some of them aren’t complex), SQL Server 2008’s comprehensive Upgrade Advisor was able to tell us that we didn’t need to make any changes to the databases before moving them to the new version.

There are some things that Upgrade Advisor suggests for after the migration, such as re-writing DTS packages using the SSIS technology that replaced DTS in SQL Server 2005, but existing DTS packages will work in SQL Server 2008, so our advice is that the time to migrate from DTS to SSIS is when you need to alter a package.

This week SQL Server 2008 has been released to manufacture, so we’ll be moving forward with building production and test systems with the finished code. We’ve planned a setup which provides higher availability and better disaster recovery than we’ve previously implemented, and we’re looking forward to taking advantage of some of the new features (I’m especially looking forward to working with the SQL Server PowerShell functionality!).