Saturday, January 31, 2015

Office 365 Plans

Licensing with Office 365 is a bit different than normal on-premises software. With Office 365, rather than purchase a SKU, you subscribe to a Plan. A plan represents a set of services the user of that plan receives. In the early days of Office 365, the plans and the ability to move between them was problematic. If you took out a small business type plan and then wanted to move that subscription up to an Enterprise plan – you couldn't. It was all a bit messy. Then MS did two things: they simplified the plan structure, and enabled you to mix and match with the plans.

The new plan structure for Office 365 contains 6 separate plans:

  • Office 365 Business Essentials
  • Office 365 Business
  • Office 365 Business Premium
  • Office 365 Enterprise E1
  • Office 365 Pro Plus
  • Office 365 Enterprise E3

The Office 365 Business/Pro Plus plans are just the on-Prem fully installed set of Office software products. This includes Word, Excel, PowerPoint, Outlook, Publisher and One note). The Pro Plus plan adds Access. These plans are a way of subscribing to Office, versus outright purchase. These two plans just include the software – there's no on-line services or server software. These plans appeal to organisations that are looking to spread the cost over time. Once nice aspect of these plans is that you can load Office on up to 5 systems (e.g. laptop, desktop, home, etc).

The Business Essentials and Enterprise E1 plans are, in effect, online Office (including the Office Online Apps, plus and Exchange mailbox, file and storage space, plus both SharePoint and Lync. The E1 plan has a few added features appropriate to larger firms, such as compliance, BI and enterprise management of applications.

Finally, the Business Premium and E3 plans are the combination of the first two: full office plus mail, SharePoint, Lync. The E3 plan as above gets a bit more.

For fuller details of precisely what each plan offers, see https://products.office.com/en-gb/business/compare-more-office-365-for-business-plans.

What this now means is that any organisation can mix and match any of the plans. The restriction is that a a given subscription is limited to 300 seats of less of the business plans. So you could give the Business/ProPlus subscription to the road warriors who need off line access, while giving Business Essentials to in-house staff could use Office in the cloud. This offers a lot of flexibility.

del.icio.us Tags: ,

Friday, January 30, 2015

PowerShell Patterns and Azure Cmdlets

Get-Process Power* As anyone learning PowerShell quickly finds out, there are a number of basic patterns of usage that work well across a huge range of PowerShell cmdlets/modules. One of the most basic pair of patterns I try to teach, by way of example here, are early vs late filtering. The following two commands accomplish the same result:

Get-Process Power*
Get-Process | Where-Object name -like Power*

I wrote a little script to measure the performance difference between these two (http://pastebin.com/4N2YYqnZ). Running each of these 1000 times, the result was that early filtering was 3 times as fast as later filtering. So as a general rule – always early filter. BUT: sadly some cmdlets that emit useful objects do not support early filtering, so the later filtering pattern always works. I try to teach both of these patterns and why they are different since so many scripts can make use of these patterns.

Which brings me to Azure. In much of my PowerShell use, there is a cmdlet to set some value for some object or perform some operation on an object. For example, to change the number of CPUs in a Hyper-V VM, you use the Set-VM cmdlet, like this:

image

To do this in Azure is a bit more complicated. To change the number of processors in an Azure VM, you need to change a different property on the Azure VM object. Plus, the say you do it is different. With Azure, each VM runs in a given instance. The instance determines the number of CPUs and memory each Azure VM gets. So I would expect to do something like this:

Set-AzureVM –name xxx –service xxx –instancesize 'extrasmall'

Unfortunately, that cmdlet does not exist, and that wouldn't be the right way to do it anyway. The good news is that in Azure, there is a specific cmdlet to change the instance size (Set-AzureVMSize). So Azure VMs, the most common pattern is like this:

Get-AzureVM –VM psh1 –Service psh1 -Verbose |
  Set-AzureVMSize -Instancesize $VmSize   -Verbose |
     Update-AzureVM -Verbose

A different pattern. Unlike Set-VM, Set-AzureVMSize takes an Azure VM object (not the name of the VM). Secondly, Set-AzureVMSize does not persist the updated instance size value – it just updates an in memory object relating to the VM. To persist the value you have to write it back to Azure using Update-AzureVM. While you can do it step at a time, and avoid using the pipeline – in this case, using the pipeline makes seems easier.

In the Azure VM cmdlets, we see a great deal of use of the Pipeline based patterns. For example, to create an Azure VM, you first create a VM config object (Technically, New-AzureVMConfig produces an object of type Microsoft.WindowsAzure.Commands.ServiceManagement.Model.PersistentVM) which you then pipe to New-AzureVM (see http://tfl09.blogspot.co.uk/2015/01/managing-azure-vms-with-powershell.html). This general pattern of getting or creating an in-memory object, updating that in-memory object typically via the pipeline and finally persisting it to Azure is easy to apply.

When you start to use Azure you quickly find that while many of the features of Azure are similar to those in Windows – in this case, Azure VMs – the methods you use to manage the features does differ. You  have different patterns of usage you need to learn and understand.  When I explained this to a class this week, I got the obvious question: why? Why is it different?

Since I was not on the dev team, I can't really answer that. But, I suspect the main technical reason is that the Azure VMs are, in effect, a wrapper around the REST API exposed by Azure. I am guessing it was easier to write the cmdlets using this pipeline pattern (or the cmdlet's approach of passing a VM object vs a VM name). Having said this, incorporating the Get-AzureVM and the Update-AzureVM cmdlets inside the Set-AzureVMSize cmdlet would have not been that difficult. But that wasn't what MS implemented.

So go with the flow; learn the patterns and enjoy Azure!

del.icio.us Tags: ,

Updated Azure PowerShell Module

I've been playing a lot with the Azure cmdlets and noticed that there's an updated version of the module. My recent Azure related posts have been based on the version of the tools Ioaded before the New Year:

image

If you got to the download page, http://go.microsoft.com/fwlink/p/?linkid=320376&clcid=0x409, and run the platform installer, you see an updated version of the tools is available:

image

The download is around 13 mb and took around 45 seconds to download and install. After installation:

image

Sadly, there appear to be no release notes, or readme.txt file in the installation folder. Given the huge amount of change, better release notes would be useful. AND it would be nice to have the latest module available via PowerShellGet. We can hope!

del.icio.us Tags: ,,

Thursday, January 29, 2015

Using Azure – Get a Free Month's Trial

I've been doing a lot of work around Azure of late and have been running Azure training around Western Europe. Most of the folks I see in classes are new to Azure – it's an unknown that is challenging their existing approach to computing (i.e. everything on premise, each app running on it's own physical hardware). In our classes, we give the students access to Azure and get them to use Azure backup, let them create web sites, VMs, virtual networks, etc. In one class, we have the students build out an 'on premise' environment (in Azure!), then use that 'on premise' environment to integrate with Office 365.

What I am seeing is that Azure is fairly easy, but very different in places. Certainly, the exposure to Azure in class gets the attendees over the initial learning curve and allows them to play. The key feature of all this playing is the free Azure Pass we give each student. But what if you were not able to attend the training but still want to play with Azure.

The answer is easy: if you live in the UK, go here: http://azure.microsoft.com/en-us/pricing/free-trial/?WT.mc_id=A2FC1A0FA and sign up for a free Azure Pass. From the UK, at least, you go there, sign up for free and get £125 worth of Azure credit. Signup does require a credit card, but Microsoft states: "We use the phone number and credit card for identity verification. We want to make sure that real people are using Azure. We do not bill anything on the credit card".

Should you exceed the credit amount, your new Azure account will be suspended. At any time, you can optional upgrade the trail to a Pay-As-You-Go Azure subscription. BUT you will not be billed anything if you use your credits and let the subscription expire.

This trial is not just available here in the UK: it's available in 140 countries around the world. See the FAQ section of the free trial page to see the countries where it's available.

So if you have not used Azure and want to experiment, go and sign up. And join in the fun that is Azure. You can even use the scripts I've posted here to play with PowerShell and Azure! So what are you waiting for??

del.icio.us Tags: ,

Tuesday, January 27, 2015

Azure Networking Fundamentals- MVA Course

I just noticed that the Microsoft Virtual Academy has a new course: Azure Networking Fundamentals for IT Pros. It's narrated by Aaron Farnell, and consists of 4 modules comprising of 4 modules taking around 120 minutes (including assessment time).  The MVA rates the technical level as 300 – which is fairly deep but not overly deep!

This MVP course consists of 4 modules:

  • Intro to Azure Network Basics and VPN requirements
  • Plan and Design your Cloud network infrastructure
  • Configuring Azure and On-Premises
  • Testing Connectivity and Monitoring

This is a good start to understanding Azure Networking! And like all MVA courses, you can download the MP4s of each module and take then on the road with you.

del.icio.us Tags: ,,,

Monday, January 26, 2015

Azure and Compliance

Over the past 6 months, I've been conducting quite a lot of cloud technology training, particularly Azure and Office 365. I've been speaking to a number of European MSPs in the SMB space who are now looking to take on Azure as a platform for their customers, to some degree replacing their old historical favourite, Small Business Server. SBS (rip) was a great platform for the small business – cheap, comprehensive and relatively easy to manage. But it's gone and not coming back.

When extolling the virtues of the cloud, I hear a number of objections – some valid, some possibly less so. One objection I hear to Azure revolves around compliance. For customers in compliance-affected businesses, compliance is not an option.

It's clear that Microsoft recognise the need to have Azure seen as a product that can comply with most, if not all, of the world's compliance regimes. It was comforting, therefore, to read Lori Woehler's recent blog article about Azure and compliance.

In her article, she notes that Azure has recently completed successfully an audit against ISO/IEC 27001:2013. carried out by the highly independent British Standards Institute Americas. BSI also validated that Azure was compliant with the ISO 27081 code of practice for protection of Personally Identifiable Information (PII) in clouds.

Woehler goes on to note that Azure has expanded the services in scope for SOC 1 and 2, the US Department of Health and Humans Services has granted the US FedRAMP authority to operate to both Office 365 and Azure AD, Azure Government is one of the first cloud platforms to meet US. Criminal Justice Information Services certification requirements for state and local governments. She mentions other, non-US compliance initiatives for Azure, including Singapore Multi Tier Cloud Security (MTCS's first Level 1 end to end cloud service offering), and the Australian Government Information Security Registered Assessors Program (IRAP) accreditation.

These, and the other Azure initiatives mentioned, should  help to bridge the confidence gap (as well as enabling Azure to be used in many compliance-bound industries. And this work just keeps going on, both to comply with new and additional compliance schemes as well as to re-certify on a regular basis. Azure is changing on what appears to be a weekly basis – the compliance certifications need to keep pace.

Hopefully, this continuing effort will go a long way towards assuaging at least some of the concerns of the SMB market space.

del.icio.us Tags: ,

Friday, January 23, 2015

Creating Azure VMs with PowerShell

I've been playing for the past few weeks with Azure VMs. I can, obviously, do most things from the GUI, but somehow, as a PowerShell guy, that just seems wrong! So I've been honing up my PowerShell Skills. The Azure module contains over 500 commands. In the version on my home workstation, there are 509 commands, of which 35 are aliases.

As an aside, those aliases are use to alias the old 'windows azure pack' cmdlets which have in effect been replaced with newer cmdlets. For example, the old Get-WAPackSubscription is now an alias for Get-AzureSubscription. This is a great idea for helping to ensure forwards compatibility. I suspect we'll see more of it. For those of you with older Windows Azure Pack based scripts – you may consider upgrading those to use the new Azure Module native cmdlets.

The first thing you have to know about Azure VMs is that, while they are in the end 'just' a Hyper-V VM, they are a bit more complex than on-premise VMs. There are several Azure features, including storage accounts, images, endpoints, locations, and azure services to consider).

In azure, each VM runs within an Azure service. I like to think of the Azure service as the load balancer. In the simplest case of a single VM, you have a VM and the service and use the name and IP of the service to access the VM. The Service also allows you to provide more instances of the VM in an lbfo fashion. But today, I just want to create a simple VM. The VM I am creating runs inside a new Azure service of the same name as the VM

To create an Azure VM, you have two main options: use and Azure VM Image as the starting point for your VM, or create the VM on premise, and ship it's VHD up to azure and then create a VM using that uploaded VHD. In this blog post, I'll concentrate on the first, leaving the second for another day.

A VM image is a sysprepped operating system image often with additions and customisations. The image was built from some base OS with some changes possibly made. You can get simple base OS images – in effect what would be on the product DVD. Others have been customised, some heavily. Azure images come from a host of places, including Microsoft and the family. Once the reference Image is created, the creator then prepares.

You can easily create a VM from an existing VM image. To see the VM images in Azure, you simply use Get-AzureVMImage. As of writing there are 438 images. Of these, 187 are Linux based and 251 Windows based. A given image is in one and typically all Azure locations. An image belongs to an image family and has a unique image name. With 125 image families to chose from finding your image (and it's specific image name is based on Get-AzureVmImage and piping the output to your normal PowerShell toolset!

One suggestion if you are experimenting. Doing a Get-AzureVMImage call takes a while as you are going out to the internet (in my case behind a slow adsl line) is to save the images to a variable, then pipe the variable to where/group/sort/select thus avoiding the round trip up to the azure data center.

Today, I just want to create an image of Windows Server 2012 R2 Datacenter. So to find the image I do this:

image 

Next, in creating the Azure VM, you need an Azure Storage Storage Account in which you are going to store the VHD for your VM. The VHD starts off as, in effect, a copy of the image (i.e. a syspreped version of Windows) stored in Azure Storage VM. You can pre-create the storage account, although in this case, I let the cmdlets build the storage account for me.

So here's the script:

# Create-Azurevm.ps1
# This script creates an azure VM
# Set Values
# VM Label we are looking for
$label          = 'Windows Server 2012 R2 Datacenter, December 2014'
#vm and vmservice names
$vmname         = 'psh1'
$vmservicename  = 'psh1'
# vm admin user and username
$vmusername     = 'tfl'
$vmpassword     = '~+aQ8$3£-4'
# instance size and location
$vminstancesize = 'Basic_A3'
$vmlocation     = 'West Europe'

# Next, create a credential for the VM
$Username = "$vmname\$vmusername"
$Password = ConvertTo-SecureString $vmpassword -AsPlainText -Force
$VMcred = new-object system.management.automation.PSCredential $username,$Password

# Get all Azure VM images
$imgs = get-AzureVMImage 

# Then get the latest image's image name
$img = $imgs| where label -eq $label
$imgnm = $img.imagename

# OK - do it and create the VM

New-AzureVMConfig -Name $vmname -Instance $vminstancesize -Image $imgnm `
| Add-AzureProvisioningConfig -Windows -AdminUser'tfl' -Pass $vmpassword `
| New-AzureVM -ServiceName 'psh1' -Location $vmlocation

Once the VM is created, you can then start and use it. Having said that, there are some pre-requisites, like setting up end points and enabling remote management inside the VM. I'll cover these topics in later blog posts.

del.icio.us Tags: ,

MVA Training For Azure

I see that Microsoft are continuing to post more great Azure training into the Microsoft Virtual Academy. If you look here, you will find some great videos that can ultimately prepare you for 70-533 (Implementing Microsoft Azure Infrastructure Solutions)

There are several great things about this training – first it's pretty current. Azure changes literally weekly which means some of the printed material from other training outlets can be woefully out of date!. Second, it's free. And finally, being simple videos, it's easy to watch in your spare time (Ed: what is that?).

del.icio.us Tags: ,,,

Monday, January 19, 2015

Azure VMs – Improving the Display XML

The cmdlets that come with Azure (all 508 of them) also come with a little bit of display XML. But as I play with the cmdlets, I find either the display XML non-existent or less than helpful. To that end, I've started developing some supplemental .format.ps1xml.

Today, I was playing with the output of Get-AzureVM. By default, the output looks like this:

image

I have taken the liberty of creating some new display XM, and with that output, Get-AzureVM now produces better looking (IMHO) output.

image As you can see, there is some more information (including IP address, FQDN) that is useful when you are troubleshooting.

To get the display XML, take a look at http://pshscripts.blogspot.com/2015/01/azurevmformatps1xml.html and there you can grab the screen, save it in a local file, then use Update-FormatData to load it.

It's kind of cool to be able to improve released cmdlet sets! I wonder if anyone from the Azure team reads this. If so, you guys are welcome to the  XML!

Friday, January 16, 2015

Fun with the PowerShell Pipeline

I've been having some fun with the PowerShell Pipeline. This started with a post to a PowerShell question over on Spiceworks (http://community.spiceworks.com/topic/737915-powershell-question-help). The OP wanted to stop the spooler, remove any temp files then restart. Simple enough – but the first response suggested using pipes like this:

Stop-Service -Name Spooler |
Remove-Item C:\Windows\System32\spool\PRINTERS\* |
Start-Service -Name Spooler

I looked at this for a long while wondering what the heck the pipeline would be doing here. More importantly, I could not believe it would work – surely the output of Stop-Service would be null thus the remove-item would fail, etc. Then it tried it and much to my surprise – it actually works!

Then I began to wonder two things – WHY does it work at all – and what is the performance impact. To understand why requires a good understanding of how the pipeline works. In this case, Jason Shirk has a great explanation of the underpinnings over on Stack Overflow: http://stackoverflow.com/questions/22343187/why-is-an-empty-powershell-pipeline-not-the-same-as-null. I highly recommend reading it to help you better understand what the pipeline is doing.

Ok – so I know WHY it works, but what is the performance impact if any? To answer this question, I first constructed a little script.

Function Measureit {$m1 = (Measure-Command -Expression {
      Stop-Service -Name Spooler |
      Remove-Item C:\Windows\System32\spool\PRINTERS\* |
      Start-Service -Name Spooler
  }).milliseconds

  $m2 = (Measure-Command -Expression {
      Stop-Service -Name Spooler
      Remove-Item C:\Windows\System32\spool\PRINTERS\*
      Start-Service -Name Spooler
  }).Milliseconds

  "{0,-10}{1}" -f $m1, $m2
}

1..20 | %{measureit}

Now please read the script and guess which is going to be faster – with or without pipelines? The results absolutely astounded me:

15        307
2         273
2         270
254       280
2         277
10        267
2         271
2         269
2         276
2         268
2         268
2         277
2         272
2         268
2         268
2         268
10        267
2         269
2         267
2         271

I know why this works, but I sure as heck have NO idea how adding the pipeline, in this care can offer up 100-fold improvement in the speed of execution in most cases. Hmmm.

Thursday, January 15, 2015

Using Azure VMs for an Exchange DAG Witness Server

Last week, the Azure team released a cool new feature: support for a DAG Witness server inside an Azure VM. With the latest version of exchange, you can configure automatic data centre failover. But to do that, you new require three physical sites. However, according to Microsoft, many customers only had two physical sites deployed. This is where Azure comes in, since these clients can use Azure as their third physical site. This provides a cost-effective method for improving the overall availability and resiliency of their Exchange deployment, and requires no up front capital expenditure.

OF course, deployment of production Exchange servers is still unsupported on Azure virtual machines. But I can't help thinking that in due course we'll see this restriction changed.  MS kind of hint that by saying; "Stay tuned for future announcements about additional support for Azure deployment scenarios.".

Yet another cool scenario available to Azure customers today . For more details on how to do it, see the TechNet article at: http://technet.microsoft.com/en-us/library/dn903504(v=exchg.150).aspx.

Thursday, January 08, 2015

Azure Storage Architecture

I've been doing a lot of work, recently, around Azure – and one interesting aspect is Azure Storage. Azure Storage enables you to store and retrieve large amounts of unstructured data, such as documents and media files with Azure Blobs; structured nosql based data with Azure Tables; reliable messages with Azure Queues and use SMB-based Azure Files that can help to migrate on-premises applications to the cloud. The first three of these feature have been offered by Azure for several years, while Azure Files is newer. Azure files is still in preview – and you can learn a bit more about the File Service from the Azure Strorage team's team blog.

As an aside, the Azure module you can download enables you to manage Azure Storage features from PowerShell. The cmdlet set is rich (and richer with Azure Files) with 48 storage related cmdlets in the latest version of the Azure Cmdlets (0.8.12). These cmdlets are fairly easy to use, but I've found the error handling is a bit harder than with equivalent on-premise cmdlets. Many errors, for instance, return little useful information to help you troubleshoot. There is room for improvement – and knowing the Azure team, will come as fast as they can get to it!

In order both to scale and to provide great recovery from failure, the architecture of Azure storage is a bit more complex than the NTFS file system we are used to. Although it's now a couple of years old, the Azure team also published a detailed description of the Azure Storage Architecture. It's true level 400 stuff – and you can find it here. The paper does not cover the Azure File Service – hopefully Microsoft will provide an update to this great white paper at some point in the future.

If you are planning on architecting solutions utilising Azure Storage, this paper is one you should read and absorb. And, in due course, apply when you implement your solution(s)!

del.icio.us Tags: ,

Thursday, January 01, 2015

Happy New Year

Just a quick post to wish all the readers of this blog a very happy and joyous new year. I hope 2015 turns out to be a great year. For me, this year sees the start of my retirement, although the way things look, I'll be busier than ever. My best to you and yours.