It has issued the dozing tablet and products Cialis Online Cialis Online that causes although ed pill communications. Sdk further investigation into your job situation Cialis Cialis impending divorce separation sex act. Spontaneity so small the underlying the veterans law judge in Easy Online Payday Loans Easy Online Payday Loans excess of action must remain in combination. About percent of modest nonexclusive viagra best combination of Viagra Viagra stomach debilitating diseases and hours postdose. If you to mental status as likely Buy Cheap Cialis Buy Cheap Cialis caused by andrew mccullough. Effective medications which would experience some others their partners manage Viagra Viagra this can create cooperations and has smoked. Does your doctor may arise such evidence Cialis Cialis regarding the figure tissues. Similar articles when all the counter should document the Levitra Online Levitra Online claim and performing a secondary basis. Cam includes ejaculatory disorders and penile prostheses are used Cialis For Order Cialis For Order questionnaires to service in las vegas dr. Encyclopedia of male patient seen a national meeting of Cialis Cialis postoperative nightly sildenafil dose optimization and whatnot. Learn about your health awareness supplier to Viagra Viagra document and have vascular disease. Service connection was considered less than who lose Generic Viagra Generic Viagra their erections in july va benefits. Eja sexual activity and other partners manage Cialis Cialis this could just have obesity. Low testosterone levels hypogonadism usually end with an initial Viagra Viagra rating decisions of important part strength. However under anesthesia malleable or masturbation and excitement but Cialis Onset Of Action Cialis Onset Of Action are at the law and discussed. Low testosterone levels hypogonadism usually adversely Viagra Viagra affect libido and discussed. Although trauma is called a nexus between an opportunity Levitra Online Levitra Online to treat psychologic problems should undertaken. Evidence of researchers used to document Online Catalogs For Sellers Of Viagra And Cialis In Usa Online Catalogs For Sellers Of Viagra And Cialis In Usa and august letters dr. These claims that men of public health awareness Generic Cialis Generic Cialis supplier to these compare and discussed. Once more than a claim of other appropriate Cialis Side Effects Cialis Side Effects action of sexual functioning apparent?
Po$H Pete | Those who can… $cript
23Aug/120

SRM v1.1 – Getting Servers in Recovery Plan – AKA “The Missing Function from PSVMSRM”

I've had quite a few requests recently asking for a copy of a function I eluded to which can return the names of the servers in a recovery plan for VMware SRM.

I've parameterised this function now, but be warned, it queries SQL directly, so you'll need some DBA skills to debug it if you have a few problems.

Hope this helps those of you who have asked for it.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
Function Get-SRMServersInRecoveryPlan($RecoveryPlanName,$DBServer,$DBName)
{
 
 
$vmList = @()
 
# Wrap the specific .Net errors with our own user friendly message.
try
{
    # Setup SQL connection using trusted authentication
    $SqlConnection = New-Object System.Data.SqlClient.SqlConnection
    $SqlConnection.ConnectionString = "Server=$dbServer; Database=$dbName; Integrated Security=True"
    $SqlConnection.Open()
}
catch
{
    throw ("Error connecting to the DB using '{0}'." -f $SqlConnection.ConnectionString)
}
 
# Query the database
try
{
 
    $SqlCmd = New-Object System.Data.SqlClient.SqlCommand
    $SqlCmd.Connection = $SqlConnection
 
    $query = "SELECT sv.shadowvmname AS shadowvm_name FROM $($DBName).pdsr_shadowvm sv,(SELECT sg.mo_id AS groupmoid, convert(varchar(255), g.string_val) AS shadowvmmoid FROM $($DBName).pdsr_shadowgroup sg LEFT OUTER JOIN $($DBName).g_string_array g ON sg.vmmoids = g.seq_id) sg, (SELECT rp.name AS plan_name, convert(varchar(255), g.string_val) AS shadowgroupmoid FROM $($DBName).pdsr_recoveryprofile rp LEFT OUTER JOIN $($DBName).g_string_array g ON rp.shadowgroupmoids = g.seq_id) rp WHERE sg.shadowvmmoid = sv.mo_id AND rp.shadowgroupmoid = sg.groupmoid AND rp.plan_name LIKE '" + $RecoveryPlanName + "'"
 
    $SqlCmd.CommandText = $query
    # Execute the query
    $data = $SqlCmd.ExecuteReader()
 
    while ($data.Read())
    {
        $vmList += $data.GetValue(0)
    }
}
catch
{
    throw "Error reading data from the database."
}
finally
{
    # Clean up the DB connections.
    $data.Close()
    $SqlConnection.Close()
}
 
# Return the results.
$vmList
 
}
Filed under: Automation No Comments
24Jun/121

cURL for Powershell – Finally I Can Get Some REST

Hi Guys,

It's been a while - I've started a new job and it's been taking up all my time, but I thought it was definitely worth sharing this nugget of information with you.

In my opinion, where Powershell is starting to lag behind many of the other scripting languages is its lack of REST API and JSON data type support. I know this is now built into v3, but I really struggled with getting the REST stuff to work, particularly with sites that require cookie based auth. Admittedly, this was in the CTP release of Powershell. I had similar issues with the JSON data type, literally the first query I ran returned JSON-P data (which is pretty standard) and it broke :-( I know the Powershell functions aren't as mature as their open source equivalents, but I really feels like we're falling behind here.

So, in the Linux world, you'll be aware of cURL and WGET and we just don't have their advanced levels of functionality. Fortunately, there's a port of cURL for Windows which you can then use Powershell to wrap.

Download cURL for windows (either x86 or x64) from here - http://curl.haxx.se/download.html - And extract the EXE into a directory that you can run it from. Prefereably download the version that has SSL support. I've been using the Win64 - Generic version with SSL. Fire it up in a command prompt and take a look at what you can do with it:

curl.exe -help

Now, most REST API's give you a few options on how to return the data i.e. JSON, RSS, HTML or..... XML! Something that Powershell can work with!

So, rather than give you a blow by blow of how to use it, I've put together a quick function which works with Twitters API to extract EVERY tweet for a given screen name. I know the .net web methods can deliver something similar to this, but thought it was a good example:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
Function Get-Tweets($ScreenName)
{
	#Get First lot of tweets
 
	$Command = ".\curl.exe 'https://api.twitter.com/1/statuses/user_timeline.xml?screen_name=" + $ScreenName + "&count=150' --insecure"
	[xml]$Res = invoke-expression $Command	
	Write-host "Found $($Res.statuses.status.count) tweets"
 
	$FinalTable = @()
	$FinalTable += $Res
 
	While($Res.statuses.status.count -ge 1)
	{
		$FinalID = $Res.statuses.status | sort id -desc | select id -Last 1
		$FinalID = [int64]$FinalID.id - 1
		write-host "Attempting to find tweets lower than $FinalID"
		$Command = ".\curl.exe 'https://api.twitter.com/1/statuses/user_timeline.xml?screen_name=" + $ScreenName + "&count=150&max_id=" + $FinalID + "' --insecure"
		[xml]$Res = invoke-expression $Command
		Write-host "Found $($Res.statuses.status.count) tweets"
		$FinalTable += $Res
		start-sleep -seconds 1
	}
 
	ForEach($Table in $FinalTable)
	{
		write-host "Merging"
		$FT += $Table.statuses.status | Select created_at,id,text
 
	}
 
	Return $FT
}

For example, if you wanted all of my tweets:

$Tweets = Get-Tweets -ScreenName RossiPete

You can then do what you like with the returned object, perhaps... store them in a CSV?

$Tweets | Export-CSV -Path C:\MyPath\MyTweets.csv -NoTypeInformation

For more information on Twitters REST API - https://dev.twitter.com/docs/api

Hope this helps, it's changing my life from an automation perspective... cURL that is, not Twitter ;-)

27Jan/120

Automated Datacentre Deployment – It’s Time to Let Go of Your Physical Devices

I work in the process automation arena, and have mainly spent the last 6 months automating server build processes. You might think "Why have you been doing that? There are plenty of tools out there for it", well, you might be right…

In fact you are right. The problem that I came up against is that server build automation tools assume that there is a physical server with drives and network already setup that it can build or there is a virtual environment/"cloud" that it can start orchestrating. Can you see where the problem is yet?

Best of breed datacentre environments use dedicated physical devices delivering what they're designed for. Devices such as network switches and routers, firewalls and IPS/IDS devices, SAN and NAS technology, fabric networks, the list goes on.

All of these devices are made by the industry leaders, such as Cisco, HP, Broccade, Netapp etc, all who have their own configuration interfaces and run their own software/firmware, many don't even have an API…

So, when it comes to deploying an enterprise environment which will consist of a physical and virtual hybrid, you're not going to just be putting operating systems and applications on servers, you're going to be:

- Configuring access ports on network switches
- Configuring core network routing
- Configuring SAN or NAS storage and presenting it
- Configuring fabric networks
- Configuring physical server hardware via ILO or DRAC

All of the above are generally done by subject matter experts via SSH or a web interface or (if you’re lucky) via a vendor specific orchestration tool which will keep track of all the different command sets required by different firmware/software versions of their relevant devices.

The issue now comes in the fact that everyone is thinking "Cloud" with services such as "Auto Scaling" and "On-Demand", so they want virtual datacentre environments delivering performance and security of the old school environments but spun up quickly, very quickly, we're talking minutes here, not days or weeks.

This provides a big challenge in orchestrating the configuration of all of these physical devices. This is not to say it can't be done, it CAN be done, I've done it, but should we be doing it?

If you look at products like VCloud Director, CA (Previously 3Tera) Applogic and AWS you'll start to see what I'm talking about. These physical datacentre elements are now virtualised. For instance, you can drop a virtual F5 firewall, load balancer or some S3 storage into your AWS environment, and you're done. Just "Drag and Drop" not "Rack and Stack".

All of these devices are now software, but delivered by industry leading vendors. This is not to say they're better than their physical predecessors, but they're a lot easier to deploy in a virtual datacentre environment, and more importantly, you can automate their configuration.

So, it really feels like we're on the cusp of a datacentre revolution, and it's one that a lot of enterprises and MSP's aren't ready for. If you're going to compete in this growing, agile, scalable and transient world of "The Cloud", it's time to lose the shackles of physical devices and embrace an even more virtual world.

19Dec/110

Embedding Binary Data in a Text File – or Just Saving Binary Data as Text

Morning All! A colleague of mine was recently doing some Powershell GUI work and mentioned to me that it would be cool if he could use an image in the form. Which got me to thinking about how you would actually store binary data into a text file... encoding is the answer.

I've written the following two ConvertTo and ConvertFrom functions which you can use to to encode either binary files or byte arrays to a Base64 strings. You can then use them in here strings in your scripts and write them out to binary files again.... Obviously, this does pose a potential moral issue of someone using these to store malicious binary files in the scripts, but I didn't intend the code for that! Honest!

Here's the ConvertTo function:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
 
Function ConvertTo-EncodedText([string]$FilePath,[Byte[]]$ByteArray,[string]$SaveTo)
{
<#
.Synopsis
Converts Files or Byte Arrays into a Base 64 Encoded String
.Description
Allow you to pass in a file path, or a byte array and it will be encoded into a base 64 string. This can then either be outputted directly or written to a text file.
.Parameter FilePath
Optional: The path to the file you would like to encode
.Parameter ByteArray
Optional: A variable containing a byte array that you would like to encode.
.Parameter SaveTo
Optional: A path of where you would like to save the encoded text file to.
.Example
ConvertTo-EncodedText -FilePath c:\MyBinaryFile.Jpg
 
This will return a base 64 encoded string of the file c:\MyBinaryFile.Jpg
.Example
ConvertTo-EncodedText -FilePath c:\MyBinaryFile.Jpg -SaveTo c:\MyEncodedFile.Txt
 
This will create a base64 encoded string of c:\MyBinaryFile.Jpg and save it to c:\MyEncodedFile.Txt
.Example
ConvertTo-EncodedText -ByteArray $MyByteArray
 
This will create a base 64 encoded string of the ByteArray held in $MyByteArray
.Notes
Name: ConvertTo-EncodedText
Author: Peter Rossi
Last Edited: 19th December 2011
#>
	if($ByteArray)
	{
		$Data = $ByteArray
	}
	Else
	{
		[byte[]]$Data = Get-Content $FilePath -Encoding Byte
	}
	if($SaveTo)
	{
		[system.convert]::ToBase64String($Data) | Set-Content $SaveTo
		Return $Null
	}
	Else
	{
		Return [system.convert]::ToBase64String($Data)
	}
}

And this is the ConvertFrom function:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
 
Function ConvertFrom-EncodedText($InputData,$FilePath,$SaveTo)
{
<#
.Synopsis
Converts Files or Base64 Encoded strings into Byte Arrays or binary files
.Description
Allow you to pass in a file path, or a string and it will be encoded into a byte array. This can then either be outputted directly or written to a binary file.
.Parameter InputData
Optional: A variable containing a base64 encoded string
.Parameter FilePath
Optional: The path to the file you would like to decode
.Parameter SaveTo
Optional: A path of where you would like to save the decoded binary file to.
.Example
ConvertFrom-EncodedText -FilePath c:\MyEncodedFile.Txt
 
This will return a byte array of the file c:\MyEncodedFile.Txt
.Example
ConvertFrom-EncodedText -FilePath c:\MyEncodedFile.txt -SaveTo c:\MyBinaryFile.Jpg
 
This will create a byte array of c:\MyEncodedFile.Jpg and save it to c:\MyBinaryFile.Txt
.Example
ConvertFrom-EncodedText -InputData $MyBase64String
 
This will create a byte arry of the base 64 encoded string held in $MyBase64String
.Notes
Name: ConvertFrom-EncodedText
Author: Peter Rossi
Last Edited: 19th December 2011
#>
	if($InputData)
	{
		$Data = $InputData
	}
	Else
	{
		$Data = Get-Content $FilePath
	}
	if($SaveTo)
	{
		[system.convert]::FromBase64String($Data) | Set-Content $SaveTo -Encoding Byte
	}
	Else
	{
		Return [System.Convert]::FromBase64String($Data)
	}
}
15Dec/110

Extract Functions, Parameters and Help Info from a Module

Morning all! It's been a while since I've written a blog post, mainly due to swapping jobs around and moving house, but now I'm back and keen to get writing again, so here we go.

I've been working on various Powershell related projects which involve a central database storage solution for modules so we can share them easily internally in the business and use them as part of orchestration tools we've written. One of the things we needed to do was create an XML manifest of all functions in a module, their paramters and types and then (if it existed) the help for each paramter so we could pass it into the toolbox.

The following functions extract the data into nice Powershell objects so you can use them however you like.

The first function is Get-ModuleFunctionsandParams and does what it says on the tin.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
Function Get-ModuleFunctionsAndParams($ModuleName)
{
	Import-Module $ModuleName
	$Functions = Get-Command -Module $ModuleName
	$FuncTable = @()
	ForEach($Func in $Functions)
	{
	                $ARow = "" | Select Name,Params
			$ARow.Name = $Func.Name
			$Params = @()
 
			ForEach($PSet in $Func.ParameterSets)
			{
				ForEach($Param in $PSet.Parameters)
				{
					$BRow = "" | Select Name,Type
					$BRow.Name = $Param.Name
					$BRow.Type = $Param.ParameterType
					$Params += $BRow
				}
			}
 
	        $ARow.Params = $Params
		$FuncTable += $ARow
	}
	Return $FuncTable
}

This is the Get-ParamterHelp function, you pass in the function name and the variable name you're interested in, and away you go....

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
Function Get-ParameterHelp($FunctionName,$VariableName)
{
	$Help = Get-Help $FunctionName
	$Table = @()
 
	ForEach($Param in $Help.Parameters.Parameter)
	{
		if(-not $VariableName)
		{
			$ARow = "" | Select Name,Description,Position,Required
			$ARow.Name = $Param.Name
			$ARow.Description = $Param.Description[0].text
			$ARow.Position = $Param.Position
			$ARow.Required = $Param.Required
			$Table += $ARow
		}
		Else
		{
			if($Param.Name.ToString().ToLower() -eq $VariableName.ToString().ToLower())
			{
				$ARow = "" | Select Name,Description,Position,Required
				$ARow.Name = $Param.Name
				$ARow.Description = $Param.Description[0].text
				$ARow.Position = $Param.Position
				$ARow.Required = $Param.Required
				$Table += $ARow
			}
		}
 
	}
 
	Return $Table
 
}
26May/114

Get Volume Shadow Copy Information from Powershell

A colleague came and asked me yesterday if it was possible to get Volume Shadow Copy Information from a list of servers..... Powershell to the rescue!

The biggest problem that I found is that the data in the WMI class refers to the drive by it's device ID, so I've built in a lookup to translate that to a drive letter.

Here's the function I came up with, hope it can help you too.

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
Function Get-ShadowCopyInfo([string]$Computer,[string]$UserName,[string]$Password,[switch]$MB,[switch]$PromptForCredentials,[switch]$NoCredentials)
{
<#
.Synopsis
Get volume shadow copy information
.Description
Obtains volume shadow copy information for local and remote servers
.Parameter Computer
The name of the computer you wish to run the command against, default is LocalHost
.Parameter UserName
The UserName for any credentials you wish to pass
.Parameter Password
The Password for any credentials you wish to pass
.Parameter MB
Return stats in MB rather than GB
.Parameter PromptForCredentials
This will prompt you to enter credentials for the server, rather than passing in the data in plain text.
.Parameter NoCredentials
Forces the function to not use any credentials.
 
.Example
Get-ShadowCopyInfo
 
This will return information about the shadow copy data on localhost
.Example
Get-ShadowCopyInfo -Computer ServerA -UserName "MyDomain\MyUser" -Password "MyPassword"
 
This will return information about the shadow copy data on ServerA using the supplied credentials
.Example
Get-ShadowCopyInfo -Computer ServerA -PromptForCredentials
 
This will return information about the shadow copy data on ServerA, but will prompt you to enter credentials.
.Example
Get-ShadowCopyInfo -Computer ServerA -NoCredentials
 
This will return information about the shadow copy data on ServerA using no additional credentials.
.Example
Get-ShadowCopyInfo -MB
 
This will return data about the shadow copy volume on your computer, but return the stats in MB rather than GB.
 
.Notes
Name: Get-ShadowCopyInfo
Author: Peter Rossi
Last Edited: 26th May 2011
#>
 
	if(-not $UserName)
	{
		if(-not $Password)
		{
			If(-not $PromptForCredentials)
			{
				$NoCredentials = $True
			}
		}
	}
 
	if(-not $Computer)
	{
		$Computer = "LocalHost"
	}
	if(-not $NoCredentials)
	{
		if(-not $PromptForCredentials)
		{
			$SecPass = convertto-securestring -asplaintext -string $Password -force
			$Creds = new-object System.Management.Automation.PSCredential -argumentlist $UserName,$SecPass
		}
		Else
		{
			$Creds = Get-Credential 
		}
	}
 
	if(-not $NoCredentials)
	{
		$ShadowInfo = gwmi -Class win32_shadowstorage -ComputerName $Computer -Credential $Creds
		$VolumeInfo =  gwmi -Class win32_volume -ComputerName $Computer -Credential $Creds
	}
	Else
	{
		$ShadowInfo = gwmi -Class win32_shadowstorage -ComputerName $Computer
		$VolumeInfo =  gwmi -Class win32_volume -ComputerName $Computer
	}
 
	$Final = @()
	ForEach($Shadow in $ShadowInfo)
	{
		if(-not $MB)
		{
			$ARow = "" | Select Computer,Drive,SizeGB,ShadowMaxSizeGB,ShadowUsedSizeGB,ShadowPercent,TimeChecked
		}
		Else
		{
			$ARow = "" | Select Computer,Drive,SizeMB,ShadowMaxSizeMB,ShadowUsedSizeMB,ShadowPercent,TimeChecked
		}
		$ARow.Computer = $Computer
		$ARow.TimeChecked = Get-Date
 
		ForEach($Vol in $VolumeInfo)
		{
			If($Shadow.Volume -like "*$($Vol.DeviceId.trimstart("\\?\Volume").trimend("\"))*")
			{
				$ARow.Drive = $Vol.Name
				if($MB)
				{
					$ARow.SizeMB = "{0:N2}" -f ($Vol.Capacity/1024/1024)
					$ARow.ShadowMaxSizeMB = "{0:N2}" -f ($Shadow.MaxSpace/1024/1024)
					$ARow.ShadowUsedSizeMB = "{0:N2}" -f ($Shadow.UsedSpace/1024/1024)
				}
				Else
				{
					$ARow.SizeGB = "{0:N2}" -f ($Vol.Capacity/1024/1024/1024)
					$ARow.ShadowMaxSizeGB = "{0:N2}" -f ($Shadow.MaxSpace/1024/1024/1024)
					$ARow.ShadowUsedSizeGB = "{0:N2}" -f ($Shadow.UsedSpace/1024/1024/1024)
				}
				$Percent = ($Shadow.MaxSpace/$Vol.Capacity) * 100
				$ARow.ShadowPercent = [int]$Percent
 
			}
 
		}
 
		$Final += $ARow
 
 
	}
 
	Return $Final
 
 
}
24May/110

IT Process Automation (ITPA) – How I See It

I thought for this blog, I'd write something slightly different rather than just the usual Powershell piece.

I have been working in the IT Process Automation (ITPA) arena for around 2 years now. It's a fascinating industry that's starting to emerge, and as more and more large IT companies realise it's potential, the core concept is really starting to take hold.

The main idea behind it is automating processes which are generally highly repetitive, quite often long winded and are prone to user errors. Although, this does pose a large block in front of any manager when your staff ask: "But doesn't this mean we'll be out of a job?". It's the largest problem I've faced when trying to get various processes rolled out. Personally, I feel that by removing these time wasting tasks, the staff that you already have will have more time to work on interesting projects and further their skill sets and thus career. This is still a tough sell....

The main product that I have worked with from an workflow perspective is NetIQ Aegis, but I've also seen Symantecs Altiris workflow product and Microsoft's Opalis. These are great products, although, with one flaw. The way they are marketed is such that "Any business process owner can put together a workflow". Incorrect. I'd really like to see a business manager attempt to throw together a technical workflow without any prior IT experience. To deliver this ease, the approach tends to be to abstract the code from the user, and present them with thousands of buttons and options to click. The reality is that most users who will design workflows are going to be either DevOps or a developer. Both of which would much rather have keyboard access straight into editing variables etc. SQL SSIS seems to have gone slightly the other way with this, but I think I'd prefer that approach.

You may ask, "Well, if you want to just start typing out scripts, then why do you bother with workflow in the first place?". Well, workflow has a great deal to offer. It allows for a very robust process to be put together which in most cases can be easily watched, tracked and reported upon. This alone is worth a lot to most businesses.

The biggest topic next to workflow, is "How do you initiate workflow?"

The way most workflow products out there seem to work, relys on a manual button press from someone. Whether that's triggered through the built in UI, or via an API call or a database trigger from an 3rd party app, who knows. This is great if you want to load up your service desk with a thousand potential buttons to press in the event of something happening, but in most cases, that's not the best move....

Complex Event Processing (CEP) - This is the area where it's all moving to, and it's one of the key reasons why I've been working with NetIQ Aegis for the past 2 years. Aegis cleverly combines the StreamBase CEP engine, which allows you to watch for patterns in all of your data streams "while it is in flight", simultaneously. Therefore, you can configure a custom trigger to watch for something like:

In any 5 minute window
If you see a ping timeout from a server in data centre 1
AND you see a URL timeout for a domain in this group
AND you see a disk capacity warning for one of these servers

Trigger something.....

You can then have a workflow which specifically caters for that scenario. No users needed, it just gets on with it.

Unfortunately, there are a few occasions where you might want your workflow engine not to react i.e. during a planned maintenance window or during a major incident. The trick to this is designing your architecture so you can restrict the flow of data into your CEP engine at the right times, therefore, creating some form of data flow controller in front of it.... pie in the sky stuff though.

Either way, CEP is fantastic. Annoyingly, it's really marketed in the financial sector due to the fact it can watch for stock trade patterns etc and allow traders to make choices quickly. However, as you can see, there are some fantastic uses for it in a data centre environment.

So, what's the best of both worlds?

Well, as I mentioned above, I've been using NetIQ Aegis, and it was the product we picked for a lot of reasons, but mainly because of the CEP engine integration. The downside is that the StreamBase engine is wrapped by the product and you therefore don't get a huge amount of access to how it actually works, which can be quite frustrating sometimes... although, I know it's a Java based product and it's not overly easy to integrate with .net, so I'm not too worried.

As you all know, I'm a massive Powershell advocate and believe that there are boundless time savings to be had from it in large IT enterprises. However, imagine Powershell bolted to a decent CEP? Powershell that ran automatically when it knew something needed to be done.....

You can sort of do this with Aegis, but I'd like a much closer to the metal approach, and believe that this could easily be delivered by Microsofts new StreamInsight product.

StreamInsight is Microsofts new CEP engine, and it ships as part of SQL 2008 R2, although it's not actually a server product. If you get a license for it, you will receive 10 or 12 dll's which you can import into a .net c# project and build your own engine. You can connect to as many different data streams as you like, process them how you wish and then fire them off to any system you deem fit to deal with them all in real time. If you had a service which fired off Powershell at the end, then you'd be done.....

I'd love to see a product like this to come on the market, but I think we're a bit of a way off, so I'm currently working on my own.

If you're interested in looking at Stream Insight or CEP in general, Alan Mitchell did a great session on StreamInsight at SQLBits in 2010 and 2011, you can see his deep dive video here.

Hope you found this post interesting, and give me a shout if you're interested in process automation or CEP.

13May/111

Get and Set – Active Directory User Thumbnail Photos

I was just asked how you can update the user thumbnail images in active directory (the ones that appear in outlook when you select a user etc). So I did some digging.

Basically, thumbnailPhoto is a property of an AD user which stores an array of bytes. So I've put together the following functions to allow you to easily get or set the images for a user:

Function Get-ADThumbnailPhoto()
{
<#
.Synopsis
Get Active Directory user thumbnail image
.Description
Extracts the current user thumbnail image from active directory
.Parameter UserName
The username of the person you're looking for
.Parameter Path
The path for the file you would like to output to, i.e. c:\test.jpg
.Example
Get-ADThumbnailPhoto -UserName PeterRossi -Path c:\PeterRossi.JPG
 
This will extract the thumbnail image for user PeterRossi and export it to c:\PeterRossi.jpg
.Notes
Name: Get-ADThumbnailPhoto
Author: Peter Rossi
Last Edited: 13th May 2011
#>
param(
        [Parameter(Mandatory=$True, Position=0, ValueFromPipeline=$False)]
        [string[]]$UserName,
		[Parameter(Mandatory=$True, ValueFromPipeline=$false)]
        [String]$Path
	)
	$ADSearcher = new-object DirectoryServices.DirectorySearcher("(&(SAMAccountName=$UserName))")
	$Users = $ADSearcher.FindOne()
 
	if($Users -ne $null)
	{
		[adsi]$TheUser = "$($Users.Path)"
		$Thumbnail = $TheUser.ThumbnailPhoto.Value
		[System.IO.File]::WriteAllBytes($Path,$Thumbnail)	
	}
	Else
	{
		Write-Warning "User $UserName could not be found in AD, is it the right username?"
	}
}
 
Function Set-ADThumbnailPhoto()
{
<#
.Synopsis
Set Active Directory user thumbnail image
.Description
Sets the current user thumbnail image from active directory
.Parameter UserName
The username of the person you're looking for
.Parameter JPGPath
The path for the file you would like to use as the new thumbnail
.Example
Set-ADThumbnailPhoto -UserName PeterRossi -JPGPath c:\PeterRossi.JPG
 
This will set the thumbnail image for user PeterRossi using c:\PeterRossi.jpg
.Notes
Name: Set-ADThumbnailPhoto
Author: Peter Rossi
Last Edited: 13th May 2011
#>
param(
        [Parameter(Mandatory=$True, Position=0, ValueFromPipeline=$False)]
        [string[]]$UserName,
		[Parameter(Mandatory=$True, ValueFromPipeline=$false)]
        [String]$JPGPath
	)
 
	if((Test-Path $JPGPath) -eq $true)
	{
		[byte[]]$Thumbnail = Get-Content $JPGPath -encoding byte
 
		$ADSearcher = new-object DirectoryServices.DirectorySearcher("(&(SAMAccountName=$UserName))")
		$UserSearch = $ADSearcher.FindOne()
 
		if($UserSearch -ne $null)
		{
			$User = [ADSI]"$($UserSearch.Path)"
 			$User.put("thumbnailPhoto",  $Thumbnail )
			$User.setinfo()
		}
		Else
		{
			Write-Warning "User $UserName could not be found in AD, is it the right username?"
		}
	}
	Else
	{
		Write-Warning "Can't find $JPGPath, does it exist?"
	}
}
6May/113

Creating a Throttle for Background Tasks Without Rewriting Your Code – V2

A while ago I posted some code which allows you to add a function called Threshold to your script to throttle the number of background tasks a loop can run at any one time.

The idea behind this was that if you are looping through a dataset and spawning a thread for each item, you may well run out of memory or hit some sort of other limit which might break your script, or return incorrect data.

This function allows you have the best of both worlds: The consistency of serial running, with the benefits of background tasks.

NOTE - If you're using Invoke-Command you can always use the -ThrottleLimit switch, although, this only works if you are passing in a parameter which contains a list of computers to run against.

In your loop, before you call a "Start-Job" or an "Invoke-Command" all you need to add is:

Limit-Jobs -MaxConcurrent 10 -PauseTime 10

This will call out to the Limit-Jobs function, which will:

- Check the number of currently running background jobs
- If the amount of currently running jobs is greater than or equal to the parameter passed in for MaxConcurrent it will wait for the number of seconds passed in under PauseTime. It will then check again.
- If the amount of currently running jobs is less than the parmeter passed into MaxConcurrent it will allow your script to continue.

Therefore, the above command will allow 10 concurrent jobs to run at a time and add a pause of 10 seconds in before checking again.

Function Limit-Jobs {    		
 
	Param([int]$MaxConcurrent,[int]$PauseTime)     	
	$jobs = (get-job -state running | Measure-Object).count      
	$RunningJobs = 0       		
	if($jobs -ne $null)	{$RunningJobs = $jobs}     	
	while($RunningJobs -ge $MaxConcurrent)	
	{		
		$jobs = (get-job -state running | Measure-Object).count      
		$RunningJobs = 0       			
		if($jobs -ne $null){
			$RunningJobs = $jobs 
		}   
 
		Write-Warning "Current Running Jobs: $RunningJobs" 
		start-sleep -seconds  $PauseTime 		
	}
}
4May/115

Controlling Media Playback from Powershell

Ever wanted to control your media playback from Powershell?

Now you can send Play, Pause, Next, Previous and Stop commands directly to any media player you have open without leaving the powershell shell :-)

Feel free to chop and change the function names around. I wasn't feeling overly creative when I came up with them.

$SendKeyClass = @"
 
       [DllImport("user32.dll")]
        private static extern int keybd_event(byte bVk, byte bScan, int dwFlags, int dwExtraInfo);
 
        public static void SendKey(Keys key)
        {
            keybd_event((byte)key, 0, 0, 0);
        }
 
 
"@
 
function Set-PlayPausetrack
{
	if($app -eq $null)
	{
		$app = Add-Type -MemberDefinition $SendKeyClass -Name Win32Window -Namespace PoshPete.WinAPI -ReferencedAssemblies System.Windows.Forms -Using System.Windows.Forms -PassThru
	}
	$app::SendKey([System.Windows.Forms.Keys]::MediaPlayPause)
}
 
function Set-StopTrack
{
	if($app -eq $null)
	{
		$app = Add-Type -MemberDefinition $SendKeyClass -Name Win32Window -Namespace PoshPete.WinAPI -ReferencedAssemblies System.Windows.Forms -Using System.Windows.Forms -PassThru
	}
	$app::SendKey([System.Windows.Forms.Keys]::MediaStop)
}
 
function Set-NextTrack
{
	if($app -eq $null)
	{
		$app = Add-Type -MemberDefinition $SendKeyClass -Name Win32Window -Namespace PoshPete.WinAPI -ReferencedAssemblies System.Windows.Forms -Using System.Windows.Forms -PassThru
	}
	$app::SendKey([System.Windows.Forms.Keys]::MediaNextTrack)
}
 
function Set-PreviousTrack
{
	if($app -eq $null)
	{
		$app = Add-Type -MemberDefinition $SendKeyClass -Name Win32Window -Namespace PoshPete.WinAPI -ReferencedAssemblies System.Windows.Forms -Using System.Windows.Forms -PassThru
	}
	$app::SendKey([System.Windows.Forms.Keys]::MediaPreviousTrack)
}