Canalblog
Editer l'article Suivre ce blog Administration + Créer mon blog
Publicité
PROSERVIA : Pôle Conseil Expertise
Newsletter
Archives
PROSERVIA : Pôle Conseil Expertise
21 octobre 2008

Scripting Corner Volume 4

I am always on the lookout for new and interesting scripts that are worth writing and worth writing about. Recently, I got two requests for scripts that I thought were useful so felt I would share them here. Both of these scripts were used to solve real world customer requests and both of them will be used multiple times in their environment.

Move-Mailboxcsv

The customer scenario for this script was that the customer was regularly moving a wide range of mailboxes between servers and databases to support their mailbox organization structure and requests for additional mailbox quota etc. They had written their own script to read a csv file of mailboxes to move and then do the move using foreach with the move-mailbox cmdlet. Since they were piping the csv input directly into the foreach it was only operating on one mailbox at a time. This resulted in them losing the multi threaded nature of the move-mailbox cmdlet.

In reviewing the example CSV file the customer provided I realized that they were moving from multiple source locations but were moving to only a handful of target locations. This pointed me in the direction of optimizing the mailbox moves based on target mailbox. Thus the script I was able to provide them reads in the CSV file and then identifies all unique combinations of Server/Storage Group/Mailbox Database and processes the moves in batches based on the unique target database.

Break down of the script

The actual structure of this one is very simple thanks to the select-object cmdlet. This cmdlet has the parameter -unique that will return only the objects whose values are unique based on the properties you have asked it filter on.

So for the script function "SelectUnique" all we have to do is take the entire contents of the CSV file that we imported into $csv using import-csv and run it thru select object asking it to only pick the unique combinations of targetmbserver, targetmbsg, and targetmbdb.

Once we have all of the unique combinations in the $targets array we have everything we need to group the users together by target database. Piping the unique sets into the MoveMBX function we can simply use a where statement to filter out the group of users for a given target database and move them in bulk.

For this script I was able to implement code that will ask the user for input if nothing is specified for the required -file script parameter . (Special thanks to live search and to Windows Powershell in Action by Bruce Payette for helping me figure this out .)

[string] $file = $(if ($help -eq $false){Read-Host "CSV File"})

This uses the fact that you can specify a default value for a parameter and that you can use a script block to do that. So as long as -help was not specified on the command line and no value was given for -file then it will prompt the user asking for the CSV file.

One of the things I love about powershell is every time I sit down to do a script I learn something new.

New-DBFromXML

I was contacted by one of our Premier Field Engineers with a script that her customer was working on for doing bulk Mailbox and Storage Group creation. Originally the script used a CSV file and required that you specified the SG Name, Log File Path, DB Name, and DB Path for each database by hand in the CSV file. Looking at their file I realized that like most customer they had a naming scheme that used a prefix follow by a number to indicate each storage group and database. Since this information was unchanging from server to server why keep working with it in the CSV file.

With a bit of playing with the XML type in powershell I was able to come up with a script that with a simple XML configuration file will create a specified number of Storage Group / Database combinations on a server.

Break down of the script

The only function I created for this script was the showhelp function for outputting the usage information to the host. The actions needed in this script were very straight forward and only did one thing so I felt there was no reason to do the work inside of a function and just put everything in the main body.

The key here is that the XML file has a list of all valid Database Drive locations and all Valid Log file locations. The only other thing we need to specify then is the prefixes we need for the names and paths then the number of databases that we want to have created.

The nice thing about working with XML files in powershell is that you can just use the same dot notation you are used to using when working with objects to get to information in an XML file. If you are interested in using XML files in script I would recommend you play with them by importing an XML file into a variable and then just play around with the variable to get a feel for how they work in powershell.

[xml]$xml = get-content $file

You can even modify the XML file while it is loaded in as a variable and then using the Save Method save it back to disk in the modified form.

$xml.save("C:\temp\myxml.xml")

Once I have the configuration XML file loaded I pull the drive locations into some arrays to make them easy to work with. Next we setup a while loop that will do the work of creating the storage group and databases. The only hard part here is constructing all of the paths and names into the form that the cmdlet needs based on the information from the XML.

Once that is done it is simply a matter of incrementing our indexes and running thru the loop again to create the next storage group / Mailbox Database.

For this particular customer implementation we didn't put any logic in to group the database instead we went with a simple rotation of file locations. What this means is that if I specified four database paths and then told the script to create twelve Storage Group / Mailbox Databases I would end up with them looking like this on the disk:

Drive 1

Drive 2

Drive 3

Drive 4

Database 1

Database 5

Database 9

Database 2

Database 6

Database 10

Database 3

Database 7

Database 11

Database 4

Database 8

Database 12

This was an implementation the customer was happy with since all of the folders and files were labeled clearly it was still easy to find the databases and using this method it was easy for the script to support creating a number of databases that were not a multiple of the locations provided.

Conclusion

Hopefully these scripts have shown you some of the things that real world customers have started using powershell to do for them and have sparked your interest in creating your own scripts.

As always if you see anything I missed, need to clarify, or you want to comment on please don't hesitate I welcome all feedback. Also I need your script Ideas . I have had a few submitted and I am hoping to include at least one of them in a future scripting corner.

Download the scripts here: http://msexchangeteam.com/files/12/attachments/entry450004.aspx

Please Note! These scripts are not officially supported by Microsoft. Please see the scripts for details!

http://msexchangeteam.com/archive/2008/10/20/450003.aspx

Publicité
Commentaires
Publicité
Publicité