As an administrator I'm constantly finding ways to use Windows PowerShell in my day to day administrative activities. Today's PowerShell tip is no different. For a while now Shane and I have been demonstrating how to use a simple PowerShell loop to back up all the site collections in your farm. It looks like this:
Get-SPWebApplication | Get-SPSite | ForEach-Object{$FilePath = "C:\Backup\" + $_.Url.Replace("http://","").Replace("/","-") + ".bak" ; Backup-SPSite -Identity $_.Url -Path $FilePath}
Since I've demonstrated that one a couple of times I felt it would be cheating to use that as my PowerShell Tip O' The Week. A couple of weeks ago someone on the TechNet forums asked how to use PowerShell to back up all of their webs, not site collections. I thought this was a worthy challenge as it had a few different aspects to it. Here's what I ended up with:
Get-SPSite http://upgrade/* | Get-SPWeb -limit all | ForEach-Object { [string] $dt = Get-Date -Format yyyyMMdd; $filepath = "C:\Backup\" + $_.Url.replace("http://","").replace("-","--").replace("/","-") + "-$dt.export" ; Export-SPWeb $_ -path $filepath -IncludeUserSecurity -IncludeVersions all -UseSqlSnapshot}
You can see it's very similar to the first one, but with a few changes. First, I didn't want to back up the webs from all the web applications in the farm, so instead I seeded the command with "Get-SPSite http://upgrade/*" which gives all the webs in that web app instead. You can also use filters to control which webs are backed up. I made sure and used "-limit all" with "Get-SPWeb" to get all the webs instead of the default of 20. Next I changed now the file names were constructed. I added the date to them. This way you can run it daily and keep multiple versions. Since Get-Date outputs a datetime object I had to cast it as a string. The format string of yyyyMMdd means the date will come out as something that is sortable in the file system. Today's date of April 5th, 2010 will show up as 20100405 in that format. Next I build the filename the same way as the site collection example, but through the $dt variable at the end. You could also put it at the beginning instead, if you wanted all of your backups sorted by date instead of by web name. To get the value of $dt instead of the literal string $dt I used double quotes. They behave differently than single quotes. Finally I export the web. I had more options to consider, so I showed some options. For instance, I use –includeusersecurity to include security and I use =includeversions to include all versions. Just to show off a little, I throw in a –UseSqlSnapshot too. If you choose not to include the date in the filename you'll end up with an error as it will be trying to write to a file that already exists. Export-SPWeb does not have an –overwrite switch, but it does have a –force switch which remedies that situation.
Since SPWeb exports are not full fidelity you probably won't use them for actual disaster recovery. For instance, you will lose your user's Alerts and any workflows. However, you might have some need for a backup of all your webs. If you do, this is the script for you. Remember, you'll have to run this from the SharePoint 2010 Management Shell, or run "add-pssnapin Microsoft.sharepoint.powershell" in your PowerShell console. And of course you'll have to run this on a SharePoint server.
Thanks to Shane Young for the site collection script that started this, and thanks to Darrin Bishop for giving my PowerShell script a sanity check and suggesting some things I didn't think of.
tk