Tag Archives: SharePoint 2010

SPPatchify – CU patch entire farm from one script

Patching can be tedious and time consuming.   Why not automate that?  Who wants to be awake all night clicking “Next” and watching SP config wizard?  So  I coded a single PowerShell script to manage the full end-to-end patching process.


The script will …

  • Enable PowerShell client remoting.  Connects to farm peer machines.
  • Autodetect current user password from IIS pools
  • Download Microsoft Cumulative Update (CU) media (EXE +CAB) and copy to all servers.   Optionally, you can download to the \media\ sub folder manually.
  • Stop Distributed Cache (DC)
  • Stop SharePoint services
  • Run EXE binary in paralle
  • Wait for EXE to complete and reboot
  • Dismount-SPContentDatabase
  • Start SharePoint services
  • Run SharePoint Config Wizard serially
  • Mount-SPContentDatabase
  • Remove Microsoft Cumulative Update (CU) media from peer servers
  • Ensure IIS started
  • Launch Central Admin with IE
  • Reboot current PC


Duration from start to end is incredibly fast.    I tested a 4 server SharePoint 2013 farm and ran the entire process in just 45 minutes.   Stopping services and dismounting content databases speeds up patching significantly.    Automating serial (one-at-time) Configuration Wizard ensures minimal “think time” between steps.   Removing manual human process gives higher consistency while mitigating risk of error.

The script uses Get-SPServer to auto detect farm members.   That enables CU media copy sideways to peers and the stop/start of SharePoint services.   Get-SPContentDatabase is exported to a local CSV file to “snapshot” before patching and later Mount-SPContentDatabase step to present databases again.   The big goal isn’t zero downtime, but rather minimal downtime.  

The entire farm patching process can be managed from one PowerShell window.  Enjoy!  shades_smile 

Please leave a comment if you found this helpful.


Get Started

  1. https://github.com/spjeff/sppatchify
  2. Extract to “C:\SPPatchify” on any server in the farm
  3. RDP with farm account and run “C:\SPPatchify\SPPatchify.ps1”


Flow Diagram











NEW CodePlex project – SPUpgradeHelper

Today I am happy to announce https://spupgradehelper.codeplex.com/ is available for download.   This project aims to smooth the “double hop” upgrade from MOSS 2007 to SP2010 to SP 2013 by reading a CSV of database targets and automating the Cmdlets needed for a consistent, repeatable, high quality, and fast process.   Please leave a comment here or on CodePlex if you found this helpful.  Cheers!  shades_smile


Project Description

Migrating MOSS 2007 to SP 2013? This script takes a CSV of databases and runs upgrade Cmdlets in bulk (DB version/Mount/Dismount/Upgrade-SPSite/Claims auth)

Upgrading MOSS to SP2013 is a tedious process with many Cmdlets, especially if you have many databases. This script aims to help automate that process.

Given a CSV with SQL instance and content database names, this script offers Cmdlets to run upgrade steps across many databases all at once. No more TXT or XLS copy and paste madness. Simply populate the CSV, get familiar with UH* Cmdlets and upgrade with ease.

Key Features

  • Read CSV of databases
  • Load helper functions into memory
  • Enable admin to more easily run Cmdlets in bulk
  • Measure time duration for each step (# minutes)
  • Provide detailed LOG file of actions, result, and duration

Quick Start Guide

  • Extract “SPUpgradeHelper.ZIP” to any SharePoint machine in your farm
  • Run “SPUpgradeHelper.ps1” to load helper functions
  • Type in full path to your CSV file (i.e. “C:\TEMP\COLLAB.CSV”)

Function Names

  • UHCIaims – execute SPWebApplication function to upgrade Classic to Claims auth
  • UHCompatibiIity – execute Get-SPSite for “set” of databases to show GUI version (14/15)
  • UHDBVersion – execute TSQL for “set” of databases to show build number (12.0, 14.0, 15.0)
  • UHDismount – execute DisMount-SPContentDatabase for “set” of databases
  • UHMount – execute Mount-SPContentDatabase for “set” of databases
  • UHReadCSV – load CSV file into memory with upgrade “set”, SQL instance, and database names
  • UHUpgrade – execute Upgrade-SPSite for “set” of databases

NOTE – Upgrade “set” is meant for running parallel workstreams. For example, two servers with SP2010 and two servers with SP2013. That way overall upgrade can be expedited by running database set “A” through the first SP2010 and SP2013 server while database set “B” runs on the second server.

Microsoft Upgrade Process

InfoPath tip – detect Security Group with List Item permissions

Yes, everyone says InfoPath is dead … but we’re all still supporting it for a while so I wanted to share one of my favorite tips.  Forms often need role based security at the field level.  Table below with example security matrix.

How can InfoPath detect that?   Don’t some people query SOAP ASMX?   Or even a custom WSP with a custom ASMX?   There is a simpler way with List Item permission.  


  1. Create a new Custom List named “SecurityLevel” with only the default “Title” column.  
  2. Go create the needed SharePoint Groups under Site Permissions.  
  3. Back in “SecurityLevel” add one new item for each SharePoint Group with an identical name.   Hover that item, pull down the menu, and set Item Level Permissions for that group ONLY to have “Read” on that item only.  
  4. With InfoPath Designer edit the XSN Form Template and create a new Data Connection to the list “SecurityLevel” with auto refresh.  
  5. Under Form Load create a rule … if count(SecurityLevel [“Analyst”]) > 0 then set field “SecurityLevel=Analyst”.   
  6. With that in place you are ready to apply formatting rules anywhere needed that use “SecurityLevel” to determine hide/show or read/write.



Now when a user open the InfoPath form, the data connection “SecurityLevel” will only show the items they have access to (which is the same as the SharePoint Group membership!).    Works on MOSS 2007, SharePoint 2010, SharePoint 2013, and Office 365.

Hope this helps.  Please leave a comment if it did.  shades_smile


DOWNLOAD  InfoPath Form – SecurityLevel.xsn >>












Upgrading SP2010 to 2013? Consider AAM redirection

This technique was introduced for upgrades from MOSS 2007 to SP2010 and appears to work nicely with SP2013 too.

The STSADM command still exists, creates HTTP 302 redirect output, and enables farm administrators to more easily manage content database attach upgrades for large Web Applications (i.e. multiple  terabytes).   Considering how content database attach is the only upgrade method for SP2013, the below AAM technique can be a helpful way to gradually upgrade content databases from a SP2010 “source” farm to a SP2013 “destination” farm.  Multiple weekends and longer timelines beyond one day upgrades (small Web Applications) are possible with the help of AAM redirection.

Please keep in mind AAM redirect (HTTP 302) is a temporary solution and best used for as short a time as possible.    The sooner upgrade is complete, the fewer support tickets we are likely to have.  See whitepaper below for how to configure.


Using Alternate Access Mapping (AAM) URL redirection as part of the upgrade process (SharePoint Server 2010) (white paper)

Special Considerations

  • Custom Code (DLL/JS)  -  Hardcoded URLs may fail on the “source” web application when it is given a new AAM and temporary URL.
  • InfoPath Forms (XSN)  -  Hardcoded URLs in data connections, hyperlinks, or code may fail.
  • Workflow (XOML)  -  Hardcoded URLs in email body HTML may fail.
  • Incoming Email  -  While AAM handles HTTP browser traffic, it does not facilitate SMTP (port 25) delivery for inbound email enabled lists and libraries.  One possible option here might be the placement of a PowerShell script in Task Scheduler on the “destination” (new SP2013) farm to move old EML files over to the \\sp2010-wfe\c$\inetpub\mailroot\drop\ folder if they are not picked up locally.   A time threshold needs to be decided (i.e. 30 minutes) to give the local SP2013 farm enough time to process while not burden users with “missing” email taking too long to appear.
  • Local HOSTS File -  It may be necessary to enable both the original URL (portal.contoso.com) and temporary URL (portalold.contoso.com) enabled on the “source” (old SP2010) system.   This means IIS binding and local HOSTS loopback so that if SharePoint 2010 attempts to open http://portal.contoso.com/ via loopback for one of the above data connections on a site which hasn’t migrated yet it could successfully do so.   Even though end user DNS will have “portal.contoso.com” pointed at SP2013, the SP2010 machines may need to loopback locally for “portal.contoso.com” to better accommodate SP2010 sites which haven’t migrated.





  • This method is might not be officially supported by Microsoft for SP2010 upgrade to SP2013.  However, quick tests in the lab show HTTP 302 redirects can be created and user traffic successfully redirects.

How to point to a custom 404 error Web page

Recently I was working on a custom 404 page with JavaScript logic for retiring an old SharePoint farm.   Replacing the “oldportal” DNS name with “newportal” helped avoid any end user 404 message by automatically handling all redirects.   Microsoft has a great KB article with implementation steps.  http://support.microsoft.com/kb/941329   However, I had two important changes and wanted to blog about it in case this might help others.

  1. Custom JavaScript redirect logic.   A simple replace statement helps change “oldportal” to “newportal” and seamlessly transition user traffic to the new system.
  2. PowerShell instead of Visual Studio EXE.   Not everyone has Visual Studio or the developer skills needed for compiling a command line EXE.   PowerShell is a simple text command you can run on MOSS2007/SP2010/SP2013 because it references the “Microsoft.SharePoint” assembly and works on older SharePoint server versions.

Enjoy!  Smile


PowerShell Code – apply web application setting

$webapp = [Microsoft.SharePoint.Administration.SPWebApplication]::Lookup("http://oldportal")
$webapp.FileNotFoundPage = "Custom404.htm"


HTML Code – Custom404.htm

<!-- _localBinding -->
<!-- _lcid="1033" _version="" -->
<meta HTTP-EQUIV="Content-Type" content="text/html; charset=utf-8" />
<meta HTTP-EQUIV="Expires" content="0" />
	<meta http-equiv="refresh" content="0; url=/_layouts/spsredirect.aspx?noscript=1" />
<script language='javascript' src="_/layouts/lO33/init.js"></script>
<script language='javascript' src="_/layouts/1033/core.js"></script>
<script language='javascript'>
//requestedUrl = escapeProperly(window.location.href);
//STSNavigate(”/layouts/.?oldUrl=” + requestedUrl);
var dest = window.location.href.replace("oldportal","newportal") .toString();
window.location.href = dest;

Launch IE as multiple users for testing

When testing role based security, it can be helpful to right click IE and “Run as different user.”   The below PowerShell script automates this process to make it easier to launch multiple IE windows for testing.   Keeping a copy on Windows 7/8 workstations can help users jump into testing easily and with fewer password typo lockouts.    If you find this helpful, please leave a comment.  Smile


# Launch multiple IE windows as different test user accounts.  Great for testing role based security systems.
# Updated line 3 with user name, line 6 with Active Directory domain, and line 9 with SharePoint URL.
$users = ('sptest1','sptest2','sptest3','sptest4')
foreach ($u in $users) {
	Write-Host $u
	$username = ('domain\' + $u)
	$password = 'pass@word1'
	$cred = New-Object System.Management.Automation.PSCredential -ArgumentList @($username,(ConvertTo-SecureString -String $password -AsPlainText -Force))
	Start-Process "c:\Program Files\Internet Explorer\iexplore.exe" -LoadUserProfile -Credential $cred -ArgumentList "http://sharepointUrlHere"


Here is a screenshot of the script in action.


PowerShell – Notify all site users for outage / maintenance

The below PowerShell can be used to email all users of all site collections on your system.   I find this helpful when planning maintenance outages to alert users of the down time, impact, and changes being performed.


Get-SPSite -Limit All |% {$url = $_.url;$_.RootWeb.SiteUsers | select UserLogin, DisplayName, Email, @{Name="SiteUrl";Expression={$url}}} | Export-Csv SiteUsers.csv



View disabled Health Check rules – PowerShell one liner

Scripting this was harder than I thought it should be, so I wanted to share this crazy long PowerShell one liner.  If anybody knows of a shorter way, please leave a comment or Tweet me.   First, we get the web applications including a special switch to get Central Admin (which is normally skipped for safety).   Then we get the site collections, but since help collection has one we need to filter those out.  Then we grab the Health Analyzer Rules Definitions list.  Finally, we can filter for disabled items and display as a table.

When working on a new farm that someone else installed.  Or heck, even one I installed but just can’t remember how.  This is a good way to confirm your assumptions about any red or yellow Health warnings you see (or don’t see because they’re suppressed)


(Get-SPWebApplication -IncludeCentralAdministration |? {$_.IsAdministrationWebApplication -eq $true} |
Get-SPSite |? {$_.Url -notlike '*help'}).Rootweb.Lists["Health Analyzer Rule Definitions"].Items |?
{$_["HealthRuleCheckEnabled"] -eq $false} | select Title,ID | ft -AutoSize


I would also recommend creating a new filtered view in the Central Admin GUI for an easy way to quickly view these going forward.  There are legitimate times when a rule should be disabled, that’s why we have the checkbox.  However, it pays to know what the current settings are and not forget those hidden items.

Once all of the health rules are the way you like, run a quick SPTimerJob refresh to see the outcomes of those rules.  Smile


Get-SPTimerJob |? {$_.name -like '*-health-*'} | Start-SPTimerJob




Document Library Site Collections (DLSC)

Splitting a large site collection into many can make backups, storage scaling, and even development easier.   This is a pattern I like to call the DLSC or "Document Library Site Collection.”

A site collection starts our small at first, gets popular, explodes in size seemingly overnight, and the SharePoint admins scramble to support it.  Sound familiar?

It can be helpful to peel out a large Document Library into a dedicated site collection.   Then you have more options via PowerShell and SQL to manage backups.   Have five document libraries with 100GB each?  Fine, split them out to 5 site collections in 5 SQL databases.  Very manageable.


How To – Split a Document Library into a new Site Collection

1)  Make a new SQL content database (optional).   If you know LOTS of content will be coming, then give it room to grow now.

2)  Make a new blank site collection.   The below PowerShell command can help direct the site creation to a new database.   I like to use STS#1 for “Blank Site” because it has a minimal simple footprint.  Less is more.

New-SPSite http://sharepoint.com/sites/doclib -OwnerAlias “DOMAIN\Admin” -ContentDatabase WSS_Content_DocLib -Name “Document Library Site Collection” -Description “DLSC” -Template “STS#1″

3)  Export the source Document Library to CMP format

http://spdeploymentwizard.codeplex.com/  can be used for a friendly safe GUI to export with all the right options.  I like to enable checkboxes for ALL security, ALL versions, and ALL user info.

4)  Import the source Document Library to CMP format

http://spdeploymentwizard.codeplex.com/ can be used for the import too.

5)  Delete the original Document Library.

6)  Update and test navigation.  The biggest challenge with the DLSC model is navigation.  Things must be easy for users so they can’t tell the difference.   Be sure to update Top Link bar navigation.   I like to add a CEWP (Content Editor Web PArt) with a quick JavaScript to redirect folks back to the main “application” site.

Click here to return to AppSite

Downsides and Caveats

  • Manage two sets of security.  Yes, you’ll have to grant people permissions in two places.  Even using SPGroups won’t help because those are scoped to one site collection.  AD groups can help.  Those would be a great way to grant security to many site collections at once.   All about scale.  With only 2 sites AD groups probably aren’t worth the trouble, but with 20 it sure would help.
  • Sandbox code solutions. These only run in 1 site collection, so if you’re using them to manage documents and move those document libraries you could see issues.   The workaround would probably be to upgrade the code to a Central Admin farm WSP solution.
  • Dependency.   For some migrations you’ll need to copy ALL site collections.  If you copy just 1 then you may have broken links or missing dependencies.


Hope  this helps somebody else out there.   Smile







Site Assets – enable Version History to sleep well at night

One thing I like to do on a newly created site is enable Version History for the “Site Assets” document library.  With many CSS and JS files supporting front end development, often those code files land here.  SharePoint Designer can be used to quickly check this setting.






“Site Pages” will have Version History enabled by default on newly created Team Sites.  That covers ASPX pages, but it can be smart to enable both for safety.

Return to Top ▲Return to Top ▲