Friday, October 16, 2009

Remove and Add Back Multiple SharePoint Content Databases with PowerShell

The TechNet documentation for deployment of software updates for SharePoint Server 2007 suggests, among other things, to detach your content databases in order to reduce downtime. This is easy enough to do through Central Admin when you have a few databases and a small number of web applications, but how about in a large farm with many web applications and dozens of content databases with multiple SQL Server Instances? PowerShell to the rescue!

First: Remove/Detach the Databases – Create a CSV file with value headings webapp and url where webapp is friendly name and url is the associated url for the web application. CSV example below:



Use below PowerShell script to loop through all of your web applications using above CSV file, enumerate the content databases via STSADM, and throw the content database name and SQL Server Server/Instance name to a new CSV file named by webapp value. An internal loop reads from the new CSV file and detaches the databases (I commented line out below – uncomment when ready to run).

#loop through a csv file that contains webapp and url, run enumcontentdbs, export results to csv
Import-Csv .\webapps.csv | ForEach {
    $strWA = $_.webapp

    $strURL = $_.url

    $rawdata=stsadm -o enumcontentdbs -url $strURL


    $sitesxml.databases.contentdatabase | Select-Object server,name | export-csv .\$strWA.csv -noType

    #loop through contentDBs from csv created in above line and delete

    Import-Csv .\$strWA.csv | ForEach {

        $strServer = $_.Server

        $strName = $_.Name

        write-output WebApp: $strURL ContentDB: $strName


        #$delDB = stsadm.exe -o deletecontentdb -url $strURL -databasename $strName



Second: Add the Databases back – After you’ve run all of your updates, you need to add your content databases back. Use below PowerShell script to loop through all of your web applications and read in the values from the CSV files created by running previous script to add the content databases back.

#loop through webapps from original csv -- same as remove script
Import-Csv .\webapps.csv | ForEach {

    $strWA = $_.webapp

    $strURL = $_.url

    $rawdata=stsadm -o enumcontentdbs -url $strURL

    #loop through contentDBs and add based on csv created in the webAppDbs_Remove.ps1

    Import-Csv .\$strWA.csv | ForEach {

        $strServer = $_.Server

        $strName = $_.Name

        #write-output WebApp: $strURL ContentDB: $strName

        $AddDB = stsadm.exe -o addcontentdb -url $strURL -databasename $strName -databaseserver $strServer -sitewarning 450 -sitemax 500


Add the scipts in appropriately named powershell ps1 files (e.g., webAppDBs_Remove.ps1 and webAppDBs_Add.ps1) on an Application or Web Front End Server where Powershell is installed and call from a .bat file with commands similar to below:
powershell.exe Set-ExecutionPolicy RemoteSigned

REM command to run script from the ps1 file to remove

powershell.exe -noexit .\webAppDBs_Remove.ps1

REM command to run script from the ps1 file to add back

REM powershell.exe -noexit .\webAppDBs_Add.ps1

Friday, August 7, 2009

Handy VBScript for STSADM looping

Before getting into PowerShell, I used to do a lot of administrative scripting for SharePoint using VBScript.  In fact we still use a vbscript based automated farm installation that a consultant assembled for us (Thanks V-man!) – I intend on switching over to powershell in future.

The vbscripts are tried-and-true though.  Here’s a little one I like to use when I need to loop through an stsadm command multiple times.  I use it for things like adding multiple managed paths, renaming sites to clear orphans, and running database repair to clear up list and document library orphans on multiple databases, etc.

'script to add managed paths based on values in paths.csv file stored on same drive as script

'paths.csv file entry example:





Const ForReading = 1

Const cStsadmPath = """C:\Program Files\Common Files\Microsoft Shared\web server extensions\12\BIN\stsadm.exe"""

Dim retval
Dim objshell
Dim cmd

Set objFSO = CreateObject("Scripting.FileSystemObject")
Set objTextFile = objFSO.OpenTextFile(".\paths.csv", ForReading)

Do Until objTextFile.AtEndOfStream
    strNextLine = objTextFile.Readline
    arrPath = Split(strNextLine, ",")

    Set objShell = CreateObject("WScript.Shell")


    'grab webapp from CSV which is the 1st value and the managed path from 2nd value in the base 0 array to run addpath command

    cmd = cStsadmPath & " -o addpath -url http://" & arrPath(0) & arrPath(1) & " -type wildcardinclusion"

    retval =, 1, True)
    Wscript.Echo cmd

    set retval = objShell.exec(cmd)

    'Wait until command is done running before running next command

    Do Until retval.Status

          Wscript.Sleep 250


    set retval = nothing

    Set objShell = Nothing   

I have a script that enumerates the sites and throws the output to an xml file – and then the vbsscipt uses the xmldom to read in values but the output gets corrupted (proabably because I’m enumerating close to 10,000 site collections) and can’t reliably be read.  I intend on using Powershell and getting the sites through the SharePoint Object model – look for that in a future post.

Wednesday, July 15, 2009

Points about SharePoint Search

Have a growing index – at about 3 million items – not humongous but enough to cause crawl headaches.

So here are some lessons learnt:

  1. Use a dedicated web front end server for crawl – preferably one that doesn’t get user traffic.  The caveat here is that in a farm where you load balance your web front ends, with multiple web apps that each have their own IP assigned in IIS, on your Index server in Central Administration website (CA), you have to set the value to “Use all web front ends for crawling.” – Say what? – Yes, if you pick a dedicated crawl server in CA, SharePoint does this thing where it automatically adds the Host entries to the hosts file on your index server – and it grabs the default IP from that server and associates to all of your web apps (timer job overwrites value so don’t bother trying to correct the IPs to point to your load balanced member IPs).  So what you do is set to crawl all web front ends in CA – which will stop this host entry re-write madness – and than go to the Index server and enter the correct IPs (even though in CA, you set to crawl all front ends, the index server will use whatever is in hosts file instead).  So enjoy the added maintenance task of updating the hosts file every time you add a new host header site collection  To their credit, Microsoft documents this here (  Also, MS enterprise search team mentions too many crawl hosts can starve your crawl here – enforces the dedicated crawl server scenario.
  2. Know the stasdm commands osearch and spsearch.  We had an admin reset all the search services, in attempt to resolve crawl issues, and he really screwed things up – he’s no longer an admin – he turned on WSS Help search through CA, but CA doesn’t let you pick an Index location for WSS search (Only Office Server Search – MOSS search – has this option in CA interface) so we started getting alerts about the C:\drive running out of space.  I learned that you can only set this value through the stsadm -o spsearch -indexlocation switch.  We have an automated farm install that shells out to stsadm which set all of these search settings over a year ago when we built this farm – so the guy that created the script, who knew about this, accommodated for it – the admin did not.
  3. Make sure your search crawl account has FULL READ only in Policy for Web Application setting in CA.  An admin set to FULL CONTROL which causes search to crawl unpublished items – again, he’s not an admin any longer.
  4. If you’re fortunate enough to have a dedicated Index server, set Indexer Performance to Maximum.  Keep an eye on this with some performance monitors and tune with crawler impact rules (part of MS search team’s post from link in item 1 above).  Right now I have a crawler impact rule against all sites to Request 64 documents at a time.  Performance monitors okay with this for now but I’m keeping an eye on this.
  5. Make sure you’re using RAID 1+0 on all disks related to search (Index, Query, and DB).  We designed the search DB to reside on it’s own dedicated disk but someone came along and put a bunch of content DBs on the same disk so we had to go through extra maintenance task of moving search DB file.
  6. Read the search team’s post about crawl performance (see link in item 1 above).  With the guidance in that post, it prompted me to stop all of our content source crawls, which were starving, and rerun a full crawl of each individually (we have over 5000 site collections in this farm and use only host header URL site collections for portals so we have content sources broken out to crawl managed path based team sites in one web app, portals in 4 separate web apps, my sites, and some individual site content crawls that have higher SLA for crawl)…thought is that this should speed up future incremental crawls and will also help in configuring better crawl schedule.
  7. On your dedicated crawl front end server, disable Internet Explorer Enhanced Security.  Was getting thousands of errors with message “An unrecognized http status was received…”  MS KB says to turn off proxy server settings in IE – we don’t have proxy server settings in IE.  We disable IE Enhanced security on all of our WFE servers anyway but this one snuck by.  I disabled through Add/Remove Programs > Windows Components, did a new, full crawl on an 850,000 item content source and watched errors go from over 20,000 to 1000 with 1.3 million items now attributed to this content source.
  8. Be careful with metadata property mappings.  Someone added a mapping that made the This list.. contextual search stop functioning properly.  MS KB968476 addresses this.  However, I think this KB is missing steps.   If the KB doesn’t correct the search issue, perform extra step of going to Metadata Properties > browse to find the “Path” Managed Property and remove all mapping except for Basic:9 (text).
  9. Don’t stop search services or add new query server while crawl is running.  Pause or stop crawl before making changes to search configurations.

If you get a chance, check out the Microsoft Enterprise Search Team Blog (

Thursday, July 9, 2009

SharePoint Load Balancing with F5 BIG-IP LTM

NOTE: Since originally posting, I realized you may want to do this by binding to individual ports in IIS instead of individual IPs (saves the need for all of those extra IPs and steps to add and configure as presented below, i.e. the members can be any port but the VIP would still be port 80)

In the MOSS 2007 deployments a co-worker and I designed for my company, the F5 BIG-IP Local Traffic Manager is the network device used to load balance Web Applications.  In the F5 configuration (sample below), each client facing MOSS Web Application requires a VIP (Virtual IP Address).  The VIP has several host (AKA member) server IP addresses associated to it to make up a Virtual Server Pool.  The VIP is what we associate to the end-user URL in DNS.

In sample, you see the MOSS Web Front Ends (WFE) listed as Hosts.  One WFE host can be included as a member in many Virtual Server Pools; however, one WFE host IP address cannot be a member of more than one Virtual Server Pool (Huh?..You may need to re-read that last sentence a few times – graphic at bottom of post should help).

So for the WFE hosts, get as many IP addresses as there are MOSS Web Applications (consider getting even more in case the need to add or extend web apps in the future arises…it will).  The IPs are added to the Network Adapter on each WFE (steps with screenshots here).  Once the IPs are added to the Network Adapter, they become available in IIS so a unique IP can be assigned to each IIS Website (Note:  While you’re in the IIS website properties, remove the host header from IIS website that was created when you first added the web app, otherwise, you’ll never be able to reach host header site collections created in this web app by adding a DNS alias to the VIP – don’t worry about this if you’re going with managed path based sites…I’ve said too much already…moving on).  As an example, in the figure below, there are 4 client facing web applications (Portal 1, Portal 2, Team Sites, and My Sites) and each of the 5 WFE Hosts has 4 unique IP addresses.

Further, the design takes load weights into consideration.  For high availability, the deployments consist of multiple WFEs.  With the multiple WFEs, you can distribute the load to maximize performance and availability.  MOSS installs all web applications and their corresponding application pool on every WFE host within the farm. However, to maximize CPU, not every application pool runs on every host (requires configuration of application pools performance settings – Joel Oleson has guidance here – If you’re dealing with load weights using multiple front ends and web apps, this can get complex -- I’ll post my PowerShell script for configuring IIS on multiple servers with multiple websites by reading in settings from a CSV file in a future post).

Using load weights, traffic load is distributed only on a select number of hosts allowing hosts to concentrate processes on a few application pools instead of all application pools.  The design leaves us with spare IPs.  This provides flexibility to add or remove WFEs to the Virtual Server Pools as needed.



For more information about load balancing and F5, the F5 folks have a SharePoint deployment guide here.  The also have a nice Load Balancing 101 paper here.

Friday, July 3, 2009

SharePoint and Novell eDirectory LDAP

Setting up an LDAP authentication source in SharePoint is well documented so I'm not going to recreate the wheel here but I will give some of the nuances of how I deployed and glue together the documentation.

My company has invested in Novell eDirectory for it's identity management deployment. With eDirectory, we're able to consolidate and sync multiple LDAPs so current users can experience a single-sign-on experience. It also provides account self-provisioning and management for users via an externally facing website.

First the links:
  • Microsoft documents multiple authentication providers for SharePoint and setting up an LDAP membership provider with examples here.
  • Nick Kellett did a great job documenting every step required to connect the eDirectory LDAP here.
  • Wen He provided a piece Nick missed to expose the LDAP through the SharePoint people picker here.
  • My own blog details some steps to push out the web.config changes to all of your farm web apps here.

Now here's what I did:

  • Planned out the membership provider attributes to use in the web.config.
  • Downloaded and installed the free Softerra LDAP browser recommended in Nick Kellett's post on a Web Front End Server in my test Farm.
  • Obtained the Novell LDAP server name and port information from the team at my company that manages Novell eDirectory.
  • Entered server information and port into the LDAP Browser and started browsing objects.
  • Drilled down the browser to the user container object and simply right clicked on the container and took the LDAP path from properties screen to use in the the provider.

  • Started browsing the available attributes of users.
  • Decided on using the mail attribute for the login in the provider. Setting userNameAttribute='mail' so when people use this form of authentication, they will use e-mail address as their username. We have both employees and non-employee content managers using this authentication to SharePoint sites in our DMZ. With this, our employees can login using the same password they use to login to our network in conjunction with their e-mail address. For external users, they can also login with their e-mail address and password set and maintained by visiting an externally facing site.
  • Tested and discovered the useDNAttribute="false" has to be set -- which kind of renders some of the other attributes useless -- but value of "true" makes your entire provider useless
  • My provider looks like this:
  • <add name='IDM' 
    type='Microsoft.Office.Server.Security.LDAPMembershipProvider, Microsoft.Office.Server, Version=, Culture=neutral, PublicKeyToken=71E9BCE111E9429C'
    server='[ldap server name]'
    userContainer='[Ldap path discovered using ldap browser]'

  • The usage will be in a DMZ so I had firewall opened up on ports 389 and 686 (used for SSL) bi-directionally between the internal NIC IP addresses of all of the Web front end servers and the eDirectory LDAP server IP.
  • Through SharePoint Central Administration site, extended web apps to Custom zone (our default zone uses Windows Based Authentication).
  • Manually added the peoplepicker and provider values to the Central Admin site web.config.
  • Ran Powershell script detailed in my previous blog to update the web.config files for all of my portal web applications (see link above).
  • Updated the Zone provider for all the portals via a batch file -- for this, I threw in stsadm command like following for all the newly extended sites (used batch file to save my Operations team from having to perform same steps through Central Admin -- also, this prevents chance of them putting in wrong value):
  • stsadm -o authentication -url http://[extended site host header value] -type forms -membershipprovider [membership provider name] -allowanonymous
  • In Central Administration, added site collection administrator for one of the root portal sites (browsed through people picker and was able to select my e-mail address).
  • Opened IIS and browsed new website where I was prompted with the standard Sharepoint forms login page. Entered my e-mail address and network password and successfully signed in.

Saturday, June 27, 2009

Don't upload Microsoft Access Databases into SharePoint

My favorite word is No. But I like to follow No with But.

Often, site owners ask me to allow upload of Microsoft Access files by disabling the block of the .mdb file extension. Here's my template "No...but" response.

We will continue to block MDB files due to the fact that unsafe code can be included within an Access database. However, Office Access 2007 allows code to be either verified as safe or disabled so the ACCDB extension can be uploaded into SharePoint.

Here are steps to convert the database from MDB to the safe ACCDB format:

Alternatively, If you can’t convert to Access 2007, consider publishing any data in Access 2003 dbs (like flat tables) into SharePoint lists.

We've deployed the Office 2007 clients globally so it's easier for me to say no in this case. My goal is to get people to adopt SharePoint...and with deeper integration and more Access-like features coming in SharePoint 2010, they're going to be better off moving away from a strictly Microsoft Access app.

Monday, June 22, 2009

Move SharePoint Databases without a DBA

Did you ever get a request to move 198 SharePoint databases? Every time I think about working with SQL Server, I remember a session at The March 2008 SharePoint Conference titled, "SharePoint Admin -- the Reluctant DBA." I live the life of the reluctant DBA every day, so I didn't attend that session, but I remember the title.

I inherited a SharePoint development farm. I didn't build it and up until 7 weeks ago, I didn't even know the server names. Now I'm the contact for making decisions about these servers.

Apparently, our SAN guys gave the database server running SQL a new 650GB LUN to present as a disk where content databases will reside and now they want the old 250GB LUN back. So I get an e-mail from one of the SAN guys in India asking to move the content from the old disk to the new disk. Fair enough. I go to see what is on the disk a see 198 SQL MDF data files and 198 SQL Log files (design side note: don’t put data and log files on the same disk – this is a DEV farm where it’s not as critical to follow this rule – definitely follow the rule in your Stage and Prod environments).

Did I mention I work for a large, global corporation? Well we got DBAs. However, to get the DBAs to work for you, you need to submit a Request For Service (It's a Microsoft Word Form) providing an overview of what needs to be done to a Group Mailbox. The people who maintain the mailbox then assign it to the responsible group (I cross my fingers at that point because I once submitted one of these requests to create a new Service Account and they assigned it to my boss who called me and asked what I needed done -- it eventually got assigned to the AD Design guy who sits 2 aisles away from me). After the request is assigned, you wait in queue. This is where I become the reluctant DBA -- I hate to wait in lines.

Okay, so how do you move 198 databases? Find an account that has Sysadmin rights in SQL -- I got lucky and the guy that built the farm used one of my accounts to install. Kick your developers out -- be nice. Verify the database backup ran.

This will take approximately one hour -- Somebody time me:

1. Log on to the server running SQL Server 2005 (1 minute).

2. Open up SQL Server Management Studio and connect to your SQL Instance where the Databases reside (1 minute).

3. Run the following query to generate T-SQL to set all your DBs OFFLINE (Again got lucky -- I ran a google search for "sql server detach multiple databases" and got this post first hit. Thanks vidhya sagar at SQL -- This is great approach. With SQL Server 2005's "ALTER DATABASE...MODIFY FILE" capability, I modified to just set databases Offline and then bring back Online after files are copied to new location. (1 minute):

set nocount on

declare @dbname as varchar(80)
declare @server_name as varchar(20)
select @server_name = @@servername
declare rs_cursor CURSOR for select name from master.dbo.sysdatabases where name not in ('model','master','msdb','tempdb','alert_db','mssecurity') and filename like 'E:\MSSQL2K\Data%'
open rs_cursor
Fetch next from rs_cursor into @dbname
PRINT 'No database to backup...Please check your script!!!'
print 'go'
print 'print ''Setting of ' + upper(@dbname) + ' database to OFFLINE successfully completed'''
print 'go'
FETCH NEXT FROM rs_cursor INTO @dbname
CLOSE rs_cursor
deallocate rs_cursor
print ' '
print 'print ''SERVER NAME : ' + upper(@server_name) + '--> All databases successfully set OFFLINE'''

4. Run the following query to generate T-SQL to set the new file location and bring your DBs back Online(modify for your environment) (1 minute):

set nocount on

declare @dbname as varchar(80)

declare @server_name as varchar(20)

select @server_name = @@servername

declare rs_cursor CURSOR for select name from master.dbo.sysdatabases where name not in ('model','master','msdb','tempdb','alert_db','mssecurity') and filename like 'E:\MSSQL2K5\Data%'

open rs_cursor

Fetch next from rs_cursor into @dbname


PRINT 'No database to backup...Please check your script!!!'



print 'USE [master]'

print 'GO'

print 'ALTER DATABASE ' + @dbname


print '(NAME = '''+ @dbname + ''', FILENAME = ' + '''G:\MSSQL2K5\Data\' + @dbname + '.mdf''' + ')'

print 'GO'

print 'ALTER DATABASE ' + @dbname


print '(NAME = '''+ @dbname + '_log'''+ ', FILENAME = ' + '''H:\MSSQL2K5\Log\' + @dbname + '_log.ldf''' + ')'

print 'GO'

print 'ALTER DATABASE ' + @dbname + ' SET ONLINE'

print 'GO'

print 'print ''Setting of ' + upper(@dbname) + ' database to ONLINE successfully completed'''

print 'go'


FETCH NEXT FROM rs_cursor INTO @dbname


CLOSE rs_cursor

deallocate rs_cursor

print ' '

print 'print ''SERVER NAME : ' + upper(@server_name) + '--> All databases successfully set ONLINE'''

5. Copy the output from Step 3 and execute it. All the content db's will be set Offline!!!! (5 minutes)

6. Copy your files to the new location -- WARNING: don't do a cut and paste. COPY FILES -- I've witnessed attaches through SQL GUI fail with the result of data files magically disappearing (Don't get me started about what it takes to get our off-shore Backup guys to restore files from tape) (10 minutes).

7. Copy the output from Step 4 and execute it. All the content db's will be brought back on-line pointing to new file locations!!!! (5 minutes)

8. Verify databases are accessible. (31 minutes)

9. Delete the Old Files. (5 minutes)

Saturday, June 20, 2009

Update SharePoint web.config with Powershell

Had a need to update 48 web.config files in a SharePoint farm. Purpose was to add Novell E-Directory as an authentication provider for SharePoint Farm in our DMZ consisting of 4 web applications (8 IIS websites by virtue of extending) load balanced across 6 Web Front Ends (more on that in a future post).

To add the provider, there are several web.config additions...and I'm sick of writing complex checklists for our operations folks to carry out the changes. I was also a little worried about them missing some of the additions. I prefer to have them perform one step -- double-click "webconfigupdate.bat" -- and be done rather than manually having them open, edit, and save 48 web.config files.

Powershell and the SPWebConfigModification Class to the rescue.

Credit needs to go to following -- I was able to merge the ideas from the code in the first two posts to get my final powershell script:

  • This is post by Raymond Mitchell that got me started. It's great sample Powershell script for single web app. Although, I needed to loop through every web app in my farm.
  • This is the post by Gary Lapointe where I found how to use Powershell to loop through web apps on a farm – it's not the first time Gary's blog helped me understand some administration scripting.
  • Mark Wagner has some nice guidance on this topic. Helped me understand errors I received.
  • Reza Alirezaei has a great post on SPWebConfigModification's Top 6 Issues which helped me confirm much of the oddness that resulted when testing.
  • Gotta thank "The Groom," a team member who got the SPWebConfigModification class working in a Solution he deployed. I tried the SPWebConfigModification class once before but abandoned it because of issues I was seeing based on Reza's post. The Groom's success inspired to give it another try.

Before starting, SPWebConfigModification Class requires XPath values for its "Path" and "Name" Properties. The Path and Name values are how the Class finds the location in the web.config to add, update, and remove values.

The BIT-101 XPath Query Tool is a nice, free, on-line tool for finding correct XPath to enter. I used the tool by simply copying my web.config – with new elements and attributes I wanted in my final web.config -- and pasting it into this java based tool. I than queried for the new elements and attributes to determine the XPath I would use in my script.

There are some nice examples here to help you understand XPath if you're not familiar.

Here's the code I threw into a Powershell .ps1 file I named webconfigchange.ps1:

$farm = [Microsoft.SharePoint.Administration.SPFarm]::Local
$websvcs = $farm.Services where -FilterScript {$_.GetType() -eq [Microsoft.SharePoint.Administration.SPWebService]}
#use powershell method for suppressing errors for reason on next line -- updates will run successfully even though errors are thrown -- comment out erroractionpreference variable when debugging
#If more than one call to the ApplyWebConfigModifications() method is made in close succession you will get the following error due to timer job:
#A web configuration modification operation is already running.
$erroractionpreference = "SilentlyContinue"

#loop through each portal web app
foreach ($websvc in $websvcs)
foreach ($webapp in $websvc.WebApplications)
#update web.config for all web apps in farm excluding mysites and ssp
if ($ -like "*my*") {write-output "skipping mysites..."}
elseif ($ -like "*ssp*") {write-output "skipping SSP..."}
#add peoplepicker node for IDM – IDM is name we will use for Authentication Provider for Novell Identity Management
#here's where you'll need to understand XPath – this variable captures the xpath query – later in script, I'll optionally use Name and Path instead of Argumentlist
$ppMod = New-Object -TypeName "Microsoft.SharePoint.Administration.SPWebConfigModification" -ArgumentList "add[@key='IDM']", "configuration/SharePoint/PeoplePickerWildcards"
$ppMod.Sequence = 0
$ppMod.Owner = "SimpleSampleUniqueOwnerValue"
$ppMod.Type = "EnsureChildNode"
#this is actual value which will be added or modified
$ppMod.Value = "<add key='IDM' value='*' />"

#add membership node
$memberMod = New-Object -TypeName "Microsoft.SharePoint.Administration.SPWebConfigModification" -ArgumentList "membership", "configuration/system.web"
$memberMod.Sequence = 0
$memberMod.Owner = "SimpleSampleUniqueOwnerValue"
$memberMod.Type = "EnsureChildNode"
$memberMod.Value = "<membership defaultProvider='IDM'></membership>"

#add providers node
$providerMod = New-Object -TypeName "Microsoft.SharePoint.Administration.SPWebConfigModification" -ArgumentList "providers", "configuration/system.web/membership"
$providerMod.Sequence = 0
$providerMod.Owner = "SimpleSampleUniqueOwnerValue"
$providerMod.Type = "EnsureChildNode"
$providerMod.Value = "<providers></providers>"

#add IDM LDAP attributes – here, we're using Path and Name properties instead of Argumentlist
$ldapMod = New-Object Microsoft.SharePoint.Administration.SPWebConfigModification
$ldapMod.Path = "configuration/system.web/membership/providers"
$ldapMod.Name = "add[@name='IDM'][@type='Microsoft.Office.Server.Security.LDAPMembershipProvider, Microsoft.Office.Server, Version=, Culture=neutral, PublicKeyToken=71E9BCE111E9429C'][@server='<your Ldap server>'][@port='389'][@useSSL='false'][@useDNAttribute='false'][@userDNAttribute='cn'][@userNameAttribute='mail'][@userContainer='<your Ldap path>'][@userObjectClass='person'][@userFilter='(ObjectClass=*)'][@scope='Subtree'][@otherRequiredUserAttributes='sn,givenname,cn']"
$ldapMod.Sequence = 0
$ldapMod.Owner = "SimpleSampleUniqueOwnerValue"
$ldapMod.Type = "EnsureChildNode"
$ldapMod.Value = "<add name='IDM' type='Microsoft.Office.Server.Security.LDAPMembershipProvider, Microsoft.Office.Server, Version=, Culture=neutral, PublicKeyToken=71E9BCE111E9429C' server='<your Ldap server>' port='389' useSSL='false' useDNAttribute='false' userDNAttribute='cn' userNameAttribute='mail' userContainer='<your Ldap path>' userObjectClass='person' userFilter='(ObjectClass=*)' scope='Subtree' otherRequiredUserAttributes='sn,givenname,cn' />"

#save changes and apply to farm
$method = [Microsoft.Sharepoint.Administration.SPServiceCollection].GetMethod("GetValue", [Type]::EmptyTypes)
$closedMethod = $method.MakeGenericMethod([Microsoft.Sharepoint.Administration.SPWebService])
$services = $webapp.Farm.Services
$service = $closedMethod.Invoke($services, [Type]::EmptyTypes)

I call the above script to run from a .bat file with below commands:

powershell.exe Set-ExecutionPolicy RemoteSigned
REM command to run script from the ps1 file
powershell.exe -noexit .\webconfigchange.ps1

So there it is. With double-click of batch file, 48 web.config files are modified. If I ever want to remove the values, I can run same ps1 but replace every "WebConfigModifications.Add" entry with "WebConfigModifications.Remove." Another thing I like about SPWebConfigModification Class is that all the changes are stored in the SharePoint configuration database so if I ever add new front ends, the web.config values are automatically populated.