Category: sharepoint

  • SharePoint 2010 Image Sync to AD

    Similar to previous identity management issues with SharePoint (I hate you FIM!), I just ran into another issue the other day.  There is a requirement to have SharePoint 2010 be the place where users can managed their profile information, but the most important thing is to have images sync to AD so they can be used in Lync and Outlook.

    The guys on the ground were pulling their hair out, as they had followed the instructions from two corroborating sites.  Unfortunately, even with all of that setup, images were not being successfully added to AD.

    Knowing how much fun FIM is, I did a bit of banging prior to arriving on site and found an article that sounded very similar to the issues they were having.  Turns out it was the answer, but I’m going to duplicate a bit of it here just in case it disappears.

    1. You have correctly configured FIM to sync the images correctly as per the TechNet article linked above (“sites”).
    2. Looking in the IIS logs of the mysite (or whatever name is accurate) web app, you see 401.1 214807254 and/or 214807252 errors on anonymous users accessing the thumbnail jpegs.

    What you need to do is log into the box where FIM is running as the FIM user sync account.  From there, add your mysite URL to the intranet zone in IE.  Re-run the sync and it should work.

    The reason is that the error IIS log error is because FIM is not passing the credentials as it is being challenged.  By adding the mysite to the intranet zone, it will automatically send credentials and not wait to be prompted (unless a GPO has overridden this setting).

  • SharePoint 2010 User Profile Sync: stopped-extension-dll-exception

    Well, it’s good to see that User Profile Sync can be better in 2010 than it was in 2007.  However, there are definitely some issues still.

    The first one, which is something we just noticed was that the User Profile Sync jobs were constantly failing.  Unfortunately, there isn’t really a good way to know without going into the MIISClient program to look at the errors.  Basically, if you think, for whatever reason, profile sync is not working, open up the MIISClient.exe (Program FilesMicrosoft Office Servers14.014.0Synchronization ServiceUIShell) as the farm account and take a look to see if everything is a success.

    For us, we were seeing all the MOSS-{guid} jobs failing with the error stopped-extension-dll-exception as you can see below.

    Based on the lovely error message, I’m still amazed that I was able to isolate this issue (event logs reported that CA was being accessed via a non-registered name).  However, it turns out it is because of alternate access mappings (AAMs) for the central admin (CA) website.  Normally, SharePoint sets up the AAM for CA as the machine name you first install SharePoint on to.  However, we changed the AAM to be a more friendly name.

    When you update the “Public URL for Zone” for the CA website, it does not propagate the change into the MIISClient.  This causes the MIISClient to not correctly access the CA APIs for the user profile sync (or at least I am imagining this is the case).

    Fix it with the following steps:

    1. MIISClient.exe as the farm account.
    2. Tools > Management Agents (or click the Management Agents in the bar)
    3. Right-click on the MOSS-{guid} management agent and select Properties
    4. Go to the Configure Connection Information section in the left-hand pane
    5. In the connection information box, change the Connect To URL to be the same URL as listed as the “Public URL for Zone” for your CA in the AAM configuration.
    6. Re-enter the farm account username and password for good measure
    7. Save the configuration
    8. Run a full profile sync from CA

  • SharePoint 2010 Synthetic File Data

    Still trying to work through creating synthetic data for an out-of-the-box SharePoint performance test.  To create the data, create a new site collection (so it doesn’t interfere with anything else and is easy to clean up), and uploads all the test data.  The biggest downside right now is that the data is created and then uploaded, which requires enough disk space to make the data.  Not a huge issue for me, but possibly for you.

    General idea came from a few places for the upload, and locally for the file creation.

    #USER Defined Variables
    #Specify the extension type of files you want uploaded
    $strDocTypes = @(".docx",".xlsx", ".pptx", ".pdf")
    #The max amount of data generated in MB
    $maxSize = 50
    #The max size one file could be in MB
    $maxFileSize = 10
    #Intermediate folder where the test data is placed
    $fileSource = "F:TestData"
    #New Content Database (for easy removal)
    $dbName = "Portal_ContentDB2"
    #New Site collection template
    $template = "SPSPORTAL#0"
    #Account owner
    $siteOwner = "TESTAdministrator"
    #Web Application address
    $webApp = "https://portal"
    #Site Collection Address
    $siteCollection = "/sites/1"
    # DO not edit anything beyond this line
    
    #Create all the test data using FSUTIL
    
    $rand = New-Object system.random
    do {
    	$guid = [guid]::NewGuid()
    	$guid =  $guid.ToString()
    	$fileName = $guid+$strDocTypes[$rand.next(0,$strDocTypes.length)]
    	$rand1 = $rand.nextdouble()
    	$rand2 = $rand.nextdouble()
    	$rand3 = $rand.nextdouble()
    	[int]$fileSize = 1048576*$rand1*$rand2*$rand3*$maxFileSize
    	FSUTIL FILE CREATENEW $fileName $fileSize
    	$fileTotalBytes = $fileTotalBytes + $fileSize
    	$fileTotal = $fileTotalBytes/1024/1024
    }
    #Data generation keeps going until the amount of data is > $maxSize
    while ($fileTotal -le $maxSize)
    
    #Creation of the new content database and site collection
    $siteCollectionURL = $webApp + $siteCollection
    New-SPContentDatabase $dbName -WebApplication $webApp
    New-SPSite -url $siteCollectionURL -OwnerAlias $siteOwner -Name "Test Doc Library" -Template $template -ContentDatabase $dbName
    
    #uploading of all the generated data into the $siteCollectionURL/Documents folder
    $spWeb = Get-SPWeb -Identity $siteCollectionURL
    $listTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary
    $spFolder = $spWeb.GetFolder("Documents")
    $spFileCollection = $spFolder.Files
    Get-ChildItem $fileSource | ForEach {
    	$spFileCollection.Add("Documents/$($_.Name)",$_.OpenRead(),$true)
    }
  • SharePoint 2010 Load Testing Kit

    Was looking for ways to generate synthetic test data for a SharePoint out-of-the-box install today, and ran into the SharePoint 2010 Load Testing Kit.  While it doesn’t help me in this stage of the project, I could see it being useful later or on other projects.

    There appears to be a lot of dependencies though:

    • Migration from 2007 to 2010
    • As it collects info from your log files, you’ll need to have everything migrated for the scripts to work
      • Data
      • Apps
      • Site Collections
      • Etc.

    Could be hot though!

  • Search Schedule Script

    To setup the crawl configuration for the default local sites, you can use the script below:

    $ssaName="Search Service Application"
    $context=[Microsoft.Office.Server.Search.Administration.SearchContext]::GetContext($ssaName)
    
    $incremental=New-Object Microsoft.Office.Server.Search.Administration.DailySchedule($context)
    $incremental.BeginDay="23"
    $incremental.BeginMonth="10"
    $incremental.BeginYear="2011"
    $incremental.StartHour="0"
    $incremental.StartMinute="00"
    $incremental.DaysInterval="1"
    $incremental.RepeatInterval="720"
    $incremental.RepeatDuration="1440"
    
    $fullCrawl=New-Object Microsoft.Office.Server.Search.Administration.WeeklySchedule($context)
    $fullCrawl.BeginDay="23"
    $fullCrawl.BeginMonth="10"
    $fullCrawl.BeginYear="2011"
    $fullCrawl.StartHour="6"
    $fullCrawl.StartMinute="00"
    $fullCrawl.WeeksInterval="1"
    $contentsource = Get-SPEnterpriseSearchCrawlContentSource -SearchApplication $ssaName -Identity "Local SharePoint Sites"
    
    $contentsource.IncrementalCrawlSchedule=$incremental
    $contentsource.FullCrawlSchedule=$fullCrawl
    $contentsource.Update()