Best way to do it is to run the command
netstat -lnptu | grep :<port#>
This will give you the PID of the service, and then you can run
kill <PID>
or
kill -9 <PID>
if just a regular kill doesn’t work.
Best way to do it is to run the command
netstat -lnptu | grep :<port#>
This will give you the PID of the service, and then you can run
kill <PID>
or
kill -9 <PID>
if just a regular kill doesn’t work.
As I can’t find this nicely spelled out anywhere, here are the economy fare class codes that you need to book to ensure you can use your system wide upgrades on any flight: Y, B, M, E, U, H, Q, V, W. And for Business: J, C, D. This is especially true for those of us flying in and out of Australia…
1 http://pss.united.com/web/en-US/content/mileageplus/awards/upgrade/default.aspx
2 http://www.united.com/web/en-US/content/booking/flight/fareClass.aspx
The other day I was playing around with nopCommerce. There was some talk about it internally, and I thought I’d see what it was all about. I didn’t get very far, and realized the installation instructions were definitely missing a few steps. The guys over there have outlined most of the steps in the documentation, but they’ve forgotten a few:
There are other OWASP and scalability best practices that I may go into later if I really dig down further, but three that immediately come out:
I’ve added an rsync job to my crontab file in order to backup all my websites I have being served from Dreamhost (including this one). The specific job is set to run every night at midnight starting last night. Unfortunately, it didn’t run.
This is because the crontab service needs to be recycled in order to grab the new jobs (also, don’t update your DSM, because that seems to blow it away). As this is a non-standard linux distro, you need to restart crontab the following way:
/usr/syno/etc.defaults/rc.d/S04crond.sh stop
/usr/syno/etc.defaults/rc.d/S04crond.sh start
This one wasn’t completely obvious, but I think I’ve managed to figure it out. It at least appears to be working correctly, assuming it continues to work a bit better after the media scan is complete.
Steps for the Server:
The steps for the Client on a Samsung TV with SmartHub is broken up into two options: installer hosted on your own server, or on someone elses. It doesn’t matter where you get the installer from, as you can specify the Plex Server after the application is installed.
Hosted on your NAS:
Hosted by someone else:
These install instructions were taken from the Plex forums.
Update 1/6/2012: The crawler has completed, and it does actually work! I also found out that it only supports TV shows right now, and not music or photos. Looking into it, it’s just a webpage with a lot of javascript. If I have time, I may look to add music in, as having one solution for everything is a lot better than both this and DLNA!
Update 8/19/2012: Instead of going through all of this, just grab the Plex app from the Samsung App Store!
Well, it’s good to see that User Profile Sync can be better in 2010 than it was in 2007. However, there are definitely some issues still.
The first one, which is something we just noticed was that the User Profile Sync jobs were constantly failing. Unfortunately, there isn’t really a good way to know without going into the MIISClient program to look at the errors. Basically, if you think, for whatever reason, profile sync is not working, open up the MIISClient.exe (Program FilesMicrosoft Office Servers14.014.0Synchronization ServiceUIShell) as the farm account and take a look to see if everything is a success.
For us, we were seeing all the MOSS-{guid} jobs failing with the error stopped-extension-dll-exception as you can see below.
Based on the lovely error message, I’m still amazed that I was able to isolate this issue (event logs reported that CA was being accessed via a non-registered name). However, it turns out it is because of alternate access mappings (AAMs) for the central admin (CA) website. Normally, SharePoint sets up the AAM for CA as the machine name you first install SharePoint on to. However, we changed the AAM to be a more friendly name.
When you update the “Public URL for Zone” for the CA website, it does not propagate the change into the MIISClient. This causes the MIISClient to not correctly access the CA APIs for the user profile sync (or at least I am imagining this is the case).
Fix it with the following steps:
These instructions were ripped verbatim from Kenneth Larsen’s blog because it just worked! You can either use vi or nano to edit the files.
Update 8/19: Just to post a reply to this, there is a much better way to get this working now (and is what I use). Check it out at http://pcloadletter.co.uk/2012/01/30/crashplan-syno-package/
I’ve been meaning to do this for awhile, but I haven’t found a suitable replacement until recently. I am decommissioning the server at home. It’s loud, large, and sucks down a lot of power for what I use it for (windows home server). It was nice because I could quickly and easily spin up some VMs and poke around, but I’ll still be able to do that.
Instead, I picked up a Synology DS1511+ NAS. This little appliance is pretty darn slick. It can pretty much do everything I was doing, in a smaller, quieter, and cooler form factor. Since it uses an Atom processor, it runs a fairly familiar flavor of Linux, so you can do quite a bit with it. Plus, a lot of the default stuff it comes with is quite nice!
I’ll be throwing up a few copy/pastes on the site so that I can quickly re-reference. Oh, and there’s another SharePoint article in the works too. Busy, busy!
Still trying to work through creating synthetic data for an out-of-the-box SharePoint performance test. To create the data, create a new site collection (so it doesn’t interfere with anything else and is easy to clean up), and uploads all the test data. The biggest downside right now is that the data is created and then uploaded, which requires enough disk space to make the data. Not a huge issue for me, but possibly for you.
General idea came from a few places for the upload, and locally for the file creation.
#USER Defined Variables #Specify the extension type of files you want uploaded $strDocTypes = @(".docx",".xlsx", ".pptx", ".pdf") #The max amount of data generated in MB $maxSize = 50 #The max size one file could be in MB $maxFileSize = 10 #Intermediate folder where the test data is placed $fileSource = "F:TestData" #New Content Database (for easy removal) $dbName = "Portal_ContentDB2" #New Site collection template $template = "SPSPORTAL#0" #Account owner $siteOwner = "TESTAdministrator" #Web Application address $webApp = "https://portal" #Site Collection Address $siteCollection = "/sites/1" # DO not edit anything beyond this line #Create all the test data using FSUTIL $rand = New-Object system.random do { $guid = [guid]::NewGuid() $guid = $guid.ToString() $fileName = $guid+$strDocTypes[$rand.next(0,$strDocTypes.length)] $rand1 = $rand.nextdouble() $rand2 = $rand.nextdouble() $rand3 = $rand.nextdouble() [int]$fileSize = 1048576*$rand1*$rand2*$rand3*$maxFileSize FSUTIL FILE CREATENEW $fileName $fileSize $fileTotalBytes = $fileTotalBytes + $fileSize $fileTotal = $fileTotalBytes/1024/1024 } #Data generation keeps going until the amount of data is > $maxSize while ($fileTotal -le $maxSize) #Creation of the new content database and site collection $siteCollectionURL = $webApp + $siteCollection New-SPContentDatabase $dbName -WebApplication $webApp New-SPSite -url $siteCollectionURL -OwnerAlias $siteOwner -Name "Test Doc Library" -Template $template -ContentDatabase $dbName #uploading of all the generated data into the $siteCollectionURL/Documents folder $spWeb = Get-SPWeb -Identity $siteCollectionURL $listTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary $spFolder = $spWeb.GetFolder("Documents") $spFileCollection = $spFolder.Files Get-ChildItem $fileSource | ForEach { $spFileCollection.Add("Documents/$($_.Name)",$_.OpenRead(),$true) }
Was looking for ways to generate synthetic test data for a SharePoint out-of-the-box install today, and ran into the SharePoint 2010 Load Testing Kit. While it doesn’t help me in this stage of the project, I could see it being useful later or on other projects.
There appears to be a lot of dependencies though:
Could be hot though!