Proxy Other Ports to Port 80 on Synology

Ok, so you want to be able to access services running on a random port on your Synology NAS over standard ports (80 and 443).  To do that, you need to do the following items:

  1. Enable the web station from the web services icon in control panel.  This gives you Apache
  2. Add the following lines to the end of /usr/syno/apache/conf/httpd.conf-user (you must re-do this after doing a DSM update).  Replace <accessPath> with a name, and <port> with the port it is running on.  For example, http://localhost:8080/party.
    LoadModule proxy_module modules/mod_proxy.so
    LoadModule proxy_http_module modules/mod_proxy_http.so
    ProxyRequests Off
    ProxyPreserveHost On
    
    <Location /<accessPath>>
    ProxyPass http://localhost:<port>/<accessPath>
    ProxyPassReverse http://localhost:<port>/<accessPath>
    </Location>
    
  3. Restart apache with “/usr/syno/etc.defaults/rc.d/S97apache-user.sh restart”

Just make sure whatever service you are doing this with has a base url of <accessPath>.

UPDATE:  Just updated to DSM 5.0 and a few things have changed:

  1. /usr/syno/apache/ has moved to /etc/httpd/
  2. /usr/syno/etc/rc.d/S97apache-user.sh is gone, so just use ‘httpd -k stop|start|restart’
  3. You may need to create the folder /var/services/web/internal for httpd to restart
Published
Categorized as synology

My Perfect Windows 8 Tablet

With Computex in full swing and Windows 8 Release Preview being released last week, I think it’s time I start thinking about a new tablet.  A few of my colleagues and I have been really excited about having a tablet that can also double as a media center.  In fact, one of them currently uses his Samsung Slate in just such a fashion.

  • 10″ or 11″ display with >=200ppi
  • Minimum of 8 hour battery life
  • Ivy Bridge CPU
  • Active Digitiser for pen support (yay OneNote!)
  • TPM Module
  • WIDI or WHDI
  • USB 3 port
  • HDMI port
  • 64-128GB SSD
  • >=4GB RAM (would like it to be upgradeable)
  • Ideally, the ability to drive 2 24″ external monitors

Based on what I’ve seen so far, this definitely looks like it could be possible in the very near future!

Update: Well, after the MSFT announcement, it looks like we have a tablet that is definitely a front-runner (Pro).  There are some outstanding questions, but it looks like it ticks most of the boxes…

  • 10″ or 11″ display with >=200ppi – 207 ppi!
  • Minimum of 8 hour battery life – No idea yet…
  • Ivy Bridge CPU – Yes!
  • Active Digitiser for pen support (yay OneNote!) – Yes!
  • TPM Module – We think so!
  • WIDI or WHDI – No idea yet…
  • USB 3 port – Yes!
  • HDMI port – DisplayPort gives the same flexibility!
  • 64-128GB SSD – Yes!
  • >=4GB RAM (would like it to be upgradeable) – Not sure, but should be 4GB!
  • Ideally, the ability to drive 2 24″ external monitors – No, but a matrox dualhead2go would work.
Published
Categorized as computers

Find PID of Application Using a Port

Best way to do it is to run the command

netstat -lnptu | grep :<port#>

This will give you the PID of the service, and then you can run

kill <PID>

or

kill -9 <PID>

if just a regular kill doesn’t work.

Published
Categorized as synology

nopCommerce Install

The other day I was playing around with nopCommerce.  There was some talk about it internally, and I thought I’d see what it was all about.  I didn’t get very far, and realized the installation instructions were definitely missing a few steps.  The guys over there have outlined most of the steps in the documentation, but they’ve forgotten a few:

  1. Ensure that your worker process (what the AppPool runs under) has the ability to create a database if you check the box Create database if it doesn’t exist.
  2. How to access the installation page.  You need to browse to http://site/views/install/default.aspx

There are other OWASP and scalability best practices that I may go into later if I really dig down further, but three that immediately come out:

  1. Unencrypted DB Connection string info
  2. compilation debug=”true” being set in the web.config
  3. Single DB
Published
Categorized as IIS, work

Synology DS1511+ and Crontab

I’ve added an rsync job to my crontab file in order to backup all my websites I have being served from Dreamhost (including this one).  The specific job is set to run every night at midnight starting last night.  Unfortunately, it didn’t run.

This is because the crontab service needs to be recycled in order to grab the new jobs (also, don’t update your DSM, because that seems to blow it away).  As this is a non-standard linux distro, you need to restart crontab the following way:

/usr/syno/etc.defaults/rc.d/S04crond.sh stop
/usr/syno/etc.defaults/rc.d/S04crond.sh start

Published
Categorized as synology

Synology Plex Media Server and Samsung Smart TV Client

This one wasn’t completely obvious, but I think I’ve managed to figure it out.  It at least appears to be working correctly, assuming it continues to work a bit better after the media scan is complete.

Steps for the Server:

  1. Grab the spk from http://www.plexapp.com/linux/linux-pms-download.php.
  2. Log into DSM and in Package Installer, install the downloaded spk.
  3. After it is installed, visit the website at http://<nas-server>:32400.  It doesn’t look like the shortcut that is created works.
  4. Add in the locations to your media.

The steps for the Client on a Samsung TV with SmartHub is broken up into two options: installer hosted on your own server, or on someone elses.  It doesn’t matter where you get the installer from, as you can specify the Plex Server after the application is installed.

Hosted on your NAS:

  1. In Control Panel, enable web station under web services
  2. Copy the installer (link) to the web share that was created in step 1
  3. Copy the widgetlist.xml (link) to the web share that was created in step 1
  4. Edit the widgetlist.xml to contain the IP of your NAS (or the URL where the installer is located)
  5. On the TV, open the Smart Hub
  6. Log in as a different user (A/red button)
    • User: develop
    • Password: 123456
  7. Click the Settings button (D/blue button)
  8. Select Development
  9. Set the Server IP to that of your NAS
  10. Select User Application Synchronisation
  11. Once the installation is finished, restart your TV
  12. Visit SmartHub and Plex is installed.
  13. Point Plex at your Plex Media server.

Hosted by someone else:

  1. On the TV, open the Smart Hub
  2. Log in as a different user (A/red button)
    • User: develop
    • Password: 123456
  3. Click the Settings button (D/blue button)
  4. Select Development
  5. Set the Server IP to 92.50.72.58
  6. Select User Application Synchronisation
  7. Once the installation is finished, restart your TV
  8. Visit SmartHub and Plex is installed.
  9. Point Plex at your Plex Media server.

These install instructions were taken from the Plex forums.

Update 1/6/2012: The crawler has completed, and it does actually work!  I also found out that it only supports TV shows right now, and not music or photos.  Looking into it, it’s just a webpage with a lot of javascript.  If I have time, I may look to add music in, as having one solution for everything is a lot better than both this and DLNA!

Update 8/19/2012: Instead of going through all of this, just grab the Plex app from the Samsung App Store!

Published
Categorized as synology

SharePoint 2010 User Profile Sync: stopped-extension-dll-exception

Well, it’s good to see that User Profile Sync can be better in 2010 than it was in 2007.  However, there are definitely some issues still.

The first one, which is something we just noticed was that the User Profile Sync jobs were constantly failing.  Unfortunately, there isn’t really a good way to know without going into the MIISClient program to look at the errors.  Basically, if you think, for whatever reason, profile sync is not working, open up the MIISClient.exe (Program FilesMicrosoft Office Servers14.014.0Synchronization ServiceUIShell) as the farm account and take a look to see if everything is a success.

For us, we were seeing all the MOSS-{guid} jobs failing with the error stopped-extension-dll-exception as you can see below.

Based on the lovely error message, I’m still amazed that I was able to isolate this issue (event logs reported that CA was being accessed via a non-registered name).  However, it turns out it is because of alternate access mappings (AAMs) for the central admin (CA) website.  Normally, SharePoint sets up the AAM for CA as the machine name you first install SharePoint on to.  However, we changed the AAM to be a more friendly name.

When you update the “Public URL for Zone” for the CA website, it does not propagate the change into the MIISClient.  This causes the MIISClient to not correctly access the CA APIs for the user profile sync (or at least I am imagining this is the case).

Fix it with the following steps:

  1. MIISClient.exe as the farm account.
  2. Tools > Management Agents (or click the Management Agents in the bar)
  3. Right-click on the MOSS-{guid} management agent and select Properties
  4. Go to the Configure Connection Information section in the left-hand pane
  5. In the connection information box, change the Connect To URL to be the same URL as listed as the “Public URL for Zone” for your CA in the AAM configuration.
  6. Re-enter the farm account username and password for good measure
  7. Save the configuration
  8. Run a full profile sync from CA

Synology DS1511+ and CrashPlan

These instructions were ripped verbatim from Kenneth Larsen’s blog because it just worked!  You can either use vi or nano to edit the files.

  1. Download the latest release of the Linux Crashplan Client from Crashplan website along with the client for your operating system if you use another operating system then Linux
  2. Upload the Linux client to the NAS and login to your NAS as root using SSH.
  3. You need to installe ipkg in order to do so. If haven’t done so already you can follow this guide: http://forum.synology.com/wiki/index.php/Overview_on_modifying_the_Synology_Server,_bootstrap,_ipkg_etc#What_is_a_Synology_Server
  4. You will need to install a few extra packages at your NAS from the command line:
    1. ipkg update
    2. ipkg install nano
    3. ipkg install cpio
    4. ipkg install bash
    5. ipkg install coreutils
  5. Move the uploaded client package to /opt
  6. Unpack the crashplan client: tar -xvf name_of_downloaded_archive_file
  7. Modify the install.sh script in the newly created directory to use bash as your shell. The first line in the script should be replaced with this one: #!/opt/bin/bash
  8. Install the crashplan using the options below. When asked about java allow it to be downloaded:
    • CrashPlan will install to: /opt/crashplan
    • And put links to binaries in: /opt/bin
    • And store datas in: /opt/crashplan/manifest
    • Your init.d dir is: /etc/init.d
    • Your current runlevel directory is: /usr/syno/etc/rc.d
  9. Modify the /opt/crashplan/bin/run.conf by adding  -Djava.net.preferIPv4Stack=trueas an additional option at the end of the two confiurations (This was already added when I did the install)
  10. Remove commandline options for the ps process in the /opt/crashplan/bin/CrashPlanEngine file since ps doesnt accept parameters at the synology NAS: sed -i ‘s/ps -eo /ps /’  CrashPlanEngine;sed -i ‘s/ps -p /ps /’  CrashPlanEngine
  11. Modify the /usr/syno/etc/rc.d/S99crashplan file line 1 to : #!/opt/bin/bash
  12. Modify the /opt/crashplan/bin/CrashPlanEngine file line 1 to: #!/opt/bin/bash
  13. Modify the /opt/crashplan/bin/CrashPlanEngine file line 14 with a full path for nice to: /opt/bin/nice
  14. Start your crashplan service /usr/syno/etc/rc.d/S99crashplan start
  15. Validate that your service is running: netstat -an | grep ‘:424.’ should give to listeners:
    • tcp        0      0 0.0.0.0:4242            0.0.0.0:*               LISTEN
    • tcp        0      0 127.0.0.1:4243          0.0.0.0:*               LISTEN
  16. Edit /etc/rc.local and add “/usr/syno/etc/rc.d/S99crashplan start” without quotes, it seems to load after restart.
  17. Install your desktop client and point it towards the headless service you just installed. Follow the instructions in the crashplan website for this (http://support.crashplan.com/doku.php/how_to/configure_a_headless_client)

Update 8/19: Just to post a reply to this, there is a much better way to get this working now (and is what I use).  Check it out at http://pcloadletter.co.uk/2012/01/30/crashplan-syno-package/

Published
Categorized as computers

New Home Server Setup

I’ve been meaning to do this for awhile, but I haven’t found a suitable replacement until recently.  I am decommissioning the server at home.  It’s loud, large, and sucks down a lot of power for what I use it for (windows home server).  It was nice because I could quickly and easily spin up some VMs and poke around, but I’ll still be able to do that.

Instead, I picked up a Synology DS1511+ NAS.  This little appliance is pretty darn slick.  It can pretty much do everything I was doing, in a smaller, quieter, and cooler form factor.  Since it uses an Atom processor, it runs a fairly familiar flavor of Linux, so you can do quite a bit with it.  Plus, a lot of the default stuff it comes with is quite nice!

I’ll be throwing up a few copy/pastes on the site so that I can quickly re-reference.  Oh, and there’s another SharePoint article in the works too.  Busy, busy!

SharePoint 2010 Synthetic File Data

Still trying to work through creating synthetic data for an out-of-the-box SharePoint performance test.  To create the data, create a new site collection (so it doesn’t interfere with anything else and is easy to clean up), and uploads all the test data.  The biggest downside right now is that the data is created and then uploaded, which requires enough disk space to make the data.  Not a huge issue for me, but possibly for you.

General idea came from a few places for the upload, and locally for the file creation.

#USER Defined Variables
#Specify the extension type of files you want uploaded
$strDocTypes = @(&quot;.docx&quot;,&quot;.xlsx&quot;, &quot;.pptx&quot;, &quot;.pdf&quot;)
#The max amount of data generated in MB
$maxSize = 50
#The max size one file could be in MB
$maxFileSize = 10
#Intermediate folder where the test data is placed
$fileSource = &quot;F:TestData&quot;
#New Content Database (for easy removal)
$dbName = &quot;Portal_ContentDB2&quot;
#New Site collection template
$template = &quot;SPSPORTAL#0&quot;
#Account owner
$siteOwner = &quot;TESTAdministrator&quot;
#Web Application address
$webApp = &quot;https://portal&quot;
#Site Collection Address
$siteCollection = &quot;/sites/1&quot;
# DO not edit anything beyond this line

#Create all the test data using FSUTIL

$rand = New-Object system.random
do {
	$guid = [guid]::NewGuid()
	$guid =  $guid.ToString()
	$fileName = $guid+$strDocTypes[$rand.next(0,$strDocTypes.length)]
	$rand1 = $rand.nextdouble()
	$rand2 = $rand.nextdouble()
	$rand3 = $rand.nextdouble()
	[int]$fileSize = 1048576*$rand1*$rand2*$rand3*$maxFileSize
	FSUTIL FILE CREATENEW $fileName $fileSize
	$fileTotalBytes = $fileTotalBytes + $fileSize
	$fileTotal = $fileTotalBytes/1024/1024
}
#Data generation keeps going until the amount of data is &gt; $maxSize
while ($fileTotal -le $maxSize)

#Creation of the new content database and site collection
$siteCollectionURL = $webApp + $siteCollection
New-SPContentDatabase $dbName -WebApplication $webApp
New-SPSite -url $siteCollectionURL -OwnerAlias $siteOwner -Name &quot;Test Doc Library&quot; -Template $template -ContentDatabase $dbName

#uploading of all the generated data into the $siteCollectionURL/Documents folder
$spWeb = Get-SPWeb -Identity $siteCollectionURL
$listTemplate = [Microsoft.SharePoint.SPListTemplateType]::DocumentLibrary
$spFolder = $spWeb.GetFolder(&quot;Documents&quot;)
$spFileCollection = $spFolder.Files
Get-ChildItem $fileSource | ForEach {
	$spFileCollection.Add(&quot;Documents/$($_.Name)&quot;,$_.OpenRead(),$true)
}