Blog

Smart Home Adventures

Ever since I bought my own home nearly a year ago, Iā€™ve
become increasingly interested in making it smart.

Right off the bat, I feel like I should clarify what that
means to me. The ability to turn some lights on or off with an app is not
smart, in my opinion ā€“ the smart way of controlling lights is by flicking a
switch conveniently located in the room you wish to illuminate.

A smart home needs to be much more intelligent. Itā€™s about
automation. Itā€™s about the home being able to notify me if something is
happening that I need to know about. Itā€™s about being able to accomplish things
with minimal difficulty, not adding complexity and more steps.

Thatā€™s where off the shelf ā€œsmart homeā€ solutions really
started to fall down for me. I could spend hundreds or maybe even thousands of
dollars, for what? The ability to turn on my living room lights while Iā€™m still
at the office? Why would I ever need to do that?

Nevertheless, the lack (in my opinion) of a pre-packaged,
useful, holistic solution that accomplished my vision of what a ā€œsmart homeā€
should be didnā€™t deter me from tackling things bit by bit. It started with our
burglar alarm. It has internet connectivity which sends me alerts in the event
that something unexpected is happening, and lets me arm or disarm the system
from my phone ā€“ which I actually do find useful.

Next up was our thermostat. The one that was installed when
we bought the house was an old-fashioned one with a simple mercury switch
inside. You set the temperature, and that was it. We replaced that about a
month ago with something programmable (it doesnā€™t need to be as warm in here at
night as it does during the day; it doesnā€™t need to be as warm if nobodyā€™s
home), and I took the opportunity to get one with WiFi so I can set the
temperature remotely. Thatā€™s not useful in and of itself, but if you take that
functionality and look at it in the context of my wider vision then the
thermostat is certainly something Iā€™d like to be able to programmatically
control.

It was around this same time that I discovered home assistant, and now my dream is
starting to come alive.

image

Home Assistant is an open-source project that runs on a
variety of hardware (I was originally running it on a Raspberry Pi, and Iā€™ve
since switched to running it in a Docker
container on our home
server
). It has a ton of plugins
(ā€œcomponentsā€) that enable it to support a variety of products ā€“ including our
existing alarm, thermostat, streaming media players, and others (including,
somewhat ironically, the colour-changing lightbulbs we have in our family
room). It includes the ability to create scripts and automations, it uses our
cellphones to know our locations, and can send us push notifications.

My initial setup was all about notifications. If we both
leave the house but the burglar alarm isnā€™t set then it tells us (and provides
an easy way to fix the issue). If we leave one of the exterior doors open for
more than five minutes, it notifies us (or just one of us, if the other is
out). I also created a dashboard (that you may have seen in my last post) to display some of this stuff on a monitor in my office.

Since installing the thermostat Iā€™ve added more automation.
The time we go to bed isnā€™t always predictable, but when we do go to bed we set
the alarm. So, if itā€™s after 7pm and the alarm goes from disarmed to armed, the
thermostat gets put into night mode. If nobody is home then the temperature
gets gradually turned down based on how far away we are.

If nobody is home at dusk then it turns on some lights and
streams talk radio through the family room speakers to give the impression that
someone is.

This stuff meets my definition of smart, and Iā€™m barely
scratching the surface. The open nature of the platform not only means that Iā€™m
not tied to a particular vendor or technology, but also means that I can add on
to the system in a DIY way.

Which is exactly what Iā€™m going to do. Iā€™ve bought some NodeMCU microcontrollers which are
WiFi enabled, Arduino IDE-compatible
development boards designed to the basis for DIY electronics projects.

Watch this space, because over the coming months Iā€™ll be
connecting our doorbell, garage door and laundry appliances to Home Assistant.
Iā€™ll be learning as I go, and Iā€™ll share the hardware and software.

Blog

TorrentApp

I have a small app on my computer that I wrote myself. Itā€™s
small and simple, and itā€™s the default application for opening BitTorrent files on our computers. When I
download one of these files the app takes the file and moves it to a folder on
the server. This folder is watched by my torrent
client of choice
which runs on the server and immediately starts the
download when it sees the file.

The app then pops up a notification to the user to ask if
they want to be directed to the deluge web interface to see the download
progress.

I rewrote the app about a year ago. The original version was
written in RealStudio but the
location of the watched folder and the URL for Delugeā€™s web interface were
hard-coded in: a reasonable design decision given it was just a small app for
only my use one, but still a poor one ā€“ when a change I made to my network
configuration required me to adjust these variables I no longer had a copy of
RealStudio available.

I wrote a new version in Visual
Basic 2010 Express
, and this time I did a little extra work to take the
configuration variables out of the source code and put them into an .ini file.

Why am I telling you all this?

Well, not that I think youā€™d need the app, but I have today
made the source code
(and the compiled executable, for good measure) publicly available on my brand
new GitLab account!

Iā€™ve been using Git for a while (and Iā€™ve written about it once
or twice
before), but I really havenā€™t been taking advantage of its featureset.

Iā€™m working on something right now thatā€™s big and complex
and I value having version control and branches to work with. I already have
Git installed on my server (both my home server and my public webserver), but
Iā€™ve downloaded a windows Git client
to compliment that setup and opened a GitLab account to use as an external
repository and a means to eventually make a finished product public.

Why have I chosen GitLab over the more ubiquitous GitHub? GitHub makes you pay to host a private
repository, and I want somewhere where I can both host code thatā€™s a work in
progress (and not ready for public distribution) and distribute completed code
thatā€™s ready for download, public review and maybe even improvement by the
wider community. GitLab gives me free private repositories for
partially-completed things that I can later make public once Iā€™m ready to.

Iā€™ve already created a couple of public repositories, mostly
to test the platform out, and TorrentApp is one of them.

So use it if itā€™s a tool that might be useful to you,
improve upon it if you have the expertise, and send me a merge request so I
can incorporate your changes into the code!

Blog

SharePoint on a Domain Controller Revisited

On Tuesday I wrote about installing SharePoint Foundation
2010
on my home windows server, which also acts as a domain controller, and I
concluded by saying that Iā€™d encountered performance issues as a result of that
(non-recommended) setup.

Turns out, the performance issues were a complete coincidence,
and everything is now running just fine.

The problem I was experiencing was that two of my three forward
DNS servers werenā€™t working correctly. Now that my
service provider
has corrected their issue, everything is great.

image

For a small setup like mine, Iā€™d say go ahead
and install SQL Server Express and SharePoint on the domain controller. It
works great!

Blog

Installing SharePoint Foundation on a Domain Controller

Itā€™s been a long time since I blogged about SharePoint, and thatā€™s
largely because I havenā€™t had a need to develop anything custom on top of the
platform for quite some time.

If youā€™ve been following along for a long time, you may
recall that back at the start of last year I installed
SharePoint foundation on a Windows 7 Virtual Machine
at home for testing
purposes and, while I didnā€™t blog about it explicitly, when I upgraded
my home server
last August I replaced that Windows 7 virtual machine (which
ran on my laptop) with an always-on Windows 2008 R2 VM, again running
SharePoint foundation.

As my home network continued to evolve I turned that Windows
Server VM into a domain controller, and this broke my SharePoint installation ā€“
but by then it wasnā€™t all that important and I didnā€™t need it for work anymore,
so I simply uninstalled it.

Recently, Iā€™ve been missing having SharePointā€™s
functionality at home. In particular, I wanted a shared calendar for Flo and I,
and a place for shared documents. We can achieve much of this with Google
calendar and our existing shared folders (and I already have a tool deployed that makes our network shares
available from outside our home network), but it all feels a little kludged together
and itā€™s lacking features like NTLM based SSO and an easy way to edit files
from the web-interface that SharePoint provides out of the box. I looked at a
couple of alternative
solutions and wasnā€™t satisfied.

Previously Iā€™d deployed SharePoint foundation in standalone
mode. This installs and runs all the required components on a single machine.
Itā€™s not recommended for a full-scale deployment, but itā€™s perfect for our home
network. The problem is that this simply isnā€™t an option if you install it on a
domain controller, and instead you have to install a server farm. In googling
around, the consensus online seemed to be that it wasnā€™t possible to install
SharePoint on a single server if that server was also acting as the domain
controller.

Not so.

It is possible, and in fact itā€™s pretty easy. I made a
couple of missteps by attempting to follow along with what some other people
had done, but the solution was actually extremely simple: first you need to
install SQL
Server 2008 R2 SP2 express
(and it has to be at least this version), then
you install SharePoint
Foundation 2010
. For all the discussion online, I actually didnā€™t have to do anything other than accept the default options to install SQL server. When I installed
SharePoint it doesnā€™t give me the option to install in Standalone mode, but I
simply pointed it to the SQL Server instance I had installed and that was all
there was to it.

That being said, things are not all that rosy. Just because
this setup is possible, doesnā€™t mean itā€™s recommended ā€“ and this is certainly
not Microsoftā€™s recommended way of doing things.

Microsoft advocate for a separation of server duties, and
having different, unrelated services running on different machines. Now that Iā€™ve
entirely eschewed that philosophy I can see why itā€™s important: SharePoint is
running well and performance is snappy, but general internet performance on our
home network has suffered, and I believe the fact my single Windows server is
also the DNS server for the network is the problem ā€“ DNS lookups are slow.

I may try and solve this by trying to install a slave DNS
server on the Linux server we have, but if not then I think SharePoint will
have to go away in the interests of DNS performance.

Or, maybe I just add a second physical server and move a few
of the VMs to that? Weā€™ll see.

Blog

Scott Forsyth’s Blog – Windows Server 2008 R2 DNS Issues

I use a service at home to unlock region-locked web content, particularly internet video. As Iā€™ve mentioned previously, I run a Windows 2008 R2 server on our home network which is our domain controller, and (as a result) our DNS server too.

The service I use for unlocking content requires that you set the DNS server on the network to the values it specifies. Thatā€™s not viable for me because of course the client machines need to use the internal DNS server in order to be able to find the domain controller, but no problem – the windows server VM can act as the DNS server just fine, handle requests relating to the internal network domain itself, and forward everything else off using the forwarders I specify (which come right from my content unlocking service).

This worked great until a few weeks ago, and then it suddenly stopped working.

I donā€™t know why and Iā€™m not quite technical enough to fully grasp the details, but the problem was EDNS (whatever that is). The blog post Iā€™ve linked above talks about it more depth, but the bottom line for me is that once I turned EDNS off everything worked fine.

Scott Forsyth’s Blog – Windows Server 2008 R2 DNS Issues

Blog

Home Server Setup ā€“ Useful Links

Iā€™ve mentioned in a couple of previous posts that Iā€™ve refreshed my home server recently. In setting everything up I spent a lot of time googling around looking for information on how to do things that for the most part Iā€™d done before on my previous server.

To help me avoid future googling if/when I need to go through this process again, Iā€™ve been creating a whole bunch of bookmarks this time around. I thought Iā€™d share them in case theyā€™re useful to anybody else.

oVirt Hypervisor

Linux Server

Windows Server

Blog

New Home Server Setup

As I mentioned briefly in a previous post, my home server is in desperate need of an update. Last weekend I took the plunge and bought the hardware necessary to build a replacement.

I donā€™t need anything especially powerful ā€“ the chief function of this device is as network-attached storage ā€“ but I do want room to grow and do things with this new server that werenā€™t possible with the hacked pogoplug device I was using previously.

image

I bought a Celeron-powered Intel NUC, a 750gb harddrive, 8gb of RAM and an 8gb USB drive. My intent was to use the 8gb drive as the boot device, keeping the harddrive entirely free for storage purposes.

I chose the Intel NUC primarily because of its small size and low power consumption (it uses a particularly small amount of electricity with the Celeron processor in the model I opted for). Iā€™m a big fan of this platform though, and when the time comes to replace the media-centre PC that lives underneath the TV in our living room I will probably buy another one of these. That said, the platform is not not without its problems. Read on, to learn how I set mine up.

Hardware Installation

First things first is hardware installation, and this was especially simple. You remove four screws from the base of the computer and the lid slides off. Inside youā€™ll find a metal chassis for the 2.5ā€ HDD, and you lift that out to revel the motherboard.

image

In my model of NUC thereā€™s a single SODIMM slot for the RAM, so I slotted that in. Next up is the HDD itself. The chassis includes brackets to hold the drive in place and the power and data connectors are already positioned. The drive just slots in, and you insert a couple of screws to hold it in place.

image

And thatā€™s really all there is to it! You put the cover back on, and tighten the four screws on the base of the unit. Done.

BIOS Update

The first thing to do is update the systemā€™s BIOS, and this really is an essential step. This thing comes with Intelā€™s visual BIOS, and the version it ships with has some issues.

image

Updating isnā€™t difficult: Head on over to Intelā€™s website, download the latest firmware and put it on a USB drive, boot into the BIOS and hit F7.

Even with the latest version installed, the BIOS is where this thing falls down, in my opinion. If youā€™re planning on installing Windows 7 or 8 on this thing then you probably wonā€™t run into any problems. My plan was to install an alternate OS though, and I ran into a whole bunch of issues. I believe this was because of bugs in the BIOS and its implementation of EFI, but I donā€™t know enough to say this for sure.

Software Installation

My plan was to install vSphere Hypervisor and use this thing to host a couple of virtual servers. vSphere has a hardware compatibility list and none of my hardware is on it, but Iā€™d done some reading and learned that I could slipstream drivers for the HDD and network card into the install. Nevertheless, I never did manage to install vSphere ā€“ the install just froze every time and I couldnā€™t get through it no matter what I tried.

The next hypervisor I tried was Proxmox VE. The install completed just fine, but I couldnā€™t get the server to boot. While the problems I had with vSphere may well have been in relation to my use of unsupported hardware, I firmly believe my problems installing Proxmox were related to the BIOS, or at least an incompatibility between the EFI implementation in Proxmoxā€™s version of the Linux kernel and the BIOSā€™ EFI implementation. I never did manage to get this working either, except for briefly with a cludgy workaround involving booting from a live-CD and entering the relevant commands to make GRUB boot the OS installed on the HDD instead.

After a day of frustration and failed attempts to install an OS, I moved on to my third VM hypervisor, oVirt. With vSphereā€™s proprietary solution the OS and the hypervisor are closely intertwined. Itā€™s possible to install Proxmox on top of an existing Debian install, but itā€™s not the recommended way of doing things and the process seems complex. oVirt, by contrast, seems to have been designed to be installed on top of an installation of CentOS. An all-in-one install image is offered, but after the previous dayā€™s failures I didnā€™t even bother with this ā€“ I did a (successful!) minimal install of CentOS and then used the yum package manager to add oVirt on top.

With the hypervisor up and running, I used the web-interface to install Ubuntu Server into one VM and Windows Server 2008 into another. I plan on adding two more virtual machines, one Linux and one Windows, for testing and playing around.

image

Blog

Home Server Refresh

I get paid every other Thursday, or 26 times a year. That mostly means I get paid twice a month, but twice a year there’s a month where I get paid three times, and this is one of those months.

Since a big chunk of our expenses are monthly, getting paid three times in a month means I have some extra cash left over to play with, and this month I’m going to use it to replace our home server.

Above is our existing home server. It’s a pogoplug device that I’ve hacked to run debian linux. It’s primary function is as network attached storage for all the other devices in the house, and the box to its right is a 2tb USB hard-drive. It also runs a LAMP stack for development, and some torrent softwareĀ so it can handle downloading duties without any of the other computers in the house needing be kept on.

It only has 256mb of RAM though, and just occasionally if it’s under heavy load things fall down. The torrent daemon is usually the first victim: sometimes I go to check on the status of a download only to find that the downloading process has run out of resources and shut itself down.

My requirements for a replacement are that it handle all the tasks the existing server does – without the problems caused by the limited memory, uses electricity sparingly, and also gives me room to grow and try new things that I haven’t even thought of yet.

My plan is to replace it with a machine I build based on an Intel NUC.

This thing is a barebones machine (it doesn’t include any RAM or storage in the basic package), but it could be useful for many scenarios (I think it would make a great HTPC, for example) including mine.

I’m going to max it out with 8gb of RAM, and this 32-fold increase over what I have now should allow for a whole host of new possibilities.

I’m going to take advantage of the extra resources by attempting to install vSphere Hypervisor on it and split it into a number of virtual servers. One will be linux-based to replicate the functionality of the existing server, one Windows-based VM, and maybe two extra VMs (one linux, one Windows) so I can separate my playing around from mission-critical server duties.

I’ll be posting more as I work my way through the process.