New Code Projects: Backblaze B2 Version Cleaner & VBA SharePoint List Library

It’s been a while since I’ve posted code of any description, but I’ve been working on a couple of things recently that I’m going to make publicly available on my GitLab page (and my mirror repository at

Backblaze B2 Version Cleaner

I wrote last week about transitioning my cloud backup to Backblaze’s B2 service, and I also mentioned a feature of it that’s nice but also slightly problematic to me: it keeps an unlimited version history of all files.

That’s good, because it gives me the ability to go back in time should I ever need to, but over time the size of this version history will add up – and I’m paying for that storage.

So, I’ve written a script that will remove old versions once a newer version of the same file has reached a certain (configurable) “safe age.”

For my purposes I use 30 days, so a month after I’ve overwritten or deleted a file the old version is discarded. If I haven’t seen fit to roll back the clock before then my chance is gone.

Get the code here!

VBA SharePoint List Library

This one I created for work. Getting data from a SharePoint list into Excel is easy, but I needed to write Excel data to a list. I assumed there’d be a VBA function that did this for me, but as it turns out I was mistaken – so I wrote one!

At the time of writing this is in “proof of concept” stage. It works, but it’s too limited for primetime (it can only create new list items, not update existing ones, and each new item can only have a single field).

Out of necessity I’ll be developing this one pretty quickly though, so check back regularly! Once it’s more complete I’ll be opening it up to community contributions.

I have no plans to add functions that read from SharePoint to this library, but once I have the basic framework down that wouldn’t be too hard to add if you’re so inclined. Just make sure you contribute back!

Get the code here!


Late Night Links – Sunday January 24th, 2016


Raspberry Pi Whole Home Audio: The Death of a Dream?

If you’ve been following my blog for a while, you’ll know
that I’ve written a whole
series of posts
on my efforts to take a few Raspberry Pis and turn them
into a DIY whole home audio solution.

If you’ve ever looked at the product offering within the
whole home audio space, you’ll know that setting such a thing up is either
cripplingly expensive, involves tearing the walls apart to run cables, or both.

Where we left off I’d put together a solution that was
glorious when it worked, but that was rare. Typically the audio was either out
of sync between the devices right from the get go, or quickly got that way.

Getting the Pis to play the same music was relatively
simple, but getting it perfectly in sync so that it could be used in a
multi-room setup eluded me to the end, and eventually I gave up.

The bottom line is that synchronizing audio between multiple
devices in a smart way requires specialized hardware that can properly account
for the differences in network latency between each of the end points. The Pi
doesn’t have that, and it’s not really powerful enough to emulate it through

So is my dream of a reasonably priced whole home audio
solution dead? Hell no.

In October I wrote
about Google’s announcement of the Chromecast Audio
. At the time it didn’t
have support for whole home audio but Google had promised that it was coming.
It’s here.

The day they announced that it had arrived was the day I
headed over to my local BestBuy and picked up four of these things. I plan to
add two more, and I couldn’t be happier with the results.

Plus, it frees up the Pis for other cool projects. Watch
this space!


Cloud Backup, Episode III

I’ve written a couple
of times
before about what I do to backup all my important data.

My last post on the topic was more than a year ago though,
so I’ll forgive you if you’ve forgotten. Here’s a recap: originally I was using
a fairly traditional consumer backup service, ADrive.
This worked well because they’re one of the few services that provides access
by Rsync, which made it easy to run scripted backup jobs on my linux
. Their account structure didn’t really meet my needs, however: you
pay for the storage you have available to you, not what you use. When I hit the
upper limit of my account the next tier up didn’t make financial sense, so I

About 15 months ago I moved my backups over to Google’s Cloud Platform. This gives me an
unlimited amount of storage space, and I just pay for what I use at a rate of
$0.02/GB/Month. This has been working well for me.

In December I
. They offer a service very similar to Google’s (or Amazon S3, or
Microsoft Azure, or any of the other players in this space that you may have
heard of), except they cost a quarter of the price at $0.005/GB/Month. There’s
even a free tier, so you don’t pay anything for the first 10GB. When I first
looked at them their toolset for interacting with their storage buckets really
wasn’t where I needed it to be to make them viable, but they’ve been iterating
quickly. I checked again this week, and I’ve already started moving some of my
backups over.

In time, I plan to switch all my backups over. So far I’ve
moved my documents folder and backups of my webserver, which totals about
2.5GB. That’s nice, because it means I’m still within the free tier. The next
thing to tackle is the backups of all our photos and music, which combine at
around 110GB. That means I have to transfer 110GB of data though, which is
going to be a painful experience. I’m still thinking about the best way to do
it, but probably the direction I’ll go is to spin up a VPS and have it handle
the download of the backup from Google and the upload to Backblaze, then it
doesn’t hog all the bandwidth I have on my home internet connection.

The only other thing to think about with Backblaze is
versioning. Google offers versioning on their storage buckets, but I have it
disabled. With Backblaze there is no option (at least not that I’ve found) to
disable this feature – meaning previous versions of files are retained and
count toward my storage bill.

I’m torn on this. The easy thing to do would be to disable
it, assuming one of Backblaze’s future iterations of their service offering is
in fact the ability to turn this off. I’m thinking though that the smarter
thing to do is make use of it.

For me and my consumer needs, that will most likely mean I
put together a PHP script or two to more intelligently manage it, however. Some
past versions are nice, but some of the files in my backup are changed pretty
frequently, and I definitely don’t need an unlimited history.

Still, I’m very much pleased with the price of B2, and
watching the feature set rapidly improve over the past couple of months gives
me confidence that I can move my backups over and keep them there for the
long-term, because the transition from one service to another is not something
I want to put myself through too often.


Late Night Links – Sunday January 17th, 2016

It’s time for late night links again! Did you know that so far I haven’t forgotten to post this even once in 2016? We’re on a roll here. Let’s keep it going.

…and we’re done! Until next week then, internet friends!


Late Night Links – Sunday January 10th, 2016

It’s that time again!

And we’re done for another week! Have a good one, and I’ll catch you next week – same time, same place.


Late Night Links – Sunday January 3rd, 2016

It’s the first late night links of 2016!! Are you excited for all the improvements I’m making? You shouldn’t be. Nothing is changing.

And we’re all done for another week! Until next time then, folks!