(Return of the Mac) Come on, (You know that I’ll be back) Here I am

Oh, Apple.  What an enormous pain in the bottom you are at times!  Nearly four months away, and I return to discover that your software is just as buggy as when I left.  But, when it works, is infinitely better than that offered for and by Windows / Microsoft.  The last straw was when I created a spreadsheet in Excel 2016 on the Dell UHD laptop, only to find that due to dodgy Windows scaling, the row sizes were all over the shop when the same file was opened on a Mac also running Excel 2016.

  • Importing 8,163 photos and videos into Photos caused Photos to crash half way through.  Thankfully as Adobe Lightroom organises photos by year, I did one year at a time and everything is now inside the Photos ecosystem.
  • Restoring iTunes to a new machine (regardless of platform) while you have an Apple Music subscription is the biggest load of nonsense I have ever encountered from any software company ever.  All seemed to go well – iTunes picked up the freshly copied Windows iTunes folder and organised/consolidated it as it should.  But, alas, while Apple Music was switched on and signed in, iTunes told me otherwise.  A workaround was to browse and/or play something directly from the Apple Music catalogue web site (within iTunes – iTunes essentially acts a glorified browser), then offline stuff could play.  In an attempt to fix the problem once and for all, I turned iCloud Music Library Off (and Apple Music) and switched them back on.  Big mistake.  As soon as that happened, iTunes attempted to reupload music and match, resulting in the duplication of all Apple Music albums and tracks.  At least some 2,000+.  Sorting out the duplicates in the Gilbert & Sullivan 450 track multi-disc album was, to say the least, [censored] annoying.  This was even after nuking the entire iTunes library and letting everything (inc. matched non-Apple tracks and iTunes purchased tracks) download again from Apple’s servers.  This incident has made me extremely nervous of ever having to restore an iTunes library from a backup.  Maybe Apple is promoting Apple Music’s strength as an online service that you really never need to back up to anything other than their servers?  *shrugs*
  • Playing Team Fortress 2 using the AMD Radeon R9 370X is fine and dandy, but things went a bit wonky straight after Steam/TF2 installation, with TF2 and Steam quitting immediately as soon as the game started.  Restarting MacOS seems to fix it.
  • Switching to Apple’s Two Factor Authentication was a pain too.  If you had Two Step Authentication, you have to disable that, create a new series of security questions, then wait a bit before the Two Factor Authentication settings pop up on the iPhone or iPad.  Trying to get the Apple TV to recognise HomeKit involved logging in and out about six times before it finally worked.  Lots of logging in and out across all devices overall.  Apple Watch needed a reboot to get the MacOS unlock functionality working, else the system complained that it couldn’t find the watch.

Otherwise, I am enjoying the Retina display, the quad core processor, and super fast SSD drive.  I’ve come to the conclusion that Windows is not ready for 4K/UHD and above displays.  Not until software developers start making the use of it.

But I will remain a Mac/iPhone/iPad user for the foreseeable future.  The alternative is good, but for me – and despite all the problems with Apple’s software division – it’s not enough. Apple have won.  I surrender.

The Smartphone Games: Catching Fire

Catnip Everready prepares to fight the evil President Lith-Ion in the sequel to ever popular The Smartphone Games: The Smartphone Games.

– Description of my new novel, The Smartphone Games. LOL.

Samsung has made available an IMEI checker that tells you if your phone is affected by the battery defect.  But then again, it may just be a list of phones that haven’t been returned to them yet.  In any event, this is what happened when I typed in my phone’s IMEI:


The Galaxy Note 7 was a truly lovely phone, but with more reports coming in (including reports that other Samsung phones may be affected too), I thought it best that I returned the unit to Carphone Warehouse and get a refund.  Which I did. Amazingly, despite the recall and the press, they told me that this was still a phone very much in demand.  Unfortunately, I think that the reputation of this brand is now tarnished sufficiently that if I were to go travelling with it, it’d attract too much attention.

Update: As if exploding batteries weren’t enough, the S7 and S7 Edge are suffering with a caching bug which is causing all manner of problems.

So I’ve now gone for the iPhone 7 Plus.  It seems a safe(r) bet than many flagship Android phones at the moment anyway.  I was especially encouraged after reactivating my Apple Music subscription.  It’s proving to be a much smoother experience than last time (I think they key thing here is the lack of iTunes Match) – indeed, I downloaded a 456 track, 21-disc version of the D’Olye Carte Company’s recordings of Gilbert & Sullivan without any issues at all.  In one sitting.  So very promising.

Generally speaking, Apple isn’t a bad company at all.  I still have reservations for their cloud services and the dependency that many of its operating systems have on it, but ultimately providing one can take backups of everything on a regular basis, it really shouldn’t be a big concern.  I still say Apple should offer an AppleCare+ like product for iCloud, however.

Back to the Windows (Future): Part Two

Settling in reasonably well with Windows 10.  Next month we’ll all be getting the Anniversary Update which will make some changes to the Start menu (which I think is for the better based on my experience from the preview builds I’ve been testing with at work) as well as a few other bits and bobs.

Windows as a Service (WaaS) is the way forward.  There will no Windows 11.  And depending on how technically adventurous you are, you can switch to using Insider Builds which provide you with the latest and greatest new features and bug fixes before they’re unleashed on the public.  Even so, I still stick with the regular builds at home.  I only use the Insider Builds on virtual machines that I run at work.

One thing that had been bugging me over the past couple of weeks was finding a local backup utility to store copies of my files on my local NAS (network attached storage), a WD MyCloud (6Tb) which sits on my gigabit switch hooked up to the Sky Q Hub.  I tried Crashplan which also backs up to its own servers, but found it to be too slow (and Crashplan’s high-resolution support isn’t great).  I also tried Acronis TrueImage 2016, but found that to be far too slow as well – and found that it didn’t recover very well if the backup was interrupted – the UI froze a lot.

I then remembered that I had a product I used way back when I was using Windows before the great migration to the Mac, SyncBack Pro.  But, alas, it has the worst high-resolution display support of any of the backup products and I have to remove it.  I mentioned this to the developers who told me I could create a file that would help improve that – but I’d have to re-create it with each new update.  Why this couldn’t be handled via the UI I don’t know.  So I gave up on that one.

It turns out that I had the solution under my nose all the time!  Kaspersky’s Total Security 2016.  I bought a multiple device license – one for my Android device and the other for the Windows desktop.  It’s very good indeed and I hadn’t realised that it comes with a backup/restore function.  So I’ve been backing up to the NAS using something I had.

For online backups I still use Backblaze.  Provides unlimited backups, but versioning only up to 30 days.  So if you delete a file and try to retrieve it after 30 days, you’ll probably be out of luck.  Hence the local backups.  I’d have preferred to use Crashplan which allows for unlimited versioning across any number of days, weeks and months, but as I’ve said, the main thing that’s holding me back is the lack of high-resolution display support.

I do hope Microsoft consider doing more work to improve high-resolution display scaling.  If Apple can do it successfully with OS X (or MacOS as it will be called), I can’t see why Microsoft can’t.  It’s time to ditch legacy and look to the future of Windows.  It can’t be too longer before 5K monitors and beyond will be the norm.  Windows  need to be ready for this along with all Windows developers.

Meanwhile, I’m selling my Xbox One in preparation for the Xbox One S.  Ultra HD Blu Ray support PLUS the ability to game (stream to a PC) and PC integration (controller can be used with a PC for gaming via Bluetooth) for less than £350?  Yes please.  The Xbox One (S) is effectively running its own version of Windows 10, so that’ll be getting the Anniversary Update too.

Brings a whole new definition to the word “clean-up”

Info Insecurity

I shouldn’t laugh at a fellow web hosting company’s misfortune, but when I heard about the almighty muck-up from 123-reg inadvertently nuking customer’s virtual private servers (source: BBC)  during routine maintenance, I couldn’t help but to try and stifle a chuckle.

But on a more serious note it highlights a couple of problems (least of which is to be very, very sure about what stuff you’re doing on the underlying host platform):

  • Virtualisation = multi-tenant server, therefore a dedicated server will be home to quite a few other clients, all doing their own thing.  Unless you’re using some form of shared storage for the virtual server image, or can quickly hot swap the drives out to a new standby chassis – if the server goes TITSUP (see below), many people will be affected, and for quite some time!
  • Backups.  I can’t believe people aren’t making multiple backups.  Especially if you’re not paying the hosting provider for the privilege.  NEVER assume that your hosting provider is taking backups of your data.  But there are many options available to ensure that you have sufficient coverage in the case of a failure. Some hosting providers usually provide something (at cost), but it’s always recommended that you store backups both away the hosting platform, in a different datacentre, and at least one copy preferably away from the hosting company.  Why not use a third party utility such as rclone to make sure you’re backing up valuable data to another service?  I’ve written a guide for cPanel server users here.
  • Redundancy.  If your business is truly that important, you’ll be looking at high availability options that can include, but are not limited to, load balancing (multiple web front ends, multiple DB and file backends).  If one more servers goes TITSUP (Total Inability To Support Usual Performance), others can take over.  Failover options are well worth investigating.  Note: it’s rarely cheap, but if you really value uptime of your business – it’s a must.

I think the best attitude to have in this situation is to tell yourself what would you do WHEN these things go wrong – not IF.  Aside from all of the above, your web site may be affected by malware (especially if you’re running legacy versions of the server components, or if your CMS or web site is itself based around legacy components – make sure you keep it up-to-date!), denial of service attacks, or a combination of both.

Running a web site and managing your email is fun, fun, fun!

Using rclone to backup your cPanel backups to a remote destination

cPanel/WHM has a robust backup system that can create .tar.gz archives of your accounts, combining email, web files, databases, etc. into a single archive that can be used to restore the account in the case of emergency, or to move to another server.

What it isn’t so good at is putting them somewhere off the server to ensure that if your it dies a horrible death (multiple hard drive failures, spontaneous combustion, human error, etc.) you can restore all your accounts.  Much of the backup system depends on third-party remote mounts, Amazon S3 or FTP servers.

Worry no more!  For one of the directors of Memset, the company that employs me to do things, has created a multi-purpose transfer tool called rclone.  It can be set-up to copy or sync data to a variety of multiple destinations, including:

  • Google Drive
  • Amazon S3
  • Openstack Swift / Rackspace cloud files / Memset Memstore
  • Dropbox
  • Google Cloud Storage
  • Amazon Cloud Drive
  • Microsoft One Drive
  • Hubic
  • Backblaze B2
  • Yandex Disk
  • The local filesystem

Since this site is hosted on a Memset server, it makes sense to backup my cPanel accounts over to my Memstore account, an object storage system that uses the OpenStack Swift protocol.  While we have custom FTP and SFTP proxies, it’s important to note that you can’t upload a file that’s greater than 5Gb in size. Thankfully rclone speaks native Swift and can handle sizes beyond 5Gb.

The following assumes a basic knowledge of Linux and access to SSH as root..

So the first thing to do is download a copy of rclone for your server.  Most people will be running a 64-bit Linux, so you’ll need to download the tarball for that.    The next step is to unpack the archive and install the binaries and manpage as per the instructions.  Skip the sudo parts if you’re on cPanel – it’s not needed, so:

Now run:

You’ll see something like this:

Press ‘n’ for New Remote.  You’ll then be prompted to give this a name.  You can call it whatever you like.  In this example I’ll be using the name ‘memstore’.  Once you’ve given it a name, you’ll be prompted for the storage type.  In our example, it’s OpenStack (number 10):

I’ve created a user within my Memstore/Memset account control panel called “cpanel” that I’ll be using to connect to the Memstore container “cpaneldemo” that will hold my backups:


I then assign read and write permissions for user “cpanel” to the container “cpaneldemo”:


Now to configure rclone:

So the username is the right-hand part of the Memstore username, and the tenant is the left-hand part (e.g. msdrakeab2.cpanel becomes user = cpanel, tenant = msdrakeab2).  The key (or password) will be displayed in plain text at all times, and is stored within the /root/.rclone.conf file.  Make sure that only root has permission to read this file – it should do by default, e.g.:

So we’re ready to rock and roll.  We don’t have any data in the container, but we can give a quick test to make sure we’re able to connect:

All seems to be working.  So let’s manually move some backups to Memstore.  Memset configures cPanel backups to be dumped to /backup on your server.  So based on that, the initial upload will look like this:

When we look at the contents of the container through the Memset account control panel:


How do I retrieve backups?

Very easily done.  Let’s say we want to grab the account called ‘mice’ that was backed up on the 31st.  In the cPanel backup hierarchy, it’ll look like this:

So to get that back from Memstore, we’d do this:

where /tmp is the local filesystem where you want the file to be placed.  Can be anywhere on the filesystem.   You can leave out the file and have the entire contents of the ‘accounts’ directory transferred too (although in this example, there is only one file in ‘accounts’):

How do I automate the backups?

Simple, just add it as a cron job:

which will run at 1:30am and will dump the output to /var/log/rclone.log.

Other ideas

You could use rclone to create historical backups within Memstore, handy if you keep a set of daily backups that you’d like to keep around longer than the cPanel keeps them for on your server’s filesystem.  To do this, you ensure you have a destination container to sync to.  So let’s create one:

then sync the contents of one container to another.  Note that all of this is done on Memstore – no data is transferred from your server to Memstore (or vice versa).

The following example demonstrates a sync of the existing data in the “cpaneldemo” container to the new container “cpdev”.  I could  automate this by adding a cron job to sync data from “cpaneldemo” to “cpaneldev” on a weekly basis, for example.