Dead Wood

Next to our house is large Oak (I think?) tree which has, over last three years, slowly died.

It’s not on our property, but had grown over the house. We had some high winds earlier this year which resulted in a decent percentage of the small branches falling on to our roof and garden. No damage, but it was clear the tree was becoming more brittle and heading towards dangerous territory.

The owners recently organised to have it cut down. This was a sad moment, however I asked for a slice of wood from the tree. Our house sign rotted away a while ago and I had my eye on the wood from this tree. I’m hoping to use this slice to make a new sign at some point as a nice little tribute to the beast that has stood guard over our property for decades.

All I need to do now is figure out how to properly do this, or find someone to do it for me!

The neighbour will be using the wood in their log burner but has offered some to us, too. Doesn’t look like we’re going to be cold next winter, or perhaps the winter after depending on how long it needs to dry for.

One silver lining: it’s now feasible for us to get solar panels!

A leak here and a leak there

We’re slowly eradicating our mould problems and it seems the universe has taken offense.

First off, our shower is leaking. The leaky pipes are within the wall and just happen to be directly above a power outlet located on the other side. The paint has bubbled up on this wall as the moisture slowly found its way out and down toward the source of electricity. We’re no longer able to shower or we risk the wall getting more damaged and the potential for a fire to start. We noticed the paint bubbling, popped the side of the bath off (our shower is above the bath tub) and realised the concrete floor was soaked, the underside of the bath and the wood that helps support it were literally caked in mould. The water has also soaked into the concrete and found its way under the wall into the kitchen. This appears to have been happening for a while now as the kitchen units along that wall are also caked in mould across the back. We will likely need to replace the units in the kitchen, which isn’t too much of an issue as we needed a new kitchen anyway, as well as dismantle half the bathroom to fix the source of the problem, which is also not too much of an issue as we also needed a new bathroom. Maybe £10k to do them both to a decent standard.

To add insult to injury, the cess pit decided to spring a leak. The container itself is, thankfully, fine. There’s a broken inlet pipe somewhere, letting rainwater in. We don’t have an issue with sewage leaking out but every time it rains the cess pit fills up pretty quickly. We could likely dig up the pipes and try to fix it but we need to replace the cess pit for a water treatment plant anyway. This is not a cheap thing to do – we’re looking at a starting figure of about £8k, though this will likely increase due to some issues with the water run-off and our lack of land.

They say it comes in three’s. They’re wrong.

My car engine light came on. Hopefully this is just the sensor – clearing it hasn’t worked so the next step is to replace the sensor itself. I’m not sure what the sensor is called but it checks the engine vibration and if it detects something amiss, alerts the driver. If the sensor is not faulty it’s potentially a new engine, which with my old-ass car likely means a new car.

A day after the engine warning light came on, my windscreen cracked. Luckily my insurance covers the cost of repair minus £125, which I have to pay. Not too bad.

Though a few days after, my partner and the baby were in a car crash. Both are fine! However the car is likely a write-off. The other party admitted fault right away and their insurance will pay out the value of the car, but until that happens we’re without a second car, which makes it difficult for me to fix mine. Not impossible, just difficult.

I guess it comes in six’s? Maybe two sets of three.

I’m excited to get these problems sorted. Though they’ve all come at once, besides the written off car they’re all things we planned on doing anyway. They’re the last renovation things we need to do before we can start on the detailing – primarily, sorting out the network (allowing me to finally post something technical on here again)

The fact that these things are now active problems means we’re more focused on fixing them. We could have paid to update our bathroom at any point but have been putting it off. Now, though, we’re looking at getting it done sooner. The downside is that we don’t have the money to do it all in one go so we do need to pick our battles in the right order (the cess pit first!)


We’ve had mould problems ever since moving in to this property. We found it under coving, under the old carpet, behind wallpaper and in the kitchen cupboards.

We’ve slowly been eradicating it – we had most of the issues resolved (except for in the kitchen, but more on that in the next post…) by the time we had finished decorating. However, it kept coming back in our bedroom and in the bathroom along the edges of the ceiling/walls. We had put this down to humidity. At night we breathe whilst sleeping, and in the bathroom… well, that one is obvious. I was getting frustrated with the constant mould recently and called up a specialist company.

They refused to take my money.

Instead, they gave me some (free) tips which, in hindsight, are quite obvious if I had thought about it. I appreciate that they did this instead of charging for someone to come out and do the work for me!

First off, they told me that a high humidity does not result in mould. Instead, where condensation forms is where mould forms. Condensation appears where a surface is cold enough for the moisture in the air to turn into water when it comes into contact with that surface. If you can remove the moisture from the air (essentially impossible in a home) OR warm up those surfaces, condensation won’t form as easily, and therefore mould will have a harder time growing.

Heating a room does not achieve this on its own. As the mould was forming along the edges of where the ceiling/walls met (on externally facing walls, too!) the specialist company surmised that the insulation in the loft was not covering those areas. The heat was escaping through the plasterboard ceiling and not being held by insulation, cooling the area down. Water in the warm air would travel up, hit the ceiling barrier and condense there, dripping down the walls if enough formed.

Up to the loft I went to disover that this was indeed the case. In some areas (not coincidentally the same areas where we had the most issues with mould) up to two foot of ceiling was exposed. Luckily, there was a spare roll of insulation up there already. I used this to fill in the edges about a week ago and since then we’ve not had any more mould appear in those trouble spots. Early days of course, and we’re still getting mould around the windows, but I have some thermal insulating paint to test there.

Ideally we’d replace our windows but can’t really afford to do so at the moment – they’re old and the seal has broken on some of them, which we will get repaired.

Is it a bird? Yes

This poor lil’ tweeter flew into our kitchen window!


We have a cat so quickly plucked the little bird off the floor and put him up on a bird table. After a short rest it flew off, happy as can be.

As I was carrying it to the bird table it snuggled right into my hand and wouldn’t get off. Must have liked the warmth. Cute!

Switched ISP

I’ve changed ISP. We used to be with Zen who were, and remained, awesome. We have recently however changed to Andrews & Arnold.

Due to our location we’re unable to get FTTC, we’re stuck with ADSL. The main reason for the ISP switch initially was A&A could provide us the same service for nearly £10 cheaper. I’m a fan of the smaller organisations and use them over behemoths where possible. Unfortunately the bigger ISPs can go even cheaper than what we pay now but I’m happy to pay more to a smaller company.

Before switching I called up A&A with a few questions, and they answered everything – they know what they’re talking about. I work in IT (though not at an ISP or in networking) and could tell they knew their stuff, and I was only talking to their customer service team. They also sent me a new router (as I signed up for 12 months) which is nice – I had been running on a Draytek 2925n which, whilst generally fine, is a bit old now. No 5Ghz wifi was hampering us somewhat. The new router seems to be good, though I’ve not poked around too much in it just yet. We still don’t get a wireless signal in all corners of the house but I have plans to fix that… eventually.

One cool thing that A&A do is provide access to some quite detailed line status and configuration settings via their Control Pages. I like the openness and control they provide to customers. The static IPv4 address and IPv6 support should be an option from all ISPs by now but frustratingly isn’t. A&A have this as standard at no extra cost.

An upside to switching to A&A is that our average download speed has increased slightly. We used to see 2.5Mbps download with Zen but this has increased to about 2.9Mbps with A&A. We’re still occasionally buffering when streaming (though Netflix is often fine, we’ve had issues with some other online streaming services) though the frequency at which this occurs has reduced a bit.

Overall, happy so far. The line has gone down a few times, which we didn’t really experience with Zen (or notice? I can see when it goes down on their control pages, plus I get an email about it if it’s down for a little while, which is awesome)

I would recommend anyone looking for an ISP to avoid the larger companies and look at either Zen or A&A. You can’t go wrong with these.

Baby ProTip #4713

Little baby gotta take medicine? Aww, poor thing 🙁 Odds are you’ll be given a syringe which, whilst it works, is a bit of a pain to work with when it comes to delivering the liquid to the child.

Instead, measure out the dosage using the syringe then squirt it into an upsidedown clean bottle teat for the baby to drink from! No need to attach the bottle, just carry the teat over – careful to not spill any – and let the baby suck it from there. Make sure it’s a low flow teat, or it’ll come out on it’s own as you take it to your baby.


I finally have a account.

I’ve been learning Git at work via an internal Gitlab installation for my Powershell/PHP scripts and it’s going quite well!

I have been told about Hacktoberfest – A few people I know have signed up and I thought “hey, why not give it a go?” It’s a perfect excuse to start contributing to projects I make use of, though I’m no coder I can probably do some documentation work, and maybe some basic Powershell stuff too if I’m feeling brave.

Eventually I’d like to put some personal projects on there, give back to the world a little bit, but for now I’ll just be logging issues or fixing small easy problems in other peoples repos 😀

You never know, I might get a T-shirt!

SIMS(.Net) is slow

I should preface this technical post by saying that I am in no way a database dude. I understand what they are and how they work at a basic level and sure, I’ve written some basic SQL queries in PHP or queried some stuff in MSSQL but as a typical “jack-of-all-trades” type I am no expert by any stretch of the imagination.

This is also relevant for any database, not just SIMS.Net.

What is SIMS?


For those of you who don’t know, SIMS(.Net) is a Capita application used by a whole bunch of schools to record data about their staff, students and their parents. It’s a school MIS.

It’s quite renown as being very slow. There are many posts on the forum complaining about this, and there are many suggested ways of fixing it. Okay, that’s not entirely true. Lots of these fixes may grant you a slight boost, but the backend of SIMS was written a very long time ago (decades) and at the end of the day it’s just a slow bit of software. That said, there are some things that can be done to help, and this post outlines one of them.

The Database File

SIMS is a Windows client-server application. The server side runs from a single database, though there are some additional features which have their own databases.

Typically, the schools IT department will not install the SIMS server. It’s quite finicky and likes things set up in a very particular way, and is often handled by a third party organisation.

The client application uses a shared network drive for some data storage. Typically, at least in every school I have seen, this is set up on the same drive on the server as the database is stored in. This is generally bad practise when it comes to databases – you should put your database file on a disk/drive that has nothing else on it. No OS, no file storage, no nuthin’!

There are a couple of reasons for this. The most obvious is that if a file on that same disk is being written to or read from, the database isn’t being used. Your application must wait its turn before the database can be queried. Schools don’t really care about this as the slowdown is not noticeable by end users given the volume of non database data being read or written, but on systems where every mili- or microsecond counts this can be a big bottleneck depending on the frequency of use the other files on the disk see.

The second reason is related to the first, and it has to do with how the database file itself is initially set up. You can define the size of a database when you create it. You should really set it to be bigger than the expected end size of the database when it’s time to migrate it to a new system or replace it. Typically, if you know that in 3 years you’re going to migrate away from the system and it will have approx 10GB of data in the database, you will probably want to add 20-50% extra on top of this when you first set up the database to allow the data to grow into the file. So you go make a 15GB database and can sit happy knowing that you’ll likely never touch the sides.

You can set the database up to dynamically grow once you’ve filled it up, too. This is called “Autogrowth”. You start with a, say, 20MB database, but this quickly fills up. MSSQL is configured to allow this to grow, so increases the file size of the database in chunks. These chunks can be consistently sized (add 1MB each time) or a percentage of the current size (eg: grow by 10% current filesize. 100MB database grows into a 110MB database, which then grows into a 122MB database, and so on)

The default growth size for a database file to grow by is 1MB. This means that if you have a heavily used database with loads of data going in often, MSSQL will need to constantly grow the database by 1MB each time. This is obviously going to add some overhead to operations, however there’s another side effect to this when you have other files being used on the same drive as the database.

If you have a 50MB database, then write some files to the drive, then add more data to the database such that the database needs to increase in size, youre going to end up with 50MB of database data on the disk in a chunk, a bunch of files next to it, then next to that another chunk of database file data. You see a single 51MB file, but at a disk level there’s one 50MB chunk, random file data, then another 1MB chunk. This is called fragmentation, and it means that for the hard drive to pull data out of the database it needs to move a little needle (if you’re using old spinning disks, like us. What about SSDs? Glad you asked.) a greater distance and more often to get at the data you want. This slows things down.

What the Frag!?

Where I work, I recently found that the database had been set up with unlimited growth in 1MB increments. We went from a near-zero sized database at initialisation to what is currently a 26GB file. Each time we added data that pushed the database file to full, MSSQL would only ever add 1MB to the size of the file.

This, combined with the use of a network drive which has been configured on the same disk, has given us a horribly fragmented database!

There’s a SysInternals tool called contig.exe which allows you to query an individual file or filesystem and show you how fragmented it is.

Following this guide I queried the database to see how fragmented it was. Keep in mind, here, that the author of that guide made a 3.5GB test database and purposefully fragemented it 2000 times to show a “heavily fragmented” example.

Here are the results from my test.

PS C:\temp\contig> .\Contig.exe -a "E:\MS SQL SERVER 2012\MSSQL11.SIMS2012\MSSQL\DATA\SIMS.mdf"

Contig v1.8 - Contig
Copyright (C) 2001-2016 Mark Russinovich

E:\MS SQL SERVER 2012\MSSQL11.SIMS2012\MSSQL\DATA\SIMS.mdf is in 102951 fragments

     Number of files processed:      1
     Number unsuccessfully procesed: 0
     Average fragmentation       : 102951 frags/file

Yeah. 102951 fragments for a single file. That’s insane. Every time the database is queried it likely needs to navigate around the disks dozens of times to get all the relevant data, slowing things down considerably.

Fixing Fragmentation

We can use the contig.exe tool to fix this. It requires that the database is offline so I’ve had to wait until this weekend to do this.

I took the database offline (via the MSSQL Server Management Studio GUI) and attempted to defrag the database file.

PS E:\MS SQL SERVER 2012\MSSQL11.SIMS2012\MSSQL\DATA> C:\temp\contig\Contig.exe .\SIMS.mdf

Contig v1.8 - Contig
Copyright (C) 2001-2016 Mark Russinovich

     Number of files processed:      1
     Number of files defragmented:   1
     Number unsuccessfully procesed: 0
     Average fragmentation before: 102951 frags/file
     Average fragmentation after : 3 frags/file

After waiting anxiously for 40 minutes the results of the operation came through. We went from a database split into 102951 fragments to a database split into just 3.

I revived the database by bringing it back online, verified it was working, then carried on with my weekend. All in all, it only took me about an hour and a half on a Saturday to sort this out.


Before I did the work over the weekend I took some rudimentary speed tests of the SIMS application to determine if the change actually had any effect.

I perfomed two tests three times. They wouldn’t stand up in a court of law but they’re good enough for my purposes.

I timed how long it would take to log in – hitting ‘Enter’ on the username/password window to getting the logged in GUI loaded – and I also ran a report on our on-role students containing their forename, surname, year & registration group code. A simple report.

Both of these tests were performed at approx 11am on a weekday. Nothing else was going on at the time (no large reports) but SIMS was actively being used by a few dozen staff. The results are in seconds.

Before defrag – Friday

Logging In


  • 16.7S
  • 7.1
  • 9.9
  • 18.6
  • 15.7
  • 18.7

After defrag – Monday

Logging in

  • 10.6
  • 5.7
  • 5.8


  • 14.1
  • 11.9
  • 14.3

As you can see, after the defrag has likely (assuming no unknown other reason for the slow results initially) shaved off a number of seconds from each run. The first login of the day is always the slowest as some stuff gets cached on your machine, but even that was sped up (by over 6 seconds!)

Overall, whilst not conclusive nor scientifically sound, I am happy with this approximate 30% speed boost and have begun looking into our other databases to see if they’re suffering from the same fragmentation.

I would advise you take a look at your SIMS database, too. Hell, any database you have. If you’re a database guru you likely know about this already and laugh at my realisation but it was genuinely surprising news to me, though obvious in hindsight. I know about file fragmentation, I just didn’t think about it in the context of automatic database growth.

Being able to say “I’ve increased the speed of one of our most critical applications by between 24-40%” sure feels good. Though it should have never been set up this way in the first place…

Behind the Times? Git Gud

I’ve recently decided to learn Git.

Yes yes I know, I’m over a decade late to the party. I haven’t taken a look at source control since I first played with SVN many (many) moons ago. I haven’t bothered for a few reasons. Mainly, I’ve not had a use for it. Though I have written some scripts for work and whatnot, I’ve not needed the collaborative advantages of using the tool, and neither have I really needed the version control side either.

Don’t get me wrong, it probably would have been useful, however I’ve not missed it or wanted the features it boasts until recently.

However, times are a-changin’. Some of the techs at work have started using my scripts over the last year and they’re beginning to identify issues or quirks which I would ignore or didn’t encounter. I wrote these scripts, so I know how to use them almost instinctively. These issues just don’t show their head for me, or when they do it’s sort-of by design and I don’t hesitate to work around it.

Since I’m now making small changes sporadically, and looking ahead I’m beginning to automate even more things now that I have a slither of free time occasionally, I’ve decided to jump head first into Git.

I’ve built an Ubuntu 18.04 VM at work and installed GitLab onto it. (Slight tangent, but their installation guides are very good.)

The Continuous Integration and Continuous Delivery stuff fascinates me, but I’ve disabled them from running on each project by default as I need to focus on learning Git first. I’m eager to learn more about this, though.

I’m going to make you cringe, but I’ve opted to use a GUI front end for my machine instead of relying on the CLI. This is because it seems to be a bit of a pain in the ass to run CLI git on a Windows machine (we’re a Windows network) and, although I will learn the commands eventually, I want to focus on the best practises of using Git rather than mess about with the command line syntax. The syntax is very simple, but I don’t trust my brain to remember it right now.

I’ve chosen to use the GitHub GUI for now. It works pretty well. I’ve moved six of my currently active scripts (all powershell) onto GitLab and have pushed commits to the projects.

I’ve also created a project for our network switch configs. I don’t know if this is something GitLab can do or if some other kind of automatic deployment technology is needed, but it’d be cool to make a change to the repository and for that change to be automagically applied to the relevant hardware. I can think of ways to script that myself, but is there a purpose built tool out there?

I’ve got lots of questions to ask and lots of avenues to explore. For now, though, I’m keeping it simple with version control & branching.

I’m considering eventually creating a public GitHub repo to put code out into the world. It would take some work to de-work-ify the existing scripts and remove any workplace data, but I could also eventually upload these scripts, too.

Boiler High Heat

We had a bit of a warm spell recently. One side effect for us was that some of our sites boiler control systems started to panic in an unexpected way.

Each boiler is fitted with a fire alarm which links into the site-wide system. Unfortunately it got so hot in a couple of the storage-heater boiler rooms that the temperature sensor started flipping out thinking there were flames causing the heat. Luckily, we caught the alarm within the 3 minute silent alarm before emergency services were automatically contacted and the site-wide alarm sounded, prompting an evacuation.

Overriding a firealarm long term is not a good idea, so we had to find a way to reduce the temperature in these small, cramped, dusty rooms.

Our solution was simple, but I liked it a lot – turn off the heating element, then run around the building turning on all the hot water taps.

This refilled the hot water storage tanks with cold water. The taps being on carried all the heat away from the room and within minutes the room had cooled significantly. It is a bit of a waste of water, which is a shame, and there was no hot water in the building for the afternoon, but that beats evacuating out from our cool offices into the high-thirty-C outdoors.

The simple fixes are the best.