Golang Database Migration Tool
I have created a database migration tool based on the usage of PHP’s Yii2 framework migration tool. It’s a command line tool that will create and run migrations easily when running manually or automated.
Once setup, migrations can be created with ./main create new_user_table
and then run with ./main up
More about the tool can be found on the Github page.
Backlight not working in Kubuntu 21.04
It seems that (at least for me) that a new install of Kubuntu (and probably Ubuntu) doesn’t recognise the backlight hardware or keypress on Lenovo X1 Carbon laptops (mine is gen 5).
Xev doesn’t recognise the keystrokes of either the brightness up XF86MonBrightnessUp, brightness down XF86MonBrightnessDown and an attempt to change manually with xbacklight -dec 10 returns an error No outputs have backlight property.
However just creating the following file with these contents resolved both issues:
/etc/X11/xorg.conf.d/20-intel.conf
Section "Device"
Identifier "card0"
Driver "intel"
Option "Backlight" "intel_backlight"
BusID "PCI:0:2:0"
EndSection
After that you’ll need to logout and back in to restart X. Hopefully this helps the next person.
Crypto Watcher with Python
I’ve had an urge for a while to create something new with Python so I created this small tool that retrieves and calculates your crypto balances from BTC Markets crypto exchange. It’s pretty simple but was fun to build.
It works by using your BTC Markets API keys to retrieve your balances from the various currencies you’re holding there and using the last sell price to determine what the current value is.

The repo can be downloaded from Github.
A Working Example of SQL Injection
I started a small project recently to create a PHP based web page that is vulnerable to SQL injection to better understand how a site can be compromised and what someone can do once they’ve exploited the vulnerability. SQL injection is possible when a software developer doesn’t properly handle data sent by a user with their browser through a form or in the URL. By running this example you will learn that it is quite easy to gain shell access to a server when data is handled poorly.
The project which can be forked on Github steps you through setting up and running a virtual machine, abuse the SQL vulnerability and eventually gain shell access. Once a vulnerability has been found, it only takes five steps to gain shell access.
There are several examples of what can be done but you’re also walked through gaining shell access. It’s really quite simple. If you’re interested in web application security I suggest giving it a go. It shouldn’t take more than an hour to get through it.
View the project on Github.

CORS Debug Page
I have been building RESTful API’s for years in many teams and with every project we seem to encounter a CORS issue. CORS is an acronym for Cross Origin Resource Sharing and it often comes into play with RESTful API’s and single page software applications build with frontend frameworks such as AngularJs or ReactJs.
CORS is something your browser will use to prevent it from making requests to a location it doesn’t have permission to access. The browser tests the permission by first sending a OPTIONS type request to the server and checks the special CORS headers. If the headers are present and valid for the domain that the application is hosted on, the app will then make the actual request to the API.
However if the API is not configured properly, it may not respond with the valid CORS headers, preventing the application from working. Sometimes the frontend web application may be at fault. It can be difficult getting to the root of the problem.
The API developer may use a tool such as Postman to test his API as s/he creates it but this will not make CORS requests as it’s not cross origin and therefore any CORS issue on a given endpoint may go unnoticed.
The CORS Debug Page is a stand-alone page that a developer can use to isolate the problem by either proving that the API works with a cross origin request or fails. Load the page in your browser and fill in the required fields including the endpoint in question and hit submit.
To better understand what is happening, open up the network Activity Monitor in the browser tools and watch the requests and their responses. If the request is successful it shows that the webapp is probably at fault. If it doesn’t succeed then the API is likely at fault.
Download or fork the project on Github.

AWS CloudFormation
As I’m building more complex applications to be hosted on AWS, it’s becoming tedious to manually configure the same thing over and over again. This is especially the case when a project requires multiple environments such as production, staging and development.
For this Amazon provide an automation tool named CloudFormation which scripts the architecture to make orchestration easier and less repetitive. The script almost anything that can be manually done through the web console, but commonly includes things such as:
- EC2 image settings such as type, size and startup scripts
- Load balancer configuration
- Security groups
- Databases (along with users and passwords)
- S3 buckets
Like any infrastructure as code language or configuration management tool, CloudFormation scripts are great to ensure things are built exactly as intended without forgetting anything, but can also be reused for other projects.
The learning curve is a little slow but there are a lot of examples online to learn from and knowing how to build things using the console makes the transition to CloudFormation straight forward.
I think that if you’re building systems on AWS regularly, it’s worthwhile in getting your head around CloudFormation as it will streamline your work and ensure integrity in your systems.
Value in Gaining AWS Certification
I’ve been building systems with Linux for 20 years now but things have really changed since the cloud has evolved. Most of my experience with creating hosting solutions was to host web applications that I was working on, either personal or commercial.
Over this time I have moved away from requesting that a data centre provision a new server for me, and then waiting days, to opening up a web console to create new servers on the fly just by clicking a mouse – which creates them immediately. Technology has advanced so much that now Amazon, Google and Microsoft all provide cutting-edge cloud environments to host software applications.
I have been using Amazon’s AWS for over 4 years but had failed to learn a lot of what it offered. This is due to web applications only requiring a small subset of services but also because Amazon keeps adding new services. In order to stay current and to show my skill at developing environments on AWS I thought it was a good idea to get certified. By studying for the exam, what I already knew was reinforced but I also learnt about the things I hadn’t had the time or need to learn.
It was a worthwhile experience and can give potential employers and clients the comfort that I know what I’m doing.
Complex Document Management System Build
A new client whose job it is to audit water management systems approached me to build a complex document management system to automate an existing system that required a lot of manual labour. The audits of varying intervals result in written reports that needed to be kept for future reference, and notifications need to be sent when the system determines that a report is late. In point form, the major requirements of the new system were:
- Allow a new site (such as a building) to be added/updated with the following attributes:
- List or auditing companies for given site
- Report types and their required intervals
- Email credentials
- Include an interface to create a report scanner – which will be used to determine the report type by scan a incoming report (pdf, xls, doc file) and scrape metadata such as report date and report type. This has to be done for each report type for each auditing company.
- Automatically download from the mail server any audits emailed from any number of reporters for the given site (eg. building)
- Scan the report using the report scanner, give it a filename that includes relevant metarandata and move to a CDN.
- Have a reporting page for each site to show the latest reports and show whether they we received on-time, late or overdue.
- Have another service hosting a Nextcloud service that would make available the reports for a user using their preferred Nextcloud client (web browser or mobile app). Therefore any facility manager could easily see and search for any reports for a location the were responsible for.
- Include a cronjob to look for any outstanding reports and send notifications to the facility manager for the given location.
Taking in all requirements, I determined AWS was the best platform to host the application as I could easily host the application and Nextcloud, use S3 for the CDN, use SNS for sending notifications and load balancers to place the servers behind.
Ansible was used to provision the local development environments as well as the EC2 instances on AWS for the application servers and Nextcloud servers which were all run on Debian.
I chose Yii2 framework to build the application for several reasons but mainly because I could easily scaffold each form and add validation and the business logic required. A custom component was written to retrieve new documents from the email server and scan them, and running the metadata through the scanner app. The documents were named according to their metadata and then moved across to an S3 bucket. If a document type was not found, then it was copied to a folder waiting to be fixed manually. A cronjob runs daily to scan the S3 folder for report types and check against the location’s report interval setting to determine if any reports were overdue and sent notifications if any were found.
The client is really happy with the application as it has taken away manual work as well as notify when reports are late which wasn’t available before.
Easily Generating New Ansible Playbooks with a Python Script.
Due to the team continually starting new projects that have different stack requirements, we decided we decided to build a Python script that would read a configuration file that contained which OS (CentOS, Debian, Ubuntu), webserver (Apache or Nginx), database (MySQL, MariaDB) and PHP version (set to latest 7.x). The file also contains project hostname (local dev name only), IP address and local output path.
When the configuration file is run, it will create a full Ansible script in the output path will the correct playbooks for the chosen stack. The same scripts can be used to provision remote servers such as AWS EC2 instances. It really helps in getting the project started quickly.
The project can be found here: https://github.com/doublehops/ansible-builder
Migrating From Rackspace to Amazon AWS.
Over the Christmas break I took on the task of migrating my websites from Rackspace Opencloud to Amazon’s AWS. There were several reasons for doing so but the main ones were because of the ever increasing amount of services I’m using there through work that I want to include in my own projects. I feel I’ve been missing out. Also, the more I that my head is in the AWS ecosystem, the more I’ll learn and be able to pass on to my clients.
As my projects (roughly 7, including this blog) are all rather small, I host them all on the one server instance. This could be a nightmare to migrate but fortunately I had scripted everything with Ansible, making the process fast and straight forward. I first had to tweak my scripts to use PHP7 as I had no yet upgraded my Rackspace instance. By starting out I created a EC2 instance running Debian Jessie, updated .ssh/config with the right credentials and ensure that I could ssh in to the new server. Once verified that all was OK, I ran the Ansible script over the new server which automatically installed:
- Required services such as Nginx, MariaDB, PHP7 and miscellanous tools such as htop, git, vim, etc…
- All the Nginx hosts records
- Any Basic Auth protection I had created for some hosts and paths
- Each database and database users for each project (I’m not using RDS for these small projects)
- Cronjobs and associated scripts that the projects require – which include the onsite backups
- Created a second user that only has privileges to retrieve the backups to store offsite
From there, it was a simple matter of using mysqldump to export all databases from the Rackspace server, SCP them to the new server and import. I then zipped up the web root directory and SCP’d them across to the server also. Lastly came the SSL certificates that I needed to move across. Before long I had a fully functioning server created from scratch that included all sites, their data and full backups. I updated the DNS records to match the new IP address and I was done.