RSS

Crypto Watcher with Python

0 Comments | This entry was posted on Jul 14 2019

I’ve had an urge for a while to create something new with Python so I created this small tool that retrieves and calculates your crypto balances from BTC Markets crypto exchange. It’s pretty simple but was fun to build.

It works by using your BTC Markets API keys to retrieve your balances from the various currencies you’re holding there and using the last sell price to determine what the current value is.

The repo can be downloaded from Github.

A Working Example of SQL Injection

0 Comments | This entry was posted on May 10 2019

I started a small project recently to create a PHP based web page that is vulnerable to SQL injection to better understand how a site can be compromised and what someone can do once they’ve exploited the vulnerability. SQL injection is possible when a software developer doesn’t properly handle data sent by a user with their browser through a form or in the URL. By running this example you will learn that it is quite easy to gain shell access to a server when data is handled poorly.

The project which can be forked on Github steps you through setting up and running a virtual machine, abuse the SQL vulnerability and eventually gain shell access. Once a vulnerability has been found, it only takes five steps to gain shell access.

There are several examples of what can be done but you’re also walked through gaining shell access. It’s really quite simple. If you’re interested in web application security I suggest giving it a go. It shouldn’t take more than an hour to get through it.

View the project on Github.

CORS Debug Page

0 Comments | This entry was posted on May 01 2019

I have been building RESTful API’s for years in many teams and with every project we seem to encounter a CORS issue. CORS is an acronym for Cross Origin Resource Sharing and it often comes into play with RESTful API’s and single page software applications build with frontend frameworks such as AngularJs or ReactJs.

CORS is something your browser will use to prevent it from making requests to a location it doesn’t have permission to access. The browser tests the permission by first sending a OPTIONS type request to the server and checks the special CORS headers. If the headers are present and valid for the domain that the application is hosted on, the app will then make the actual request to the API.

However if the API is not configured properly, it may not respond with the valid CORS headers, preventing the application from working. Sometimes the frontend web application may be at fault. It can be difficult getting to the root of the problem.

The API developer may use a tool such as Postman to test his API as s/he creates it but this will not make CORS requests as it’s not cross origin and therefore any CORS issue on a given endpoint may go unnoticed.

The CORS Debug Page is a stand-alone page that a developer can use to isolate the problem by either proving that the API works with a cross origin request or fails. Load the page in your browser and fill in the required fields including the endpoint in question and hit submit.

To better understand what is happening, open up the network Activity Monitor in the browser tools and watch the requests and their responses. If the request is successful it shows that the webapp is probably at fault. If it doesn’t succeed then the API is likely at fault.

Download or fork the project on Github.

AWS CloudFormation

0 Comments | This entry was posted on Apr 17 2019

As I’m building more complex applications to be hosted on AWS, it’s becoming tedious to manually configure the same thing over and over again. This is especially the case when a project requires multiple environments such as production, staging and development.

For this Amazon provide an automation tool named CloudFormation which scripts the architecture to make orchestration easier and less repetitive. The script almost anything that can be manually done through the web console, but commonly includes things such as:

  • EC2 image settings such as type, size and startup scripts
  • Load balancer configuration
  • Security groups
  • Databases (along with users and passwords)
  • S3 buckets

Like any infrastructure as code language or configuration management tool, CloudFormation scripts are great to ensure things are built exactly as intended without forgetting anything, but can also be reused for other projects.

The learning curve is a little slow but there are a lot of examples online to learn from and knowing how to build things using the console makes the transition to CloudFormation straight forward.

I think that if you’re building systems on AWS regularly, it’s worthwhile in getting your head around CloudFormation as it will streamline your work and ensure integrity in your systems.

Value in Gaining AWS Certification

0 Comments | This entry was posted on Feb 20 2019

I’ve been building systems with Linux for 20 years now but things have really changed since the cloud has evolved. Most of my experience with creating hosting solutions was to host web applications that I was working on, either personal or commercial.

Over this time I have moved away from requesting that a data centre provision a new server for me, and then waiting days, to opening up a web console to create new servers on the fly just by clicking a mouse – which creates them immediately. Technology has advanced so much that now Amazon, Google and Microsoft all provide cutting-edge cloud environments to host software applications.

I have been using Amazon’s AWS for over 4 years but had failed to learn a lot of what it offered. This is due to web applications only requiring a small subset of services but also because Amazon keeps adding new services. In order to stay current and to show my skill at developing environments on AWS I thought it was a good idea to get certified. By studying for the exam, what I already knew was reinforced but I also learnt about the things I hadn’t had the time or need to learn.

It was a worthwhile experience and can give potential employers and clients the comfort that I know what I’m doing.

Complex Document Management System Build

0 Comments | This entry was posted on Jun 21 2018

A new client whose job it is to audit water management systems approached me to build a complex document management system to automate an existing system that required a lot of manual labour. The audits of varying intervals result in written reports that needed to be kept for future reference, and notifications need to be sent when the system determines that a report is late. In point form, the major requirements of the new system were:

  • Allow a new site (such as a building) to be added/updated with the following attributes:
    • List or auditing companies for given site
    • Report types and their required intervals
    • Email credentials
  • Include an interface to create a report scanner – which will be used to determine the report type by scan a incoming report (pdf, xls, doc file) and scrape metadata such as report date and report type. This has to be done for each report type for each auditing company.
  • Automatically download from the mail server any audits emailed from any number of reporters for the given site (eg. building)
  • Scan the report using the report scanner, give it a filename that includes relevant metarandata and move to a CDN.
  • Have a reporting page for each site to show the latest reports and show whether they we received on-time, late or overdue.
  • Have another service hosting a Nextcloud service that would make available the reports for a user using their preferred Nextcloud client (web browser or mobile app). Therefore any facility manager could easily see and search for any reports for a location the were responsible for.
  • Include a cronjob to look for any outstanding reports and send notifications to the facility manager for the given location.

Taking in all requirements, I determined AWS was the best platform to host the application as I could easily host the application and Nextcloud, use S3 for the CDN, use SNS for sending notifications and load balancers to place the servers behind.

Ansible was used to provision the local development environments as well as the EC2 instances on AWS for the application servers and Nextcloud servers which were all run on Debian.

I chose Yii2 framework to build the application for several reasons but mainly because I could easily scaffold each form and add validation and the business logic required. A custom component was written to retrieve new documents from the email server and scan them, and running the metadata through the scanner app. The documents were named according to their metadata and then moved across to an S3 bucket. If a document type was not found, then it was copied to a folder waiting to be fixed manually. A cronjob runs daily to scan the S3 folder for report types and check against the location’s report interval setting to determine if any reports were overdue and sent notifications if any were found.

The client is really happy with the application as it has taken away manual work as well as notify when reports are late which wasn’t available before.

Easily Generating New Ansible Playbooks with a Python Script.

0 Comments | This entry was posted on Nov 16 2017


Due to the team continually starting new projects that have different stack requirements, we decided we decided to build a Python script that would read a configuration file that contained which OS (CentOS, Debian, Ubuntu), webserver (Apache or Nginx), database (MySQL, MariaDB) and PHP version (set to latest 7.x). The file also contains project hostname (local dev name only), IP address and local output path.

When the configuration file is run, it will create a full Ansible script in the output path will the correct playbooks for the chosen stack. The same scripts can be used to provision remote servers such as AWS EC2 instances. It really helps in getting the project started quickly.

The project can be found here: https://github.com/doublehops/ansible-builder

Migrating From Rackspace to Amazon AWS.

0 Comments | This entry was posted on Jan 05 2017

Over the Christmas break I took on the task of migrating my websites from Rackspace Opencloud to Amazon’s AWS. There were several reasons for doing so but the main ones were because of the ever increasing amount of services I’m using there through work that I want to include in my own projects. I feel I’ve been missing out. Also, the more I that my head is in the AWS ecosystem, the more I’ll learn and be able to pass on to my clients.

As my projects (roughly 7, including this blog) are all rather small, I host them all on the one server instance. This could be a nightmare to migrate but fortunately I had scripted everything with Ansible, making the process fast and straight forward. I first had to tweak my scripts to use PHP7 as I had no yet upgraded my Rackspace instance. By starting out I created a EC2 instance running Debian Jessie, updated .ssh/config with the right credentials and ensure that I could ssh in to the new server. Once verified that all was OK, I ran the Ansible script over the new server which automatically installed:

  • Required services such as Nginx, MariaDB, PHP7 and miscellanous tools such as htop, git, vim, etc…
  • All the Nginx hosts records
  • Any Basic Auth protection I had created for some hosts and paths
  • Each database and database users for each project (I’m not using RDS for these small projects)
  • Cronjobs and associated scripts that the projects require – which include the onsite backups
  • Created a second user that only has privileges to retrieve the backups to store offsite

From there, it was a simple matter of using mysqldump to export all databases from the Rackspace server, SCP them to the new server and import. I then zipped up the web root directory and SCP’d them across to the server also. Lastly came the SSL certificates that I needed to move across. Before long I had a fully functioning server created from scratch that included all sites, their data and full backups. I updated the DNS records to match the new IP address and I was done.

Offensive Security Training

0 Comments | This entry was posted on Sep 11 2016

I’ve always been interested in computer security and although it’s something I consciously think about when building web applications, it’s not something I’ve ever giving solid time to solely focus on. However, over the last three months I spent all my spare time in the evenings and on weekends working through the Offensive Security certificate, a certificate that is taught by the developers of the Kali Linux distro.

The certificate is a very hands on approach to learning how to compromise computer systems. Along with a guide that goes into detail many of the ways in which vulnerabilities can be found and exploited, you are given access to a VPN with about two dozen vulnerable machines where you can explore and hone your skills. Starting out by scanning networks and profiling each server by learning its operating system, open ports and applications (and their versions) running on each. I found that it can be a tedious exercise but very interesting at the same time. SQL injection was fun but maybe because I’ve played around with that before and already had a great understanding of how it works. The buffer overflow exploits, although tough, was made much easier than I would have guessed because of the tools available today that make attempts quite transparent.

You quickly learn to write your own scripts to automate things that you find yourself repeating. As a result it improved my skills in both Python and Bash. This mostly to do things like scan a network for webservers or servers had MySQL ports open. I thought that sqlmap was a useful tool as it takes the tedious guess work out of finding applications that are not properly escaping user data before running them through an SQL query. The certificate introduces the student to many useful tools ranging from discovery and exploit execution.

It was an exhausting exercise to take outside of my day job but very rewarding. I learned a lot about a topic that has fascinated me for over 20 years and I can use these new skills to build and test that the applications and environments that I build are as secure as possible. In 2017, I will start working with IoT devices and build the APIs that they will communicate with. These devices will need to be secure and not become part of the growing botnets that we read about. Keeping on top of security issues is an ongoing task that I’m glad to be a part of.

Presentation on Building an API with Yii2 at PHP Melbourne

0 Comments | This entry was posted on Mar 22 2016

Last week I made a presentation at the Melbourne PHP usergroup phpMelb. The presentation was a live demonstration on how someone would go about creating an API with Yii2 from a clean install. I went through the steps of creating a migration and building the model with Gii. Then, following the Yii2 guide on how to turn a controller into a RESTful API controller. The steps are very straight forward and very quick.

I discussed using Chrome extension Postman to create and submit the payload and to view results. I showed how to add behaviours to the models and controllers but could only touch on authentication because of time constraints. I also demonstrated that with the advanced template you can also have a backend which I built again with Gii to show how easy it is to create a web based admin part of the API/site.

You can view the presentation here: https://doublehops.com/presentations/yii2-api-presentation.