RSS

Setting Up Development Environments With Vagrant and Ansible

0 Comments | This entry was posted on Feb 19 2014

One of the reasons I love running Linux on my main laptop/workstations is that I have an ideal environment to develop web projects. However there’s been many developments in software that moves away from this model which I have grown to love, and that is running your dev environments in virtual machines.

Instead of running Apache (or Nginx), MySQL and PHP natively on my dev machine, I have found it’s now easier to setup and run dev environments in virtual machines that are configured specifically for a given project, which can be automated through server management scripts. Initially this sounds like additional work, and it is but it has several advantages:

  • Custom environments for each project
  • Easily deployable for other developers in your team
  • No knowledge required for other team members.
  • Scripts can be reused for staging and development environments.

What are Vagrant and Ansible:

Vagrant is software that allows you to easily build reproducible development environments for various operating systems. It runs on top of other virtual machine platforms such as Virtualbox but, among other things, creates a sync drive that is accessible to your local file system, allowing you to use you IDE as you would normally without the need to transfer files to the machine.

Ansible, like Puppet or Chef is a server management scripting language. However the learning curve is a lot simpler and doesn’t require any software running on the remote servers. It configures the hosts over ssh.

By combining Vagrant with Ansible, it’s very easy to create development environments for developers who are running any common operating system within minutes without having to manually configure their dev environments to suit their operating system.

I have created Vagrant/Ansible setup script which can be found on Github. This will configure a development virtual machine that will have installed the latest versions of Nginx, MariaDB and PHP on Debian 7.

I think it’s worthwhile for any development teams to investigate using virtual machines like this, especially where complex environments are required.

New AFL Websites Built with Yii

1 Comment | This entry was posted on May 14 2013

I was tasked with building the new AFL websites Mark of the Year and Goal of the Year for the 2013 season for the Australian Football League. After gathering the requirements it was excited to find that Yii will be perfect to build these sites.

I love the sink my teeth into a project by starting with database design. The requirements were clear and not too exhaustive so it didn’t take too long to design. From there I was quickly able to build the models, views and controllers with Yii’s great scaffolding tool, Gii.

From this point I could start with the basic page layout and add the business rules to the the models and controllers and tweak the views to meet the functionality. Once the site was fully functional I could then add all the design elements to meet the page layout and styles created by the client.

The tricky part was including the video code into the site with Telstra’s proprietary video embedding code.

It was a great experience and really highlighted how easy and fun it is to build functional websites with Yii.

Websites:

www.markoftheyear.afl.com.au
www.goaloftheyear.afl.com.au

 

NGINX config for CakePHP 1.3 (& PHP 5.4)

1 Comment | This entry was posted on May 05 2013

This afternoon I setup a virtual host in NGINX for a CakePHP 1.3.x project in readiness for starting work with a new client tomorrow. However once I had what looked correct, CakePHP would complain that friendly URLs where not setup correctly. I am running PHP 5.4.14 on my laptop and CakePHP 1.3 for the site, as this is what the current project is running.

There seems to be no examples on the web of how to get these two versions to run together. So here is my example that I got to work for anyone who’s also stuck:

server {
    listen 80;
    server_name cakephp;
    root /var/www/cakephp/app/webroot/;

    access_log /var/log/nginx/cakephp/access.log;
    error_log /var/log/nginx/cakephp/error.log;

    location / {
        index index.php index.html index.htm;

        if (-f $request_filename) {
            break;
        }

        if (-d $request_filename) {
            break;
        }

        rewrite ^(.+)$ /index.php?url=$1 last;
    }

    location ~ .*\.php[345]?$ {
        fastcgi_pass unix:/var/run/php5-fpm.sock;
        fastcgi_index index.php;
        fastcgi_param SCRIPT_FILENAME
        /var/www/cakephp/app/webroot$fastcgi_script_name;

        include fastcgi_params;
    }
}

Preparing for Massive Load with OpenStack

0 Comments | This entry was posted on Feb 25 2013

In November last year I again updated and hosted the website for the NAB Australian Football League Draft Tracker and flew up to the Gold Coast to sit in on the event to ensure it all went well. The website (http://drafttracker.afl.com.au/) was built as a live tracker so the public can watch the picks as they happen.

It was designed to be lightweight to both server and browser so that any client requests only pulled in all site assets on the initial page load and then tiny JSON files every ten seconds looking for updates [1].  Adding drafted players as they happened by the admin updated records in the database via PHP which would create static files (JSON) for clients to pull down to update the page. NGINX was used as the webserver. All this allowed the server to run with minimal effort during the busy night.

However the trade period weeks earlier showed that the online interest in the AFL had lifted significantly and that I should prepare further to ensure that load was not a problem. As I host on Rackspace with their OpenStack cloud hosting I was able to take advantage of their versatile system to easily build a solution to meet potential demand. I created an image of the server and made four new server instances from it which were to become slaves. I then modified the site so that updates on the master would copy (via rsync) any changed JSON files to the slaves. Then I created a load balancer within the same interface with a few clicks and added the four slaves behind it before finally pointing the domain name to the load balancer’s IP address.  Another benefit of this design was that the site administrator could make updates to the site from an instance that was experiencing no load and therefore unhindered by too much traffic.

The draft went for about 1.5 hours and saw 100,000 unique visitors, each of which would poll the site every ten seconds. Monitoring the servers showed that my solution was complete overkill and probably only the one server was enough. But it’s better to plan for the worst and it was a great experience putting the solution together.

COSTING

Each of the four slaves ran with 512MB memory which costs $0.03 per hour, $0.15 total (including master). The load balancer costs a base of $0.015 per hour but scaled up per number of connections. Therefore for the 1.5 hours the expense of the set-up would have cost just a few dollars. Of course I had this set-up running for quite a few days beforehand though but overall the costing is negligible.

[1] The site was designed before the days of NodeJS and websockets not solution for older browsers.

Gillette AFL Trade Tracker

0 Comments | This entry was posted on Oct 16 2012

My most recent appointment required me to build a CMS and front-end for the Australian Football League for the trade period. The CMS was built to allow editors to add news items, trades and free agency movements between the 18 clubs.  The front-end was to display the inserted items, but allow the end-user to filter them to given rules. Again, I chose Yii to build this as it’s a great framework for rapid development but also robust and a pleasure to work with.

After designing the database I started building the models, views and controllers before modifying the forms to match the experience required for an easy to use and intuitive CMS. For the main news feed section, the front-end results could be filtered with different filters such as club, date and result type, eg. Trade only or general comment. These filters work together for fine control over the results shown. As each filter is used, the results are returned and populated by AJAX requests with filters being cleared by selecting Live Feed. The challenging part here was deciding on how to have the filters work together in the browser. I ended up building the URL that would be passed in the AJAX request. Session could have worked also but was an issue in load balancing and caching as I’ll point out later.

The second view was a breakdown of trades in and trades out by club. The result for the view were pulled from the same data as in the main feed to save on repetition will adding content. Also with filters that load with AJAX this came together quickly. I’m impressed the way that Yii allows you to reload content for partial views with just a few extra lines of code writing the jQuery for you.

The third view shows the players that fans most want traded. This data is pulled from another website trademachine.afl.com.au which the results are user generated. I could build this view quickly also by implementing a second database that is easy to do in Yii.

The site went live on October 1 and the demand was a lot greater than I was expecting. This resulting in the server becoming overwhelmed and some slow or failed page loads. Being a little unprepared I quickly made new instances of the server and put them all under a load balancer to meet demand. Cloning servers and putting them under a load balancer couldn’t be easier than what is available with Rackspace. This was quick and saved me a lot of pain early on. I then spent some time adding and fine tuning the built-in caching that Yii provides. I had not used caching in Yii before but I was surprised at how easy and effective this is. Although the content should only be cached for 60 seconds on the live feed, the resources being used on the server were dramatically reduced.

This is an example of adding caching to a given part of the site with Yii:

if($this->beginCache('main-content', array(
            'duration'=>60,
            'varyByParam'=>array('filter','club','dateRange'),
            ))) {
                $this->renderPartial('_entryList', array(
                    'dataProvider'=>$dataProvider,
                ));
    $this->endCache(); }

 

This would cache the view to 60 seconds and the varyByParam parameter tells the cache to use GET variables filter, club and dataRange as values to take into account when caching to ensure that each unique request is cached and returned as expected. This is essential as the view has a single URL but the content will change depending on what GET variables are also supplied. If I was to use sessions to keep track of what filters the browser had selected, it would fail through the cache and load balancers so sessions here was not an option.

Overall this was a fun project that required me to provide a solution for an event that I have a lot of interest in. The result is an easy to use CMS with a great user experience in the front-end also.

 

 

Defcon 2012

0 Comments | This entry was posted on Aug 15 2012

Last month I was one of 15,000 people that attended the Defcon computer security convention in Las Vegas. It was a fantastic four day event with presenters talking about their findings and projects in regards to all things security.

Upon paying the $200 entry fee we were given our badge required for entry. This year’s badge was electronic and a puzzle in a way. Through onboard lights and light sensor the badges would communicate with each other as they past by. Also via a USB port we were encouraged to program some hacks so that they behaved differently.

Defcon 2012 Badge

 

One of the most interesting events in Capture The Flag where teams are set against each other to hack into their opponents servers and capture so called flags. Each team would harden their own servers before beginning to attack others. From what I could gather they do this non-stop throughout the event and the team who has gathered the most flags is deemed the winner.

My highlights were sitting in on talks by Kevin Mitnick on social engineering and Kevin Poulsen discussing the exploits he used to get up to in his past. Having read books by both presenters I was keen to see what they had to say.

I would love to attend again next year. Anyone feel like sponsoring my trip?

OSCON 2012

0 Comments | This entry was posted on Jul 26 2012

For some years now I’ve been inspired to travel to the United States to attend the Open Source Convention OSCON in Portland. I hoped to learn what new open source tools and resources developers from around the world are using to get their work done.

This year I made the journey and it was well worth it. About 3000 people attended over the five days and they are all so passionate about open source software. Most are developers but all are working with open source software in one way or another. Everyone is very willing to share their skills and experience.

A main focus of the conference was Open Stack (http://www.openstack.org/) which is an open source alternative to Amazon’s cloud services and the primary thing I hoped to learn about when leaving Melbourne. Open Stack is being embraced by many businesses and the founders from NASA have moved on to build their own businesses that use Open Stack technologies. As some speakers discussed there is still a lot of work to do before Open Stack has all the features required to be a complete cloud services platform but it’s looking very promising.

I also got a lot out of talks about PHP, Vim, Twitter’s Bootcamp and system performance tuning.

I also met lots of interesting people. Sitting down to lunch I found myself sitting next to Sebastion Bergman who created PHPUnit and on another day with an Open Stack founder Josh McKenty. I also met some Ubuntu community members and some people behind MySQL (and MariaSQL), Linode, Rackspace and many more.

Everyone is pushing the open source movement in the same direction. Forward. It was a fantastic event and I hope to attend next year. However tomorrow in day one of Defcon which I’m very excited about.

runQuery: A New Tool To Query MySQL Databases

0 Comments | This entry was posted on May 07 2012

Recently I have been working on sites that don’t allow ssh access and I always find installing phpMyAdmin overkill and unnecessary for my needs. Therefore I wrote a single file script that you upload to your webserver, login to the database and immediately start writing queries.

It allows you to write complex queries with joins or just simple inserts, updates or deletes and returns the result set quickly with the number of rows found or affected on a page that’s clear and easy to read.

 

 

In the future I would like to add support for PostgreSQL and the ability to add or modify the data through a form.
The code is available to download from github. Please try it and let me know what you think.

Why Use A No-SQL Database Like Redis

0 Comments | This entry was posted on Dec 01 2011

For the last three months I have been working on different websites that I inherited that rely heavily on Redis. Redis is another no-sql database that uses key/value pairs to store data but does not give you the flexibility to write queries like you’re used to with relational databases like MySQL and Postgres.

Interestingly, these sites also use MySQL. I never had a chance to use or learn about No-SQL databases beforehand but the idea of using two types of databases for one application sounded like a convoluted and unnecessary solution. However, the more I use Redis (especially in these applications) the more I love it.

Redis is used in these applications for caching. When a request it made, rather than PHP sending queries to MySQL, it requests the data directly from Redis, which is pulled straight from the RAM. This results is a much faster response time and requires less resources from the server.

As there are already some good tutorials on what Redis is and how to use it, I will show you some great tricks I found through my travels that I did not see in the general documentation. Start the Redis client (redis-cli) and try the following two tricks:

List All Keys In The Database:

keys *

Show Variable Type:

type <variable-name>

Monitor Queries:
Monitor the queries being sent to Redis by using telnet to login to redis on the port number it’s running (default 6379) and type monitor. This is very helpful if the queries are being sent from an application and you need to debug exactly what’s the query is.

Running Multiple Instances Of Redis:
As I am running multiple applications that require Redis, I needed to learn how to run multiple versions of Redis. This is because you don’t define separate databases like you would with MySQL for example. There is no logins and no way to clearly separate data between applications. An excellent description on how to run multiple version of Redis can be found at chrislaskey.com.

Start Learning:
To get a better understanding of Redis I recommend using the online practical guide found at try.redis-db.com. This guide explains what different types of variables are available and how to access them.

Conclusion:
Redis has shown me what’s available in no-sql databases and that relational databases may not always be the answer. I can see that as I use Redis more in my own projects I will find that it’s useful for other purposes.  One possibility is the storing of variables that I may have previously put into sessions.

Yii Framework Issue With Nginx And SSL

0 Comments | This entry was posted on Oct 09 2011

During the process of moving my websites from one host to another and from Apache to Nginx I came across an issue that had me frustrated yet intrigued. I have a Yii application that I use for my business that I ran over SSL. This was working fine under Apache and for the most part under Nginx. The problem would only come about in Nginx when a form was posted, the browser would be redirected from HTTPS to HTTP mode. Why was this?

After some Google searching, Yii core framework investigation and some trial and error I found that Yii relies on what I determined is a non standard HTTP header ‘HTTPS’ with a value of ‘on’. Nginx does not send this header and on a form post Yii would make the assumption that the browser was in non HTTPS mode.

The core Yii method that determined the protocol is:

public function getIsSecureConnection()
{
    return isset($_SERVER['HTTPS']) && !strcasecmp($_SERVER['HTTPS'],'on');
}

This can be overcome by adding a parameter into the Nginx virtual host record but I see this as a workaround and not a real solution. I thought a simple change from the non standard ‘HTTPS’ header to ‘SERVER_PORT’ (checking that it’s set to 443), which is sent by both Apache and Nginx would be a better solution. I have sent a bug report off to the Yii project with my suggestion so maybe we’ll see it in a future release.

Update: 18 October 2011
My suggested fix was added to the Yii code base but was reverted back out because a site running on port 443 does not necessarily mean it’s running over SSL. There is no standard header that browsers send when running over SSL so therefore a non-ideal solution must be used. I believe the best solution is to add the HTTPS header to the Nginx config as suggested.