Posts Tagged ‘PHP’
A Working Example of SQL Injection
I started a small project recently to create a PHP based web page that is vulnerable to SQL injection to better understand how a site can be compromised and what someone can do once they’ve exploited the vulnerability. SQL injection is possible when a software developer doesn’t properly handle data sent by a user with their browser through a form or in the URL. By running this example you will learn that it is quite easy to gain shell access to a server when data is handled poorly.
The project which can be forked on Github steps you through setting up and running a virtual machine, abuse the SQL vulnerability and eventually gain shell access. Once a vulnerability has been found, it only takes five steps to gain shell access.
There are several examples of what can be done but you’re also walked through gaining shell access. It’s really quite simple. If you’re interested in web application security I suggest giving it a go. It shouldn’t take more than an hour to get through it.
View the project on Github.

Complex Document Management System Build
A new client whose job it is to audit water management systems approached me to build a complex document management system to automate an existing system that required a lot of manual labour. The audits of varying intervals result in written reports that needed to be kept for future reference, and notifications need to be sent when the system determines that a report is late. In point form, the major requirements of the new system were:
- Allow a new site (such as a building) to be added/updated with the following attributes:
- List or auditing companies for given site
- Report types and their required intervals
- Email credentials
- Include an interface to create a report scanner – which will be used to determine the report type by scan a incoming report (pdf, xls, doc file) and scrape metadata such as report date and report type. This has to be done for each report type for each auditing company.
- Automatically download from the mail server any audits emailed from any number of reporters for the given site (eg. building)
- Scan the report using the report scanner, give it a filename that includes relevant metarandata and move to a CDN.
- Have a reporting page for each site to show the latest reports and show whether they we received on-time, late or overdue.
- Have another service hosting a Nextcloud service that would make available the reports for a user using their preferred Nextcloud client (web browser or mobile app). Therefore any facility manager could easily see and search for any reports for a location the were responsible for.
- Include a cronjob to look for any outstanding reports and send notifications to the facility manager for the given location.
Taking in all requirements, I determined AWS was the best platform to host the application as I could easily host the application and Nextcloud, use S3 for the CDN, use SNS for sending notifications and load balancers to place the servers behind.
Ansible was used to provision the local development environments as well as the EC2 instances on AWS for the application servers and Nextcloud servers which were all run on Debian.
I chose Yii2 framework to build the application for several reasons but mainly because I could easily scaffold each form and add validation and the business logic required. A custom component was written to retrieve new documents from the email server and scan them, and running the metadata through the scanner app. The documents were named according to their metadata and then moved across to an S3 bucket. If a document type was not found, then it was copied to a folder waiting to be fixed manually. A cronjob runs daily to scan the S3 folder for report types and check against the location’s report interval setting to determine if any reports were overdue and sent notifications if any were found.
The client is really happy with the application as it has taken away manual work as well as notify when reports are late which wasn’t available before.
Easily Generating New Ansible Playbooks with a Python Script.
Due to the team continually starting new projects that have different stack requirements, we decided we decided to build a Python script that would read a configuration file that contained which OS (CentOS, Debian, Ubuntu), webserver (Apache or Nginx), database (MySQL, MariaDB) and PHP version (set to latest 7.x). The file also contains project hostname (local dev name only), IP address and local output path.
When the configuration file is run, it will create a full Ansible script in the output path will the correct playbooks for the chosen stack. The same scripts can be used to provision remote servers such as AWS EC2 instances. It really helps in getting the project started quickly.
The project can be found here: https://github.com/doublehops/ansible-builder
Yii2: Second Generation Yii
I found enough time recently to finally look into Yii2. I decided that a good test project would be to build a crypto currency exchange tracker. It would download latest prices of all markets from both Cryptsy and Mintpal and then display the data in charts so I could quickly scan trends of all currencies.
Yii2 and it’s dependencies can be installed and managed through Composer which I enjoy. It prevents you from needing to keep any third party packages in version control and makes installs, upgrades and deployments easier. The Yii2 documentation is again great and the community is already solid. Any questions I had that were Yii2 specific were answered on the forum in good time. Some things are different and migrating projects from Yii 1.x to 2.x will take a lot of work. Yii2 uses namespaces and this means namespaces need to be declared at the top of views and other files that wasn’t previously necessary. Getting instances of records is slightly different and was changed several times during Yii2’s evolution. However this is stable now.
Many things are still the same. Migrations, scaffolding, commands and nearly everything else is the same. In my opinion, Yii2 is still the best PHP framework and I can’t wait to start a production project with it. Yii2 is still beta but the code base has mostly settled with only bug fixes remaining. My next task is to incorporate AngularJS into my Yii projects.
Setting Up Development Environments With Vagrant and Ansible
One of the reasons I love running Linux on my main laptop/workstations is that I have an ideal environment to develop web projects. However there’s been many developments in software that moves away from this model which I have grown to love, and that is running your dev environments in virtual machines.
Instead of running Apache (or Nginx), MySQL and PHP natively on my dev machine, I have found it’s now easier to setup and run dev environments in virtual machines that are configured specifically for a given project, which can be automated through server management scripts. Initially this sounds like additional work, and it is but it has several advantages:
- Custom environments for each project
- Easily deployable for other developers in your team
- No knowledge required for other team members.
- Scripts can be reused for staging and development environments.
What are Vagrant and Ansible:
Vagrant is software that allows you to easily build reproducible development environments for various operating systems. It runs on top of other virtual machine platforms such as Virtualbox but, among other things, creates a sync drive that is accessible to your local file system, allowing you to use you IDE as you would normally without the need to transfer files to the machine.
Ansible, like Puppet or Chef is a server management scripting language. However the learning curve is a lot simpler and doesn’t require any software running on the remote servers. It configures the hosts over ssh.
By combining Vagrant with Ansible, it’s very easy to create development environments for developers who are running any common operating system within minutes without having to manually configure their dev environments to suit their operating system.
I have created Vagrant/Ansible setup script which can be found on Github. This will configure a development virtual machine that will have installed the latest versions of Nginx, MariaDB and PHP on Debian 7.
I think it’s worthwhile for any development teams to investigate using virtual machines like this, especially where complex environments are required.
Preparing for Massive Load with OpenStack
In November last year I again updated and hosted the website for the NAB Australian Football League Draft Tracker and flew up to the Gold Coast to sit in on the event to ensure it all went well. The website (http://drafttracker.afl.com.au/) was built as a live tracker so the public can watch the picks as they happen.
It was designed to be lightweight to both server and browser so that any client requests only pulled in all site assets on the initial page load and then tiny JSON files every ten seconds looking for updates [1]. Adding drafted players as they happened by the admin updated records in the database via PHP which would create static files (JSON) for clients to pull down to update the page. NGINX was used as the webserver. All this allowed the server to run with minimal effort during the busy night.
However the trade period weeks earlier showed that the online interest in the AFL had lifted significantly and that I should prepare further to ensure that load was not a problem. As I host on Rackspace with their OpenStack cloud hosting I was able to take advantage of their versatile system to easily build a solution to meet potential demand. I created an image of the server and made four new server instances from it which were to become slaves. I then modified the site so that updates on the master would copy (via rsync) any changed JSON files to the slaves. Then I created a load balancer within the same interface with a few clicks and added the four slaves behind it before finally pointing the domain name to the load balancer’s IP address. Another benefit of this design was that the site administrator could make updates to the site from an instance that was experiencing no load and therefore unhindered by too much traffic.
The draft went for about 1.5 hours and saw 100,000 unique visitors, each of which would poll the site every ten seconds. Monitoring the servers showed that my solution was complete overkill and probably only the one server was enough. But it’s better to plan for the worst and it was a great experience putting the solution together.
COSTING
Each of the four slaves ran with 512MB memory which costs $0.03 per hour, $0.15 total (including master). The load balancer costs a base of $0.015 per hour but scaled up per number of connections. Therefore for the 1.5 hours the expense of the set-up would have cost just a few dollars. Of course I had this set-up running for quite a few days beforehand though but overall the costing is negligible.
[1] The site was designed before the days of NodeJS and websockets not solution for older browsers.
OSCON 2012
For some years now I’ve been inspired to travel to the United States to attend the Open Source Convention OSCON in Portland. I hoped to learn what new open source tools and resources developers from around the world are using to get their work done.
This year I made the journey and it was well worth it. About 3000 people attended over the five days and they are all so passionate about open source software. Most are developers but all are working with open source software in one way or another. Everyone is very willing to share their skills and experience.
A main focus of the conference was Open Stack (http://www.openstack.org/) which is an open source alternative to Amazon’s cloud services and the primary thing I hoped to learn about when leaving Melbourne. Open Stack is being embraced by many businesses and the founders from NASA have moved on to build their own businesses that use Open Stack technologies. As some speakers discussed there is still a lot of work to do before Open Stack has all the features required to be a complete cloud services platform but it’s looking very promising.
I also got a lot out of talks about PHP, Vim, Twitter’s Bootcamp and system performance tuning.
I also met lots of interesting people. Sitting down to lunch I found myself sitting next to Sebastion Bergman who created PHPUnit and on another day with an Open Stack founder Josh McKenty. I also met some Ubuntu community members and some people behind MySQL (and MariaSQL), Linode, Rackspace and many more.
Everyone is pushing the open source movement in the same direction. Forward. It was a fantastic event and I hope to attend next year. However tomorrow in day one of Defcon which I’m very excited about.
runQuery: A New Tool To Query MySQL Databases
Recently I have been working on sites that don’t allow ssh access and I always find installing phpMyAdmin overkill and unnecessary for my needs. Therefore I wrote a single file script that you upload to your webserver, login to the database and immediately start writing queries.
It allows you to write complex queries with joins or just simple inserts, updates or deletes and returns the result set quickly with the number of rows found or affected on a page that’s clear and easy to read.
In the future I would like to add support for PostgreSQL and the ability to add or modify the data through a form.
The code is available to download from github. Please try it and let me know what you think.
Why Use A No-SQL Database Like Redis
For the last three months I have been working on different websites that I inherited that rely heavily on Redis. Redis is another no-sql database that uses key/value pairs to store data but does not give you the flexibility to write queries like you’re used to with relational databases like MySQL and Postgres.
Interestingly, these sites also use MySQL. I never had a chance to use or learn about No-SQL databases beforehand but the idea of using two types of databases for one application sounded like a convoluted and unnecessary solution. However, the more I use Redis (especially in these applications) the more I love it.
Redis is used in these applications for caching. When a request it made, rather than PHP sending queries to MySQL, it requests the data directly from Redis, which is pulled straight from the RAM. This results is a much faster response time and requires less resources from the server.
As there are already some good tutorials on what Redis is and how to use it, I will show you some great tricks I found through my travels that I did not see in the general documentation. Start the Redis client (redis-cli) and try the following two tricks:
List All Keys In The Database:
keys *
Show Variable Type:
type <variable-name>
Monitor Queries:
Monitor the queries being sent to Redis by using telnet to login to redis on the port number it’s running (default 6379) and type monitor. This is very helpful if the queries are being sent from an application and you need to debug exactly what’s the query is.
Running Multiple Instances Of Redis:
As I am running multiple applications that require Redis, I needed to learn how to run multiple versions of Redis. This is because you don’t define separate databases like you would with MySQL for example. There is no logins and no way to clearly separate data between applications. An excellent description on how to run multiple version of Redis can be found at chrislaskey.com.
Start Learning:
To get a better understanding of Redis I recommend using the online practical guide found at try.redis-db.com. This guide explains what different types of variables are available and how to access them.
Conclusion:
Redis has shown me what’s available in no-sql databases and that relational databases may not always be the answer. I can see that as I use Redis more in my own projects I will find that it’s useful for other purposes. One possibility is the storing of variables that I may have previously put into sessions.
Yii Framework Issue With Nginx And SSL
During the process of moving my websites from one host to another and from Apache to Nginx I came across an issue that had me frustrated yet intrigued. I have a Yii application that I use for my business that I ran over SSL. This was working fine under Apache and for the most part under Nginx. The problem would only come about in Nginx when a form was posted, the browser would be redirected from HTTPS to HTTP mode. Why was this?
After some Google searching, Yii core framework investigation and some trial and error I found that Yii relies on what I determined is a non standard HTTP header ‘HTTPS’ with a value of ‘on’. Nginx does not send this header and on a form post Yii would make the assumption that the browser was in non HTTPS mode.
The core Yii method that determined the protocol is:
public function getIsSecureConnection() { return isset($_SERVER['HTTPS']) && !strcasecmp($_SERVER['HTTPS'],'on'); }
This can be overcome by adding a parameter into the Nginx virtual host record but I see this as a workaround and not a real solution. I thought a simple change from the non standard ‘HTTPS’ header to ‘SERVER_PORT’ (checking that it’s set to 443), which is sent by both Apache and Nginx would be a better solution. I have sent a bug report off to the Yii project with my suggestion so maybe we’ll see it in a future release.
Update: 18 October 2011
My suggested fix was added to the Yii code base but was reverted back out because a site running on port 443 does not necessarily mean it’s running over SSL. There is no standard header that browsers send when running over SSL so therefore a non-ideal solution must be used. I believe the best solution is to add the HTTPS header to the Nginx config as suggested.