Photo by:

Validating HTML5 using Laravel and PHPUnit

Photo by: slavik_V.

Photo by: slavik_V.

Working as a Business Analyst, in the Student Systems team, at Flinders University means that I don’t get to work on programming code as much as I used to.

This is a good thing. It means I can exercise my ‘softer’ skills such as systems analysis, design, writing documentation and requirements gathering. I also get to write some truly epic SQL queries.

That said, I’ve also been working on a new web application using Laravel that I hope will make a positive impact on my team. I’m a firm believer in using unit testing to improve the quality of my code, and this post is about one such unit test.

I want to ensure that the HTML code that is generated by the web application is valid according to the HTML 5 standard. This means, that I wanted a way to identify markup that is incorrect in some way. For example invalid nesting of tags, unclosed tags etc.

I started with the Markup Validation for PHPUnit extension written by Kevin Weber. This extension makes it easy to include an assertion in your unit tests that checks the validity of your HTML.

The extension in turn uses the (X)HTML5 Validator to do the actual validation. The one drawback of this, from my perspective, is that the HTML is sent to an external third party for validation. I wanted the validation to occur locally in my development environment.

This is how I achieved my goal of validating the HTML locally on my development machine using unit tests.

First, I wrote a custom connector class to use a local install of the validator like this:

Second, I wrote a test case using code like this:

The canValidate() function uses Guzzle PHP HTTP client to see if the local validator is available. The value returned from this function is used to determine if the tests should be skipped or not. In this way, if the validator is not available the other test cases can still complete successfully.

The validateRoute() function is responsible for making a request to the Laravel application and validating the response. The other functions in the class, such as the testBasicHtmlValid() setup the web application for testing. For example creating test users and data before calling the required route.

Lastly to use a local version of the validator application, download and unzip the latest release of the application from the repository on GitHub. Once extracted start a local instance using the following command:

When the unit tests are run, either the HTML for each route that you need to test is validated, or the tests are skipped if the validator is not available.

I hope this post helps others, who like me, are looking for a way to validate the HTML that is returned by the Laravel routes. If you have any thoughts about this approach, contact me on Twitter.

Today I crossed over to the dark side and joined Facebook

Photo By: tookapic.

Photo By: tookapic.

Earlier today I joined Facebook. I feel like I’ve crossed over to the dark side. My Facebook profile is available here.

I’ve resisted joining Facebook for many years, in fact I resisted joining social networks for more years than I care to count. I was happy with my blog and using Twitter. My Twitter profile is available here.

Earlier this year my friend Mark Drechsler convinced me to set up a LinkedIn profile, which is available here. That was the start of the slippery slope into social media.

After I completed the signup process I tweeted about it.

My friend Kahiwa Sebire wanted to know why after all this time, I’ve succumbed to the lure of Facebook. Essentially it comes down to the following three reasons.

First, I’ve been noticing that there is more and more content that is exclusively available on the Facebook platform, which I’m missing out on. Including the pages of some of my favourite podcasts including:

Perhaps now that I have an account, I’ll be able to contribute to the vibrant communities around these podcasts and other content.

I’ve also noticed that some of the activities that my little girl is involved in, are promoted on Facebook. I increasingly feel like I’m missing out on that aspect of her life.

Second, many of my friends use Facebook to communicate and keep in touch with each other. In the past I’ve been very bad at using email, and especially the phone, to keep in touch with friends. My hope is that through Facebook we can keep in contact easier and with more success.

This third thing is what tipped me over the edge.

My little girl, who is four, loves to talk to her Granny and Pa in Queensland on the phone. The trouble is that she hasn’t quite grasped the idea that they can hear her, but they can’t see her. So she spends time trying to show them her favourite toy, latest craft masterpiece, or lego construction.

We’ve tried various video chat services in the past, such as Google Hangouts and Skype, with limited success. We even tried 3G video calling, but as they have an Android based phone, and I have an Apple iPhone, it wasn’t possible.

I want her to have a close relationship with her grand parents, and video calling seems like the best alternative to actually being with them.

Granny is already on Facebook, and is familiar with using Facebook to chat. My understanding is that Facebook Video calling is super easy to use. My hope is that we can use this service, where other services have failed.

So in essence the main reason for having an account, the thing that pushed me over the edge, is video calling between my little girl and her grandparents.

It’s an experiment, and I’ll post here on my blog about the results.

Photo by: succo.

Two factor authentication, a grab for our mobile phone numbers?

Photo by: succo.

Photo by: succo.

The other day I signed up for a new online service, which one isn’t relevant to this post.

When signing up for new online services I try to provide as little information as possible. This is because I am a firm believer in the tenet that ‘If you’re not paying for the product, you are the product’, as outlined in this post by Charles Stross.

I worry about what  my personal information is being used for, outside what is strictly required to provide me with the service that I’m signing up for. So when I signed up for this online service, I didn’t add my mobile phone number to my account information. It was optional field on the account settings page, and isn’t required to provide me with the service that I required.

Once my account was set up, I went looking for the two-factor authentication settings. I always enable two-factor authentication, also known as 2FA, because it improves the security of my account.

Using a username and password to access an online service is an example of one-factor authentication. You’re proving that you are the registered user by providing the username, and something that only you know, your password.

Two-factor authentication uses a second factor, which is something you have. Typically this is your mobile phone. The basic premise that only you should know your password, and have your mobile phone.

Asserting that your mobile phone is with you is achieved through one of two main means:

  1. Sending a code to your phone via SMS.
  2. Using an app on your phone to generate a code.

One of the most popular applications used to generate codes is Google Authenticator. There are others, including apps specific to online services, and popular password managers such as 1Password.

To enable two-factor authentication for the online service I needed to provide my mobile phone number. The service used it to send me a code to confirm the phone was mine. Once they had done that, I was able to configure two-factor authentication using their app on my phone.

What bothered me was that I needed to set up SMS authentication first, before I could use their app on my phone. Additionally I can’t delete my mobile phone number from the account, because to do so would turn off two-factor authentication.

There is no need for this service to have my mobile phone number, it isn’t required for two-factor authentication because I can use their app. It isn’t even required as a backup option, for example if the app breaks on my phone, or I replace my phone and keep the same number, as I have backup codes stored securely in my password manager.

In essence it is personal information that I was forced to provide to ostensibly enhance the security of my account, when it isn’t required. That annoys me, and got me thinking.

The rise of two-factor authentication is a good thing. Anything that increases the security of our online accounts is a step in the right direction. Especially if it is easy to setup and maintain.

Wearing my cynical hat on, which may or may not be made of tinfoil, I can’t help but wonder if the collection of more personal information is viewed as more important than improving account security by some online service providers.

I’m interested to hear what you think, let me know via Twitter if you feel so inclined.

Photo by: succo.

Reflections on open source, security and trust

Photo by: succo.

Photo by: succo.

Last week I was listening to a tech news podcast. They were discussing the recent Patreon data breach. In light of the breach, and the need to change passwords, the topic of password managers was discussed. The two main benefits of password managers are they:

  1. Make it easy to generate long complex passwords.
  2. Store the long complex passwords securely so that you don’t have to remember them.

The question was raised, how do you choose which password manager to use? In response the guest that day suggested that using an open source application from GitHub would be best because ‘you can read the source code’, and can therefore know that it is OK.

This got me thinking. Just because, as a user, I can download the source code for an application, does that make it any more secure?

My career to date, has involved writing web applications in a variety of languages, on a number of platforms, for many different use cases. Almost always I have used open source software, and in some cases released the code that I’ve written as open source as well.

While working at Blackboard, where I worked for a little over two years, I needed to review the source code of Moodle plugins for inclusions in the Enterprise Moodle or Joule platforms. This was to ensure that the code contained in the plugin met the high standards necessary for an enterprise class system. The security of the code was an aspect of the review.

Things get ‘interesting’ when a plugin written with the assumption that a typical course contains 10-100 enrolments, is used with a course with 1000 enrolments. Especially if there is a need to keep page generation times to a minimum… but I digress.

One thing which I have learnt in doing this type of work, is that reviewing software code is a skill which is difficult to master. In my opinion, it is harder than writing code, and uses a different skill set. Additionally code varies greatly in quality and clarity, which makes the task of reviewing code even harder.

I would go further and argue that reviewing code for security related issues is a specialisation that goes deeper than a standard code review. In short, writing code is one thing, reviewing it is another, and reviewing it for security related issues is an entirely different thing again.

I believe that the point the guest was making was an extension of Linus’s Law, which states:

given enough eyeballs, all bugs are shallow

The premise is that the more developers that look at the code, the more bugs are found and resolved. As a result the software improves. In the context of my reflections, it also means that the software is more secure. However this may not be the case, as the Heartbleed and Shellshock bugs have shown. A security related bug can exist in open source code for a significant amount of time.

Pondering this issue further, I came to the conclusion that open source software is not inherently more secure than proprietary software, just because the source code can be read by anyone.

This lead to me wonder, why open source software is considered more secure than proprietary software. After all, both types of software contain bugs, including security related issues, and both are patched regularly.

The answer I’ve come up with, is that open source operates on a different trust model than proprietary software. By virtue of being open, the developers of the software are implying that they have nothing to hide. By refusing to provide the source code, proprietary developers could be seen as having something to hide. Indeed, an argument could be made they are practicing security through obscurity, which never ends well.

Ultimately, whether the source is open or closed, there are still going to be bugs. Some of which will put the users security and privacy at risk.

Having gone through this thought exercise, I can summarise my conclusions as follows:

  1. Open source software is seen as more secure, because it is more trustworthy, as the source is available for all to see.
  2. Closed source software is seen as less secure, because it is not trustworthy, as the source is closed and access to it is restricted.

I believe this to be flawed logic. As I mentioned earlier, not every developer is going to be good at reviewing code, and not every developer undertaking code reviews is going to be good at spotting security issues.

Perhaps there is an opportunity here for developers of closed source software to undertake an independent third party code review of their proprietary systems. And more importantly, publish the results. For example such reviews are taken into account by the EFF in their Secure Messaging Scorecard.

This has been an interesting thought exercise, kicked off by a somewhat off hand comment by a contributor to a tech news podcast. I wonder why my next reflection will be about. I’m interested in hearing your thoughts, especially if you disagree with me. You can contact me via various means as listed on my about page.


Fork in the Road. By:

Leaving Blackboard and going to Flinders University

Fork in the Road. By: Bs0u10e0.

Fork in the Road.
By: Bs0u10e0.

Friday, October 9, 2015 was my last day at Blackboard. I started with Blackboard in July 2013, working in the Adelaide, Australia office1.

I was employed as a Senior Software Engineer, and worked as part of the client development team. I supported clients in their use of the Enterprise Moodle platform. As the name suggests, this was an enterprise class system which is built on top of the open source Moodle, Learning Management System (LMS).

Earlier this year I made the transition to the product development team at Moodlerooms, which is also part of Blackboard. I worked as a software engineer in the product development team for Joule. The Moodlerooms Joule platform is also built on top of Moodle. The focus of my role was on defect resolution, and enhancements related to the core product.

In the nearly two and a half years that I was at Blackboard I saw a lot of change, and grew both professionally and personally. I enjoyed working on open source code, and helping clients meet their needs through customisations to Moodle.

Reflecting on my career in the past few months, it became clear to me that I am happiest working in the higher education sector. The corporate culture necessary to sustain a commercial enterprise, is not for me.

Additionally I came to realise that I am not the typical software developer or software engineer. The aspect of my work which I found the most fulfilling is working directly with users. Whereas many people who work in this type of role prefer not to deal with end users, and possibly people in general2.

To that end I am starting as a Business Analyst at Flinders University on Monday, October 19, 2015. I’ll be working in the Student Systems area, supporting users of the universities Student Information System (SIS).

I’m looking forward to, with great anticipation, returning to the higher education sector, where I have worked for most of my career. I’m also excited about the new challenges that await me, and the new things that I will learn.

The topics that I will post about here on my blog will change, although I have a few posts rattling around in the back of my mind already.

I’m excited to see where this new change in direction for my career will take me.

1At the time I started, the company was known as NetSpot, which had been acquired by Blackboard in March 2012.

2I’m over generalising in jest.

Wing Gundam Zero Honoo, with sword.

XXXG-00W0CV Wing Gundam Zero Honoo

Over the weekend I completed a new model, specifically the XXXG-00W0CV Wing Gundam Zero Honoo. More information about how this gundam fits within the Gundam universe is available here.

Below is a series of photos of the model with the different armaments.

I’m really pleased with how this model turned out. The finish is improving as I’m improving my sanding technique with practice. The panel lining on this model I’ve kept a minimum, focussing on exhaust and intake ports mostly. That’s the one area where I need to improve the most I think.

The things that attracted me to this model were the translucent orange elements, as well as the really big swords.

SD-237S Star Winning Gundam

SD-237S Star Winning Gundam

Another of the Gundam models that I’ve completed in the past few months. This one is the SD-237S Star Winning Gundam, more information about it is available here.

This is a much smaller model than say the AMX-107, and overall is an easier build. I did some pane lining on this model as it fits within the overall style. On other more highly detailed models my attempts at panel lining look too thick and heavy.

AMX-107 Bawoo

AMX-107 ‘Bawoo’

Below is a gallery of photos of my latest Gundam model, the AMX-107 ‘Bawoo’. More information about it is available here.

I’m still learning how to assemble Gundam modles, after my friend rurisu hoshino introduced me to them earlier this year. With each model the finish is getting better.

Panel lining is still something that I’m not very good at, so I’m focusing on exhaust ports and air intakes. I’m also sanding the individual pieces to remove the nubs once the piece is removed from the spru. It takes a lot longer, but I think I get a better finish, and I don’t cut myself as often.

If I keep taking photos of my models, I’m going to need to come up with a better lighting rig. My lamp gave all of the photos a distinct yellow tinge.

Thinking outside the box quote

Reflections on employability and higher education

Thinking outside the box quote

Photo by: kaboompics.

I’ve been reflecting for a few weeks on the ‘Employability, entrepreneurship and the future of Higher Education‘ post by Mark Drechsler on his blog. There is one main point that I keep come back to in my reflections.

I’m not convinced that I want higher education institutions teaching students to be ‘work ready’. In my reflections, I’ve defined ‘work ready’ as meeting the selection criteria as outlined in job advertisements. For that is the most visible indicator of what their requirements are. The ones I’ve seen lately, in my area of software engineering / development, are very specific.

Not only are employers looking for prospective employees with experience in specific languages, they want experience in specific frameworks. For example it isn’t enough to have experience with PHP, a candidate must also have experience in Laravel. Or alternatively it isn’t sufficient to have experience in JavaScript, a candidate must have experience in AngularJS.

The other type of listing includes such a wide array of technologies and skills, that I don’t believe anyone has them. Either way, my concern is that these are lists of very specific skills.

I don’t think higher education institutions, universities, should be in the business of imparting such specific skills and knowledge to students. Rather, they should be teaching higher order skills, so that they can learn these types of specific skills. More importantly they need to be prepared for a lifetime of learning these types of skills.

For the past two years I’ve been working on Moodle at Blackboard. I was fortunate that the managers at Blackboard valued my years of experience in the higher education sector, over my lack of direct Moodle experience.

During this time I’ve used all of my experience and skills in understanding the Moodle code base, and delivering value to clients. As a software engineer the ‘soft skills’, such as communication and time management, are just as important as the technical skills. I’d also argue that for some positions, such as client development, the soft skills are even more important than the technical ones.

My point from all of this reflection and rambling, is this:

In my mind, someone working in information technology who has the capability to learn and build on their experiences including their softer skills, is more valuable to an organisation than one who has experience in a narrow skill set.

Exploring Vagrant and Chef for Testing Moodle with PostgreSQL

Elephant Statue

Elephant statue.
Photo by: byronv2.

As part of my work as a software engineer at Moodlerooms, part of Blackboard, I work on a variety of different tasks. Today I had to work on an issue that related to an SQL query used by a Moodle plugin that works fine in MySQL, and was reported to be broken in PostgreSQL. Unfortunately the version of PostgreSQL that was being used was not reported. To fully investigate the issue, I needed to test the SQL query against all of the supported versions of PostgreSQL that can be used with Moodle 2.8.

I’ve been exploring the use of virtual machines for development lately. For my day to day tasks on the Moodlerooms platform, I use an Ubuntu based virtual machine, hosted by VirtualBox, that is managed by Vagrant and Chef. All of the team use this same setup. It ensures that we are all starting from a common platform.

For those not familiar with the tools:

  • VirtualBox is software that supports the creation and management of virtual machines.
  • Vagrant is a tool used to help mange virtual machines.
  • Chef is a tool used to automatically configure the virtual machine managed by Vagrant.

Using Vagrant and Chef I was able to setup a relatively simple virtual machine that ran a PostgreSQL database server. I was then able to connect the instance of Moodle in my development virtual machine, with the other virtual machine that was running a PostgreSQL server. Using Chef I was then able to change the version of PostgreSQL, while retaining the rest of the configuration as exactly the same.

The basic workflow was:

  1. Configure my Moodle development virtual machine to use to the database on the PostgreSQL virtual machine.
  2. Install the Moodle database.
  3. Test the required functionality.
  4. Destroy the PostgreSQL virtual machine.
  5. Provision the PostgreSQL virtual machine with a different version of PostgreSQL.
  6. Repeat steps 2 – 5 as often as I needed.

That’s the nice thing about Vagrant, it makes it easy to create and destroy virtual machines. Using Chef makes it easy to ensure the virtual machine is configured exactly the same each time it is created. In this way it was easy to have a PostgreSQL database server that is exactly the same, except for the version of PostgreSQL of course.

Using virtual machines make it easy to test this type of issue. It’s a technique I look forward to using again.