Methodology for Testing Mobile Apps and Websites

In an earlier post, from a few months ago, I wrote about how I was Investigating Issues with the Moodle Mobile app that our clients at NetSpot had reported. Since that original post I’ve refined my methodology, and wanted to take this opportunity to write an updated post. I’ll break the post into two parts, first the hardware, and then the software.

The Hardware

The computer hardware that I use for testing can be seen in the photograph of my desk at NetSpot below.

My desk at NetSpot

My desk at NetSpot.
(Click image for larger version)

The hardware specific to testing mobile apps is an iPad Mini, an iPhone (used to take the picture) and a USB powered TP-Link TL-MR3020WiFi router. I’ve updated the firmware on the router to the latest OpenWrt firmware. I updated the firmware to OpenWrt to make it easier for me to manage the device.

The router is configured to use a network in the 172.16.1.* range. Which is range that is not used either at work, or at home. Indeed I’ve never seen this range used in a real life network. The devices that I connect to the WiFi network are configured to use a proxy server at a statically defined address. This address is assigned to the USB network adapter connected to my MacBook Pro.

In this way I have a private network that I can connect my devices to. Additionally it is a network that I have complete control over. Importantly the only traffic on the network is from my devices.  Lastly all HTTP and HTTPS traffic must go through a proxy server running on my MacBook. This is important as it makes it easier to capture the traffic for analysis.

The Software

I use two different proxy servers depending on the type of issue I’m investigating. I usually start by using Tinyproxy, as it is really simple to setup and works really well. As with many of these types of applications I installed it using Homebrew. Using the TinyProxy logs I can see which URLs are being requested.

This is important, because an app running on the iPhone or iPad Mini is a black box which you don’t have much insight into. By using a proxy server I can at least see the web requests that the app makes. I prefer to test the app, or mobile theme, on a real device. In my experience, while emulators do a fine job, they are no substitute for testing on an actual device.

One thing that Tinyproxy doesn’t do, is allow me to see the content of the requests or responses. This is especially true when the requests are made using HTTPS.

To see the content of the requests and responses, including those made using HTTPS, I use mitmproxy. Using mitmproxy I can capture the requests made by the app, and the responses from the web server in real time. This is very useful.

Additionally it is possible to add a certificate to the device so that HTTPS traffic can be captured without any warnings being displayed. Or the app that I’m testing throwing an error.

Final Thoughts

Using this hardware and software I have been able to successfully debug issues with mobile websites and mobile applications, such as the Moodle Mobile app. As one of my colleagues noted, this solution is over engineered somewhat. It is possible to have my iPhone connect to a proxy server on my MacBook Pro via the corporate network. There are three additional considerations to the setup.

First, running services on my MacBook that others can connect to, on the corporate network, makes me uncomfortable. Second, in the past I have used Wireshark to capture network packets. The thought of capturing packets that I shouldn’t also makes me uncomfortable.

Lastly to get things up and running  all I need to do is plug the WiFi router in, connect the device to the WiFi network and I’m good to go. Both at work in the office, or when I’m working at home.

My solution isn’t for everyone, as I’m sure there are alternative ways of doing things. What’s important is that it works for me.

 

New Utility Script: GerritPush

"Peer Review" by AJ Cann

“Peer Review” by AJ Cann

Today I uploaded a new script to the Techxplorer’s Utility Scripts repository on GitHub. The new script is called GerritPush, and is documented in the repository wiki.

At Netspot we use Gerrit to support our peer review process for all the code that we write. Any code that goes into our repository must be reviewed by at least one member of the development team. If it doesn’t pass peer review the code change must be updated to address any concerns raised.

Occasionally a bug slips through the net. On the whole code quality has increased which is a good thing. Another side effect of peer review is learning between the teams. For example learning about a better way to write code to achieve a certain outcome, or to learn new ways of doing the same thing.

Code review is a very good thing, and I’m glad that we’re using at NetSpot. It is something that I wish I had used at some of my previous projects.

The deplorable state of customer service

"beelzebubbles'll 'ave your arm..." by andrea

“beelzebubbles’ll ‘ave your arm…” by andrea

I’d like to take a moment to share a story about the deplorable state of customer service. We get our gas and electricity from one of the bug energy suppliers. I won’t name them here.

We’ve had a significant number of issues since moving house late last year. This latest incident is beyond anything that I could have previously envisaged.

A few weeks ago I got a text message saying that our gas bill had been delayed. We’ve signed up for monthly gas bills and this is the second time it has been ‘delayed’. Over the past 6 months it has been delayed twice. There is no indication about what had caused it to be delayed, and I forgot about it.

Last week I got a call from the gas companies automated system, telling me that my account was overdue and pressuring me to pay by credit card. There were a number of issues with this:

  1. I don’t give my credit card details over the phone to people I don’t know. Especially an automated service that I can’t verify is legitimate.
  2. The service new my name and birthday, but couldn’t pronounce my name. It is the first time I’ve ever heard my last name pronounced as ‘walrus’.
  3. At no time during the call is there any opportunity to talk to a representative of the company.
  4. The call ends with a number of attempts by the service to get me to provide my credit card details.

Being called by an automated service is not pleasant. Especially one that can’t even pronounce my name, and provides no way to verify that the service is actually legitimate. This put me in a bad mood.

What really tipped me over the edge is that we aren’t late in paying the bill. You see we never got the bill. We can’t pay a bill that we never received!

So in a really foul mood I placed a call to the customer service line. I explained my situation, and I’m ashamed to admit I got a little ‘shouty’. But as I kept saying:

I can’t pay a bill that I never received!

So being asked to pay by an automated service, including  a ‘late payment fee’ is ridiculous.

The customer service representative confirmed that all of my contact details are correct, and insisted that I owed them the money. Eventually we reached a point where she decided to send me via email a copy of the outstanding bill.

This would have been fine, except the bill she sent me was last months bill! Further discussions ensued where I kept reiterating my points that:

  1. I can’t pay a bill I never received.
  2. The copy of the ‘latest’ bill she sent me via email was last months bill

Eventually she decided to escalate my call to another department, and put me on hold.

Having bad on hold music would have been bad enough for the indeterminable amount of time I was on hold. The really annoying thing about being on hold was that a voice over kept telling me about:

  1. How easy it was to switch to monthly billing
  2. How convenient being emailed a bill is, compared to receiving a paper one
  3. That they’re hear to help
  4. How great their charity work is
"Sea Wolf Missile Firing" by UK Ministry of Defence

“Sea Wolf Missile Firing” by UK Ministry of Defence

My current call disproved the first three points, and I can only hope that their charitable efforts are managed better than their billing is.

Eventually another customer service representative came on the line. By this time my mood has descended to the point that I’m prepared to launch thermonuclear weapons of mass destruction at their corporate headquarters.

Unfortunately I got a bit more shouty at this point. To give the customer service representative credit she handled my complaint really well.

She explained that the billing system was ‘showing an error’ and that the bill that is overdue can’t be sent. She put a hold on my account, to stop any more of the ‘pay now via credit card over the phone to a automated system’ nonsense, and had escalated my issue to another department.

I apologised for getting shouty, and explained that the whole this was ridiculous and that the automated system that started this whole this off was awful. She agreed, and I got the sense that I’m not the only one to get upset by the automated system.

The upshot of it all is that I lost half an hour of my life I’m not getting back, and I’m to wait ’5 to 10′ business days for my bill to arrive in order to pay it.

If we weren’t signed up for a two year contract with this company I’d bail right now and go with someone else. I can’t help wondering in a world where it is OK to have a brain dead automated system call you out of the blue for your credit card details, would any other energy company be any better.

The moral of this story is that:

  1. To a large corporation you’re not a person, you’re just a thing that costs them money and periodically sends them money. Which isn’t enough for them to treat you as a human being.
  2. The customer service people are at a distinct disadvantage in dealing with grumpy and cranky people, when the higher ups insist on silly things like the automated system that called me.
  3. You need to stick to your guns when dealing with the large corporation.

I know it’s silly to vent at the Internet like this, but strangely it has helped. Thanks for reading.

 

Rant Warning: Turning off debugging is not a solution

"Mr Grumpy hits Grumpy Care Bear" By Len Matthews

“Mr Grumpy hits Grumpy Care Bear” By Len Matthews

In my role as a Senior Software Engineer at NetSpot, the majority of my time is taken up with forensic programming tasks related to Moodle. I’ve written about forensic programming here on by blog before.

To help the technical teams investigate issues we have debugging output turned on for our test and stage environments. This helps us in our investigations by providing additional information. It also helps our clients identify issues when they’re testing our patches and code fixes.

Debugging output in Moodle includes warnings generated by PHP when code is doing something that is suboptimal. One of the most common warnings we see is generated by code that does something like this:

This code generates an uninitialised variable. The code is problematic as it relies on PHP determining an appropriate default value for at least the text property and in some cases even the content property.

The code can easily be rewritten so that the warning is not displayed. For example the first time the content property is required it should be instantiated correctly like this:

The key here is that the two properties are explicitly initialised with variables of a type that is appropriate for the rest of the code. Importantly it also doesn’t generate a warning.

This matters because the warnings can break things. For example they appear in the page and provide a negative user experience. Additionally in scripts that are designed to output JSON or some other content that is parsed by JavaScript as part of an AJAX request, they can make parts of the page silently fail.

More important, issues can appear in production environments, caused by code that generates a warning. Even though the warning isn’t displayed the code doesn’t behave as the original author intended. This also leads to a negative user experience and an issue that we need to investigate.

I will admit that the example above is a trivial example. I have seen other examples that were not trivial. The way in which PHP had determined the default value of variables had caused unforeseen issues in other unrelated PHP code. Additionally I have seen code that generates debugging errors that causes issues in seemingly unrelated areas of Moodle. These types of issue can take a significant amount of time to investigate and resolve.

So, you may be wondering, what’s the point?

The point is that the other day I saw a post on the Moodle forum that suggested a problem could be solved by turning off debugging output. This is, in my opinion, is totally wrong and disrespectful to those who need to maintain the code.

Rather than turning off debugging, the solution needs to be to fix the code so that debugging messages are not output.

So when you’re next writing code, in PHP for Moodle, or in some other language or for a different project entirely. Take a moment to consider those who will need to follow after you, and support your code. Show some respect to them, and the wider community, and ensure your code doesn’t output debugging messages.

New Utility Script: GitFetchReset

"Reset" by orse

“Reset” by orse

Today I uploaded a new script to the Techxplorer’s Utility Scripts repository on GitHub. The new script is called GitFetchReset, and is documented in the repository wiki.

The purpose of the script is simple. When invoked it fetches the most recent changes from the remote Git repository. Once successfully completed the script then updates the currently checked out branch to to the remote HEAD commit.

At NetSpot we have a number of all teams all working on the same code base. Making sure that your changes are always based on the latest version is very important. Using this script will make the task easier, and use less typing.

Now to make it even easier, and less typing, all I need to do is create a shell alias. Which is a job for another day.

The script is part of my Techxplorer Utils repository which now has 13 scripts in it. When I first started developing these scripts I never thought it would grow to be this large. As with all of my scripts, I hope that this script proves useful to others as well.

New Utility Script: GitResetRepo

"Reset" by orse

“Reset” by orse

At NetSpot we have a single Git repository for all of our Moodle development. Each of our clients has their own dedicated branches. We also have branches for product development and our common code base.

This structure works well for managing the code, centrally keeping track of changes, as well as merging code between our common code base and the client branches. But it doesn’t work well with my preferred development workflow.

I’ve found that my workflow works best when I have a dedicated directory for each client that I’m working on at the time. I also find it helps to have a dedicated virtual host for each client. It helps me keep the different tasks and clients separate in my mind.

To help in setting up a new client copy of the repository I’ve written the GitResetRepo.php script. This script automates a number of different maintenance tasks when I start a new client directory. For a full list of the tasks that it automates, check out the wiki page about the script.

The script is part of my Techxplorer Utils repository which now has 12 scripts in it. When I first started developing these scripts I never thought it would grow to be this large. As with all of my scripts, I hope that this script proves useful to others as well.

New Utility Script: GitMergeContents

"merge" by Pedro Moura Pinheiro

“merge” by Pedro Moura Pinheiro

This morning I couldn’t sleep and so decided to write a quick utility script. This new script is called GitMergeContents.

The purpose of the script is to list the contents of a merge commit, in a Git repository, in a format that is easy to copy and paste into other systems.

Why is this useful you may ask?

It is useful, for me at least, because at NetSpot we undertake periodic merges from our common codebase branch into our client branches. In this way, a clients codebase is kept up to date with all of the common fixes and updates to our Moodle codebase.

We use JIRA to keep track of the changes to a clients codebase at a higher level than just individual commits. A merge from our common codebase into the clients codebase is a new JIRA item for that client, that must list all of the other JIRA items that the merge contains.

Copying and pasting this list by hand can get quite tedious, especially if the client has not had a merge in a while. This script is designed to strip out all of the unnecessary information and output it in a format that is easy to copy and paste into JIRA.

As with all of the scripts that are part of the Techxplorer’s Utility Scripts respository, I hope that this script proves useful to others as well.

Three Utility Scripts Updated

"deBUG" by Franz & P

“deBUG” by Franz & P

Today I updated three of my utility scripts. More information on the scripts is available in the Techxplorer’s Utility Scripts repository on GitHub.

The three scripts that I updated are:

  1. JiraGitBridge – for acting as a bridge between JIRA and a Git repository
  2. MdlUserListCreator – for creating randomly generated lists of users for upload into Moodle
  3. ServiceMgmt – making it easier to automate services managed with launchctl

More information on the exact changes are available in the Change Log sections of the wiki pages for each of the scripts on GitHub. The main page of the wiki, available here, links to all of the individual wiki pages for each of the scripts.

I maintain a wiki with documentation on the scripts because I believe, documentation is just as important as the code.

If you’re using these scripts, I hope they prove useful.

New Utility Script: ServiceMgmt

"Launch Control" by wbeem

“Launch Control” by wbeem

Today I released a new script in the Techxplorer’s Utility Scripts repository on GitHub called ServiceMgmt.

The intention of the script is to make it easier to start, stop, restart and checkup on the services that I use on a daily basis. This includes the nginx web server, the PHP-FPM server for executing PHP scripts, and the Memcached server.

There are two things that each of these servers has in common, they are:

  1. Installed using homebrew
  2. Managed by OS X by launchd which can be accessed via launchctl

A list of services is stored in a JSON file. The script uses this file to determine a list of services that it needs to work with. More information on how to use the script is contained in the wiki page for the script.

If you find the script useful, please let me know.

Is MOOC Too Masculine?

"0046 Le Boss" by Artemedion

“0046 Le Boss” by Artemedion

This is an edited repost of an earlier post from an earlier version of my blog.

While working at a university nearly a year ago, I had a discussion with a colleague of mine in the computer science school tea room about online learning. In particular we were discussing the impact of MOOCs on the higher education sector, and their potential impact here in Australia. The discussion was in the context of what we’d want to see in a MOOC if we were in control of one.

For those that aren’t familiar with the term MOOC, which stands for Massive Open Online Course, is defined by Wikipedia as:

an online course aimed at large-scale interactive participation and open access via the web. In addition to traditional course materials such as videos, readings, and problem sets, MOOCs provide interactive user forums that help build a community for the students, professors, and teaching assistants

The key items from the point of view of the higher education sector are:

  1. They’re large scale
  2. They have content other than just slides, documents and video recordings of lectures
  3. They are accessible via the web
  4. Most importantly they’re open access, or to put it another way they’re free to access

At the time I had to cut the discussion short because I needed to make the trek across campus to the humanities school for a meeting with another colleague of mine. As an interesting side note, who knew working on a cross discipline project involved so much walking around campus?

While in the humanities schoolI happened to see a list of items on their whiteboard for discussion and further investigation I noticed that one of them was MOOC’s. I mentioned that I’d just been discussing it before our meeting. That’s when she said something quite interesting.

She said that she was a member of a feminist collective in the school and that the collective was against the term MOOC as it was deemed to be too masculine. I enquired how it was too masculine and she wasn’t sure, perhaps it has something to do with the term massive. We left the discussion there and moved on with the rest of the meeting

This comment stuck in my mind. A few days later I was back in the computer science tea room, involved in a discussion on online learning again, when I happened to mention the discussion I’d had in the humanities a few days earlier.

The idea that the term MOOC was too masculine for the feminist collective was met with laughter and a comment that they’d just have to get used to it. Now I should clarify at this point that both colleagues were women.

I felt this was entirely the wrong response. The key issue for me isn’t if MOOC is too masculine or not. They key issue for me, is that a not insignificant number of the user community where a MOOC may be deployed felt that the term was too masculine. My thoughts on the matter of the masculinity of MOOCs are immaterial, what is important is their thoughts on it.

It is important because if a MOOC was introduced, they will need to use the system. They will need to be the ones uploading the content, in many instances creating new content, and most importantly being advocates for the system.

If they don’t like the term MOOC, getting them to use and participate in the system is going to that much more difficult. Not only would their expectations of the new system need to be managed carefully, they’re less likely to engage as it uses a term that they don’t like.

The key issue here is that if they don’t engage with the system then the system is a failure. For in my mind if a system doesn’t meet the needs of users, they won’t use the system. Or if they do use the system it will be under duress and therefore they won’t use it to the fullest extent possible. Lastly, and perhaps most importantly, they won’t be good advocates for the system.

The response to the feminists shouldn’t be “just get over it” it should be “what term would be better?”. That way they’re involved in the development of the system and more importantly starting to engage with the system.

While this is an amusing anecdote the key message that I want to impart is this. When developing a new system getting the users to engage in the system is critical and if that means changing the language that is used to describe the system, then that’s what you’ve got to do.