August 2010


Quite a varied week this week, though as the title suggests, it did involve some nightmares, and as the map should suggest it involved a bit of travel.

Supposedly my main focus was to get the web based admin interfaces for the revision apps sorted out. As I mentioned last week I found I was using the wrong version of a plugin so I tried using the right one and it worked better. As it happened I still wasn’t entirely happy with the interface offered so I’ve kept using my own form fields for some parts and used the defaults elsewhere. By the end of Monday I’d got enough of the admin interfaces done to declare them “finished” and then moved onto the next task. That task is to allow uploading of three large blocks of text and have the code parse the text to work out what the questions should be. This is to allow the teacher I’m working with to use his existing word documents to upload questions without having to make any changes. I got that nearly finished this week but got a bit sidetracked on a few other issues. I’m hoping to have that finished today and get a few other bits done so that I can have the whole first milestone finished this weekend.

On Tuesday I took a trip down to London. The main reason I was going was to meet the people at 7digital. The iPhone app I was working on earlier this year was for them so we decided that I should go to visit them and finally meet them in person. The meeting was fairly short but went well and we discussed a few of the features they’d like to get into the next version. The first version of the app is actually still in the review process for the app store (as far as I know) but hopefully it’ll get through soon.

While in London I also met up with the guy that I’ve been working on a chess app with. We discussed some of the features in the current version of the app and highlighted a few hopefully fairly simple steps that we can take to get the app to version 1.0 and ideally get me paid!

While on the trip to London I also had a play with Twitter’s OAuth support. Twitter have decided that Basic Auth is too insecure and leads to leakage of password and so everyone needs to use OAuth in the future. This is definitely a good idea but meant that I needed to change the way I was accessing Twitter with mapme.at. I found a fork of the main twitter4r library with OAuth support and managed to get it working quite easily in a small standalone script. When it came to incorporating this into mapme.at though I came across some issues. The first, very basic, script I modified worked fine but the next one gave me errors. After trying to find the problem over the course of a few days (while trying to do other work) I finally found the issue. The fireeagle library that I was using was “monkey patching” the OAuth library. Basically all it was doing was wrapping one method call so that it could change a few properties on the object that was returned. Unfortunately the fireeagle changes were based on an old version of the OAuth library and so ended up breaking the main OAuth library. Once I’d fixed that I had a few more issues as I remembered I had in the past made a few changes of my own to the twitter library, both by modifying the library in place and by monkey patching it in my own code. I ended up creating my own branch on github that has fixes in for broken friendship methods, an extra friendship method and a few other small fixes.

Room 29

So I managed to miss last week’s week notes out. I’m not really sure why that was, I think because over the weekend I sometimes feel I don’t want to do “work”, i.e. the weeknotes, but then during weekdays I want to do proper work, not weeknotes. Oh well, here I am now so I might as well get on with it.

Week 107 was actually pretty good, fairly relaxed but quite productive too. I finally started work on the new version of my GCSE & A-Level revision iPhone apps. I spent the whole of the Monday brainstorming the feature-set that I’m aiming for and then building this into a list of Milestones and breaking down the first few milestones into more detail. Once I’d done that I entered all the information into the Trac setup that I’ve previously mentioned. I quite like how Trac is all set up for handling milestones and versions of apps as it has worked out really well for managing this project.

Although these are iPhone apps the first thing I needed to do was to create a web app for managing the questions. So far I’ve just been taking the questions from the teacher as a collection of word documents and then in a part-manual, part-automated process turned these into XML files for the iPhone app to read. I wanted to make a web-based system that the teacher could use himself to upload questions so that he can sort everything to do with the questions out, taking up less of my time. Along with properly planning this project out and setting milestones I also decided I really needed to try building this project using “Test Driven Development”, i.e. writing a set of automated tests for functionality and then building out the functionality until the tests pass. This is actually the first time I’ve tried doing this for a whole project and it’s been an interesting ride so far. I’ve found that it’s really slowed me down, partly just from having to learn about the various Ruby modules and Rails plugins and getting myself up to speed with them, but I do think I’m getting better quality code and structure as a result so I’m going to continue doing it.

The first features I completed were the ability to import the existing XML files into the database and then export “test modules” from the database into the same XML format. This is about making sure we can manage questions from the database in the future and that, if necessary, I can continue to use the existing iPhone code to launch apps. I then started work on the admin interfaces. I’ve done so many admin interfaces over the years, especially back when I was doing PHP. I was hoping the state of things had changed so that it would be nice and easy to set something up to do all the work for me, especially considering rails has scaffolding which I haven’t used before but have read about and is supposed to do this sort of thing. I happened to find out about the Ruby Toolbox that week and that had an Admin Interfaces section which suggested active_scaffold was the most popular plugin to use.

I started playing with active_scaffold this week with mixed results. It was really simple to get set up but then some of the features didn’t seem to work. I managed to work out eventually that I needed another plugin to support it – recordselect. I got that installed but then it didn’t seem to work right, the ajaxy bits just didn’t do what they were supposed to and I was getting nowhere. I ended up dropping the ajaxy bits but did post a message to the active_scaffold mailing list. I’ve now had a response telling me that I need to use a different fork of the recordselect plugin, the main version only supports Rails 3 whereas I’m still using Rails 2. Hopefully once I do that things will go better.

I got sidetracked on Thursday this week by having a look at a few bugs in the chess app I’ve been working on. It seems I’d missed out some functionality of the PGN format and it was tripping up when trying to read some more complicated files. I spent all day Thursday and some of Friday on this but finally got it sorted on Saturday morning. I’ve now sent a new version over to the client and I’ll find out what they think when I meet them on Tuesday.

Other interesting things that happened in the past two weeks… in week 107 I had feedback for mapme.at requesting support for xAuth in the API and XML output. On Saturday morning I had some time so I had a look into it. The XML output turned out to be trivial. The JSON output is quite simple, I construct native datastructures like arrays and hashes and then call .to_json on the resulting object. Turns out if I call .to_xml on the same object I get perfectly reasonable XML out. The xAuth support was a little more tricky. xAuth is a mechanism whereby mobile and desktop apps can take a username and password from a user and then make a call to a server to exchange this for a token that can be used to make requests. This is better because the app then doesn’t need to store the username and password and they’re also not being passed over the network repeatedly. It also means that you don’t have to go through the convoluted OAuth process of app -> web browser -> app. I managed to get both these features out on Saturday morning and the person who had requested them was quite surprised and happy. Hopefully he’ll be using them in a Windows Mobile 6.5 twitter app soon. One other thing I tried to do was to convert mapme.at from using Basic Auth with Twitter to using OAuth instead, Twitter are turning off Basic Auth support at the end of the month and so I really need to switch soon. Unfortunately it looked like the twitter4r library that I use doesn’t yet support OAuth so I decided to postpone the change, I’m really going to have to do something this week, either find an OAuth compatible version of twitter4r or use an alternative library.

When looking back at my previous post I noticed that a good chunk of it was about how the mapme.at server went down and took ages to come back up. Yesterday I upgraded some packages on the server again and rebooted it, happy in the knowledge that the disk check completed last time and so that should’ve found whatever the problems were. Unfortunately at 12:30 this morning (just before going to bed) I got a tweet, an SMS and an email from pingdom telling me that mapme.at was down. I looked into it and saw that yet again it had mounted the disk as read only. I began a disk check and went back to bed. It’s still running now but hopefully will be done in the next two hours. I’m going to have to get in touch with Hetzner and demand that they give me new drives I think as I can’t have this going down for so long so often.

Again to finish on some good news, I have an office! You can see it in the photo above, and Adrian McEwen practicing his DJing (or perhaps pretending to sit at a desk). A bunch of us freelancers in Liverpool have been umming and ahhing over getting one for months now. Hakim happened to be walking past an office building on Duke Street with a sign mentioning office space to let. We had a look at a selection of the offices they had available and settled on one with enough room for four of us with plenty of natural light (though hopefully not too much so we can still see our screens). We received the keys on Thursday and most of us moved in properly on Friday. It’ll be interesting to see how this goes for me, I’ve predominantly worked from home for the past 5 years so I’m not sure how I’ll find working in an office again. Right now it’s looking like being a lot of fun and good to see people every day but considering none of us really got much work done on Friday we’re going to have to be more careful in the future. I’m sure as things calm down and we get used to it we’ll be really productive, and hopefully able to make use of each other’s slightly differing talents.

I’ll be dropping some stuff off in the office today and then heading to How? Why? DIY! to see what it’s all about. If you’re in Liverpool you should come along!

Ignite 3 featured in the Liverpool Daily Post

Not quite so stressful a week this week though it did have its moments. I spent Monday morning catching up on some much delayed tasks. Nothing particularly interesting just bits and pieces like polishing off the print styles for case studies on the Marketing PRojects website and adding a blog feed to the homepage.

A new bug was reported on the music iPhone app though which caused some consternation. The app allows you to log into a site and download the information about music that you’ve previously bought. It seemed though that for some users it would display nothing even though the information was being downloaded. It didn’t take me too long to track it down to something relating to an SQLite database I was using. I would parse the reponse from the server and then add all the information to the database. I would then close the database in one function and then reopen it in another function and read it back in. For some reason though when I tried to read the information back in it was finding no rows in the database. The bug seemed only to occur on iPhone OS 3 and only for certain users. In the end I couldn’t track down the actual cause and had to just save the information to the database but then ignore the database and just use the information from memory. It’s obviously a more efficient way of doing things, there’s not really much point reading the information back from the database when you already have it in memory. I guess I was doing it that way to make sure that I always read the current data from the database to make sure it was always in the same format. Now the database is only used when you first load the app and everything works nicely. Annoyingly though I wasted a day trying to sort that out.

I also worked on adding some features to a property management database that I wrote last year for The Restaurant Group. This is a site built using the Zend Framework to give a fairly lightweight API onto a database and then a rich JavaScript interface built with JQuery and the Multimap API. Most of the features were fairly simple and were sorted within a day. The most troublesome one though was to add the ability to log into the system by entering a username and password from a Microsoft Active Directory system. I managed to get the authentication step working without too much difficulty using the Zend LDAP Authentication class but I also needed to check the groups that the user was in to determine whether they were allowed to make changes to the system or were only allowed to view the information. After a few hours of getting nowhere we eventually decided to try using an alternative library. adLDAP is an LDAP wrapping library specifically tailored to accessing Active Directory systems. I used this and again managed to get the authentication working easily but the groups still caused some issues. In the end we found that the AD system I was testing against was set up in a non-standard way. After a few tests against the real live system we found that the library was working fine and I eventually managed to get it all working.

I also managed to spend a little time on the chess app, didn’t make too much progress but I’ve managed to get the full moves view to slide in from the side and slide out again when you’ve selected a new move to go to (or clicked the “X” in the corner).

The final thing that caused issues this week was the mapme.at server. On Thursday morning I decided to upgrade the kernel as ubuntu was telling me there was a new one available with some security fixes. The upgrade went fine and I carefully made sure that all the services were running and that nothing had been broken by the upgrade. A few hours later I just happened to go to the site only to find I was getting errors. I logged in to try to track down what was happening, at first this was difficult because oddly I was getting no errors in the log file. It was only when I realised that the filesystem had mounted itself as read only that I realised something fairly major was wrong. This has happened once before and at the time I just rebooted the system and it came back fine, I rebooted now though and it didn’t come back. I was hoping it was doing a disk check so I left it for a few minutes before eventually requesting console access from my hosting provider hetzner.de. When that came through I found the system was complaining that it couldn’t find the superblock, or the disk or something along those lines which was when I realised something was quite wrong. I put in a support request for them to take a look and waited for them to get back to me. After an hour or so of no response I sent them another email, two and a half hours after the original request I was told it was just a filesystem error and that they had booted into the rescue system to run a disk check. After over 12 hours of waiting for more information I emailed them again to be told that the disk check was waiting for input and that I should just run it myself, so it had essentially been doing nothing for all that time! I logged into the rescue system and ran the check myself. It took nearly two hours but finally after that I got the machine back up and running. I realise now that I should really have been able to handle most of the “repair” myself but I’m still very disappointed with Hetzner’s feedback. Taking hours to give any response and then failing to anticipate that the disk check would need user input (“fsck -y” would have fixed this) are pretty crap. At least now I’m more prepared if anything similar happens in the future. Not only am I more confident about how to fix the problem (though I’m still not 100% sure how to get into the “rescue system”) but I’ve now also invested in the services of Pingdom. When mapme.at went down it actually took me a few hours to notice but now that I have Pingdom set up it should alert me ASAP if there’s any further problems.

To end on a positive note, I went to the third Ignite Liverpool on Thursday night. We had another really fun night with talk subjects ranging from cows to cannibilism, and had Batman talking about using the “Systems Failures Approach” to analyse why his arch enemies were so unsuccessful (see photo above). Social events in Liverpool are just getting better and better these days. At the end of this month we’ve got the second Social Media Cafe Liverpool and Jelly Liverpool to look forward to (both on the open labs calendar). There’s also How? Why? DIY! which is not, as it may sound, going to consist of a Sunday afternoon putting up shelves but will offer a day of interesting sessions aimed at helping people use the full potential of the technology, community and social media facilities available to them. Take a look at the link for that one as there’s some interesting sessions planned and I’ll be going along to try to get people interested in mapping too!

Well, as you may have noticed I missed last week’s notes. I’ve had a pretty harried few weeks trying to close out the two iPhone projects I’ve been working on. Mainly this was simply because I’d finished the time that I’d assigned to the projects and so any further work was not making me any money, but also July is the end of my financial year and I really wanted to get this work invoiced for so that I could include it in this year’s earnings (the second year for my company). Not only was I trying to get these two client projects finished but I also wanted to make some progress on the chess app that has remained untouched for a few months now.

It’s difficult to say for sure but it looks like the two client projects are just about finished. One has zero live bugs and the other just has a few with questions attached which I’m hoping should result in no more work. One thing I’m finding though is that I’m definitely getting my time estimates for projects wrong. These two client projects I originally estimated as 5 and 10 days respectively. The 10 day project was eventually increased to 13 days (I submitted a “complete” app only to be sent wireframes for how the app should work, not sure if there was a communication issue here) but all in all I’ve probably spent at least 50% more time again on each one. I suppose what I’ve been doing is thinking of the time it’ll take for me to get an app together that fits the spec without considering time for the client testing and for the client to make the various modifications they’re likely to request. I’m definitely going to be more careful about any estimates I submit in the future.

I did spend some time getting a “full game view” working on the chess app. This view is intended to show the complete description of a chess game including all commentary and variations. As the game is progressed using the controls the current move should be highlighted and if any move is clicked then the visible game should be altered to show that move. I actually managed to get this pretty much done and working in half a day last week with another half day spent on some other tweaks. I sent it off to the guy I’m working with on this and he tried it out. There did seem to be a couple of issues with handling of incompatible file types (it was crashing rather than giving an error message) and a few issues with handling very large files but in general it seems to be working. I now need to experiment a bit with the placement of the full game view and then I think we’ll just be working out what other tweaks we need to do to it before we can release it.

Last week I also spent an interesting two days with Dave Verwer of Shiny Development. He’s asked me to help him out with some iPhone training so we spent the two days going through his materials and making sure I understood everything and was clear on how he presented it. Dave has been developing in Objective C for many years, initially for the Mac but with the iPhone SDK since it was released, so it was really good just spending the time with him and learning even more about the iPhone SDK from him. I’m basically going to act as the overflow taking on any courses when Dave is already booked up, the first course is booked in for October so it’s going to be really interesting to see how that goes. If you’re interested in getting trained up in Objective C and the iPhone SDK then you should definitely give Dave a call. Whether you want in-house training for a company or you’re an individual who wants to join one of the public training sessions you’ll have a great experience and be building iPhone apps in no time.

Final thing to mention is something I finally got set up today. I’ve been using SVN for a long time for various personal projects including mapme.at but I’ve also more recently been using Git. Git is great as a distributed VCS giving me the ability to check-in code on the train and branch projects with ease, but the lack of an enforced server-side component has led to me being a bit sloppy about how often my code gets backed up to a server. Today I finally worked out how to properly host git modules on a server. I’m actually doing it over SSH so it’s fairly simple, I just need to create a “bare” repository on the server:

mkdir foo.git
cd foo.git
git --bare init

Then the following commands link my local repository to the remote one and pushes all my changes up:

git remote add origin ssh://git.example.com/path_to_repository/foo.git
git push origin master

I think when I’d tried this previously I was using an old version of Git that didn’t properly support --bare so I’m glad that everything worked easily this time. I then took this one step further by setting up trac as a bug tracking system. It wasn’t too hard to set up on ubuntu using these instructions they’re a bit out of date though so once I’d setup mod_python I then found I needed to do that bit again and use WSGI instead using these instructions. I’ve also installed the Git plugin so that I can browse my Git repository from within trac and also most importantly I can resolve and update bugs from my commit messages using the post-receive script on that page (which must be named post-receive and put in the repository.git/hooks/ folder).

So a busy few weeks. As I’m coming to the end of projects I’m getting to the post-project comedown that I tend to have after being so busy and then becoming quiet again. I’ll still need to finish off any final bugs on these two client projects and get this chess app more complete. I also have emails from clients with various bits and pieces that need doing but I should finally be able to get started on some new projects, I just have to work out which ones!