Performance vs. responsiveness

Striking the balance between performance and application responsiveness isn’t particularly easy. Heck even engineering a solution that offers some kind of balance isn’t that straightforward. But before I go into that, what’s the difference between responsiveness and performance? Well responsiveness refers to how quickly the application responds to your input, whether it beachballs and stutters or whether it runs smoothly. Performance on the other hand is how quickly the internal engine does its allotted task. Typically the harder you work the engine, the worse responsiveness becomes. This is because the CPU time that would be spent dealing with your mouse clicks and updating the user interface is being eaten up by the engine. On the other hand if you want a super responsive application you typically have to throttle back the engine to make sure there are enough resources that it runs smoothly.

In the context of NewsMac Pro the engine is the part of the program that downloads and parses RSS and Atom feeds. Up until now the engine has been allowed to pretty much kill the applications responsiveness in order to get things done fast. But this doesn’t really offer the best user experience, it means while lots of downloads are occurring you more or less have to just leave the app alone. This probably doesn’t effect most people that badly, it only really hits when you have lots of scheduled folder updates going or choose to reload a folder with lots of channels.

Still I see the ability as being able to deal with these large numbers of channels gracefully as being oen of the key goals of NewsMac Pro. You should be able to load 200 channels at once and not have to wait seconds for your mouse clicks to be registered. So with that in mind I decided to try and figure out a way of throttling back the engine, of adding a bottleneck somewhere that would give the app a major responsiveness boost under heavy load.

So first thing I needed to do was really identify which part of the whole process was hammering the CPU so badly. I was pretty sure it was the RSS/Atom parsers doing their thing that was the cause. The way NewsMac Pro works is that there can be up to 10 feeds downloading at once, and as soon as any of those 10 processes finish downloading they start parsing. This means that up to 10 parsers can be running at any one time. Now parsing 10 RSS feeds at the same time and indexing their headlines in an object oriented language like Cocoa is surprisingly CPU intensive. Lots of objects get created, stored, sorted, removed and compared. Running 10 at a time was bringing the app to its knees. So to see how much over head those 10 downloads were generating without the parsing I simply commented out the bit of code that invoked the parser. No noticible slow down. Hmm, well OK then the solution is fairly straightforward. Download everything as quick as we can, then queue it up and parse one channel at a time. This way downloads can happen really fast and are not dependant on the parser finishing to get on with the next channel, and there is only one CPU intensive parsing operation going on at a time.

Bingo, responsiveness! The down side is of course that batch channel updates take longer to finish processing, but at least you can pleasantly read the headlines from those that have downloaded while you wait. I’m sure as time goes by I’ll be able to further refine this and make it reasonable to have several parsers running together, but for the time being this offers a step in the right direction.

Download engine diagram

The diagram above shows you how the ‘engine’ works in NewsMac Pro.

1. The user initiates a download, e.g. by clicking on a channel.
2. The request is placed into a queue.
3. As soon as one of the 10 download slots becomes free the request is pulled off the queue and the RSS/Atom feed is downloaded.
4. The downloaded raw XML is placed into another queue to await being parsed.
5. When the parser is free from it’s last task it pulls another raw XML file off the queue to process.
6. Finally the processed headlines get added to the headline database and indexed for searching. From here they are accessible to the user.

Leave a Reply

Your email address will not be published. Required fields are marked *