Daily Archives: December 5, 2006

Rules for Bug Lists

I identified a problem with a project today where there seemed to be an ongoing misalignment between what the developers thought was left to be fixed before shipping and what the project sponsors thought needed to be done. This problem has been going on so to stop it happening in the future – here are my rules for maintaining lists of bugs.

  • Bugs should be recorded as they are discovered with as much information as possible entered into the bug details.
  • There should be only one list of bugs per project although it may be sliced and diced in different ways for reporeting purposes.
  • The list of bugs should be stored in a central location, for example an Excel file on a network share, or if you have access to such a tool, something like Team Foundation Server/Work Item Tracking.
  • Each bug should be issue a unique identifier and new bugs should check against the list of known bugs to ensure that a duplicate bug entry is not being made.

Dual cores on the desktop was easy, 2 + n cores is more of a challenge.

I came across this post on the APC site today (posted 25th August 2006) which suggests that the introduction of quad core processors is overkill at the moment. The general gist of the article is that quad core processors are overkill for most users today.

Personally I think that it is one of those “it depends” things. To understand why quad core processors might not make much of a difference to your current stable of desktop applications you need to understand what dual core processors did for the desktop – and from there you can begin to understand the kinds of applications that will be able to leverage an increasing number of processors looking forward.

Why dual cores help desktop applications?

Desktop applications on modern applications are multi-threaded beasts which means that they have multiple logical threads of execution, and the runtime and operating system decide when each of those logical threads of execution get a slice of dedicated processor (or core) time.

Usually with a desktop application today there is one thread dedicated to serving up the user interface, one or more threads dedicated to managing memory and performing background housekeeping functions (depending on whether you are running on a runime, and what that runtime is) and zero or more threads specifically started by the developer to perform background processing tasks.

On a single core machine you can only do one of these at a time so if memory management is taking a long time then the user feels it because the UI stops responding. A dual core machine instantly reduces this contention on processing resources and so desktop applications almost by default benefit from having the two cores to spread load across (not to mention sharing that resource with every other part of the operating system).

Quad Core and Diminishing Returns

So if dual core machines make desktop applications better, why doesn’t a quad core processor have double the effect? Well the reality is that it probably does, but even with a single core processor things were OK – a dual core allowed us to do two things at once, and with a quad core – well, we are running out of useful things to do with that extra capacity – but that is desktop applications.

What kind of software can take advantage of the increasing number of cores? Well there are a couple obvious ones like computer games with a large number of independent actors – those kinds programs need oodles of processing power and the way they are implemented today is through a number of less than obvious hacks which make the maintainability of code harder.

Scientific applications like Mathematica will also be able to take advantage multiple cores because its algorithm breaks down an expression and operators on each of its elements in parallel. In essence this is what guys like Joe Duffy are going to be trying to bring to mainstream development.

Personally I’m looking forward to a day when counting the number of cores you get on a processor is about as useful as quoting the size of a disk in bytes. I’m sure we will be able to come up with something to do with all that processing power :)

In the meantime quad core processors will make our computer as a whole more responsive, even if it doesn’t make a huge difference if we focus soley on a single application.

WPF/E, Expression Blend, Expression Design and Expression Web

It was a big night over night! Scott Guthrie announced the availability of WPF/E (December CTP) on his blog and also linked to a couple of demonstration pages. They are actually pretty slick – I really like the Page Turner. Of course, what really matters is the tools that we get to use and Microsoft (finally) made a big public announcement about where they are heading with the Expression Suite. In particular the Expression Suite products have some new names:

Exciting times! Microsoft sure knows how to pile on the new technology, I just hope people are good at finding their new cheese. It looks like Microsoft is doing one of their classic shape shifting moves and becoming a vendor of what the market needs – and right now the market needs tools and platforms that allow developers (and designers) to produce beautiful user experiences.

The Mac crown better watch out – they don’t have a monopoly on cool anymore :)

Update: I just noticed Long Zheng’s post on this subject. Long posted up pictures of me using a build of Blend at the Ready Summit in Melbourne which I had done accidentally – we tried to keep the leak low key, but what can you do when something is picked up my Mary Jo Foley? I got the Blend bits from Chuck who unfortunately had to answer to the product team about the accidental leak.

The good thing is that it is all out in the open now and we can start to enjoy this cool new product. I had an opportunity to get quite familiar with the new Blend interface and it is lightyears in front of they way things were at with EID.