deltaflow: home

Software architecture with Grady Booch

Posted by Julian Seidenberg on 5 February 2007 | 0 Comments

Tags: , ,

I recently attended a round-table discussion with Grady Booch. Yes, the Grady Booch. What, you've never heard of him? If you studied Computer Science you are sure to have at least one book of his. He is one of the gurus of software development. He is now working as "chief scientist" for IBM.

Read his blog here and another blog of his here.

You can also watch his recent Turing Lecture on "the promise, the limits and the beauty of software". It is very interesting.

Here some tidbits from the discussion with him :

Functional programming languages (like LISP, Scheme and SML) failed largely because they made it very easy to do very difficult things, but it was too hard to do the easy things.

The current buzzword for revolutionizing the software industry is SOA: Service Oriented Architecture. Grady calls it "Snake Oil Oriented Architecture". It is just re-branded "Message Oriented Architecture". The idea is to expose services and describe them using WSDL. This decreases coupling between systems. The service becomes the thing to test things against. The rest of the software application becomes a black box. A meta-architecture emerges: no software is an island onto itself.

It is a good idea, but the hundreds of WS* standards are so complicated and ill-defined that Microsoft's and IBM's implementations end up being incompatible. Lesser companies have no hope of ever implementing these crazy so-called standards. Just another scheme by the big companies to lock people into their software.

Bill Higgins' REST-style of SOA is much more promising. It builds upon the idea of something like HTTP instead of the complex transfer protocols of the WS-Vertigo world.

But back to software architecture...

The next big challenge in software architecture is concurrency. Raw clock speed has just about reached its physical limit. Chip companies are now putting multiple copies of the same CPU onto a single chip. The result is that applications can no longer just be run faster. They have to be run in parallel in some way. For example:

Dreamworks computer animation uses 10,000 serves in a production pipeline to render movies like Shrek 3. They will soon switch to using multi-core processor, but will have trouble distributing the work-load to take advantage of all these multiple cores.

The game company EA has the same problem. the Playstation 3 uses the Cell processor which has an 8-core CPU. How does on take advantage of all these 8 cores? EA segments their games into simple concerns: graphics on one core, audio on another, AI on yet another, etc. But the company admits that they are using only about 10% of the processor's capacity. So much potential computing power is wasted because it is really difficult to parallelize something as complex as a video game.
A typical Google node (and there are many around the world) consists of about 100,000 servers, but Google have a relatively "easy" problem. Search is "easy" to parallelize.

The perfect architecture doesn't exist. Good architectures have evolved over time. The first version of Photoshop wasn't very good, but it has undergone many rebirths. Amazon's computer systems can handle the loss of an entire data-center without a shopper ever noticing. It certainly wasn't always that way, but by gradual refinement they have built (and are continuing to build) a better and better architecture.
A typical EA game costs about $15 million just in development cost (that is without the cost involved in licensing, marketing, or distributing). Two kids in a garage can no longer create amazing software. They can have a great idea, but it has to evolve into something much more complex to be truly useful (on that note: Google is a company most seriously in need of adult supervision; way too much money in the hands of kids. They will soon face a mid-life crisis just like IBM has in the past and Microsoft currently is right in the middle of - just look at the state of Windows Vista).

Some principles for a good architecture:

  • Crisp and resilient abstractions: use an object oriented view of the world, rather than algorithm based view of the world. Think about things instead of processes (this idea dates back to Plato).
  • Good separation of concerns: that is in one sense obvious, but is also really hard to get right. It is very tempting to put a bits of logic in the wrong places in the architecture.
  • Balanced distribution of responsibilities: no part of the system should dominate the entire architecture.
  • Simple systems: the holy grail; very few software companies get to this point. The best systems are ones that actually decrease their amount of code over time. Good developers find ways to do the same functions more efficiently.

How to tell a good architecture when you see one? Ask the following questions?

  • Do you have a software architect? (or, at most, 2 - 3 people sharing the role)
  • Do you have an incremental development process? (not waterfall, but releasing a new version every week or so)
  • Do you have a culture of patterns? (design patterns are beautiful and the best thing for creating good software)

If the answer to all three questions is "yes", then chances are you have a good architecture, or even if you do not have a good architecture at the moment, you will gradually evolve to having one.

4Plus1 Architecture

Want to learn about good architecture? A good place to start is the 4+1 model view of software architecture. Software needs to be envisioned from multiple different perspective simultaneously. Just like their can't be just one 2D diagram outlining the plan for a house, there can't be a single view of a software application. [I might add that there can't just be a single view of the Universe. The Vedic literature therefore describes the Universe from 4 different viewpoints simultaneously.]

As for Web 2.0: it is a meme, an idea, a flag pole that you can hang almost anything off.

As for the Semantic Web? Developers don't understand normal software architecture properly, so what chance is there for them to understand something as complicated as semantically aware software? So, in Grady's opinion, the semantic web is a long, long way off.


No one has commented on this page yet.

Post your comment

RSS feed for comments on this page | RSS feed for all comments