December 15, 2010

...Learn TDD with Codemanship

Time To Look Seriously At Software Developer Apprenticeships

We have a problem in this country.

Well, we have many problems, but I want to talk about one in particular.

When I do my software craftsmanship spiel at conferences and wotnot, I start by putting software into context with a very brief history lesson.

In 1943 there was one electronic programmable computing device in the world. You can see a working replica of it at the National Museum of Computing at Bletchley Park.

Since that first quantum leap into the computing age, the growth of computing devices has been exponential. Today, we estimate there are more than 500 billion computing devices in operation - everything from desktop computers to the microcontroller that stops your toast from burning.

If the trend continues, and it shows no signs of abating, by 2050 computing devices could outnumber insects. And by 2100, there could be more computing devices on Earth than living organisms. Including bacteria. They'll be swimming through your bloodstream, deciding which cells to kill.

And the programs that these computing devices run will be written by...

Scared yet?

Another sobering trend is the complexity of the software these computing devices will be running. Take Microsoft Word as an example. Word 1.0 contained 30,000 lines of code, or thereabouts. Word 10 contains a staggering 10 million LOC. We see this trend everywhere. The first microcontrollers used in cars were relatively simple and basic, but today a luxury family saloon can be running anything up to 100 million LOC, if you throw in the MP3 player, the digital radio, the SatNav and all ther other stuff cars seem to require these days.

Modern civilisation is powered by computers, and the sole purpose of computers is to run programs. Think of the FTSE 100 here in the UK. Is there a single company on that list that could survive even a few weeks without its software? It's no coincidence that for decades now disaster recovery plans have focused on protecting data from floods, famines and nuclear wars.

The UK, and the developed world in general, is much deeper into the Information Age than maybe we realise. Computers and software have become as central to our society as clean water and antibiotics. And in the decades to come, they will play an increasingly pivotal role.

This makes the people who write computer software as fundamentally important to Digital Britain as the people who built railways and ships were to Steam Britain.

Like many of you, I've witnessed large, established organisations brought to their knees by their inability to create and adapt the software they need to drive their business. I've watched financial organisations prevented from rolling out new products and services while competitors steal the march on them. I've watched dotcoms ground into dust by their inability to respond to user feedback fast enough. I've watched traditional bricks-and-mortar businesses that have been around for over a century disappear almost overnight because their whole game changed with the release of a single computer system that made them instantaneously obsolete. Take the record industry: right now it's floating aimlessly around like a ghost that hasn't realised it's dead yet. All that damage inflicted on a 100-yr-old multi-billion pound industry by a simple digital audio file format.

This is just the beginning. As computers and software become more and more integrated into every aspect of our lives, the ability to produce working software and to adapt software based on feedback will become the primary limiting factor on not just our "digital economy", but our entire economy and our entire standard of living.

Whether or not programmers take steps to make their code reliable and easy to change will show up on the balance sheets of businesses of all sizes. It will be increasingly evident in the price of gas and electricity. It will show up in the success or failure rates of schools and hospitals. It will pervade every aspect of modern life.

As programmers, we must rise to the challenge. The tools and techniques we use to create software have changed little in 50 years. It still ultimately boils down to someone typing code into a text editor. And while the end product is orders of magnitude more complex than the programs being written in 1960, the underlying principles that govern software have stayed pretty much the same. Practices that were known to produce more reliable software, and that make software easier to adapt and evolve, have been understood for decades. OO programming is much older than you think. 40 years old, in fact. Unit testing is even older. Even something we might consider "shiny and new" like refactoring turns out to have been around for longer than today's computing undergraduates have been alive.

And yet, still, the vast majority of programmers don't know these basic practices and fewer still apply them. We have a lot of catching up to do. It's not acceptable that 90%+ of programmers don't refactor. (That's being generous - it's probably 99%+) It's not acceptable that 90%+ of programmers don't write tests for their code. It's not acceptable that 90%+ of programmers can't tell clean code from dirty code and think that m_XfrtYort_obj is a perfectly good name for a variable.

It's not acceptable, and not just for reasons of professional ethics or pride. It's not acceptable because our economy and our quality of life will suffer. Indeed, it almost certainly already does.

For Digital Britain to have a viaible future, the 90%+ that doesn't needs to be transformed into the 90%+ that does. And this is where our challenge really begins. Let's face it, it's not going to happen. Not on anything like the scale that it should.

Firstly, it's not going to happen because the change is just too great. 90% of UK software developers runs into hundreds of thousands of people. All of whom are earning good money not doing it. If they cared, they'd be doing it. It's pretty simple, really.

Secondly, in the UK, computing education seems to be in decline. I mean real computing, as opposed to learning how to use a spreadsheet. This is an alarming paradox. Year on year, computing becomes more central to our lives. And year on year, intake into computing degrees gets weaker and weaker.

Thirdly, our computing education in this country is preparing students for a career in a version of computing most of us don't recognise. Students devote the majority of their time learning theory and skills that they almost certainly won't be applying when they get their first proper job. Computing schools are hopelessly out of touch with the reality of computing in the real world. While employers clamour for TDD or refactoring skills, academics turn their noses up at them and focus on things like formal specification and executable UML and compiler design, along with outdated and thoroughly discredited "software engineering" processes.

When you leave university with your computer science or software engineering degree, prepare to discover that you have just wasted 3-4 years of your life learning stuff we don't need you to know.

So here is our challenge: where will the next generation of good programmers come from? They're not in their bedrooms right now writing games on their Sinclair Spectrums or Acorn Micros like my generation was. They're not in universities learning the practical skills professional programmers need. And, with the exception of a small minority that probably includes you (since you're reading this), they're not at work constantly striving to improve themselves and the quality of their code.

Where are the railway builders and bridge builders and ship builders of the Information Age going to come from? And what can we do to help them find their way?

This will be my primary concern in 2011.

The recent furore about university fees may possibly present us with an opportunity. I happen to believe that "software development" exists in its own educational and professional space, distinct from "IT" or "computer science". To the best of my knowledge, software development is not being addressed as a practical, vocational discipline, like carpentry or nursing. But it should be. The only way I've found to learn to be an effective programmer is to write programs. Lots of them.

Although there's undoubtedly useful theory to pick up along the way, the best programmers I've met have learned by doing it and have thousands of hours of practical, hands-on experience. They also have good motivations - they WANT to be better and are their own harshest critics - and they have good influences. They seek out the better programmers and learn from them. One day, when they have improved enough, they become influences themselves and warmly embrace those who seek them out and share generously their knowledge and experience.

It's high time, then, that we looked seriously at the idea of software developer apprenticeships as an alternative to the traditional educational routes. Someone leaving school at 17-19 with decent qualifications in maths, english and science could be given a choice to persue a career as a software developer - a proper one, a good one - by starting an apprenticeship that may last 4-7 years, during which time they will progress and earn money instead of getting deeper and deeper into debt.

I know we can cover the theory along the way, and do it in a more practical and engaging way. For example, we can create katas for the fundamental data structures and algorithms and apprentices can learn to roll a binary heap or solve the travelling salesman problem in a test-driven way, ensuring not only a practical appreciation of the theory but also a reliable, clean implementation - something most computer science graduates seem to struggle with.

Rigour is not incompatible with practicality. Indeed, they really ought to finally - after a long and difficult engagement - tie the knot and do the decent thing so more of us can benefit from both.

It may even be possible to structure an apprenticeship such that it leads to academic qualifications, should any of the more progressive institutions wish to branch out and engage with us practitioners in a way we find more meaningful. Although it shouldn't be necessary. Who would you hire? A recent CS graduate or someone who had spent the ages 18-23 serving a full-time apprenticeship at, say, 8th Light or Eden Development (or even, dare I say it, Codemanship)?

But I think it's possible to have our cake and eat it. I see no incompatibility between academic rigour and on-the-job experience, provided everyone involved approaches apprenticeships in the right spirit and with an open mind (or an open heart, or with an open beard and open-toed sandals, or whatever).

If each practicing software craftsman were to take on one apprentice, we could double our numbers every few years. Ultimately that may only lead to a few thousand more software craftsmen practicing in the UK, but that's a few thousand more than we probably would have had. And never forget that the best programmers can be ten times as productive as the worst. This could be a few thousand craftsmen doing the work of a few tens of thousands of bog-standard mortgage-driven developers, in terms of their value to the economy.




Posted 7 years, 1 month ago on December 15, 2010