May 17, 2013
Straw Man TDDA lot of the criticisms of Test-driven Develoment I hear are really attacks on a mythical version of TDD that no right-minded advocate ever put forward.
Nevertheless, being a TDD trainer and coach, I do still devote time to answering these straw man criticisms and objections. I thought it would be useful to collect some of the most common misconceptions in one place that I can point people to when I'm just too tired and/or drunk to answer them any more.
1. TDD means not doing any up-front thinking about design
Nobody has ever suggested this. It would be madness. Read books like Extreme Programming Explained again. You'll see sketches. You'll see CRC cards. You'll even see UML. (Gasp!)
The question really is about how much up-front design is sufficient. And the somewhat glib answer is "just enough". I tend to qualify that as "just enough to know what tests you need to pass". So, if your approach is focused on roles, responsibilities and interactions, then I'd want to have a high-level idea of what those are before diving in to code. If it's more an algorithmic focus, I'd want to have a test list that can act as a roadmap for key examples that - taken together - explain the algorithm. And so on.
I'd stop at the point where I'm asking questions that are best answered in code (e.g., is this an interface? Should this method be exposed? etc) Code is for details.
2. TDD takes significantly longer because you write twice as much code
Once you've got the hang of TDD - and that can take months of practice - we find it doesn't take significantly longer. Mostly because the bulk of our time isn't spent typing, it's spent thinking and, when we don't take care, fixing problems. Fixing problems, we find, generally takes more time than avoiding them. So much so, in fact, that working in the very short feedback loops of TDD and testing thoroughly as we go can turn out to be a way of saving time.
Most developers and teams who report a loss of productivity when they try TDD are actually reporting the learning curve. Which can be steep. This is why it can make good commercial sense to seek help in those early stages from someone who's been there, done that and got the t-shirt.
3. TDD leads to mountains of test code that make it harder to change your source code
There are three key steps in TDD, but most developers miss out or skimp on the third one - refactoring. So, when they report that they tried TDD for a few months, but found after a while that they couldn't change their source code without breaking loads of unit tests, I'm inclined to believe that this is what's really happened.
Test code is source code. If the test code is difficult to change, your code is difficult to change. So we must apply as much effort to the maintainability of test code as to the code it's testing. It must be easy to read and understand. It must be as simple as we can make it. It must be low in duplication. And, very importantly, it must be loosely coupled to the interfaces of the objects it's testing.
Think of UI testing. Maybe we wrote thousands of lines of scripts that click buttons and populate text boxes and all that sort of thing, binding our UI tests very closely to the implementation of the UI itself. So if we want to change the UI design - and we will - a whole bunch of dependent tests break.
Better to refactor our UI test scripts so that interactions with the concrete UI are encapsulated in one place and invoked through meaningfully-named helper functions. so we can write tests scripts in the abstract (e.g., submitMortgageApplication() instead of submitButton.click() )
The same applies to unit tests. If we repeatedly invoke the same methods on an object in our tests, better to encapsulate those interactions behind abstract and meaningful interfaces so it all happens in one place only.
4. TDD does not guarantee bug-free code
This isn't a straw man, per se. But to say that "we don't bother doing X because X is not completely perfect" isn't much of an argument against doing X when no approach guarantees perfection. When people throw this one at me, I'm naturally keen to see their bug-free code.
Let's face it, the vast majority of teams who don't do TDD would benefit from doing something like TDD. They'd benefit from working towards more explicit, testable outcomes. They'd benefit from shorter and less subjective feedback loops. They'd benefit from continuous refactoring. They'd benefit from fast, cheap regression testing. Their software would be more reliable and easier to maintain, and - once they've worked their way up the learning curve - it won't cost them more to achieve those better results. There are, of course, other approaches than TDD that can achieve these things. But, by Jiminy, they don't half feel like TDD when you're doing them (which I have).
5. you are not designing domain abstractions, you are designing tests.
This is a new addition to the fold, courtesy of some chap on That Twitter who obviously thinks I don't know one end of a domain model from a horse's backside.
Now, I've spent a fair chunk of my career modeling businesses - back in the good old days of "enterprise architecture", when that was where the big bucks were. So I do know a thing or two about this.
What I know is that those domain abstractions have to come from somewhere. How do we know we need a customer and that customer might have both a billing address and a shipping address, which my be the same address, and that customer may be a person or a company?
We know it because we see examples that require it to be so. If we don't see examples on which these generalisations are based, then our domain model is pure conjecture based on what we think the world our systems are modeling might look like (probably). I design software to be used, and it has been considered a good idea to drive the design from examples of usage for longer than I've been alive. Even when we're not designing software, but simply modeling the domain in order to understand it - perhaps to improve the way our business works - it workes best when we explore with examples and generalise as we go. In TDD, we call this "triangulation".
I will very often sketch out the concepts that play a part in a collection of scenarios - or examples - and create a generalised model that satisfies them all as a basis for the tests I'm about to write. (See Straw Man #1, of which this is just another example.)
When we generalise without exploring examples, we tend to find our domain models suffer from a smell we call "Speculative Generality". We can end up with unnecessarily complex models that often turn out not to be what's needed to satisfy the needs of end users.
Good user-centred software design is a process of discovery. We don't magic these abstractions and generalisations out of thin air. We discover the need for them. At it's very essence, that's what TDD is. I can't think of a single mainstream software development method of the last few decades that wasn't driven by usage scenarios or examples. There's a very good reason for that. To just go off and "model the domain" is a fool's errand. Model for a purpose, and that purpose comes first.
If you practice TDD, but don't think about the domain and the design up-front, then you're doing TDD wrong. It's highly recommended you think ahead. Just as long as you don't code ahead.
6. TDD doesn't work for the User Interface
Let's backtrack a little. Remember those good old days, about 10 minutes ago, when I told you that you should decouple your test code from the interfaces that it tests?
Those were the days. David Cameron was Prime Minister, and you could buy a pint of beer for under £4.
Anyhoo, it turns out - as if by magic - that it's not such a bad idea to decouple the logic of user interactions from the specific UI implementation in the architecture of your software. That is to say, you knobs and widgets in the UI should do - to use the scientific parlance - "f**k all" as regards the logic of your application.
The workflow of user interactions exists independent of whether that workflow is through a Java dekstop application or an iOS smartphone app.
A tiny slither of code is needed to glue the logical user experience to the physical user experience. If more than 5% of you code is dependent on the UI framework you're using, you're very probably doing it wrong.
And for that last 5%... well, you'd be surprised at how testable it really is. It may take some ingenuity, but it's often more do-able than you think.
Take web apps: all it takes is a fake HTTP context, and we've got ourselves 100% coverage. (Whatever that means.) Java Swing is equally get-at-able. As are .NET desktop GUIs. You just have to know where to stick your wotsit.
If you'd like to see a few other TDD myths debunked, while getting some hands-on practice in an intensive and fun workshop, join us in London on July 13th.
May 15, 2013
How Can You Attract And Retain Great Developers?Companies often wonder how they can attract and retain great software developers.
Well, here's the thing about great software developers. They don't approach what they do as just a job. To them, it's a passion, a calling. They do it because they love to do it.
To make sense of this, let's change the context of the question.
You have started a band. How can you attract and retain great musicians for your band?
Here's how you might not do it:
1. Constantly remind them that this is your band and that they must do as you tell them
2. Get them playing awful, tacky music (e.g., that song from Four Weddings & A Funeral, anything by Meatloaf) at weddings and school proms
3. Force them to play with crappy musicians who make tonnes of mistakes, and don't give them time and space to help those musicians improve. And then blame them if it sounds crap.
4. Consistently approach the band's musical output with a "that'll do" attitude. "Yeah, the vocal's off-key in the chorus, but we're on a deadline so let's just print the CDs already"
5. Make unrealistic demands of them. "We're going to be playing 2 shows a day for the next 6 months", "We've got 5 days to rehearse this 90-minute set"
6. When they do something amazing, ignore it. Focus on you. You're the band leader, after all.
7. Routinely remind them that they are dispensible. Great musicians grow on trees, remember?
8. Discourage a musician-led culture in your band, and restrict time to practice, learn and grow as musicians. You're there to make money. Who gives a shit about day trips to NAMM or time off to attend guitar clinics?
9. Most important of all, remember: when the band's a success, it's because you're a great band leader. When it's a failure, it's because your musicians suck.
Now, ask me again: how can you attract and retain great software developers?
Legacy Code Without Automated Tests Is Not An Excuse For Less RigourWhile I'm on the subject of bad ideas when refactoring legacy code, I feel I should draw attention to what appears to be a common misunderstanding - even among us experts.
I watch a lot of screencasts where folk demonstrate how they would refactor legacy code. Typically, they start by stating bluntly that you shouldn't refactor code without automated tests.
Then they go on to do exactly that so that they can write their first automated unit test in order to make the code initially testable - usually to introduce some kind of dependency injection.
Some excuse themselves from the need to re-test while they do these initial refactorings because they were using automated refactoring tools. I'm not quite sure how this urban myth got started, but let me burst that bubble right here.
At the time of writing, no refactoring tool is that reliable. Even if you're expert at using the tool, and selecting all the right options for more complex refactorings , which most of us aren't, every once in a while the tool screws up our code. And when I say "once in a while", I mean regularly.
I've learned from the school of Hard Knocks to re-test my code even after using the simplest automated refactorings. Even if those tests run slowly. Even if I have to follow manual test scripts and click the buttons myself.
The reason we fear legacy code is because it is difficult to change. It's difficult to change because it's easy to break. The time to be giving ourselves a hall pass to excuse ourselves from regression testing is not right at the start, when the code is probably at ots most brittle.
In these early stages, when our priority is probably getting fast-running automated tests (i.e., unit tests) in place to enable the kind of architectural refactoring we want to do, we must approach the code with utmost care. That means we need to apply the greatest rigour.
May 10, 2013
Making The Untestable Testable With Mocks - Resist Temptation To Bake In A Bad DesignJust a quick note before my next pairing session about using mock object frameworks to make untestable code testable.
Mocking frameworks have grown in their sophistication, for sure. But I fear they may have mutated into testing tools, rather than the design aids that their originators intended.
Say, for example, you're trying to write unit tests for some legacy code that depends on a static method which accesses the file system. We want unit tests that run quickly, and reading and writing files means slow unit tests. So we want somehow that we can invoke the methods we want to test without them calling that static method.
Enter stage right: UberMock (or whatever you're using). UberMock solves this problem with some metaprogramming jiggery-pokery that makes it possible to specify that a mock version of a static method be invoked at runtime. We write unit tests that set up expectations on that mock static method call. That is to say: we expose an internal detail that the static method - in mock form - should be invoked.
That's a legacy code "gotcha". We now have unit tests. Hoorah! But these unit tests depend on this internal design detail. And make no mistake - it's a design flaw we'll want to get rid of later.
If we decide, after we've got some tests around it, to refactor this horrid code so that we're observing the Open-Closed Principle (The "O" in "SOLID" - meaning that classes should be open for extension but closed for modification, which is not possible when we depend on static methods that can't be susbtituted with overrided implementations without the aforementioned meta-programming jiggery-pokery), we cannot do so without re-writing our tests.
The tests we write that depend on internal design details of legacy code effectively bake in that legacy design, making refactoring doubly difficult at the very least.
If our ultimate aim is to invert that dependency on a static method, so that the code now relies on some dependency-injected abstraction, it tends to work out easier in the long run to put that abstraction in place first, and then use mocks to unit test that code.
Don't bake in a design that you'll later need to change
It's a little chicken-and-egg, I grant you. Ideally, we'd want unit tests around that code before we tried to introduce the abstraction, but how do we do that - without baking in the old design - until the abstraction's in place.
It's one of those situations where, I'm afraid, the answer is that you're going to have to be disciplined about it. There's usually no quick fix. You may have to rely on slow and cumbersome system tests for a while. Or even - gulp - manual testing.
But experience has taught me that, in the final reckoning, it can be well worth it to avoid pouring quick-drying cement on an already rigid and brittle design.
Ah, and I hear my next
May 5, 2013
Music By Programmers - Week #1 Update
The album to raise money for maths and programming workshops at Bletchley Park and The National Museum Of Computing has been out for a week now.
And what a week it's been! Things kicked off on the previous Friday with an article that ended up being the BBC Tech News number two story, getting a link on the news home page, which generated a lot of interest.
Then on Monday we were assisted by a generous tweet from Stephen Fry.
Together with bags of other social media activity, and coverage by PC Pro, The Register and other noteworthy websites, the buzz was enough to propel Music By Programmers into the download bestseller charts on Amazon, iTunes and Google Play.
The limited edition CD went on sale around lunchtime on Monday, and sold out on Tuesday.
The week ended with interviews for other web news sites, as well as BBC local radio, about these computer programmers who were "storming the charts".
Naturally, the web having the short attention span that it does, we've tailed off quite spectacularly since Friday - though as of writing we're still in Amazon's Top 40 Dance & Electronica albums. As the saying goes, we've had our 15 megabytes of fame. Now the real work starts!
Of course, being pop stars for a week isn't really the point of it all - gratifying though it is to see something you've helped create up there among the Daft Punk's and the will.i.am's for a short while. Something to tell the grandchildren. (If I never have granchildren, I'll borrow someone else's and tell them.)
A huge "thank you" if you bought the album and spread the word. With no marketing budget and no label behind us, we're relying completely on word-of-mouth. Without your support, none of this would be possible.
How this all translate into sales, and therefore money raised, we shall have to wait and see. It can take months to get sales figures - and money - from the online retailers. My feeling is that we're well on our way to achieving our target, though.
But I doubt we're there yet, so we still need your support to make our goals happen. If you've not bought your copy yet, please consider downloading it today. Roughly £4-5 from every sale goes directly to these educational projects, so every download counts.
April 29, 2013
Music By Programmers - Help Start A Programming Club
If you've not heard yet, myself and five other programmer-type dudes have been working on an album of electronic music to raise the money to start a computer programming club at The National Museum Of Computing and parent-child maths workshops at Bletchley Park.
The album, Music By Programmers, goes on sale today. You can download it from iTunes, Amazon and Google Play.
Every penny of the profits goes directly to these projects, and every download is essentially a donation of £4-5, depending on where you buy it. Be assured: every download makes a difference.
You'll also be able to buy a very limited edition CD version, featuring bonus tracks in a spiffy full-colour digipak from the Bletchley Park online shop later today. Only 50 of these exist.
Your support is vital to making these projects possible. If electronica's not your cup of tea, you can donate instead. There's a link on the website.
You can find out more by visiting the Music By Programmers website.
April 19, 2013
Dark Life & New Ways Of SeeingThis article in the Guardian about how some astrobiologists theorise that there could be a "hidden biosphere" that has evolved on Earth in parallel with the tree of life from which we sprang reminded me of that age-old problem of how we can expect to find things we're not looking for.
We similarly overlooked a big chunk of the mass of the universe because we looked at electromagnetic radiation, seeing only that which emits or reflects electromagnetic waves.
In software development, we to can naively interpret our inability to see something as non-existence of the thing we can't see. Typically, we can't see it because we're not looking for it.
These could be, for example, the bugs nobody tested for. Like dark matter, and dark life, the bugs are still there, and they can still bite. But I've seen too many teams apply the strategy of not looking, as if that somehow means those bugs don't exist. This is like covering our faces and assuming that, because we can't see other people, they can't see us.
New ways of seeing are therefore vitally important. We can "see" dark matter by measuring its gravitational effects. And we could see dark life by appying tests for a wider set of biological possibilities. Then a whole new world (or universe) emerges out of the shadows, and our understanding is expanded.
Developers may believe their multithreaded code has few bugs, but that may because they haven't tested it in multithreaded scenarios. They may believe their software is easy to use, but that may be because they haven't tested it users who weren't involved in the design. They may believe their software is performant, but that may be because they haven't tested it under a high load. They may believe their classes are loosely coupled, but that may be because they haven't looked at a graph of class dependencies.
New ways of seeing offer up new possible understandings. And I can't help feeling we, as an industry, invest far too little in expanded our senses so we can expand our understanding of software. Too much of it is about "looking at text files", and I find that limits our vision and restricts our understanding.
April 6, 2013
Science & SoftwarePairing with my apprentice-to-be, Will, on Friday, we got to chatting about the increasingly intimate relationship between software development and science.
i was reminded of something I heard at university (back in the days when we wrote our dissertations with quills). A PhD student who I hung out with had been working on a large-scale simulation of atoms in a crystal lattice to try and crack the problem of why washing powder clogs. His code was written in FORTRAN and was designed to run on the HP minicomputer - a monster of a computer, almost as powerful as my Android phone!
His research was building on work done by a previous PhD student, who had also written code to simulate the same crystals. His PhD was predicated on the assumption that he'd be able to take the existing code and adapt it for his research - in much the sdame way that an algorithm to traverse a tree and search for something could be adapted to traverse the same tree and search for something else.
Problem was that the existing code was impossible to understand, and the person who wrote it was long gone. This person lost several months rewriting the simulation from scratch.
Now, this was more than two decades ago. Much physics was still done with pen and paper or in the lab. But, increasingly, more and more research was done almost entirely using software to either search through large amounts of data collected from experiments, or to simulate physical systems that would be too expensive - or even impossible - to recreate in the lab.
I'm told by friends who carried on with physics that software-based research is much more common these days. And they often share anecdotes about the trouble software causes them that sounds jolly familiar. I wouldn't be surprised if a lot of research time and money isn't being lost to the kinds of problems we come up against daily in business when software's involved.
When I studied physics, they encouraged us to keep a diary of the lab work we did. The idea was that if, for example, we got run over by a bus, someone else could read our lab diary and continue our research. Hoorah - progress continues unabated.
The lab diary codifies your method. These days, I suspect, the method may - in at least some key cases - be codified as software. If you can't understand the software, you can't understand the method, and you can't continue the research.
Similarly, if the software is buggy, then your method is buggy, and your results and their conclusions are suspect. Cue faster-than-light neutrinos. The interactions of subatomic particles at CERN are interpreted by software. My first instinct on hearing the sensational news was "I'd like to read their code".
As science becomes more and more reliant on software, the integrity of our science will rely more and more on the integrity of our software. As yet, this is a fringe topic in physics. Universities may teach computer programming and computational maths, but they don't really help or encourage students to write software to a high-enough standard.
I can't help feeling that some element of the discipline of writing good software would benefit science students. But a science degree is already a big ask in terms of time commitment. Throwing in a day a week of "software craftsmanship" or "software engineering" may be the straw that broke the camel's back.
I do think, though, that the model of apprenticeship I'm proposing to trial with Will (and A.N.Other, if I can find the right person) could present a solution.
This is something I'm going to give more thought to.
March 20, 2013
Music By Programmers Release Date, April 29thJust a quick post for those of you who want to support maths and programming education, or who just like electronica.
The official release date for the Music By Programmers album is Monday April 29th.
It'll be available for download from all the usual outlets (iTunes, Amazon etc), and every penny of the proceeds will go directly towards parent-child maths workshops at Bletchley Park and programming workshops at The National Museum Of Computing.
These are very worthwhile programmes, and your support is vital to helping more children get to grips with maths and computing.
The new web site's up, so you can find out more and hear track previews at http://www.musicbyprogrammers.com
March 9, 2013
Music By Programmers - Raising Money To Educate New ProgrammersAs some of the more eagle-eyed among my Twitter brethren may have noticed, for the last few months I've been up to something in rare spare moments.
Finally, I can reveal what it is.
Over the last 4 years, I've spearheaded various shenanigans to raise money for the Bletchley Park Trust, and I've also been plenty busy rattling cages on the subject of getting kids programming.
I'm also a bit of an amateur musician (very amateur, some might say).
My latest wheeze has been to combine all these passions of mine, and the end product is called Music By Programmers.
Six software developers - real ones, coding day-to-day - who make music in our spare time have recorded a compilation album of electronic music which evokes that classic era when the likes of Kraftwerk, Jean Michel Jarre and Tangerine Dream were at their peak. When I began learning to program, this kind of music was always on in the background; the soundtrack to a golden age of kids learning to code.
We've got nine tracks from Chris Whitworth, Yuriy O'Donnell, Peter Camfield, Lance Walton, Brian Hogan and me. And, even if I say so myself, they're jolly spiffy.
Here's a sneak preview:
Made using only software (including the mastering, with Nagasaki Sound in Las Vegas generously donating their time and considerable expertise to make it sound - y'know - proper professional, like), we're going to try and sell as many downloads of the Music For Programmers LP as we can, and every penny of the profits will go directly to educational programmes at The National Museum Of Computing and Bletchley Park.
This is very new territory for all of us, and we have no idea how much we might make, but we've set a target to sell 2000 downloads and raise £10,000 in total. This may prove to be naive, foolish optimism. Or it may be we're underestimating the potential audience lurking out there. But, having raised similar sums several times before, we feel this might be a realistic goal.
£5,000 could sponsor programming events for kids at TNMOC, or codebreaking camps at Bletchley Park. It could buy a crate-load of Raspberry Pi's, or build web-based cryptography games for schools to use. Heck, we could do some genuine good with £50! It all helps.
The album will be released as a download in late April, with a bit of a fanfare to let the world and her husband know it's available, but in the meantime, we could really use some help spreading the word and building a buzz - well, you know how it is with this "Pop Music" that they have nowadays.
So, please check out the preview video, follow @ProgrammerMusic on Twitter and/or like us on Facebook.
Please spread the word - re-tweet, share, tell your friends, get it tattooed on your private parts and wave it at visiting royalty*, paint it on the side of passing asteroids**, and whatever else you can do to help us get the word out.
If we can reach the right people and meet our target of raising £10,000, that would do a lot of good for a lot of children. And you get an iron-clad excuse to do some cheesy 80's dancing, too. You can even roll up the sleeves of your jacket (the one with the giant shoulder pads), if you like. That's a Win-Win, in my book.
* Don't, obviously
** No seriously, though - don't