Has test-driven development fallen out of vogue?

If so, why?

Thanks for your help,
Rob

Test-driven development has not fallen out of vogue.

What has fallen out of vogue somewhat (thank God) is EXTREME!!!1111 Programming, yet another doomed-to-failure methodology cult that included strong test-driven development as one of its precepts.

(Note: There’s nothing wrong with the particular methodology that XP entails; it’s just another thing that Management fails to recognize is merely a tool, and not a commandment. Also, pair programming is annoying as hell.)

What are the features of XP? TDD, continuous integration, frequent releases, and pair programming? What does management do wrong?

Thanks,
Rob

12 practices of XP (per Kent Beck):
[ul]
[li]small releases (lots of small project iterations with usable code at the end of each – we do 1 week iterations)[/li][li]planning game (engineers estimate size and capacity of jobs, customer prioritizes)[/li][li]refactoring (early and often)[/li][li]testing (usually TDD)[/li][li]pair programming (2 engineers working together)[/li][li]sustainable pace (a consistent brisk pace is better than “crunch time”)[/li][li]team code ownership (spreading the expertise and knowledge around, anyone can change any code)[/li][li]agreed coding standards[/li][li]simple design (don’t build it if you don’t need it)[/li][li]metaphor (a common frame of reference for thinking about the system)[/li][li]continuous integration (pairs must integrate at least once a day, often more frequently)[/li][li]on-site customer (access to an actual user / customer is easy and constantly available)[/li][/ul]
Where does management go wrong? Thinking this a universal panacea or silver bullet; XP requires dedicated, motivated, and trained / competent engineers. Treating people as interchangeable resources and figuring you can throw any old schlubs into the team and it’ll still work. Or thinking that you pull people out and screw up velocities, but XP will still make it all better. Not providing appropriate customers (you cannot just give the engineers the person you can spare – and being a customer is a lot of work too: Agile projects best succeed when there is a “customer team” with a broad range of experience relevant to the business problems being solved available to work with the engineering team).

Some people love it, some hate it… I’m in the latter camp. :slight_smile:

Fortunately as a UI / interaction designer I have a foot each in the engineers and customer camps and do most of my technical work alone (after consultation with the customers) and then work with the engineers to integrate changes.

I’m ambivalent about it, actually. I had some good experiences doing it when I was in college, but that was before pair programming was “Pair Programming.” We just did it that way because we had to work with a partner on the labs.

When it was mandated by a manager who got bitten by the XP bug, the conversation went something like this:

Manager: friedo, I want you to add this feature.
Me: OK.
Manager: And you’re going to use Pair Programming. It’s EXTREME!!!11.
Me: Uhh, OK.
Manager: So work with Steve on it.
Me: But Steve is a database programmer. He doesn’t know about the front-end stuff.
Manager: Exactly. This way he’ll learn.
Me: But he doesn’t even know Perl!
Manager: That’s OK. This new methodology is z0mg EXTREME!!!1111oneone
Me: Given that Steve has, absolutely no procedural or object-oriented programming experience, by the time I teach Steve object oriented Perl, the structure of our front-end app, how to use all the nine billion CPAN modules that it depends on, how our release process works, and how to use version control, we will have lost all our clients for lack of any new features.
Manager: Sounds good. Remember, red, green, refactor!
Me: :: facepalm ::

As it turned out, Steve thought this was also a horrible idea, and I ended up getting paired with one of the other application developers instead, where we proceeded to become exactly half as productive as before.

Benevolent deities save us from incompetent managers and the “new new thing”. :slight_smile:

The manager of the engineering team I work with is an Agile / Scrum / XP evangelist (one might even stretch to “zealot”), but at least he knows what the heck he’s doing, and why. We also have a couple of good senior engineers with a solid handle on the methodology and what to do and avoid. But when the team was initially established a couple of years back the resource plan for the development of our new browser based, java back-end product was 20 engineers. We got 20 people assigned, true, but several were business analysts, one a DBA, and another a security expert! All of whom were expected to pair program and write java code, even if they had no experience with java or any other part of the new tech-stack. :rolleyes:

I am interested in TDD, but the examples that I see are too trivial to really get across the kinds of tests you should write and how simple they should be. Also, I was never clear on how to write tests for several interdependent features, like a CRUD service. I need to use the read function in the create test, even though the read function is not yet been tested. What do others do here?

Thanks,
Rob

Some useful things to remember:

  1. Your tests should tests your interfaces (or contracts, if you like), not the underlying code. You want to be able to completely change the code inside, leaving the interface the same, and have all the tests still pass. Everything should be a black box.

  2. You don’t want to test code that isn’t yours. (By “yours” I mean the project you’re working on.) That means all the third-party libraries and classes you’re using for your app, the database system you’re talking to, etc. Simulate these things with mock objects if you can.

  3. Your tests should exercise as much of the underlying code as they can. (Test coverage). For example, you test a method by passing n=42, and getting the expected return value of 23. But you forgot that this method has a conditional that checks if n is greater than 50, and does something different if it is, so that code remains untested. Therefore you need to do at least two tests. Perl has cool coverage utilities like Devel::Cover which will give you all sorts of stats on which blocks are covered by tests and which aren’t. There are probably similar tools for your platform.

The order of the tests shouldn’t matter; each one should be able to run completely independently. For a CRUD app, my recommendation would be to simulate the database or ORM layer with mock objects, then test that your Read function is requesting the right data from the mock object, that your Write function is sending the right data, etc. This saves you from having to use a real database to test against, which is outside the scope of unit testing. (BUT: That doesn’t mean you shouldn’t test your database. Just that it shouldn’t be part of your TDD. That’s more of an integration testing and regression testing process.)

Do you know of good examples of this? I am a java developer btw. At work, our ORM is jdo, but hibernate will do fine.

Thanks,
Rob

A colleague pointed me to this article about using JMock with Hibernate. You could probably do pretty similar stuff with JDO.