http://qs321.pair.com?node_id=329222


in reply to Yet another meditation on testing

I'm just confused, I guess what I'm really looking for is "real world" examples of tests for complex things, things you can't just test by calling a function and comparing it's output to a constant.

If you approach development by considering the test first--thinking through how to structure a method, class, or API such that it's possible to call functions and compare output--you'd be surprised how far you can get. Part of the trick is being clear about what you're testing. You want to test small numbers at a things at a time, faking whatever infrastructure those things live on top of.

Take your example. You're building a web application that manipulates a specific kind of server. The "obvious" way to test such a think is to simulate an input (a link click or a button press), then verify that the right HTML comes back. But these kind of tests end up testing too much stuff at once, and get messy quickly. It's far easier to break things down and test parts in semi-isolation. Not knowing how your application is structured, I can speculate that you have a layer that abstracts communication to the remote server. This layer can be tested in isolation, even in isolation from a real remote server, if there's a way to swap in a "mock" object for the remote server (e.g., by using a fake socket that's under the control of the test code). Then, you can write test cases that verify that if you tickle the abstraction API, the write bits get delivered to the socket. And you can test that the server abstraction correctly handles various simulated responses from the remove server.

Then, you can test whatever layer drivers the remote server abstraction by swapping in a mock implementation of the abstraction layer. And so on up the chain, small piece by small piece, until you're at the level of delivering test-driven URIs to the top layer of code, and are verifying that it's emiting the HTML you expect.

The trick is to figure out how to structure the code so that you can swap in mock implementations of underpinnings. This is a lot easier to do if you approach development by figuring out the test case first, but can still be done on legacy code with some amount of restructing.

Replies are listed 'Best First'.
Re: Re: Yet another meditation on testing
by flyingmoose (Priest) on Feb 17, 2004 at 20:14 UTC

    Automated testing, a tenet of the buzzword-compliant "XP" Extreme Programming, can be seductively dangerous though. Beware, you will write tests first, but they clearly won't encompass the problem....or worse, you will write tests later and will unknowingly write tests that all pass. A good test suite is very difficult to build, and the need for manual testing can never be eliminated.

    Far too often in the software industry does an emphasis on automated testing and formal test organizations (i.e. "testers by occuptation") result in poor manual unit testing. A developer really has to understand all of the corner cases before manually unit testing, and to write effective unit tests, he has to be even sharper.

    Anyhow, be warned -- automated testing is great stuff -- but it is not a substitute for the real thing. Your test cases passing doesn't mean there are no bugs!

      Anyhow, be warned -- automated testing is great stuff -- but it is not a substitute for the real thing.

      First, we weren't talking about "automated" testing. Second, what do you mean by "real thing?" Actual use? Ad Hoc testing?