Tuesday, 16 June 2009

TDD Problems for Practice

Every time I look or hear a TDD lecture, I always see the same examples. While its hard to go very deep in a 30-60 minutes lecture, the common used examples doesn't really reflects a real life scenario. How many in our daily job are coding a calculator?

When I'm asked (which happens 90% of the time) if I have more "real life" example I redirect to the TDD Problem Site:

The aim of this site is to contain a growing collection of software problems well-suited for the TDD-beginner and apprentice to learn Test-Driven Development through problem solving.

and the real cool good thing about those problems are that they follow the following rules:

  • they are real-world, not just toys
  • they are targeted towards learning TDD (that is: they are small and easy enough to work out in say half a day)
  • they don't involve any of the harder-to-test application development areas: GUI, database or file I/O. (since those topics are considered too hard for the TDD-beginner)
  • they have been solved by a TDD-practitioner previously, proving their appropriateness for this site

If you want to practice doing TDD and feels that its too hard to do on your production system. These examples are a great place to start practicing real life TDD.

Sunday, 7 June 2009

Myth Busted - NOT

In a recent post Scott Ambler starts by claiming

Agile Myth:High Quality Costs Less than Low Quality - Busted! (at scale)

later on when reading the actual content Ive learnt that he refers to very specific cases :

For example, in situations where the regulatory compliance
scaling factor is applicable, particularly regulations around protecting human life (i.e. the FDA's CFR 21 Part 11), you find that some of the URPS requirements require a greater investment in quality which can increase overall development cost and time.

and

This is particularly true when you need to start meeting 4-nines requirements (i.e. the system needs to be available 99.99% of the time) let alone 5-nines requirements or more. The cost of thorough testing and inspection can rise substantially in these sorts of situations.

In my opinion he went a little off charts with his claim.

First is the exact "Myth" so to speak? Is it a simple "High quality cost less"?

Well actually its a little more subtle than that. What the agile community has found once and again (as Scott mentions) is that it cost less to work in high quality mode, when you need to reach and sustain acceptable quality. After all quality is infectious. In general it cost less to produce crappy systems, but mostly those just fails when quality needs catch up.

But back to the examples.

I don't have experience with life critical systems. However, is there a possibility that what actually cost are the regulations themselves and not the underlying quality? Is there a way to reach the necessary life safety quality without following those costly regulations (at lower costs)? I don't know, I do know is that the FDA regulations are costly to implement and originate in a time before the Agile paradigm shift.

High Available (HA) system, on the other hand, I do understand. In fact I was in charge of developing a HA solution for a big billing company. And here Scott argument falls short.

Reaching 4 and especially 5-nines has nothing to do with the quality of the developed system. In order to get to that level of availability you must have an integrated solution and there lies the cost of 5nine systems.

So what myth has been busted?

Yes there are cases in which a specific aspects of quality will drive costs up. But taking a very specific examples and generalizing it to "Busted (at scale)" that an obvious Mistake in my book.

Wednesday, 3 June 2009

Evolving Design - Its a must!

A couple of days ago I read Gil's post about "Good Design Is Not Objective", in it he writes

So are there "good" or "bad" designs? Every person has an opinion. Opinions change over time. Problems and technologies change over time, and a design can help or interfere with the project, but you'll know that in hindsight.

Yesterday, at one of my client a simple question turned into a related discussion. One of the software architectures asked me if I ever saw (and what do I think of) a system which uses a global variable repository. One that every part of the system could use at will  to store data and is used as way of communicating between various parts of the system. Apparently that kind of an approach is used in their system and he wanted to know what I would say.

After thinking a little, I said that the only system which seems to me to be similar would be Windows registry, I said that I never encountered such a system in a project I was working on (at least not on the scale he was talking about). I also said that in my past each time I used such global variable (in a similar manner) I regretted it. And last that it really sounds like very "old fashion" approach to design. Something that kind of contradicts object oriented design.

The guy told me he said, that this part in question is very old and was written some 15 years ago when the company was at its startup stage. It appears that the entire system was evolved on top of it, which made it very hard to replace. in fact the cost was so high that each time someone considered it they decided to leave it as is.

Now I think this is a very interesting case that demonstrates how our continual search for the Best Design is just stupid.

There is no such thing as the Best Design!

Like most things, design also carries a context. In this case the current mechanism they have was (as the guy admitted) the quickest and cheapest way of reaching a product (one which people will buy). At that time (15 years ago) the technology (they were writing in C) and software engineering approaches were in line with such a mechanism. And it was a reasonable design at that point of time.

Is it still so today? absolutely not!

The company is no longer a startup, what was good at that time is really bad today. The needs has changed over time, the technology has changed over time, and the product has changed over time.

As Gil said it:

Time Changes everything

The Best Design today will probably wont be the Best Design tomorrow. A developer accepting that fact, can focus on producing a "good enough" design and evolving it as needed over time.

In my opinion that was the actual Mistake done. Its not that the design was wrong. its the fact that they assumed it wont need to change. That assumption made the cost of actually changing it high resulting in an unsuitable design they have today.

Tuesday, 5 May 2009

How NOT to write Code Examples

When working on one of my pet projects, I need to extend the visual studio IDE. Before starting out I wanted to get a better understanding of the IDE extensibility model. Therefore I allocated  some time to do a spike in order to get a feeling of the basic capabilities and the effort that will be involved. Usually when doing spikes I  like to dirty my hands as soon as possible, so I thought that best way would be to create my own sample add-in and play with it.

So the first step of business was to create my own add-in project, that was fairly easy to accomplish since there's a build in wizard for creating an add-in project (it hides under other project types-> extensibility:

addin-wizard

The next step for me was to try an add a custom menu command to be able to insert some custom logic and check out its behavior. It didn't took me long and I found the following article on MSDN: How to: Expose an Add-in on the Tools Menu (Visual C#).

After reading the initial description:

When you create an add-in by using the Add-In Wizard and select the option to display it as a command, the command is on the Tools menu by default. If you skip this option when you create the add-in, however, you can simply run the Add-In Wizard again, check that option, and then copy your existing code to the new add-in.

If doing this is not possible, though, the following procedure produces the same result.

I was encouraged. This was exactly what I needed.

So I went ahead and followed the instruction titled:

To add a menu command to an existing add-in

The first step was to copy paste a  given piece of code:

1. Replace or change the add-in's Connect class and OnConnection() procedure code to the following:

(actual code can be found in the article itself)

Followed by copy pasting 2 new methods:

2. Add the following two required procedures, QueryStatus and Exec

(actual code can be found in the article itself)

and that's it.

easily enough I did those things and went on to compile and test it (a nice thing about the add-in wizard is that it takes care of deploying the add-in and it puts in the debug command line what you need in order to safely open a new IDE instance with the updated add-in.)

did it work?

Of course not (otherwise I wouldn't be writing this post would I?)

Actually it was far worse then not working at all. When trying to figure what was wrong, I got this irritating inconsistent behavior that didn't point me in any way to what was causing it. At first I did something wrong, so I went ahead and did everything again (from scratch). Naturally that didn't help. Then I went and tried other examples from the web for the OnConnect method, but still nothing.

To make a long story short, after several frustrating hours debugging, I took the needed break and went back to reread the MSDN article. This time I paid extra attention to all parts. At the end of the article there was the following paragraph:

Each time you implement IDTCommandTarget, you must add these two procedures. A quick way to do it is to select IDTCommandTarget in the Class Name drop-down box at the top-left corner of the editor. Select each procedure in turn from the Method Name drop-down box in the top-right corner. This creates the necessary empty procedures with the correct parameters to which you can then add code.

The Exec procedure is called when a user clicks your menu command, so insert code there that you want to execute at that time.

Don't get me wrong. I read the entire article twice before, but on the third time it hit me. The Connect class needed to implement the IDTCOmmandTarget class. And  since the wizard which created the class didn't do so and nowhere in the article it was mentioned explicitly I didn't add it myself.

Lesson Learnt?

Well the obvious one would be

when everything else fails read the f***** manual

but that would be a cliche.

The actual message I'm trying to sent out here is meant for all those writing API documentation, articles and other code examples.

Please make the extra step and try your own examples before publishing them. Not on your personal machine. Try them out in a clean system where you mimic (as closely as possible) the steps that will be taken by those using the examples. Several hours would have been spared me (and probably for others as well) if that was done in this case.

Monday, 4 May 2009

The future of scrum

Lately I've heard rumors about SCRUM starting to define the developer role inside the scrum team. Yesterday in the 3rd Israeli scrum forum, those rumors were confirmed when Danko announced that the scrum alliance is contemplating adding a Scrum Developer certification course. When I asked for details, he said that its just the beginning, but the Scrum Alliance is now starting to discuss the various engineering practices and decide which of those to adopt as inherent parts of SCRUM.

Its about time.

A good Development process

One of the things that always bothered me about SCRUM is the idea that one can define a DEVELOPMENT process without actually going into how software is developed. This led me to describe SCRUM as a management process which is very suitable for managing software development projects, and not as a full development process. Therefore when I'm consulting companies trying to adopt scrum I always make it a point to introduce to them some of the engineering practices included in the XP process (mainly TDD, CI, Pair programming and refactoring). Others (Dave Rooney, Martin Fowler, Ron Jeffries and more) also expressed their concerns about SCRUM not being enough and in order to sustain the benefits and become a hyper-productive team one must adopt some engineering practices to enhance the SCRUM process.

Are XP and SCRUM merging?

Although it is very early in the process, I do believe that we are finally starting to witness one of the most expected moves in the agile community, the evolving of the SCRUM and XP methodologies into a unified methodology that will encompass all aspects of software development. I'm guessing that is only natural following the joining of key figures of the XP community to the Scrum alliance.

The birth of ScrumP

For me there was never a place for two sisters methodologies in the software world. In fact most of the places I've seen have chosen to a to develop using an XP SCRUM hybrid methodology. In those places which did not do so, it was mainly because they felt it to be too risky to take it all at once, and thought it better to focus on specific aspects of the process to begin with adding more as they get better.

So lets welcome the newborn baby and help him make his first steps in the world. I hope that when he grows up he will help us all do a better job.

 
Design by Free WordPress Themes | Bloggerized by Lasantha - Premium Blogger Themes | Walgreens Printable Coupons