Practicing XP in its extremes: VB/Ruby tradition plus unit testing
What am I doing – all those “no framework approach” stuff?
I am starting a project that is the beginning of many loosely connected small projects.
It is TDD in its extremes. It seems that everything TDD says is literally in reality: little official design, consciously anti-framework, on-cubical customers ...
You may also say it is the traditional VB style, you just put a TDD label on it! – yes, I totally admit it, only with one point-out: the team is currently all painfully realizing that we need regression testing, and QA cannot deliver it, and it is not their fault at all and cannot be "fixed" in any foreseeable future – so, we all know that developers must take the responsibility to provide regression testing. To implement that feature, we, very naturally, have automated unit testing.
I have been theorizing (material for pattern-writing?) about OTTEC. Now, it is time to use the theory, tentatively, of course.
I mention Ruby, because nowadays in web development it means the same thing as what VB represented years ago in winforms development: productivity at the level that it is done immediately when you have done the analysis; therefore developers are the analysts, same person, same time.
9 Comments:
Traditional VB coding as TDD. Winform which are used by developers by developer for NUnit tests and then used by Business Analysts for Fit Tests.Traditional Vb applications are just combination BLL,DLL,NUNIT,Fit tests, all rolled into one. Where is all this leading us to ?
Just kidding. :) I have to read this Winform Databinding book which created a Mid-life crisis for tought person like you.
Please read it – I know you are re-reading CSLA; it is very good to read them together.
I am trying to find a few rules (only a few -- it has no use if there are too many), hopefully FxCop-able, to ensure it is indeed unit testable (and centralized authorization and data-access auditing).
I need someone who wants to and is able to say no to dataset, yet with a pragmatically open mind – that is you, to answer the question: is it time to say dataset is just one variation of OR mapping or custom classes, and we can/should use both in one project?
I hope you say yes; but I want some one is capable of saying no.
Hi Survic,
you have a way to force me to write a post which has been on my mind for some time.
I never liked the dataset as a machanism to transfer data between
different layers because of following reasons
1. I seriously read Rocky's writings and you know how highly he thinks of dataset(tongue in cheek).
2. Performance. I was instrument in getting rid of dataset from two of my projects architecture.I am going to do some benchmark tests on net2.0. I know that in net1.0, there was four times difference.
3. More code. You have to write more code to package the data in dataset and retrieve. Remoting and WebServices handle serialization and deserialization to great extent. Before Generics, get rid of dataset was a way , I was able to get rid of 1000s line of code and get more performance.
Recently I did some benchmark tests on generics from my office computer(Which has less memory) and got more encouaging results. I will post them soon on my blog
Having said that I may use Dataset/DataTable in following situation
1. When I have a situation where columns of a custome collection are variable. DataTable/Dataset IMO offers more clean solution.
2. Disconnected rich client architecture
http://forums.asp.net/thread/1337006.aspx
http://forums.asp.net/thread/1216039.aspx
http://forums.asp.net/thread/1352578.aspx
You know my posting handle.
1. I heard of the performance is fine now in 2.0. I just blindly believe it, until I get burned, or not.
2. You said “More code”. This strikes me. The thing that drives me away from custom class framework approach is the fact that it has so much repetitive code that it has to depend on code generation. To be fair, dataset also uses code generation, but it is from M$ -- I know, you will say that what is the difference -- but you have to admit that by that it means it is “standardized”.
3. Most importantly, when people use dataset, they tend to use it in a messy way – so, it is almost a self-fulfilling prophecy. Of course, the “base” must be worth it. For 2.0, I feel that it is worth a try, instead of just half-hearted hacking.
The determining factor is whether we can do unit testing easily or not.
So, I am re-read the data-binding book to examine and design (if you may) the following:
1. (how bad it can be) how much we have to duplicate the formatting/parsing-validation logic; how to transfer some CSLA technique to it?
2. (how good it can be) and how to use it as a way to avoid custom collections.
Note that this effort actually caught my attention by Rocky’s blog. I do not know what has happened to that specific book he mentioned; however, the data-binding book I am reading is definitely in the exact same spirit: due diligent strict 3-layer effort for dataset approach.
Here is Rocky's blog:
http://www.lhotka.net/weblog/CommentView,guid,ad5be814-6063-43e0-b703-932771444b98.aspx
The two examples in the binding book, the first one (around listing 4.6) is intentionally twisted, and the second one (around Listing 4.7), are very good examples: if we use dataset as 4.6, then, surely we are in trouble. If we alwasy do it in 4.7-like way, perhaps it is not that bad.
The thing is, we need to find a few techniques and small (not big ones) utility tools so that 80% of time we can do it nice and sweet. And further, when it cannot be done, we know it clearly and therefore can push back, gentle. If users insist, then, we can use custom class with collections (but without portal and its limitations), and we know how to use them together nice and sweet.
OK, is that just a dream, or, doable immediately?
Very soon ...
Post a Comment
Subscribe to Post Comments [Atom]
<< Home