Σάββατο, Απριλίου 30, 2005

.NET headaches!

I hate Variants, but unfortunately I have to use them. Since, the components I am writing are supposed to enhance Datasnap somehow, I had to figure out how I could "package" info (to be sent from client to server and vice versa) as OleVariants (in VarArrays of VarBytes so that this would succeed in any circumstances). I implemented a couple of classes that do this kind of stuff (is marshaling the correct term?). These classes used a lot of Move, VarArrayLock e.t.c. instructions and pointers and (with a little help from Manuel Parma) I was able to achieve what I wanted at first place.
And then, a brilliant idea came! I should at least try and port this code to .NET ! That gave me a big disappointment. (I am a newbie to .NET and I don't think the starting point of learning .NET should be how to use IntPtr and Marshal). Besides, OleVariants (variant arrays of VarBytes at least) have a new format (they should be just simply casted to TBytes if I've figured it right).
At first, all my code was stuffed with IFDEFs and that was a real pain. Then, I read this excellent article from Chad Hower and decided to follow his advice. I tried to use polymorphism and tried to isolate all IFDEFs in a particular unit. But, may be the most difficult part was to get rid of all these pointer operations (redesign and reimplement the related classes). I am in the middle of this, but am thinking of wrapping a TMemoryStream to do all the job I need (both in Win32 and CLR) .
I think I 'll also follow another advice I've read somewhere (don't remember where, may be in the same article) and stick to the RTL. For example, I decided to convert all my TList objects to TObjectList and this way avoid doing all these hard type casts like:
TObject(FooList[i]).Free;
.NET is a new world for me and even though it seems fascinating in many aspects, trying to build a cross platform app (component) is a real headache! I would certainly appreciate any kind of help (may be I should buy a book or something) , but I certainly don't like others doing my homework. So, I'll just keep trying on my own!

Δευτέρα, Απριλίου 11, 2005

Black box programming

Black box programming is a commonly used term and it has been used with both good and bad meaning.
It is considered good, because for example a function should do what it is expected to do without any further side- effects and the developer using this function (inside the black box) should rely on it without knowing how it internally works. It is as if we are trying to step on a rock and we shouldn't worry about falling down or how the rock is "thinking" of handling our weight.
On the other hand, it has been used with a bad meaning because it doesn't let developers have full control of how something is achieved. An example is how ADO (simple ADO) handles applying updates. Some call it "automagically", some think it is black box programming (very... very black).
Where is the fine line that makes black box programming techniques useful and at the same time lets users have full control?
I believe an example that walks perfectly well on this thin line is Delphi itself. It is a RAD tool (automating a lot of procedures), but at the same time gives a developer full control of what is going on and how it is done. It performs things "automagically", but lets developers participate in magic (Aren't Delphi programmers magicians?!!).
On the other hand, let's consider ADO .NET. A lot of people complained about how ADO handles all applying updates logic in a black box not letting them intervene. MS with ADO .NET tried to balance on the thin line I am talking about and it is true that it gives developers full control (more than Datasnap I have to say). But on the other hand, a lot of code has to be manually written (even though wizards of various RAD tools including VS .NET and Delphi 2005 simplify things a lot). I believe MS overdid it.
I think if ADO is an example of a black box and Datasnap and ADO .NET are examples of boxes made of glass then ADO .Net box is more fragile than Datasnap is (you have to touch both to make them work, but more manipulations are necessary in case of ADO .NET and thus risk of breaking is bigger)

Datasnap still better in some aspects

I believe Datasnap is still better in some aspects:
  1. Cds.Delta property includes automatically what is needed to be sent to server. With ADO .NET a developer has to manually define what is to be sent (through GetChanges method or using custom methods like Select method). GetChanges has some problems though, because it returns a dataset that is different than the original and the whole procedure is much more complicated than the way Datasnap handles things)
  2. I read (p. 488 of ADO .NET by David Sceppa) the paragraph titled "Working with AutoIncrement Values and Relational Data". It ends saying " Thanks to functionality of DataRelation objects, cascading the new autoincrement values throughout you hierarchy is the simplest part of the process". However, it is considered a default behavior to "submit the new orders, retrieve the new autoincrement values for the new orders (master) , apply these values to appropriate line items (detail) (automatically through cascades) and then submit the new line items to the database". This means, at least 2 calls to the server should be made. Using my method of applying updates this is performed automatically inside a unique call to app server with automatic wrapping in a transaction (in ADO .NET the user has to explicitly define transactions- this has its pros and cons)
  3. The Reconcile method of DataSnap merges propagated changes from server using internal RecNo property and this way it finds where in the delta to "merge" the changes. ADO .NET relies on primary keys to achieve the same thing (using Merge method). This means more manual work from developer perspective: (p. 499) "... removing the new orders (with the "dummy" OrderID Values) from the main Dataset just before we merge in the DataSet returned by the Web service", or alternatively "Change the primary key.... call them PsedoKey". I believe this is much better handled in Datasnap and simplifies developer work a lot (if only it weren't for QC 11761....)

Κυριακή, Απριλίου 10, 2005

Cds versus .NET dataset

I followed Bill Todd's advice in a question I put on Borland newsgroups related to comparison of cds and .NET dataset and finally purchased a copy of ADO .NET book by David Sceppa.
I've been reading for a couple of days now and my first impressions are:
  • Even though, ADO .NET is radically different in design compared to DataSnap, there are a lot of common concepts and things that could make a developer using one technology to port to the other.
  • Terminology differs (as expected). For example, Cds.MergeChangeLog is similar to Dataset.AcceptChanges and Dataset.Merge has similarities with RefreshRecord. That makes things rather complex for a developer who would like to be able to use both technologies. What I've been thinking is that I (with my components) contribute to this chaos. For example I use the term hierarchical cds to refer to self-referenced datasets that can be used with tree- view like components, whereas in the book I'm reading it is equivalent to nested dataset structures (master-detail-detail...). In addition, may be it should be better to rename IsClientServer property of my components to something similar to AutoIncrementStep and introduce AutoIncrementSeed property as ADO .NET does, so that it becomes easier for a developer (used to develop on ADO .NET) to get accustomed to my components.
  • I have already posted that I try to mimic some of .Net dataset features building my components. My main problem at this point is to support other kinds of fields beyond TIntegerField. TNumericField descendants would be quite easy, but TStringField type fields require some special considerations, related to how autoincrement behavior should be handled. I'll continue reading this excellent resource and may get some ideas.
  • In ADO. NET, when updating changes back to the database and if a primary key value changes (through a back-end database generated value) then this value is propagated back to the "client" and then built-in mechanisms of .NET dataset enforce referential constraints. That reminds me of my first implementation of my components. As implemented at this point, all of this "work" is moved to "server side" and is performed by my provider component. I don't know if this is a better solution, I just followed this (more complicated) approach by intuition, because this way I think have better control of the procedure of applying updates (not proved though). For example, I thought that this way I would be able to handle better situations like updating self-referenced (hierarchical) datasets or handle circular references (at this point I am definitely not satisfied with the way I support these cases)
I'll keep on reading this exellent book and may be I'll post a (simplistic) summary of what I think are the most significant things to be considered by a developer, so that moving from Datasnap to ADO .NET (and vice versa) could become easier. I am not aware of any resource that already does that.

Δευτέρα, Απριλίου 04, 2005

What now?

I think I've made two components that are quite functional at this point. As I review on how they are coded and designed I conclude that (implicitly) all I did was to try to mimic features of .NET dataset that client datasets lack (data relations in particular). On this post I'll just give a summary of what I think is left to be done, so that they may become even more useful:
  • Figure out a way of incorporating the UndoLastChange technique I show in my Interbase example application in component code (may be add a unit level procedure and something like UndoCdsKTList even though this will be very restrictive in that all CDSs should be placed inside a single form or datamodule and I don't know if this is thread safe).
  • Use Cds.SavePoint in conjunction with the previous method so that I can make 3 procedures (probably also at unit level) StartLocalTransaction, CommitLocalTransaction, RollBackLocalTransaction
  • The above two methods can be combined with re-introducing Cds.UndoLastChange, GetSavePoint and SetSavePoint (since none is declared as virtual) so that they become non- functional (care should be taken not to use for example standard TClientDataset Actions with my components- I should note that on my help file).
  • May be I should provide a way (probably bi- directional) of setting the primary key property in relation to pfInKey flag of a TField. I should also make primary key field readonly
  • I am not thinking of providing support for composite primary and foreign keys. This is much too complicated for me to handle.
  • I should probably support somehow all kind of field types (other than TIntegerField). What puzzles me though is how to provide AutoInc (or AutoDec) capabilities in these fields (I should probably let the user set the primary key value and not change it during applying updates- just use these fields to enforce client side constraints, or alternatively I could make the temporary values behave as integers (casting to and from and integer is rather easy) and let the user/ developer define the permanent value on my OnGetGenerator event)
  • Add support for circular references (and probably self- referenced datasets and drop support for hierarchical datasets as they are implemented right now since now it is very restrictive). What I think is very difficult to accomplish is to support the way applying updates is handled (related to circular references). For example, if I insert 2 records in two circular referenced datasets, then I should first "apply" the one that has TField.Required property of a foreign key set to false, then the other and then I should provide an "Update TableName ...." SQL clause to update the first table foreign key value. This is very difficult to accomplish and will make code very complex (any idea accepted).
  • A lot of discussion is made recently about VCL .NET and how useful it is (primarily in migrating existing Win 32 applications to .NET). I should probably provide support for Datasnap .NET, however my experience with .NET is very limited (for example I can't figure out how to provide a property editor on .NET).
  • May be I should convert my ForeignKeys property from string property to a collection property (making code more clear and helping in migrating to .NET)
All the above are quite a lot of work to be done (not to mention testing and bug fixing). However, I don't know if I can succeed all this on my own I have to say (.NET implementation in particular) . Besides, time will be limited from now on. (I start a "real" job and medicine is very demanding from time and personal resources perspective). This will depend on feedback and as long as it is minimal I am thinking of "freezing" the whole effort.
After all, the way my components work at this point covers (not to say exceeds) my personal needs and I am very happy about it. (I use Win32 and I use "surrogate" keys (see also section on surrogate keys) and I've been able to build a rather complicated medical record database application- Having 10000 drugs in a KTClientDataSet as a lookup table and being able to "download" new "definitions" as antivirus products do, is a design that satisfies my needs)

Παρασκευή, Απριλίου 01, 2005

Where are bugs coming from?

As I keep on testing (I use a rather complex application with 23 tables, multiple relationships and combinations with nested datasets) I found a rather odd bug: I promised (primarily myself) that there isn't any need to Refresh datasets, so that primary and foreign key values are in correct place. But, I ...failed! Something certainly wasn't on the right route. As a low self-respect person (referring to developing efficiency), I reviewed my code and indeed I found out some problems (mainly related to foreign keys having null values) and fixed them (or at least I think I did). But, the messing with nested dataset foreign key values remained! I got disappointed at this point.
Then a thought came by: try to reproduce the same problem using plain client datasets and Dataset providers. If the bug remained, that meant that this was a midas bug, not something wrong in my code. And it did. I formed a test case application and posted on QC# 11761 (I debugged into source code, but was not able to figure out where the problem occurs. Provider.pas seems to do all things correctly and returns all the propagated values in the right order. I believe the problem is somewhere in the implementation of Reconcile_MD in midas.dll).
Anyway, the generic question is how a developer reacts to bugs and where do they come from (my code or a bug in the code base- library I am using). My opinion is that it highly depends on developer character (arrogance, self-respect), capabilities and experience. In my case, it was more or less intuition driven from disappointment.

Τρίτη, Μαρτίου 29, 2005

Primary keys in nested dataset

A main issue I had to solve, so that I could cover cases of nested datasets (with nesting level >2) was how to handle primary key values if it belongs to nested dataset. Client datasets consider two nested datasets of the same "depth level" to be different and this means that for example values 1,2,3 of one nested dataset and 1,2,3 of another represent different records of the detail table (in back-end database). Of course, under normal circumstances this is not a problem since these values are temporary and are replaced by a permanent value from the database back-end.
However, this was a major problem related to use of these values in my components as primary key values referenced by some foreign keys of another Cds (not belonging to this Delta Tree). I had to implement custom Autoincrement (decrement) behavior of these primary keys, so that values they take are unique among all nested datasets of the same "depth". This made applying updates also safer (using my custom solution) and building complicated file based apps (with many CDSs relating to each other) feasible

Σάββατο, Μαρτίου 26, 2005

Found a work around for D2005

I continued testing, this time on Delphi 6 and the same bug that occurs on D2005 came up. I tried to figure out what was the problem and since in all cases I used D2005 midas.dll the problem should be somewhere inside DBClient.pas. I compared the relevant code and found out that only on D7+supplemental database update (in CloneCursor implementation) a call to CancelRange was made (in case Reset parameter is true). I just added this call in my overridden CloneCursor method and this way fixed the problem.
What I don't understand though is why was this change not included in D2005 version of DBClient.pas. May be a call to CancelRange causes another problem I can't imagine of.