I successfully wrote a small test of a COM server and a VB client, following the CVF Professional documentation and using the COM Server Wizard. I have a couple of questions about some things that seem to work but that I don?t exactly understand.
In the sample, instance data is passed to individual methods via a pointer, called ObjectData, to a user-defined data structure. This would be unsatisfactory for us since the whole idea of our project is to recast a large amount of legacy code in the form of a COM server. We do not want to rewrite all this legacy code.
I found that I could define instance data in the form of members of a COMMON block. This seems to work just fine. Instance data are store in COMMON blocks, and multiple instances of the server are nicely independent of each other. This is what we need.
Also I found that, in the subroutines that implement my methods, local variables, which I defined to be static as specified in the CVF documentation, are independent across instances and behave the way static variables in a subroutine should. This is just what we need because our legacy code often depends on local variables in subroutines being static, so they retain their value when a subroutine is re-entered.
1. Is the passing of instance data through the ObjectData pointer simply done for convenience, and storing instance data in common blocks is equally valid? Or is there something else I need to know?
2. Are my variables in COMMON, and my local variables, stored in a heap (or some orther sort of memory structure) that is created when my COM server is instantiated? If not, how are they stored?
Your VB client is probably running a single instance of the COM server in-process. Running multiple copies of the VB client is not a good test as each VB client has its own in-process server, therefore you wouldn't expect to see any data collision.
To test the server properly you need to create multiple instances within the same process, e.g. Windows MDI application (one instance per child window); ASP web page (all clients running from a single server process).
I have also done a large legacy conversion to COM Server using the Fortran COM Server wizard. I can definitely say that COMMON blocks and SAVE variables do act as global data when tested properly (i.e. they collide). In my project an important step was to pull out all the COMMON block and SAVE definitions and implement them as instance data. The worst part was that all the global variables references in the code had to be changed from "XXXX" to "ObjectData%XXXX" which is enough to test anyones patience when there is 100,000 lines of code!
Thanks for this information. I will be testing the COM server on ASP and ASP.NET very shortly.
Windows.NET has the ability to integrate with native Windows DLLs. Suppose we abandon the COM server wizard, compile our legacy code as a DLL or DLLs, and integrate it with an ASP.NET web application. Do you think that we could then avoid the need to convert all the code to change all variables to parts of a single Instance Data structure?
We have 330,000 lines of code. I have written a preprocessor that gets fairly deep into the syntax of this code, including creating symbol tables, and converts it to f90. Perhaps it could be extended to handle the conversion of variables. Did you convert 100,000 lines of code by hand?
No! I didn't do all the conversion by hand. I wrote various editor macros to assist. It's not a straight 'search and replace', you have to lexically analyse the code to make sure it is a variable name you are changing.
I'm not sure about calling a plain DLL from .NET. I know there is COM to .NET interoperability but I'm not qualified to offer a further opinion (in a few months maybe...) Perhaps someone else would like to comment.
When we have a Fortran.NET compiler (Compaq/Intel please note!) we may be able to build our legacy code to native .NET components??
Well, last month I looked at the sales hype from some of the other Fortran compiler vendors -- Lahey, Fujitsu, and Salford -- regarding their plans for .NET products. These may not support certain legacy Fortran features, for example EQUIVALENCE (our code is highly littered with EQUIVALENCEs). So the advent of Fortran.NET compilers may not help much with the problem of migrating legacy Fortran code to current platforms.
Perhaps the "integration with native DLLs" capability provided by .NET Platform Invocation Services will come to the rescue. I hope to report results on this in the near future.
If not, converting our code to use a single global structure for all variables will be a challenging problem.
We want to migrate a large legacy application to .NET, where the result will be the ability of a single server to handle multiple user sessions of the Web application, all accessing a common database. I wonder how large the market is for tools to accomplish this?
I can tell you that we have no current plans to offer a Fortran compiler that generates .NET managed code, but that may change in the future. Our focus is on providing tools that facilitate integrating Fortran code in a .NET environment through COM. You can use CVF's COM Server Wizard to create a COM server accessible by managed code, and we're working on a tool that is similar to the Module Wizard for facilitating calls to .NET objects from Fortran compiled code.
If you go here and download the Add-In and get MASM 6.11d as distributed with the Win98 DDK from Microsoft you'll be able to:
1. Statically link your Fortran code with VB and limit the changes to your Fortran to those required to access VB procedures and methods without the complications of COM. 2. By turning your VB into an AXDLL (server), export whatever entry points you want to expose to your Fortran code (client), again with minimum changes to your Fortran code.
Either way you don't have to mess with/up 100-300K lines of code that is presumably worth salvaging. A side benefit is that the dog (F) now wags the tail (VB) and not the converse.