Date: Mon, 16 Mar 1998 18:23:39 -0800 (Pacific Standard Time) From: Miles Logsdon To: Jeff Richey , Emi1io Mayorga , Bill Gustafson Subject: a framework for CODE Jeff, Emilio, Bill This e-mail is old news to some of you ... a bad attempt a summarizing long discussions for others ... and, may be not necessary. But, before going off and starting something that might "suggest" more WORK, I wanted to sum up a group of low-key discussions we've had concerning build a "product/project framework" for a group of C code, algorithms, file format conversion, I/O, etc. that Emilio and Bill have written, which supports the kind of analysis we tend to do. I've talked to you all and listen to others. It is time to bring this idea to the the "table" and talk about how far to developed it. In terms of an overview: We've got CASM (ie. milton products), We've got standard accepted products/algorithms from others (ie. MaxNDVI, albedo, etc.), we've got 1byte to 2byte conversion, we've got .bil to netCDF and HDF, we've got from Emilio a whole suite of analysis tools like correlation, differences, up-stream basins, etc.... all available once a dataset is chosen. Right now those tools are at a variety of different stages of "usability" by others. Some only Milton can do, others Bill has to organize, Emilio has much of his organized and accessed by writing a simple script that calls a variety of C functions ... but so much more is possible! What I'm suggesting is that we give ourselves a nice defined tasks that organizes what we've already agreed to do for EOS, package as a "product", and view it as foundation for our LBA work ... Let me explain: If we saw ourselves, (at some point in the future), handing out a set of executable code (ie software) to a group of researchers that share an image data collection that meet a limited requirement (common spatial extent, row & column, etc.) and by using this collection of tools they could perform file conversions for various commercial software, explore the dataset, prepare data for their model, view and explore the model output, prepare basic analysis of correlation, covariance, trends, etc. between model input and output.... then we would be describing ourselves RIGHT NOW as needing this!!! This is do-able if we don't try to become "software developers". My point would be that we think of putting all our "home-grown" tools into a common environment so that when we develop tools that we use ourself we can share it between ourselves and ...may be others. The key is ... we're doing this now just without the "shared framework". With basic I/O functions, conversion functions, etc., the various analysis tools can be added and adapted from a variety of sources. This is what Emilio has been trying to get me to see... And this is what Bill had been suggesting ... And this is "kind-of" what I see the CASA team developed ... And this is were we are often "defeated" when wanting to work together. If we come into LBA as a research team already with a toolbox for regional analysis then all the better. Let me make a point. I'm NOT talking about a model or building CASA into this... I'm talking about a common environment/framework for working on "building" a model. An environment that would let you get the data ready, compare it, view it, etc. So that if you had specific software where you want to do analysis you could, ... or if you write or steal a function from somewhere, the I/O and everything would be already there. So, with this e-mail I just wanted to kick this off and have you people shoot it down or suggest the next step. It means coordinating. It means being focused on doing just want is needed to be done. It means creating a product (something that people can say ... this works, that doesn't) But it also means getting a "LOT" of mileage out of what you people have already done.... "mileage" that could be seen as a tangible product, checked off a to-do list, cited as progress, etc. This deserves a frank and open discussion. Miles .....