I have this issue where I need to store large amounts of files (compressed sparse matrices) in a highly structured, efficient way. My problem is these files are produced from a large number of heterogeneous scripts. This would be simple, except they use different languages. I have some using C/Fortran libraries, and others using matlab.
I've researched this topic and come up with a number of ideas, but there's not a whole lot of research in this area other than image storage.
Idea 1: Write a C lib that stores the output file in DB itself. This could be used in matlab, c, but I don't think fortran.
Idea 2: Write a standalone program that should be able to be called from any of the above that takes a file location as input and does the rest.
Idea 3: Have each script write to the specific temp folder (did I mention this has to scale?) and then another program collect the output and store it.
Suggestions?
I've researched this topic and come up with a number of ideas, but there's not a whole lot of research in this area other than image storage.
Idea 1: Write a C lib that stores the output file in DB itself. This could be used in matlab, c, but I don't think fortran.
Idea 2: Write a standalone program that should be able to be called from any of the above that takes a file location as input and does the rest.
Idea 3: Have each script write to the specific temp folder (did I mention this has to scale?) and then another program collect the output and store it.
Suggestions?