I'm a member of a University IT team who runs a group of computer labs for several engineering departments. One of the requirements of these labs is every single computer being able to run fairly large software suites; things like ProE, AutoCAD, etc. The computers themselves don't have very much disk space; some of them now have 500MB drives while most only have 80GB drives.
We've been limping along with the tcl-implemented 'modules' paradigm: http://modules.sourceforge.net/
Users, frankly, hate using modules. We're asking ourselves (and now I'm asking you) what could be a better way of providing software to our users when the following is true:
*The total amount of installed software is (likely) larger than a single client's disk
*All software must be available on all computers (No Mechanical Engineering Lab vs Electrical Engineer Lab vs...)
*We'd like the software to run as smoothly as possible (modules are over NFS, engineering programs over NFS can be quite sluggish)
*We really dislike using modules
Any input is greatly appreciated. Thanks!
[link][10 comments]