Hi gang,
I have server A. It receives emails and sftp from users adam, brain, carol and diane. Each user has an email account - adam@example.com - and an sftp account. These user accounts only receive email and accept sftp (details below).
Server A, upon receipt of an email (in, say, /home/adam/mail/new/), via a perl routine opens the emails and puts the attachments (say in /home/adam/email_processing/). If user diane sftps a file (to say, chrooted /home/diane/datadir/), the same routine grabs the file from there and moves it (to say /home/diane/ftp_processing).
It also logs the arrival of the email and (some work to be done here) logs the arrival of the ftp file. Let's call all of the above the "receiving" stage.
Another process then checks the processing folders - /home/../email_processing and /home/../ftp_processing/ - and works out what to do with the files and writes the details of the actons into a database. Let's call this the "pre-processing" stage.
Then another process reads the database actions and carries them out - in reality, cracks the files open and sucks the data out of them. let's call this the "processing" stage
Once they're processed, the files get moved to /home/../processed_files/. Eventually they're archived. The "archiving" stage.
The users, once the file is delivered to server A, have no contact with the files. Everything after the delivery stage is handled by user root.
I'm thinking that there should be a server B that does the pre-processing bit and stores the processing database. Not particularly powerful. it could also handle the archiving stage. And a server C that does the actual processing - the most power of all (files are all excel/tab delimited/csv/databasy kinds of files).
All servers are on someone else's hardware - probably Digital Ocean.
Now here's the question. How should I allow each server to access the files? NFS share? NFS via private networking? rsync the files from server to server? Something else?
Any ideas gratefully accepted.
Cheers,
---=L