Hey there Shelley --
Your users could have (for example) access to some network share or somewhere on the server that they could save their files. And your script could start (before the Rscript part) by doing a "find" command for any files sitting in that folder. That's just an idea ... but you could definitely have it run on a single cron job.
Either make sure their data gets stored as a file in a specific location on the server, or I would recommend setting up a MySQL server and then using DbWriteTable (or is it DbCopyTable?) functions to just throw it into the database server. Much easier accessed that way.
If your front end app could simply accept their data and place it as a file on the server or into a database, the cron script could run and just check those locations for anything that needs to be processed. You'll just need a way of defining which user is waiting for the results from which files - either putting this information as metadata inside the file name, or having a database table linked with that information. That's a discussion we could have another time. (If you're interested in the database method, I can explain how we do ours in private emails.)
The big issue you're facing is making sure that if they do not wait for processing to finish before closing their browser, the data could go into the oblivion. So to have a reliable system, it will definitely have to have a few moving parts. A Shiny front end is great, with a cron-style R back end.
We used to have our data stored as files, but the more you work with, the harder it is to deal with. Databases are great because we can index each dataset and flag it as new, processed, processed but not reported yet, etc.
Learning dplyr, DBI and pool package functions will get you sorted proper with the database input and output.
It's been a long week, so I'm probably not making any sense at this point ha ha. But feel free to ask any questions you like.
Have a great weekend.