I've noticed that there's a limit on the total Deployment size of 100 MB. I fully understand why there has to be some limit, but it's been a bit inconvenient for my app that comes in at 108 MB.
Rather than ask for the limit to be raised (which would be a temporary solution, at best), I thought I'd ask the community if they have any ideas for the following situation:
I'm really new to building Shiny apps and I've only managed to get a handful working so far, but I'm seeing the following pattern in my work:
1. Grab a large public dataset, clean it, format it, calculate derived meta-data, and then put it into some disk storage format (I've found SQLite particularly useful).
2. Build a Shiny app that lets users search, filter and aggregate from the cleaned dataset.
3. Provide plots, reports, or links to other resources based on the user-generated subset.
The problem is that the actual Shiny code is tiny, maybe a handful of lines, but in order to let users define their own filters and subsets on the fly, the whole dataset has to be available to the Shiny server. Right now this means tiny ui.R and server.R files and giant database file that pushed me over the 100 MB deployment limit.
I just need read access to the database. I'm not altering anything in it, and the user output is tiny too. Because I'm storing it in a file instead of in memory, it's pretty light on RAM and CPU usage as well.
Does anyone have any alternate schemes or ideas for Shiny apps that fit the general pattern of accessing a large static database that wouldn't take up as much space?
Cheers,
D