Large database file as part of Shiny deployment: alternatives?

637 views
Skip to first unread message

Daniel Buijs

unread,
Jan 8, 2015, 3:18:26 PM1/8/15
to shinyap...@googlegroups.com
I've noticed that there's a limit on the total Deployment size of 100 MB. I fully understand why there has to be some limit, but it's been a bit inconvenient for my app that comes in at 108 MB. 

Rather than ask for the limit to be raised (which would be a temporary solution, at best), I thought I'd ask the community if they have any ideas for the following situation:

I'm really new to building Shiny apps and I've only managed to get a handful working so far, but I'm seeing the following pattern in my work:

1. Grab a large public dataset, clean it, format it, calculate derived meta-data, and then put it into some disk storage format (I've found SQLite particularly useful).
2. Build a Shiny app that lets users search, filter and aggregate from the cleaned dataset.
3. Provide plots, reports, or links to other resources based on the user-generated subset. 

The problem is that the actual Shiny code is tiny, maybe a handful of lines, but in order to let users define their own filters and subsets on the fly, the whole dataset has to be available to the Shiny server. Right now this means tiny ui.R and server.R files and giant database file that pushed me over the 100 MB deployment limit. 

I just need read access to the database. I'm not altering anything in it, and the user output is tiny too. Because I'm storing it in a file instead of in memory, it's pretty light on RAM and CPU usage as well. 

Does anyone have any alternate schemes or ideas for Shiny apps that fit the general pattern of accessing a large static database that wouldn't take up as much space? 

Cheers,

D

Phill Clarke

unread,
Jan 9, 2015, 5:49:58 PM1/9/15
to shinyap...@googlegroups.com
Perhaps you could host the database remotely on another server?

The regular R odbc connector packages support this, e.g. http://www.stat.berkeley.edu/~nolan/stat133/Fall05/lectures/SQL-R.pdf

And here's a StackOverflow page outlining how to achieve it with the dplyr package:

http://stackoverflow.com/questions/22461848/using-dplyr-to-connect-to-ssl-encrypted-remote-database

Even more details here:

http://cran.r-project.org/web/packages/dplyr/vignettes/databases.html

Vance Lopez

unread,
Apr 9, 2016, 3:39:05 PM4/9/16
to ShinyApps Users
Hi Daniel,

What was your workaround for the application bundle limit? I have the exact same issue.

Thanks,
Vance

Tareef Kawaf

unread,
Apr 11, 2016, 6:38:47 AM4/11/16
to Vance Lopez, ShinyApps Users
The 100MB limit was raised a while ago to 1GB on the free plan and to 3GB at basic and above plans.  How large is your dataset?
--
You received this message because you are subscribed to the Google Groups "ShinyApps Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to shinyapps-use...@googlegroups.com.
To post to this group, send email to shinyap...@googlegroups.com.
To view this discussion on the web visit https://groups.google.com/d/msgid/shinyapps-users/867b24ef-7290-4f45-b901-be68d51c028b%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Vance Lopez

unread,
Apr 11, 2016, 11:38:57 AM4/11/16
to ShinyApps Users, vance....@gmail.com
Thank you, Tareef. These limits work for my application - I was able to get it uploaded after determining that the errors were internet connection related. The SQLite solution works well in this case. 


On Monday, April 11, 2016 at 3:38:47 AM UTC-7, Tareef Kawaf wrote:
The 100MB limit was raised a while ago to 1GB on the free plan and to 3GB at basic and above plans.  How large is your dataset?

On Saturday, April 9, 2016, Vance Lopez <vance....@gmail.com> wrote:
Hi Daniel,

What was your workaround for the application bundle limit? I have the exact same issue.

Thanks,
Vance

On Thursday, January 8, 2015 at 12:18:26 PM UTC-8, Daniel Buijs wrote:
I've noticed that there's a limit on the total Deployment size of 100 MB. I fully understand why there has to be some limit, but it's been a bit inconvenient for my app that comes in at 108 MB. 

Rather than ask for the limit to be raised (which would be a temporary solution, at best), I thought I'd ask the community if they have any ideas for the following situation:

I'm really new to building Shiny apps and I've only managed to get a handful working so far, but I'm seeing the following pattern in my work:

1. Grab a large public dataset, clean it, format it, calculate derived meta-data, and then put it into some disk storage format (I've found SQLite particularly useful).
2. Build a Shiny app that lets users search, filter and aggregate from the cleaned dataset.
3. Provide plots, reports, or links to other resources based on the user-generated subset. 

The problem is that the actual Shiny code is tiny, maybe a handful of lines, but in order to let users define their own filters and subsets on the fly, the whole dataset has to be available to the Shiny server. Right now this means tiny ui.R and server.R files and giant database file that pushed me over the 100 MB deployment limit. 

I just need read access to the database. I'm not altering anything in it, and the user output is tiny too. Because I'm storing it in a file instead of in memory, it's pretty light on RAM and CPU usage as well. 

Does anyone have any alternate schemes or ideas for Shiny apps that fit the general pattern of accessing a large static database that wouldn't take up as much space? 

Cheers,

D

--
You received this message because you are subscribed to the Google Groups "ShinyApps Users" group.
To unsubscribe from this group and stop receiving emails from it, send an email to shinyapps-users+unsubscribe@googlegroups.com.
To post to this group, send email to shinyapps-users@googlegroups.com.
Reply all
Reply to author
Forward
0 new messages