Importing Large Vagrant Image

33 views
Skip to first unread message

Yogesh Rananavare

unread,
Jul 20, 2015, 4:21:47 AM7/20/15
to vagra...@googlegroups.com
Hi, 

We are creating a database vagrant image which loads database environment for developers.
Currently the size of the Virtual Box image is 550 GB. 
When we create a Vagrant Image and try to import it would copy this 550 GB image from the source. 
Then when we Vagrant up it would explode the box file and consume another 550 GB. 
The challange is that we have 1 TB hdd for developer, importing such a big box file creates redundancy and consumes more than 1TB of disk space. 
Is there any alternative so that it doesn't store the box file on disk.

PS: We plan to sharing this box file via shared folder on network. 

Chris Bednarski

unread,
Jul 20, 2015, 3:07:41 PM7/20/15
to vagra...@googlegroups.com
Hi Yogesh!

I've done this type of thing before. A 550gb image is very slow and difficult to manage. Ideally your VM is as small as possible. Here are some strategies I've used in the past:

- Use fixtures or test data so you don't copy the real database.
- Use a sample of the dataset so you don't copy the entire database.
- Host the database on a centrally-managed server, connect from the VM. You can create a different database for each dev, or share one for multiple developers. Linked clones can make this even easier.
- As a last resort: Host the database on the developer host OS, connect from inside the VM. This means the dataset is not copied twice.

From my experience it's almost always preferable to use a sample or fixture of the data rather than use a full production dataset. There are at least a few reasons for this:

- Smaller datasets are much faster to copy and manage. There is less time waiting so developers are more productive.
- Full production datasets often have personally-identifiable or confidential information like email addresses, ip addresses, etc. If a developer's computer is lost or stolen, data goes with it. Using a sample dataset protects against data leakage.

Cheers!

Chris
Reply all
Reply to author
Forward
0 new messages