The ETL was split to extract to files first like the SO sample, and
I am able now to import on a faster computer.
Reading from file to RavenDB via batch commands, the ETL client is
running at 50mb memory usage, the ravenDB server eventually reached 7
gigs. I am using the config values below**, I wonder if there is
something else I can use to limit memory usage?
**
http://ravendb.net/faq/low-memory-footprint
<add key="Raven/Esent/CacheSizeMax" value="256"/>
<add key="Raven/Esent/MaxVerPages" value="32"/>
Memory usage increased in a seesaw manner, occasionally dropping 1
gig then regaining a bit more, drifting from 2gig to 7gig. Eventually
it was stuck seesawing at 7 gigs, but the seesaw steps are flattened
out to where it stays at 7. This is slowing down the ETL now, but
perhaps it will still finish.
I don't know how to share a repro unfortunately, given the data is
closed. Would it help for me to grab some kind of memory profile
trace? Or other suggestions?
> > > > I'm writing anETLthat that loads documents from SQL intoRavenDB.
> > > > There may be multiple patch operations on the same target object. The
> > > > patch operations are appending to an array, I need to be sure they
> > > > happen in order. I'm using Rhino.ETLfor the first time, I like it.
>
> > > > Can I assume that the patch operations within a single save call will
> > > > be applied in the order they were added to the session? I only really
> > > > care about the order of patches that apply to the same object.
>
> > > > Can I assume that the batches will be applied in the order they're
> > > > generated within theETLprocess? (with the currently batching, its
> > > > possible that updates to the same object will be split across
> > > > batches...)
>
> > > > TheETLis loading from SQL intoRavenDB. When I run it across the
> > > > full data set (~3 million rows) I noticed theETLprocess went up to 2
> > > > gigs memory, then stayed about there. Is that normal/ok? Perhaps I'm
> > > > holding onto memory too long, or maybe its just not aggressive about
> > > > reclaiming memory until it needs to.
>
> > > > I see the stack overflowETLsample writes to file, in this case I'm