I wonder how can I clear the database before each spec?
At first I thought that I can wrap each spec within a transaction like
so:
https://github.com/andreasronge/neo4j/blob/master/spec/spec_helper.rb#L64
but roll it back.
But the problem is that the saved models within that transactions are
not traversable.
So that this is false:
User.create
User.count.should > 0
This is pretty unexpected to me.
So my question is how to clean-up the database before each test.
Cheers,
/peter neubauer
GTalk: neubauer.peter
Skype peter.neubauer
Phone +46 704 106975
LinkedIn http://www.linkedin.com/in/neubauer
Twitter http://twitter.com/peterneubauer
brew install neo4j && neo4j start
heroku addons:add neo4j
> --
> You received this message because you are subscribed to the Google Groups "neo4jrb" group.
> To post to this group, send email to neo...@googlegroups.com.
> To unsubscribe from this group, send email to neo4jrb+u...@googlegroups.com.
> For more options, visit this group at http://groups.google.com/group/neo4jrb?hl=en.
>
c.after(:each) do
finish_tx
Neo4j::Rails::Model.close_lucene_connections
Neo4j::Transaction.run do
Neo4j::Index::IndexerRegistry.delete_all_indexes
end
Neo4j::Transaction.run do
Neo4j.threadlocal_ref_node = Neo4j::Node.new :name =>
"ref_#{$name_counter}"
$name_counter += 1
end
end
from https://github.com/andreasronge/neo4j/blob/master/spec/spec_helper.rb
Cheers,
/peter neubauer
GTalk: neubauer.peter
Skype peter.neubauer
Phone +46 704 106975
LinkedIn http://www.linkedin.com/in/neubauer
Twitter http://twitter.com/peterneubauer
brew install neo4j && neo4j start
heroku addons:add neo4j
> Guys, there is the new ImpermanentGraphDatabase, that is fully
> functional but operating on non-persistent RAM-backed FileChannels at
> the lowest level. We use it for testing in Java land, I think it would
> be very applicable here, too!
>
That is exactly the perfect solution that I'm after.
In RDBMS world I use in-memory SQLite for that.
So we can just drop the database and create a new one before the test and it should be cheap and fast.
Sounds awesome.
But I'm a little bit lost how I can make use of it (sorry just starting with all this), especially with the issues outlined by Andreas.
Or maybe for now I need to clean it up as Vivek explained?
BTW, I would call the class Transient/Memory Database instead of Impermanent :-)
Cheers.
@Dmytrii - I'm thinking of moving the lighthouse issues to github,
since it looks like github now has support for crosslinking between
commits and issues and support for hash tags in commit messages to
open close issues etc..
> Created an issue for that.
>
> http://neo4j.lighthouseapp.com/projects/15548-neo4j/tickets/208-better-cleanup-after-rspecs-using-impermanentgraphdatabase
Thanks for that.
Hope it won't take too much time to implement it.
Unfortunately I won't be able to contribute PRs.
Really have to switch the current app over. And that's the last chance I am giving to neo4j :)
> @Dmytrii - I'm thinking of moving the lighthouse issues to github,
> since it looks like github now has support for crosslinking between
> commits and issues and support for hash tags in commit messages to
> open close issues etc..
Way to go! But I still can't see the "Issues" section on the repo.
Also with that in mind you could have created the "cleanup specs" issue on the Github.
Cheers.
The only thing I am missing now is commit logs referring to other repos issues, and voting on issues. Otherwise, github is great.
/peter
Sent from my phone, please excuse typos and autocorrection.
The only thing I am missing now is commit logs referring to other repos issues, and voting on issues. Otherwise, github is great.
Wow thanks,
That is a life saver!
/peter
Sent from my phone, please excuse typos and autocorrection.
--
--
> Yes, you're right. I actually forgot to mention that - we're using memory disks on both the linux and mac platforms on my project and it does significantly speed up the tests.
Well done!
I just can't get my hands on to that. Also want to avoid any huge setups and create the memory disk before all specs and the drop it at the end.
Do you mind to share your setup?
mkdir -p /tmp/neo4j_testing
mount -t tmpfs -o size=500M tmpfs /tmp/neo4j_testing
MacYou may need to do a User.destroy_all after each test as a workaround. Otherwise, you'll have to delete all the nodes in your database like I'd suggested in an earlier thread. The 'all' call doesn't use the lucene index AFAIK, it uses traversals.
Is your user model shared across tenants (ie, does it have a
ref_node {Neo4j.default_ref_node}
declaration anywhere?
> Hard to say what exactly is going on without examining what's happening at runtime. I've seen this kind of behaviour when top level transactions are inadvertently created in tests / migrations. Is any of your before / after stuff creating Neo4j transactions anywhere?
Yes, it's basically done the same is neo4j.rb does
The problem is that it work fine for RSpec, but not for Cucumber!
Anyway, here is my "features/support/ne4j.rb" file that is supposed to take care of the clean-up:
require 'fileutils'
# TODO: Remove dup: copy paste from the spec/support/neo4j.rb
def rm_db_storage!
FileUtils.rm_rf Neo4j::Config[:storage_path]
raise "Can't delete db" if File.exist?(Neo4j::Config[:storage_path])
end
rm_db_storage! unless Neo4j.running?
$spec_counter = Time.now.to_f
After do |s|
Neo4j::Rails::Model.close_lucene_connections
Neo4j::Transaction.run do
Neo4j::Index::IndexerRegistry.delete_all_indexes
end
Neo4j::Transaction.run do
Neo4j.threadlocal_ref_node = Neo4j::Node.new :name => "ref_#{$spec_counter}"
$spec_counter += 1
end
end
> In your test, you could try asserting that the Neo4j threadlocal ref node is the same as what was assigned in the before / after block?
ref_node is the same before tests, during tests and after. It only changes after the block above.
> Are your user objects created by the test, or as part of setup?
As part of the test. In Cucumber there is really no setup as such.
> Could you try looking at the database using Neoclipse? You should not see any user nodes attached to the home node.
That is not correct. The setup I showed you deletes the database in the beginning. At the end of each test the ref_node is replaced,
So the data will be left in the DB (and I can see it with Neoclipse).
Another thing that is slow is loading the neo4j gem. Currently
everything is loaded (except some JAR files like for HA support).
Would be nice if we instead could use the Ruby autoload feature and
only load modules that are required.
Using ImpermanentGraphDatabase sounds like a good idea. I'm just a bit
worried that the real database and the ImpermanentGraphDatabase
behaves differently and would cause us problems (e.g. bugs only found
using the real database). One possible approach is to let the build
server use the real database and on the developer machine use the
ImpermanentGraphDatabase, maybe ?
Cheers
Andreas
Guys,
the impermanent gdb is save to use, the abstraction is way down at the file channel layer and its used everywhere except the kernel tests.
Cheers
/peter
Sent from a device with crappy keyboard and autocorrect
How big is your datastore?
Better to use ImpermanentGraphDatabase for testing, which is no longer using disk filesystems as of 1.6.M02.Perhaps adding support for that in neo4j.rb would be great!
You can profile JRuby with any java profiler like (visualvm or yourkit) they can connect to the running java process and record timing information for all method calls. (Then you can have a look at the "hotspot" methods that take the most time).
Regarding slow RSpecs for Neo4j.rb - I've been thinking of splitting
neo4j into several gems.
Since yesterday Neo4j.rb now consists of two gems neo4j and neo4j-jars.
Splitting neo4j.rb into another gem (neo4j-rails) would mean less
RSpecs has to be run. (But we still can do that now by just bundle
exec spec spec/rails.) Also, I know there are some RSpecs that are
very similar which can be removed.
Another thing that is slow is loading the neo4j gem. Currently
everything is loaded (except some JAR files like for HA support).
Would be nice if we instead could use the Ruby autoload feature and
only load modules that are required.
Using ImpermanentGraphDatabase sounds like a good idea. I'm just a bit
worried that the real database and the ImpermanentGraphDatabase
behaves differently and would cause us problems (e.g. bugs only found
using the real database). One possible approach is to let the build
server use the real database and on the developer machine use the
ImpermanentGraphDatabase, maybe ?
the impermanent gdb is save to use, the abstraction is way down at the file channel layer and its used everywhere except the kernel tests.
> Looking at your profiling, it seems that not the cleanup takes the time but rather the startup process of your specs?
Yeah, it looks like a lot of time is spent in Bundler loading things.
Doesn't look like neo4j is causing the slowdown at all.
I guess it just the way JRuby works :(
But still ~2 tests per second is just way too slow.
Don't know if that is relevant to your case: http://moretea.posterous.com/corey-haines-fast-rails-tests
Otherwise what about setting up jruby/neo4j just once before all tests run and then cleanup in between, so you don't get the startup penalty for each spec?
Cheers
Michael
> Perhaps you'd want to look into the stuff Corey Haines did about speeding up rspec tests?
>
> Don't know if that is relevant to your case: http://moretea.posterous.com/corey-haines-fast-rails-tests
It is possible but too hard as Rails has lots of dependencies.
I can consider that, but it will only speed up the model test. I still need full Rails stack of controller, view, helper spec.
> Otherwise what about setting up jruby/neo4j just once before all tests run and then cleanup in between, so you don't get the startup penalty for each spec?
That's exactly what I do. The database is set-up before all tests start. Then all nodes and indexes are removed in between.