Stack Overflow

218 views
Skip to first unread message

Lukasz Wasylow

unread,
May 1, 2014, 7:16:50 AM5/1/14
to db...@googlegroups.com
Hi
I got an error when trying to query Database to compare large results.
Query in DB returns approx. 8k rows and query a database dictionary

Error I'm getting when I'm running is:
java.lang.StackOverflowError at sun.nio.cs.UTF_8.updatePositions(Unknown Source) at sun.nio.cs.UTF_8$Encoder.encodeArrayLoop(Unknown Source) at sun.nio.cs.UTF_8$Encoder.encodeLoop(Unknown Source) at java.nio.charset.CharsetEncoder.encode(Unknown Source) at sun.nio.cs.StreamEncoder.implWrite(Unknown Source) at sun.nio.cs.StreamEncoder.write(Unknown Source) at sun.nio.cs.StreamEncoder.write(Unknown Source) at java.io.OutputStreamWriter.write(Unknown Source) at java.io.PrintWriter.write(Unknown Source) at java.io.PrintWriter.write(Unknown Source) at java.io.PrintWriter.print(Unknown Source) at fit.Parse.print(Parse.java:203)


Same query runs ok when run with ROWNUM < 4000 restriction or against other schema owner.
Is there a limit of records that can be returned from query ?

Yavor Nikolov

unread,
May 1, 2014, 7:50:35 AM5/1/14
to db...@googlegroups.com
How exactly does your test look like? (Is it Query or CompareStoredQueries, etc.). The error above is coming from Fit https://github.com/unclebob/fitnesse/blob/master/src/fit/Parse.java#L202 but it's not clear how did we reach there.

Why do you want to test with so many records?


Yavor



--
You received this message because you are subscribed to the Google Groups "dbfit" group.
To unsubscribe from this group and stop receiving emails from it, send an email to dbfit+un...@googlegroups.com.
To post to this group, send email to db...@googlegroups.com.
Visit this group at http://groups.google.com/group/dbfit.
For more options, visit https://groups.google.com/d/optout.

Lukasz Wasylow

unread,
May 1, 2014, 3:08:18 PM5/1/14
to db...@googlegroups.com
Hi 
It's a query that pull data from DB dictionary from given test env and compare with a defined data ( prod extract of DB Dictionary).
It so happens that for one schema there is over 8k records from dba_tab_columns.

Thanks
I will try increase a stack on my machine.

Yavor Nikolov

unread,
May 1, 2014, 3:42:05 PM5/1/14
to db...@googlegroups.com
So your expected data is in a test page (and compare with Query)? Or you get it directly from the database (and compare with CompareStoredQueries)?
Some people have reported using CompareStored queries on top of larger data sets than 8K.

I'm not aware of all details in your case - in general DbFit tests are supposed to be working on small data sets. Maybe you can focus on just a few representative columns in a test page.

Yavor

Lukasz Wasylow

unread,
May 4, 2014, 11:22:51 AM5/4/14
to db...@googlegroups.com
Hi
My expected data are in test page and compared with Query.
Generally I'm want to do a comparision of diffrent env with my build database.
When a release is being prepared and tested in build machine I'm using Ant and Freemarker to generate a DbFit folder structure and test pages from defined earlier tests.
So far there is only a schemaValidation tests e.g. AllObjects lists , IndexList , ColumnList , all is splited by schema.

After succesfull build when release is pushed to other env I want to run DBFit test on given env and compare that what's in that database match whats in build database and if not where are diffrences. Note that build databases are shared across diffrent component builds so cannot use them as source as they being nuked and rebuild.

Issue is that some tests are bringing around 8K records and query fails with stackOverflow.
On older DBFit version I noticed that page got a stack error even when just loading up few thousend records even before I start running test (but that's might be cause of version, will check that in work on Monday).

I will try to increase machine memory settings and check if that works.
Message has been deleted

Lukasz Wasylow

unread,
May 5, 2014, 4:47:32 AM5/5/14
to db...@googlegroups.com
Test I used is :
!*> Assertion: Compares columns in given schema.

!path lib/*.jar
!|dbfit.OracleTest|

!|Connect|host:port|user|password|dbname|


|Set Parameter|schema_name|''SCHEMA_NAME''|
!|query|!-
select table_name,column_name,data_type,data_length,data_precision,data_scale,nullable from dba_tab_columns where  owner=:schema_name ORDER BY table_name, column_id
-!|
|TABLE_NAME|COLUMN_NAME|?DATA_TYPE|?DATA_LENGTH|?DATA_PRECISION|?DATA_SCALE|?NULLABLE|
|TABLEA|COLUMN1|NUMBER|22|10|0|N|

and there is around 8K rows of data on page.

query abover return 8000 records.


Also I used latest version of DBFit and error is :
Exception in thread "main" java.lang.OutOfMemoryError: Java heap space
	at java.util.Arrays.copyOfRange(Unknown Source)
	at java.lang.String.<init>(Unknown Source)
	at java.lang.String.toLowerCase(Unknown Source)
	at java.lang.String.toLowerCase(Unknown Source)
	at fit.Parse.<init>(Parse.java:45)
	at fit.Parse.<init>(Parse.java:74)
 

Whats interesting if I remove all data from page except one row.
Error is bit different:
 Exception in thread "main" java.lang.StackOverflowError
	at sun.nio.cs.UTF_8.updatePositions(Unknown Source)
	at sun.nio.cs.UTF_8$Encoder.encodeArrayLoop(Unknown Source)
	at sun.nio.cs.UTF_8$Encoder.encodeLoop(Unknown Source)
	at java.nio.charset.CharsetEncoder.encode(Unknown Source)
	at sun.nio.cs.StreamEncoder.implWrite(Unknown Source)
	at sun.nio.cs.StreamEncoder.write(Unknown Source)
	at sun.nio.cs.StreamEncoder.write(Unknown Source)
	at java.io.OutputStreamWriter.write(Unknown Source)
	at java.io.PrintWriter.write(Unknown Source)
	at java.io.PrintWriter.write(Unknown Source)
	at java.io.PrintWriter.print(Unknown Source)
	at fit.Parse.print(Parse.java:203)
	at fit.Parse.print(Parse.java:212
and so on.

I got heap size on 2G now.

It seems to be working better when using a compare store queries, but I dont really want to create object in other env or setup a separate DB for DBFit as it will defeat purpose of portable tests :(

Yavor Nikolov

unread,
May 5, 2014, 8:59:56 AM5/5/14
to db...@googlegroups.com
Hi,

The error is raised by the underlying framework (FitNesse/Fit) - it has been designed to work with small test tables and is not optimized for handling large ones. If you want to do so, you just need more memory. But more likely, this is an indication that there is something wrong with your tests design and you should work on improving them so that they're smaller.



> I dont really want to create object in other env or setup a separate DB for DBFit as it will defeat purpose of portable tests :(
You may implement seeding that test data into your database (could be in a separate schema in the same database). E.g. - could be in a similar way you deploy your application schema/data; could be in some other way.

Regards,
Yavor

Lukasz Wasylow

unread,
May 5, 2014, 9:43:55 AM5/5/14
to db...@googlegroups.com
Hi

Thanks for help Yavor, I getting to conclusion that you are right.
I've tried using a compare stored queries with Oracle and HDSQL but hit same issues with a higher volumes

I will think about direct DB queries then.

In terms of a tests, I will rethink then indeed.

Thanks again :)
Reply all
Reply to author
Forward
0 new messages