Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

best way to retrieve thousands of records.

0 views
Skip to first unread message

jaYPee

unread,
Dec 1, 2004, 1:33:47 AM12/1/04
to
I have currently using a dataset to access my data from sql server
2000. The dataset contains 3 tables that is related to each other.
parent/child/grandchild relationship. My problem is it's very slow to
retrieve records from sql server.

I'm using sqldataadapter.fill method to populate dataset.
Parent table contains more than 3,000 records and child table contains
more than 10,000 records and grandchild table contains more than
90,000 records as of now.

I tried to limit the number of records to return by a select command
using the criteria but my problem is I dont know how to limit the
number of records return by a 3rd table (grandchild).

I'm using textboxes in parent table and datagrid in 2nd and 3rd table.
There is no problem querying the 1st and 2nd table but I don't know
how to query the 3rd table. 3rd table must know first the primary key
value of the 2nd table that is stored in a datagrid. but don't know
how to do this.

Anyone know what is the better way to access this data?

I would love to read some article about techniques on accessing table
that has a relationship up to 3rd table (grandchild).

thanks in advance.

Frans Bouma [C# MVP]

unread,
Dec 1, 2004, 3:48:49 AM12/1/04
to
What on earth are you planning to do with 3000 parent records? No human
on earth can handle that much rows in one go.

I'd redesign your gui with a more user friendlier system. Your current
system requires a human to browse through 3000 rows to find a row to
edit it, pretty harsh / time consuming.

Frans.

--
Get LLBLGen Pro, productive O/R mapping for .NET: http://www.llblgen.com
My .NET Blog : http://weblogs.asp.net/FBouma
Microsoft MVP (C#)

Cor Ligthert

unread,
Dec 1, 2004, 5:55:13 AM12/1/04
to
jaYPee,

I assume you want "one" SQL string to get 3 tables.

Where you get from table 1 least say one datarow in a table.
In table 2 the rows from the related key in table 1
in table 3 the rows from the related keys in table 2

Because I don't like (hate) SQL, I think it is better that somebody else
give you this answer.

Or you should like to do it with 3 seperated SQL strings where you can than
create in a for loop your own sql string with a where or clause for the
third table from the second

http://msdn.microsoft.com/library/default.asp?url=/library/en-us/tsqlref/ts_ca-co_2y2h.asp

Just an idea, not great

Cor

"jaYPee" <hija...@yahoo.com>

Cor Ligthert

unread,
Dec 1, 2004, 6:10:03 AM12/1/04
to
jayPee,

That new MSDN system creates no links anymore that can be copied or set in
favorits,

So I was searching on "contains" and went than to Or, maybe you can you do
that yourself. (sql Or gives a little bit to much hits)

Cor

.
"Cor Ligthert" <notmyfi...@planet.nl>

dotnetnewbie

unread,
Dec 1, 2004, 7:35:03 AM12/1/04
to
If you are having performance problems first thing to check is if you have
indexes on your tables. Also, is your query slow only in VB.Net or is it
also slow when you run it in TSQL ?

How many records are you feasibly planning on showing in the front end screen?
Are you planning on retrieving all 90,000 from this 3rd table ?

I tend to use the Reader when dealing with large data queries into VB.Net.

Jeff Dillon

unread,
Dec 1, 2004, 1:48:54 PM12/1/04
to
A 3 table join is easy...

Using the SQL Server Northwind database, this query selects the category
name for all products with a quantity greater than 3 in order details (just
a random query)

Select Distinct C.CategoryName
FROM [Order Details] OD
INNER JOIN Products P on P.ProductID = OD.ProductID
INNER JOIN Categories C on C.CategoryID = P.CategoryID
WHERE OD.Quantity > 3


"jaYPee" <hija...@yahoo.com> wrote in message
news:7joqq0lvgb49ave7a...@4ax.com...

Mark Jones

unread,
Dec 1, 2004, 3:37:49 PM12/1/04
to
If it's not important that you have absolutely correct data,
try using the WITH(NOLOCK) hint in your SQL.

Also, as someone else mentioned, use profiler to record some activity ansd then feed the
profile data into the query analyzer index tuning wizard. These two steps should help out
to some extent. Of course, reducing the number of rows you are working with through where
clauses is a great way to speed things up. A general rule of thumb is that if your query
takes more than a second, figure out a better way to run it.

GL


"jaYPee" <hija...@yahoo.com> wrote in message
news:7joqq0lvgb49ave7a...@4ax.com...

Sahil Malik

unread,
Dec 1, 2004, 10:36:12 PM12/1/04
to
I want to second Frans Bouma's reply - don't retreive that many records in
one go. I understand you should have a chunky not chatty interface, and
retreive as much as you can - but that doesn't mean retreive the lion, the
mouse and their children and their grandmothers with your data. There is
again no hard and fast rule about "how much is how much" - but 90,000 rows
is too much beyond doubt. A 1000 is hitting the limits of too much IMHO -
but anyway.

Have you considered what will it take to DataBind 90,000 rows to the UI?
I just answered another question on this newsgroup recently on which a nice
dude had 100,000 rows in a datatable and to remove half of them it took him
30+ SECONDS (yes not milliseconds).

Also imagine the kind of memory you are using up. (LOTS).

The beauty of ADO.NET is that you can call SqlDataAdapter.Fill multiple
times on the same dataset - why don't you fill the third table on demand -
and fill it with fewer rows? .. i.e code up another stored proc that accepts
an ID from Table2 as a parameter, and returns you rows matching *only* that
id.

- Sahil Malik
http://dotnetjunkies.com/weblog/sahilmalik


"jaYPee" <hija...@yahoo.com> wrote in message
news:7joqq0lvgb49ave7a...@4ax.com...

Cor Ligthert

unread,
Dec 2, 2004, 3:35:05 AM12/2/04
to
Sahil,

That is often said too jaYPee in the language.vb newsgroup, I assume that is
the reason he is optimizing now his datasets.

However he is now telling in my opinion that he has 90.000 rows in his
database, any clue to optimize that, because that can be really interesting
when you are able to give that when you do not know what kind of database
table that is and not know if it is already completly optimized.

:-)

Cor


Frans Bouma [C# MVP]

unread,
Dec 2, 2004, 5:32:34 AM12/2/04
to

As he uses a master - detail - superdetail setup, it depends on which
grid will be the main source of information to the user. I.e.: will he
browser hte detail a lot (and thus changing the superdetail grid a lot)
or not. These setups are typically well served with a load-on-demand
scenario. This means that when the user changes for example the selected
index in the detail grid, the logic tries to determine if there are rows
loaded for the superdetail grid for that particular detail row. If not,
fetch them from the db. (so initially start with an empty superdetail
table).

This also avoids having tens of thousands of rows in a datatable which
can't deal very well with more than 57000 rows. It also avoids having to
bind thousands of rows to a grid, as the maser -> detail grid can be
solved with the same trick.

Cor Ligthert

unread,
Dec 2, 2004, 9:48:49 AM12/2/04
to
Sahil,

jaYpee is in my opinion not talking about a dataset.

He is in my opinion talking about a database, from which he wants a limited
dataset which is as small as possible.

Where do you guys read a dataset of 90.000 datarows?

Cor


Sahil Malik

unread,
Dec 2, 2004, 9:13:32 AM12/2/04
to
Cor,

I agree that the queries have to be optimized, but even if we assume the
best possible case, lets say the dataset actually lives prepared in a memory
mapped file, that you just have to do a load on - there still are the issues
of binding it and browsing it and the memory it consumes. It is going to
instantiate 103,000 datarows, and setup appropriate relations. And I don't
even want to think of what it takes to bind/browse 103,000 rows
appropriately.

No matter how fast the queries run, in my opinion the best solution would be
to reduce the dataset size by changing the UI a little bit. Once you have
that squared away, along with good Sql Queries you'd have a good practicable
solution.

And if he is asking us to optimize a datapull from 90,000 rows - then we
need to see the table structures and sql statements to comment accurately.

"Cor Ligthert" <notmyfi...@planet.nl> wrote in message
news:%23Hd3amE...@tk2msftngp13.phx.gbl...

jaYPee

unread,
Dec 5, 2004, 8:21:21 PM12/5/04
to

Sorry for the delayed response 'cause i have attended a seminar last
week.

Thanks for the reply. You are exactly right with your opinion. I just
want to optimize my code to make it faster. My scenario is the same
with the sample from microsoft called " VB.NET - Data Access - Build a
Master-Detail Windows Form". I have build the same concept w/ this
sample. The problem here is when there are so many parent records in
the database and many many more records in the child form (1st and 2nd
datagrid). So when the form loads say for example based on the sample
from microsoft if it has 3,000 parent records and 10,000 child records
and 90,000 grand child records the loading of form is so very slow
using the dataadapter.fill method. In order to solve this problem I
have to pull out only the data based on the criteria that the user
needs. But my problem now is on how can I filter the to child table.

thanks again...

0 new messages