Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

How can I show data in dbgrid more faster with adoquery?

1,747 views
Skip to first unread message

ribbon

unread,
Feb 17, 2002, 7:31:12 AM2/17/02
to
I have an c/s application with adoquery recieved data from SQL server, and
display data in dbgrid. But the speed is so bad when conneted at first
time(about 300,000 records). I think it mabey ado's question.
How can I improve my speed? And how can I do like Borland's "SQL
Explorer"(it seems like return 20 records when clikking the bar, and it's
speed is so fast!!!)??

Thanks,

ribbon

rib...@x263.net


MS

unread,
Feb 17, 2002, 8:57:04 AM2/17/02
to
Hi.
Use:
Component TAdoDataSet or TBetterAdoDataSet (http://web.orbitel.bg/vassil/)
Property CursorLocation - clUseServer
Property CommandType - cmdTableDirect
Property CursorType - ctStatic or ctKeyset
it's very speed.
or
Component TAdoDataSet or TBetterAdoDataSet (http://web.orbitel.bg/vassil/)
Property CursorLocation - clUseServer
Property CommandType - cmdText
Property CommandText and insert SQL statement.
Property CursorType - ctStatic or ctKeyset

M.S


Użytkownik ribbon <rib...@x263.net> w wiadomości do grup dyskusyjnych
napisał:3c6fa24f$1_1@dnews...

Vassiliev V.V.

unread,
Feb 17, 2002, 10:05:35 AM2/17/02
to
For fastest speed you could check OLEDB Direct at http://www.oledbdirect.com

For ADO, try use server-side cursor.

Regards,
Vassiliev V.V.
http://www.oledbdirect.com


"ribbon" <rib...@x263.net> сообщил/сообщила в новостях следующее:
news:3c6fa24f$1_1@dnews...

Brian Bushay TeamB

unread,
Feb 17, 2002, 11:11:28 PM2/17/02
to
>I have an c/s application with adoquery recieved data from SQL server, and
>display data in dbgrid. But the speed is so bad when conneted at first
>time(about 300,000 records). I think it mabey ado's question.
>How can I improve my speed?

If you are using ADOexpress components with client side cursors all the records
in your select statement get loaded into memory when you open the TADOQuery.
300,000 records it way to many to do any thing practical with. The strategy
you need to adopt is one that returns only the records you need to look at.
The exact way to do this depends on how you use the data. For applications I
create that have more records than you can reasonably scroll through I have some
kind of an opening dialog that forces the user to enter some kind of selection
criteria before they view any records. Then I query for those records.
If you are using a sequential key you can also do something like open by
fetching the last 100 records entered


--
Brian Bushay (TeamB)
Bbu...@NMPLS.com

Wayne Brantley

unread,
Feb 18, 2002, 12:19:53 AM2/18/02
to
Brian,
The only problem with that, is you are letting your database
functionality dictate the design. We all do that to some extent, but I hate
to do it just because of some ineffecency of a tool. See my post above
called 'ADO SLOWER than BDE', where I show ADO to be 10x slower than the BDE
and what this guy is wanting to do will be EXTREMLY fast if he uses BDE and
sql links or OLEDBDIRECT. It does not matter if the first query returns
1million records or 10, will be the same speed.

Wayne

"Brian Bushay TeamB" <BBu...@Nmpls.com> wrote in message
news:36t07uks25o1m5n2o...@4ax.com...

Brian Bushay TeamB

unread,
Feb 18, 2002, 11:11:55 PM2/18/02
to

>The only problem with that, is you are letting your database
>functionality dictate the design.
Your being foolish if you don't let your tools dictate your application design.
And if you do open a table with a million records what do you want to do with it
browse through it a few records at a time?

Wayne Brantley

unread,
Feb 19, 2002, 2:24:49 AM2/19/02
to

The point is, when you let the user enter query criteria, you have NO idea
how many records that will return. If their criteria is too general, it
will return a million records. What needs to happen? Certainly not a wait
of 10minutes to get them....show the first ones, let them browse through and
tighten up the criteria.

The odds of the above happens increases proportionally by the number of rows
in the table!

My entire point is the ADO emulates a cursor engine and brings down ALL
records you request from the SQL source. It does NOT need to do that when
hitting MSSQL as MSSQL has its own cursor engine that does NOT do that and
will do most everything the ADO one will do. MS just generalized ADO so
that you can hit a text file with it and still use 'cursor like'
functionality.

'select * from billionrecordtable'

If you execute that with ADO, go on vacation.
If you execute that with DBLIB (native to MSSQL server), it will come back
immediatley and have the first record ready for your viewing.
If you execute that with OLEDB(native to MSSQL server), it will come back
immediatley and have the first record ready for your viewing.
If you execute that with BDE, it will come back immediatley and have the
first record ready for your viewing.

That is my entire point in a nutshell!

(database functionality <> "using some really slow method to access my
really fast database, so I better design the app like this...")

Wayne

"Brian Bushay TeamB" <BBu...@Nmpls.com> wrote in message

news:nah37u4106ogrfa3v...@4ax.com...

Sunil Furtado

unread,
Feb 19, 2002, 8:55:50 AM2/19/02
to

A couple of things you could try to make ado faster.

1. CursorLocation - Server
2. CursorType - OpenForwardOnly
3. ExecuteOptions - ASyncFetch
4. CacheSize - some small number like 20.


If the data needs to be displayed in a grid then with a
ClientDataset,

1. PacketRecords - some small number like 20.

SQL Server also supports the selecting of top n rows, no
matter what the sql statement is or how many rows match the
condition.


Server side cursors have their own overheads and is not
one of my favourites. I strongly believe in controlling the
number of rows that is fetched over the wire.


Sunil

DRS

unread,
Feb 19, 2002, 10:00:35 AM2/19/02
to
"Wayne Brantley" <Wa...@aprompt.com> wrote in message
news:3c71fd49$1_2@dnews...

[...]

| My entire point is the ADO emulates a cursor engine and brings down ALL

There's no emulation about it. ADO uses the ADO Cursor Engine (ACE) to
implement client-side cursors.

| records you request from the SQL source. It does NOT need to do that
when
| hitting MSSQL as MSSQL has its own cursor engine that does NOT do that and
| will do most everything the ADO one will do. MS just generalized ADO so
| that you can hit a text file with it and still use 'cursor like'
| functionality.

If you think it's preferable to use SQL Server's cursor engine then I for
one will never hire you. Server side cursors scale HORRIBLY, especially on
rdbms's like SQL Server and I get sick of saying it.

--

Quidquid latine dictum sit, altum viditur.
#319


ozbear

unread,
Feb 19, 2002, 5:16:38 PM2/19/02
to
On Mon, 18 Feb 2002 22:11:55 -0600, Brian Bushay TeamB
<BBu...@Nmpls.com> wrote:

>
>>The only problem with that, is you are letting your database
>>functionality dictate the design.
>Your being foolish if you don't let your tools dictate your application design.
>And if you do open a table with a million records what do you want to do with it
>browse through it a few records at a time?
>

I am in somewhat the same boat as Wayne on this, using ADO against
MSSQL 7 with a TClientDataSet attached to a TADOQuery.

Exactly what -is- one to do when the user-specified query criteria
results in a million record result set? That in itself is no problem
as long as they don't get squirted down the wire to my client app
but are only fetched as required, either grid's-worth at a time
or a form's-worth at a time.

But they do get squirted down the line and performance is terrible
as a result.

I've tried the "usual" suggestions of using server-side cursors and
the results are worse than with client side ones. I am unsure why
this is so, or why people recommend it when it is worse.

While I am not doing a web browser I'd prefer it if my query results
acted like Google...even if you enter horrible search criteria like
"cat" you don't get a billion results delivered back to your browser.
How does one do the equivalent with ADO + MSSQL 7 + Delphi against
a database server on a LAN?

Regards, Oz

Brian Bushay TeamB

unread,
Feb 19, 2002, 10:48:05 PM2/19/02
to

>Exactly what -is- one to do when the user-specified query criteria
>results in a million record result set?
You use the Maxrecords property to set the maximum number of records to fetch.
Then you check the record count and if it is equal to the max records you give
the user a message indicating they chose poor selection criteria and it exceeded
the maximum fetch allowed.

Brian Bushay TeamB

unread,
Feb 19, 2002, 10:48:05 PM2/19/02
to

>'select * from billionrecordtable'
You response leaves out a few points

>If you execute that with ADO, go on vacation.
There is a maxRecords property that you can set to limit the records returned.

>If you execute that with DBLIB (native to MSSQL server), it will come back
>immediatley and have the first record ready for your viewing.

Unfortunately Microsft isn't reving dblib

>If you execute that with OLEDB(native to MSSQL server), it will come back
>immediatley and have the first record ready for your viewing.

I believe you get this same response with ADO when you set Asyncfetch to true

>If you execute that with BDE, it will come back immediatley and have the
>first record ready for your viewing.

BDE Uses dbLib so it ins't a good option

Wayne Brantley

unread,
Feb 19, 2002, 10:53:50 PM2/19/02
to
You can do it.
1) Use BDE
2) Use OLEDB yourself
3) Use www.oledbdirect.com or similar product.

I wanted badly to use ADO, but now my app is mixed BDE and ADO. I have to
use BDE when the result set might be/is large so it will be fast. If a
report is running or other such nonsense, I can use ADO, wait now or wait
later....

good luck.

Wayne

"ozbear" <ozb...@bigpond.com> wrote in message
news:3c72c950...@newsgroups.borland.com...

Wayne Brantley

unread,
Feb 19, 2002, 10:51:02 PM2/19/02
to
Emulation in this context means it provides a cursor if one exists or not.
So, in SQL server's case the cursor does exist, but ADO does not use it.

As far as hiring me goes, it would be the other way around....me not hiring
you! :-)
Really though, if you are interested I can backup my statements with proof
and show you they scale! Are you telling me that if I use OLEDB directly
the app will not scale...? Of course it will and that is what I am saying,
use it directly and it SMOKES. Try oledbdirect if you do not believe me.
Anyway, here, there, whatever I am not much of one for ranting and raving on
newsgroups. I was just hoping in reality that someone had an
implementation/option on ADO that would let it bypass its ACE and therefore
speed up 10x.

Wayne

"DRS" <d...@removethis.ihug.com.au> wrote in message news:3c72685a_1@dnews...

DRS

unread,
Feb 19, 2002, 11:05:11 PM2/19/02
to
"ozbear" <ozb...@bigpond.com> wrote in message
news:3c72c950...@newsgroups.borland.com...
| On Mon, 18 Feb 2002 22:11:55 -0600, Brian Bushay TeamB
| <BBu...@Nmpls.com> wrote:
|
| >
| >>The only problem with that, is you are letting your database
| >>functionality dictate the design.
| >Your being foolish if you don't let your tools dictate your application
design.
| >And if you do open a table with a million records what do you want to do
with it
| >browse through it a few records at a time?
| >
| I am in somewhat the same boat as Wayne on this, using ADO against
| MSSQL 7 with a TClientDataSet attached to a TADOQuery.
|
| Exactly what -is- one to do when the user-specified query criteria
| results in a million record result set? That in itself is no problem
| as long as they don't get squirted down the wire to my client app
| but are only fetched as required, either grid's-worth at a time
| or a form's-worth at a time.

It linearly becomes a problem as more and more users generate giant
recordsets and SQL Server is expected to manage them.

| But they do get squirted down the line and performance is terrible
| as a result.

Performance is worse with server-side cursors and multiple users. SQL
Server is optimized to answer qeries and return the results, then move on to
the next query.

| I've tried the "usual" suggestions of using server-side cursors and
| the results are worse than with client side ones. I am unsure why
| this is so, or why people recommend it when it is worse.

SQL Server is optimized to store data and efficiently satisfy queries made
on it. It is not optimized to store and manipulate the results of those
queries, which is what it is doing with server-side cursors. It must also
maintain live connections for each recordset, incurs network round-trips
each time your code or your user decides to scroll through the recordset,
and cannot use technologies like MTS with server-side cursors. There are
almost never sound reasons to use server-side cursors with SQL Server.

| While I am not doing a web browser I'd prefer it if my query results
| acted like Google...even if you enter horrible search criteria like
| "cat" you don't get a billion results delivered back to your browser.
| How does one do the equivalent with ADO + MSSQL 7 + Delphi against
| a database server on a LAN?

You can't. You have to use intelligent mechanisms to try to ensure
intelligent queries are made in the first place. More often than not
million record recordsets are the result of poor planning, which is not
ADO's fault.

DRS

unread,
Feb 19, 2002, 11:07:26 PM2/19/02
to
"Wayne Brantley" <Wa...@aprompt.com> wrote in message
news:3c731cae_2@dnews...

| Emulation in this context means it provides a cursor if one exists or not.
| So, in SQL server's case the cursor does exist, but ADO does not use it.
|
| As far as hiring me goes, it would be the other way around....me not
hiring
| you! :-)
| Really though, if you are interested I can backup my statements with proof
| and show you they scale!

If you'd stop top-posting you might notice that I explictly said server-side
cursors do not scale well. They scale horribly. If you have empirical
proof otherwise then there is a world of experienced data administrators
waiting to be surprised.

Kevin Frevert

unread,
Feb 21, 2002, 5:24:33 PM2/21/02
to
Agreed. There are smarter mechanisms (views, stored procedures,
user-defined functions,... the WHERE clause? :) to search large tables. If
opening a million record table and your expecting the client db engine to do
the "right" thing, then good luck. If I wrote apps like that, I probably
wouldn't have a job if I tried ..
"uh..it's not my fault 'Select * from BigTable' takes 20 minutes to open and
locks the table, it's a ...uh..um...Borland's fault, yea...that's it.
Borland should be smart enough to know I only really wanted was the first 10
records. Where is my lawyer!"

My apps are far from perfect, but (knocking on wood) I've never locked down
tables, purposely caused deadlocks, and had users waiting more than 5
seconds to open a query (tables with up to 3 million+ rows). If I notice
performance issues, I first look at the SQL and what the server is doing,
not the BDE or ADO.

krf


Brian Bushay TeamB <BBu...@Nmpls.com> wrote in message

news:nah37u4106ogrfa3v...@4ax.com...

Kevin Frevert

unread,
Feb 21, 2002, 5:54:11 PM2/21/02
to

Just curious, what business are you in?

krf

Wayne Brantley

unread,
Feb 21, 2002, 11:03:04 PM2/21/02
to
Insurance....

"Kevin Frevert" <kfre...@midwayusa.com> wrote in message
news:3c755eb8$1_2@dnews...

Kevin Frevert

unread,
Feb 22, 2002, 10:40:07 AM2/22/02
to
Thanks. It's difficult to understand why developers go down a particular
road to solve a problem without knowing the problem.
For our company, time is money (http://www.midwayusa.com). I have a
financial obligation to make users productive and scrolling a dbgrid to find
customer 123, for us, is not.
Our users are probably a lot different than yours. I would guess insurance
agents understand if they turn off the power to their PC while updating
data, there is nothing your program can do to stop them. We once had a
request to update an application so it wouldn't let the user turn off their
PC (a user lost her changes and was very upset with MIS because the system
"let" her turn the PC off). Users are still users, but the chain went to
her supervisor (he OK'ed it), his manager, my boss and to the VP of MIS (who
has a MCSE) who asked me if I could fix this "bug".
The conversation went something like this..
VP:"Kevin, how long do you think it would take you to fix this?"
Me:"uh..Sandy reaches down and turns off the power off her PC, what exactly
do think my program can do to stop her?"
VP:"Oh, I guess your right."

krf

Wayne Brantley <Wa...@aprompt.com> wrote in message

news:3c75c27f$1_2@dnews...

Wayne Brantley

unread,
Feb 22, 2002, 2:09:54 PM2/22/02
to
That is freakin ridiculous and funny!!!

Well, one option I thought of considering short of a redesign or short of
staying with BDE, was to do try to provide a maxrecords. So I would present
the grid, but limit it to like 20 records.....they wanna see other records,
they enter filter conditions.

I have not done an official speed test on that yet, but I know when I set
the maxrecords property before it went much faster.

Wonder what the speed is where I have an Infopower drop down search as you
type combo box, where the list they are searching from contains a few
thousand records.......I wonder if limiting the maxrecords on that control
would let it behave properly? Assuming they actually typed an item from the
list instead of hit 'B' and then started scrolling thru the B's I suppose
the max record would work? Any experience with that? (Works great with BDE
of course)

Also, I wonder what Borlands implementation of DBExpress with MSSQL support
will be like. That could be a good alternative also, assuming they develop
the driver.

Wayne

"Kevin Frevert" <kfre...@midwayusa.com> wrote in message

news:3c764a7e$1_1@dnews...

0 new messages