Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

Why is ADO So Slow Inserting Records?

906 views
Skip to first unread message

Preston Crawford

unread,
Dec 8, 1999, 3:00:00 AM12/8/99
to
I have a program I've written in VB using ADO. Basically I'm running a
series of queries against an Access table. I then loop through that
resulting recordset and populate a different database (two tables) with the
results I get. So basically I'm running an ADO insert for each record in the
database I'm drawing from. I've heard that Batch updates are faster. Is this
true? The amount of inserts I'm working with is 40,000 and it's taking about
an hours. Does that sound normal?

Preston

SillyRabbit

unread,
Dec 8, 1999, 3:00:00 AM12/8/99
to
There is nothing at all slow about ADO.....Access on the other hand is
another matter.

"Preston Crawford" <aspx...@micron.net> wrote in message
news:e0PYW8bQ$GA.261@cppssbbsa04...

Preston Crawford

unread,
Dec 8, 1999, 3:00:00 AM12/8/99
to
Agree totally. I don't prefer it, but it seems like there should be some way
to get this update to work faster.

Preston


"SillyRabbit" <car...@rabbits.com> wrote in message
news:euc5NEdQ$GA....@cppssbbsa02.microsoft.com...

Clint LaFever

unread,
Dec 8, 1999, 3:00:00 AM12/8/99
to
Well even though ADO works for Jet engines and that ADO is the most popular
way right now for data access. It is not recommended to run against Jet
engines. If you revert back to DAO you will see a major increase. Reason
is, ADO is really talking to Jet (in Acces) then Jet does the work and hands
it back to ADO, whereas if you use DAO you are using Jet all the way, not
middle man. Access 2000 does allow you to turn off Jet however but even, I
would never use Access unless it was a stand alone desktop application. I
prefer MSDE or SQL Server.

Preston Crawford wrote in message ...

Preston Crawford

unread,
Dec 8, 1999, 3:00:00 AM12/8/99
to
Agree totally again. I think Access is crap myself, but unfortunately here
is the scenario. We have a customer who has an imortant database we are
supposed to put on the web in Access. They produced the snapshots of data
they wanted us to publish in Directory Expert which produces huge queries
that look like the following.....

----------------------------------------

SELECT DISTINCT Last_Name_Facility_Name_Grp_Name+IIF(First_and_Middle_Name
is not null, ', '+First_and_Middle_Name,'')+IIF(Suffix is not null, ',
'+Suffix,'')+IIF(Degree is not null, ', '+Degree,'') AS [Full_Name],
Std_City+', '+Std_State+' '+Zip_Code AS CSZ, SPECIALTY.[Specialty Text] AS
[Specialty Text], IIF(SPECIALTY IN('26','71','92','93','BI','NP'),'X') AS

----------------------------------------

That's just an example and even then a very small portion of what you are
looking at. So basically it produces some absolute trash in the name of
making Access easy to use. What I've decided to do, rather than wade through
this logic, is build a VB app that takes these queries and uses them to spin
through their database and spit out an actual relational database structure
that can be used on the web. This sounds very sane and sensible and really
works very well, it's just a little slower than I'd like. I can't just batch
copy over the data and then create the relations, because many of the fields
that are created are created by directory expert and are built in SQL using
the AS statement (i.e. SELECT A+B+C+D+E as Name.) So basically looping
through each query and producing this is what I'm stuck with, I believe. I'd
just like to know if there are some tricks beyond this to optimizing ADO. I
would try DAO, but I started learning DB programming on ADO and haven't been
in the business long enough to have experience with DAO so I'm not sure if I
can do the same exact things I'm trying to do here.

Preston

"Clint LaFever" <lafe...@lycosemail.com> wrote in message
news:1172...@NEWS.SAIC.COM...

Gregj

unread,
Dec 8, 1999, 3:00:00 AM12/8/99
to
You can wrap your inserts inside Transactions (100 per transaction or
something to that effect). To speed up some.

Or you might be able to devise a way to do a Select Into statement and
select directly into the Access Table Source....

UpdateBatches only used with Disconnected Recordsets I am not sure that you
are going to benefit here.


Cheers

--
Gregory A Jackson MCSD, MCT
Senior Software Engineer
STEP Technology
PDX, OR


Preston Crawford

unread,
Dec 8, 1999, 3:00:00 AM12/8/99
to
The problem is that there is essentially no schema. It's a database created
by Directory Expert which is used primarily for publishing printed
directories. It's basically totally flat. So what I'm attempting to do as
part of my operation is break it out into some extra tables. It's also
really whack because the queries directory expert produces (off which I am
forced to base my queries) are very ugly and have many places where fields
are haphazardly mashed together into new fields through AS statements (i.e.
SELECT table.a + table.b AS Name.)

It's basically a mess I've managed to straighten out. Since many of the
fields exist only via the AS statements I can't do a bulk copy or anything
like that. I've seen a couple ideas here that have merit, though and might
help.

Preston

"Hugo de Vreugd" <Hugo.de...@Xachmea.nl> wrote in message
news:MPG.12b96a472e1c68d298968a@ZKResearch01...
> In article <e0PYW8bQ$GA.261@cppssbbsa04>, aspx...@micron.net says...


> > I have a program I've written in VB using ADO. Basically I'm running a
> > series of queries against an Access table. I then loop through that
> > resulting recordset and populate a different database (two tables) with
the
> > results I get. So basically I'm running an ADO insert for each record in
the
> > database I'm drawing from. I've heard that Batch updates are faster. Is
this
> > true? The amount of inserts I'm working with is 40,000 and it's taking
about
> > an hours. Does that sound normal?
> >
> > Preston
> >
> >
> >

> I don't know the schema of the target table. But you need to consider
> indexes. Many inserts to tables with a lot of indexes can be a lot
> quicker when you remove the indexes and afterwards create them again.
>
> Hugo.
> --
>
> For e-mail remove X's.

Vincent Minden

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
A much more efficient way to do what you are trying to do is to rely on Jet engine. Instead of doing a select to get a recordset
and then use ADO to get and copy each individual record, use the Jet SQL to your advantage.

Use either the INSERT..INTO or SELECT..INTO commands.
INSERT..INTO if the table is already created or
SELECT..INTO if you need to create the table.

A simple example

INSERT INTO othertable(field1,field2,field2) IN myotherdb.mdb
SELECT field1,field2,field3 FROM mytable WHERE . . .

The complete details are in the docs - I looked them up under the DAO SDK, but the SQL is for the Jet engine, having nothing to do
with either DAO or ADO. Just create the SQL and use, say, the connection's Execute method. Just note that this is specific to the
Jet engine.

Vincent Minden


Preston Crawford <aspx...@micron.net> wrote in message news:e0PYW8bQ$GA.261@cppssbbsa04...

Hugo de Vreugd

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
In article <e0PYW8bQ$GA.261@cppssbbsa04>, aspx...@micron.net says...
> I have a program I've written in VB using ADO. Basically I'm running a
> series of queries against an Access table. I then loop through that
> resulting recordset and populate a different database (two tables) with the
> results I get. So basically I'm running an ADO insert for each record in the
> database I'm drawing from. I've heard that Batch updates are faster. Is this
> true? The amount of inserts I'm working with is 40,000 and it's taking about
> an hours. Does that sound normal?
>
> Preston
>
>
>

Clint LaFever

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
Access does not support transactions.

http://msdn.microsoft.com/library/backgrnd/html/msdeforvs.htm for more
information.


Gregj wrote in message ...

Gregj

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
Jet does support transactions.

It does NOT Support "Transaction Logging"

Michel Walsh

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
Hi,


You are absolutely right, Access is an application (like VC++, VB, etc).
Jet, on the other hand, does do transactions, and does it very well.


Vanderghast, Access MVP.

Clint LaFever <lafe...@lycosemail.com> wrote in message
news:1172...@NEWS.SAIC.COM...

Michel Walsh

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
Hi,

If I understand the problem right, you have, on one hand, a crap query in
Access (you probably mean Jet) and, on the other hand, a nice looping in VB
and you wish to make the looping in VB faster... isn't it that the crap
query in Access IS faster than your nice VB-loop?

Further more, you say the crap Jet query produce complete trash? While the
very sane and sensible approach (looping in VB) is "just a little bit
slower" than you'll like? and the result is what, the same, so it is still
complete trash. Logical, isn't it?

You may try to describe better the problem, rationally. Since you stay
general, I can only give general guide lines. A query is generally ton time
faster than a loop, in VB.... and the worst scenario of all, is to use
run-time defined query inside a VB loop. Try using ONE query and to remove
your loops. And if Jet is a piece of scrap, close to your definition,
calling rst.Update 100 times just use 100 times that piece of scrap! So, be
consequent with yourself, use it only once!


Thanks, at least, to read the message, even if you will probably chose to
ignore it,
Vanderghast, Access MVP.


Preston Crawford <aspx...@micron.net> wrote in message

news:us37cweQ$GA.253@cppssbbsa04...

> "Clint LaFever" <lafe...@lycosemail.com> wrote in message
> news:1172...@NEWS.SAIC.COM...

> > >> "Preston Crawford" <aspx...@micron.net> wrote in message
> > >> news:e0PYW8bQ$GA.261@cppssbbsa04...

Preston Crawford

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
"Michel Walsh" <Vande...@email.msn.com> wrote in message

> If I understand the problem right, you have, on one hand, a crap query in
> Access (you probably mean Jet) and, on the other hand, a nice looping in
VB

No, it's a query that Directory Expert produces to run against an Access
database. The logical is very hokey as you could see in my example and so to
make this extract tool (which is all it is, used to extract data from their
database for use on the web) extensible and to eliminate a human being
having to recreate the queries Directory Expert produced in normal logic, I
decided to write this app that simply uses these queries to get recordsets
from the data and drop the records in a new database.

> You may try to describe better the problem, rationally. Since you stay

Agreed.

Here's a better description. Basically Directory expert produces 51 of these
long random queries that need to be run against the database to produce the
dataset that represents the information our client wants conveyed. These 51
queries are all similar, they just represent different single field
properties for each person in the database.

So an example would be that let's say the database is a database of cars.
Each of the 51 queries essentially produces the same cars, but based on
whether it shows up in a query determines whether the car is sold at that
dealership. So query 4 could be Tom's Ford in Phoenix. The same cars could
show up there as in query 5, just a different dealership. The problem,
though, is that I have no idea (without digging into the Directory Expert
queries one by one each time an update is done) how Directory Expert puts
together the fields. So to produce (Cars.Name) in query 4 could be a
different AS statement than producing (Cars.Name) in query 5. That's my
dilemna. To make it fairly transparent and easy to use from a BUSINESS
standpoint, I have to go with what Directory Expert produces and loop
through it the queries it makes.

My options thus are to (A) talk to them and get them to give me the logic
behind the queries in English so I can write my own against their database,
or to accept what Directory Expert produces and work within the confines of
it. Unfortunately, said company doesn't remember why they did what they did
with Directory Expert and there is no one there that really has a handle on
that end of the process. So rather than fight them to get the real world
schema and then write my own queries (which I want to do) I chose (B) which
is to write a tool that uses their existing queries against the database to
produce a data set.

So I guess the first thing to explain is that this is more of a business
process problem on their end then it is a programming problem.

But now that I have these queries and I have to work with them, this has
been my approach. I load the VB app and when I load it I load these queries
into an array of strings which hold the queries that Directory Expert
created. Then I loop through this array and for each query I do the
following.

I run the query against the old database, thus producing the data view they
wanted for that particular query. I then loop through this resulting
recordset and drop the record in the new table in the new database. So in my
car example I would loop through and check to see if the car exists. If the
car doesn't exist in the database I copy the record over then make the link
table connections to the table for the dealerships. If the car DOES exist I
simply make another link in the link table for THAT dealership and move to
the next one.

So that's my problem. I'm not sure how to optimize this. It seems to me that
the bottleneck is in the fact that for each of the 51 queries I do a SELECT
that basically gets most of the records (cars.)

> Thanks, at least, to read the message, even if you will probably chose to
> ignore it,
> Vanderghast, Access MVP.

Call me crazy, but you may have a bias. I'm not saying VB is golden, but
I've done similar things against SQL Server and I would swear it wasn't this
slow. No bias here. I'm honestly asking for help.

Preston

Preston Crawford

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
I'm looking into that. How do create an object that is a direct connection
to the database through jet? Where is that info found?

Preston

"Vincent Minden" <v-mi...@instant.co.jp.nospam> wrote in message
news:OIp$p0eQ$GA.272@cppssbbsa04...


> A much more efficient way to do what you are trying to do is to rely on
Jet engine. Instead of doing a select to get a recordset
> and then use ADO to get and copy each individual record, use the Jet SQL
to your advantage.
>
> Use either the INSERT..INTO or SELECT..INTO commands.
> INSERT..INTO if the table is already created or
> SELECT..INTO if you need to create the table.
>
> A simple example
>
> INSERT INTO othertable(field1,field2,field2) IN myotherdb.mdb
> SELECT field1,field2,field3 FROM mytable WHERE . . .
>
> The complete details are in the docs - I looked them up under the DAO SDK,
but the SQL is for the Jet engine, having nothing to do
> with either DAO or ADO. Just create the SQL and use, say, the
connection's Execute method. Just note that this is specific to the
> Jet engine.
>
> Vincent Minden
>
>

Preston Crawford

unread,
Dec 9, 1999, 3:00:00 AM12/9/99
to
Oh. Okay. I know how to use the ADO Connection object obviously since what I
originaly wrote was in ADO. :-)

So basically I'll say conn.Execute(strQuery) and that will do it?

I thought that was what I was looking at, but I wasn't sure since I didn't
know if the using an ADO connection object actually circumvented ADO like
you talked about.

Thanks,

Preston


"Vincent Minden" <v-mi...@instant.co.jp.nospam> wrote in message

news:uuCe2KqQ$GA.272@cppssbbsa04...
> Since you are using ADO, use the ADO connection object. It has an Execute
method that you can
> pass in a SQL querry that will be executed by the Jet engine. See the
docs for the ADO
> connection object.


>
> Vincent Minden
>
> Preston Crawford <aspx...@micron.net> wrote in message

news:OppGUFoQ$GA.265@cppssbbsa04...

Vincent Minden

unread,
Dec 10, 1999, 3:00:00 AM12/10/99
to

Michel Walsh

unread,
Dec 10, 1999, 3:00:00 AM12/10/99
to
Hi

You are right that running 51 queries in Jet will cause part of the data (at
least, the index) to transit through the network wire, 51 times, while it
won't for an SQL Server (where only the result will come back 51 times). I
assume, in this case, that importing the table in a local Jet database (on
the application PC) ONCE and thus, running the 51 queries on the "local"
table can be a profitable solution. See it as working on a snapshot
recordset or a disconnected recordset: you are penalized first because you
have to fill the whole recordset, but the next access to that data, and you
have 50 of them, is much faster.

In JET, you can access another database with the simple syntax
database-and-name.table. Even if I have never tried that syntax from ADO, I
don't see why the syntax won't work. As Vincent suggested it, use SELECT
INTO or INSERT INTO accordingly to the existence of the (new) table in the
local database.

If some of the queries are close, but just differ by a constant values, you
can change the constant to a parameter or, better still, make a GroupBy
query on each of the possible values. That way, you run LESS queries, so you
win, at least, the overhead of calling many times the same query.

If some queries differ by simple constant "substitution", you can change
that too in placing those constants into a table and making a simple lookup
through a fast inner join. Not only you win on reducing the number of
queries, but you also win on the maintenance side (easier to
add/modify/delete a value in a table than to re-write a query, or to have to
re-write a query at run-time).


Hoping that may help,
Vanderghast, Access MVP.

Preston Crawford <aspx...@micron.net> wrote in message

news:uZrdl1nQ$GA.292@cppssbbsa04...

David Williams

unread,
Dec 10, 1999, 3:00:00 AM12/10/99
to
One thing that I would do is NOT create the indexes or keys for the output
table(s) until you have fully populated the table(s). If there are
keys/indexes on the output table, the database engine must recalculate the
keys and indexes for every record.

Preston Crawford wrote in message ...

Michael

unread,
Dec 11, 1999, 3:00:00 AM12/11/99
to
Preston,
ADO is known to be slow with Access97. Problems solved with Access 2000

Kind Regards
Mike MCP
David Williams <dav...@clinitex.net> wrote in message
news:#gi1MTxQ$GA.259@cppssbbsa04...

Jerry M

unread,
Dec 11, 1999, 3:00:00 AM12/11/99
to
First of all, Preston, please accept my condolences. I bet you feel the
need to shower after working with that kind of garbage! <grin>

BTW, I would be extremely interested in seeing one complete query and the
field list of the table the query is on. It boggles my mind to see garbage
like that cranked out by a program that claims to be an "Expert". Always
steer clear of programs that claim to be "Expert" or "Master" or "Pro"!
<grin>

Have you actually timed the various parts of your algorithm to find out
which segment is taking up the most time? Just looking at the small snippet
of SQL that you posted makes me think that the query itself is eating a
large portion of time. Let me explain: SQL calculations slow down the
return of the relation. The IIF() function is notoriously slow, so your
adding additional time to a process that is already running slow. Multiply
that by the number of times that SQL has to perform calculations just to
build just one tuple and you'll begin to understand why the query is so
slow. (Just looking at those five lines of SQL that you posted makes me a
bit queasy).

You could speed up things tremedously by offloading the calculations to a
user-defined function. For example, instead of:

Last_Name_Facility_Name_Grp_Name + IIF(First_and_Middle_Name is not
null,', '+First_and_Middle_Name,'')...

Change it to:


gBuildFullName(Last_Name_Facility_Name_Grp_Name,First_and_Middle_Name,...)
as [Full_Name]

Where "gBuildFullName() is a function that builds the "Full_Name" and
returns it in a string. Then, as others have mentioned, create a new table
from the query and work off the new table.

> So basically it produces some absolute trash in the name of
> making Access easy to use.

I doubt that this is actually the case. From that small snippet of SQL that
you posted, I can almost garauntee you that the SQL is there to bandage a
poorly designed data-/code-set and that it has nothing to do with making
Access easy to use (which it is just by itself). I am willing to bet that
you would see the same garbage regardless of which DBMS the author desided
to use (you can tell by the author's notation that he or she is an oldtimer
who has no idea what Fifth Normal Form is! <grin>)

I wish you the best of luck with your project and am glad that you're doing
it and not me. I've got my hands full just trying to integrate with MAS90
(another one of those garbage programs that claiming to be a "Master")

Jerry...

Preston Crawford <aspx...@micron.net> wrote in message

Michel Walsh

unread,
Dec 11, 1999, 3:00:00 AM12/11/99
to
Hi,


Using a user defined function will work only within Access + Jet, it won't
work from VB+Jet, as example. If the case is Access+Jet, then the iif, in
this case, can be replace by + and &:

FirstName & (", " + MiddleName ) & (", " + LastName)

replace a sequence of complex iif, and you can get, as result:

"FirstName" not "FirstName, , " with ugly extra comas, in the case
of Null for MiddleName and LastName
"FirstName, LastName" not "FirstName, , LastName" if
midleName is Null

and so on. The rule is that operator + propagates Null while the operator &
doesn't.


Otherwise, iif, inside SQL, is NOT slow. You refer to your experience of iif
inside VB/VBA, which IS slow, but even if they share the same name and the
same end functionality, they don't share the same code at all and the
iif-SQL, I repeat myself, is not slow. You can even try: Iif( 1=1, "Hey",
45.6/0.0 )

in VB, you will get an error (45.6 / 0.0 ), but you won't at all in SQL
(since SQL is intelligent enough to NOT evaluate uselessly the false part)
(and no, it is not due to a "silent error", those things happen sometimes in
SQL, just make it a call to a function with a side effect, like displaying a
MsgBox, if you want to be sure).

Also, the user define function will have to do the job anyhow, plus the
overhead of being a function (stack push-pop, etc.). I doubt it will be
faster, in this case. Cleaner, but not faster.


Just some clarifications, I thought would be useful,
Vanderghast, Access MVP.

Jerry M <Jer...@home.com> wrote in message
news:OPOKKN6Q$GA.267@cppssbbsa05...

(...)

>
> You could speed up things tremedously by offloading the calculations to a
> user-defined function. For example, instead of:
>
> Last_Name_Facility_Name_Grp_Name + IIF(First_and_Middle_Name is not
> null,', '+First_and_Middle_Name,'')...
>
> Change it to:
>
>
> gBuildFullName(Last_Name_Facility_Name_Grp_Name,First_and_Middle_Name,...)
> as [Full_Name]
>
> Where "gBuildFullName() is a function that builds the "Full_Name" and
> returns it in a string.

(...)

Michel Walsh

unread,
Dec 11, 1999, 3:00:00 AM12/11/99
to
Hi,


lack of precision, I should have speak of JET-SQL-iif, not SQL-iif. MS SQL
Server uses Case, and doesn't know iif, iif is not part of the standard SQL.

Vanderghast.

Michel Walsh <Vande...@email.msn.com> wrote in message

news:#tLw6b#Q$GA....@cppssbbsa02.microsoft.com...

Jerry M

unread,
Dec 11, 1999, 3:00:00 AM12/11/99
to
Hello Michel,

Your point about user defined functions is noted. Thanks for point it out.

Your use of the concatenation operators is the way to go, not to mention
that it is a much more elegant solution.

Your point is also noted about Immediate IF is also noted. I had not
realized at the interpreter for SQL was different from that of VB.

Your point about using the user-defined function not giving a speed increase
is not vaild. Immediate IF is a function, so it is going to suffer the same
stack operation penaties as a user-defined function. Plus, in the case of
multiple IIFs per attribute it will speed things up (not as fast as your
concatenation operators, however).

Thank you for the clarifications, Michel. The were very useful.

Jerry...

Eric S. MacZura

unread,
Dec 14, 1999, 3:00:00 AM12/14/99
to
Here is just a suggestion:
__________________________________________
Access to OTHER DB:

Use and ADO recordset to select your records into...

Create a disconnected recordset with the structure of the table in the dest
DB.

Iterate through the ADO resultset (source) and insert a record into the
disconnected recordset....

After the iteration is complete

connect the recordset.

Call update batch method on disconnected recordset..

I run on PII 300 and it runs in less than 2-3min. for about 20-30K recs.

_________________________________________________
Access to Access / Excel to Access / ODBC to Access:

load the MS Access Object Library and use the TransferDatabase methods

very nice... hardly any code..

I did it with Fox Pro to Access

First generated a script in FoxPro to export the files to excel
spreadsheets...
Then used TransferDatabase method provided by Access

Ran like a charm...

Things to think about anyway...

Eric


Preston Crawford <aspx...@micron.net> wrote in message

news:e0PYW8bQ$GA.261@cppssbbsa04...

John Grandy

unread,
Dec 16, 1999, 3:00:00 AM12/16/99
to
<<
You can even try: Iif( 1=1, "Hey",
45.6/0.0 )

in VB, you will get an error (45.6 / 0.0 ), but you won't at all in SQL
(since SQL is intelligent enough to NOT evaluate uselessly the false part
>>

i believe the VB IIF function works that way by design. it is not always a
useless evaluation. depends on your specific circumstances. anyway, all
languages (and related standard class or function libraries built on top of
these languages) have what some may think of as idiosyncracies/quirks which
tend to be very useful in some applications and infuriating in others.

are you going to tell me that "default fall-through to next CASE statement"
in c and java 's SWITCH statement is good ? and VB's "evaluate one CASE
statement only" is bad? i don't think there is a real answer to that
question.

also, when you refer to "SQL" in the generic sense, what exactly are you
referring to ? a specific parsing engine ? an data engine ? the ANSI
Standards ? i can't figure out what you are referring to.

cheers.

John Grandy

unread,
Dec 16, 1999, 3:00:00 AM12/16/99
to
in this situation, is network latency really a problem of a large enough
relative magntidue to make it worth considering ?

from my experience, it will make very little difference in speed whether the
MDB is local or remote. as long as it is on a continuous network
connection.

of course, i only work on fast networks . . .

Darren Reynolds

unread,
Dec 16, 1999, 3:00:00 AM12/16/99
to
I have to disagree with John about relative performance of local vs.
network. As far as servers go, NetWare generally performs better than
NT with Access. However, no server can compensate for a heavily
trafficked network. Big companies tend to suffer more than small ones,
but I have found there's usually a big difference between a
server-based MDB and a workstation-based one. There will be an even
bigger difference if the MDB is the other side of a router.

Michel Walsh

unread,
Dec 16, 1999, 3:00:00 AM12/16/99
to
Hi,

By implementation, you pass values to a function, to its arguments, you
don't pass the unsolved formula.

If you have:
Public Sub Toto(arg As Long)
...
End Sub

then, when you call:
Call Toto( 44+66 )

you are obliged to evaluate the arguments, you don't pass 44+66, you pass
110 and inside Toto, you see only 110, not 44+66.

iif is a function, so you have to evaluate each of its arguments before
passing them to its "inside", even if only the true part or the false part
is required. Just compare:

If Test then
X=Lengthily_computation1()
Else
X=Lengthily_computation2()
End if

vs:

X=iif(test, Lengthily_computation1(), Lengthily_computation2())

In the first case, only one computation is done, in the second case, both
are done.

The difference is negligible is the computation is small, such as being a
pure constant, but VB/VBA-string concatenation is not an irrelevant
operation, as it is the case here. In a good programming practice, where you
SHOULD NOT create side effects, and under such a case, iif need uselessly to
evaluate one of the two last arguments, uselessly since you will capture
only one. In my humble opinion, that is not an idiosyncrasy, that's the
procedural model itself, the problem is generally in that people would
optimise the process, and manually, evaluate only one of the argument, but
that is not the procedural way of doing it.

Iif in Jet-SQL is not a function, since SQL (the language) is not a
procedural language in nature, it is not based on functions, on HOW to do
the stuff, but on DESCRIBING the wanted result, and telling it (SQL, the
language), do as you wish to get that result, I don't care, as long as I get
the result I want and the process is as performant as possible. Being NOT a
function, the arguments are not obliged to be "evaluated" before "sending"
them to another level of evaluation. "Iif" is not part of SQL standard, but
it is part of JET-SQL, "Case" is generally used, in SQL (standard), for that
purpose, but "case" is not supported by JET-SQL.


Vanderghast, Access MVP.

John Grandy <JohnA...@csi.com> wrote in message
news:#wmUvN6R$GA.254@cppssbbsa04...

Dave G

unread,
Dec 16, 1999, 3:00:00 AM12/16/99
to
One warning. I had two Access databases that I had to query, so I created a third Access
DB with linked tables from the other two. It appeared to send the entire contents of both
databases across the network for a simple one-record update query.

Linked Tables are evil (or can be in certain situations).

Dave G
OTI

Darren Reynolds <darrenrey...@my-deja.com> wrote in message
news:3858c380...@msnews.microsoft.com...


> I have to disagree with John about relative performance of local vs.
> network. As far as servers go, NetWare generally performs better than
> NT with Access. However, no server can compensate for a heavily
> trafficked network. Big companies tend to suffer more than small ones,
> but I have found there's usually a big difference between a
> server-based MDB and a workstation-based one. There will be an even
> bigger difference if the MDB is the other side of a router.
>

rick_s...@my-deja.com

unread,
Dec 17, 1999, 3:00:00 AM12/17/99
to
In article <e0PYW8bQ$GA.261@cppssbbsa04>,

"Preston Crawford" <aspx...@micron.net> wrote:
> I have a program I've written in VB using ADO. Basically I'm running a
> series of queries against an Access table. I then loop through that
> resulting recordset and populate a different database (two tables)
with the
> results I get. So basically I'm running an ADO insert for each record
in the
> database I'm drawing from. I've heard that Batch updates are faster.
Is this
> true? The amount of inserts I'm working with is 40,000 and it's
taking about
> an hours. Does that sound normal?
>
> Preston
>
>
Preston: You started quite a thread. My experience is to do as much
as you can in the database query if you want performance. I would
suggest using the existing SELECT query, and in ACCESS, build a new
INSERT or CREATE TABLE query based on it that creates the output
dataset you need. Then run the new query from the VB APP (DAO or ADO),
preferably running on the same PC as the database is hosted. You can
do a surprising amount of processing in SQL itself, and it is almost
always faster than reading tables into a recordset and trying to apply
the logic programatically.

If you really do want to use a recordset, I would think that the
disconnected recordset and using ADDNEW and UPDATE BATCH would give you
the best performance. The suggestions to make sure there are no
indexes on the destination table are also worthwile.


Sent via Deja.com http://www.deja.com/
Before you buy.

Stephen Clapham

unread,
Dec 29, 1999, 3:00:00 AM12/29/99
to
I have found that this is very fast for adding / changing records in the
disconnected recordset BUT when I do the "updatebatch" after reconnecting
the slow stuff happens again - about 4 seconds with 300 records on a K6-2
350.
I expected something better - am I expecting too much??

Stephen C

Eric S. MacZura <eric.m...@mkg.com> wrote in message
news:eMUc8LoR$GA.266@cppssbbsa05...

0 new messages