I have some simple code where I create a datastore, load about 150,000
records and then process. The process modifies about 10,000 rows. This can
be accomplished in about 2 minutes. After that I have to issue the update
which takes about an hour. The dataobject has about 20 columns which are
all updateable and 1 column which is the primary key. I am using PB 7 and
MSSQL 7.0. Any ideas on how to improve the performance? Thanks in advance,
Steve
"Bora Susmaz" <bsu...@hotmail.com> wrote in message
news:0Msk9dB...@forums.sybase.com...
> I am not sure if this works but move (or copy) the rows you have changed
to
> another datastore and update that second datastore. May be its spending
some
> time for checking 140,000 rows which are not modified. And also, before
> issuing the update on the second datastore, destroy the first one so that
> you will reclaim the memory, giving more to the actual process.
>
> HTH
>
> "Steve" <ste...@customcall.com> wrote in message
> news:j#CGDIBA...@forums.sybase.com...
HTH
"Steve" <ste...@customcall.com> wrote in message
news:j#CGDIBA...@forums.sybase.com...
I'll try the copy into a second datastore and see if that improves the
performance. As to your other suggestion, how would I do the processing on
the server instead of the client? This is a simple client-server
application, but I've never known how to do a datawindow or datastore update
on anything other than on the client. Thanks for any additional help.
Steve
"Bora Susmaz" <bsu...@hotmail.com> wrote in message
news:FrDmPfB...@forums.sybase.com...
You should do this in the server side. You should create a stored procedure
and then call the stored procedure from the client. You will need to read
about stored procedures. Basically, stored procedures are just like
functions, having input and output arguments, but residing on the database
server.
This is the right way to go in your case.
Good Luck !!!
"Steve" <ste...@customcall.com> wrote in message
news:nv9E4vB...@forums.sybase.com...
The datawindow is not a great tool for mass updates. It will take long to
update that many rows. PB generates a SQL Update statement for each row.
There is one thing you can do that would help a bit. Check the Update
properties for the DW. Ensure that the "Where Clause for Update" is set to
use just the key columns. This will generate the most efficient SQL for
updating a single row. Unfortunately you need to wait for 10,000 of these
statements to be sent to the server.
Another issue is the transaction. If you expect the entire update to take
place within the scope of one transaction, this will also slow things down,
because the database will hold the transaction open until it finishes
updating all 10,000 rows. You could try to commit the transaction after each
row or after a few rows, in much the same way that you can control the
number of rows in a transaction when using the data pipiline feature.
HTH
Sanjiv.
"Steve" <ste...@customcall.com> wrote in message
news:j#CGDIBA...@forums.sybase.com...
I am tinkering around with the great suggestions I have received from the
group to try to find the fastest solution. I was just wondering - if the
datawindow/datastore is not a great tool for mass updates, is there
something in PB that is? MSSQL 7.0 has a bulk insert routine that is quick
but I would need to save the datastore into a text file first (kinda messy),
and our ASP customers cannot have bulk insert rights granted to them (need
to be admin until version 8.0). Thanks for any additional suggestion!
Steve
"Sanjiv Das [TeamSybase]" <No-Spa...@CompuServe.Com> wrote in message
news:4IGdLGG...@forums.sybase.com...