Google Groups no longer supports new Usenet posts or subscriptions. Historical content remains viewable.
Dismiss

SQL job never ending

34 views
Skip to first unread message

sstia...@gmail.com

unread,
Apr 22, 2020, 12:29:22 PM4/22/20
to
I try is to calculate the cost of goods per warehouse and per product by using the FIFO method, per warehouse per product and store the results in a table.
I have a table called ITEMTRANSACTIONS, which has the transactions for all the products and for all the warehouses, and another table called PRODUCTS with the transactions for some of the products.
PRODUCTS table have the following columns:

DATE, DATETIME, ID, WAREHOUSE, ITEMCODE, TRANSACTIONTYPE, QTY, PRICE

I have created a job called FIFO_MAIN, which has the following steps:
• Step 1 to 6: Updating some columns in the ITEMTRANSACTIONS table.
• Step 7: Update the data in the PRODUCTS table (using MERGE), using the ITEMTRANSACTIONS table as the source.
• Step 8: Delete the data in the RESULTS table.
• Step 9: Calculate values using data from the PRODUCTS table and enter results in the RESULTS table (using INSERT INTO).
• Step 10: Update the data type of some columns in the RESULTS table.
• Step 11: Update FIFO price column in the PRODUCTS table (using UPDATE TABLE SET COLUMN).
In step 9 I use WITH where I create the tables:
• sample_data where I pull the data from the table. I use the WHERE clause to only take the rows related to the MAIN warehouse.
• Table_1 (which pulls the data from sample_data), Table_2 and Table_3, which use data each from the previous one to calculate some columns.
• WITH clause closes and becomes INSERT INTO (columns of the RESULTS table) and SELECT (columns of Table_3) + 2 calculated columns which are a simple sum and a multiplication.
I ran the FIFO_MAIN job at night using daily schedule. Each night it took a total of 2 hours and 36 minutes at most to complete and the results were stored as expected in the RESULTS table.
The issue is:
I had to create another job (FIFO_SECOND) which also pulls data from the PRODUCTS table, but this time, for the SECOND warehouse. All steps were the same, except for step 9, where in sample_data in the WHERE clause WAREHOUSE=SECOND instead of WAREHOUSE=MAIN. This time, I use the RESULTS_SECOND table for the output results. The first time the FIFO_SECOND job ran, when it reached step 9, it got stuck. It took many hours appearing "in progress" in the history of the job until I forced it to stop.
What I tried:
I tried to clear the cache and run the FIFO_SECOND job again, but it still got stuck at step 9. Also, I ran the 2 jobs separately, but I still had the same result.
In the FIFO_MAIN job, which has no issues, I tried to replace the MAIN warehouse with the SECOND one in the WHERE clause, since this is the only difference and it got stuck again at step 9. When I replaced it again as it was at the beginning, the job ran normally again.
It looks like it has to do with the change in step 9. However, by making changes to any other step, eg in 2 where I update a column of the table ITEMTRANSACTIONS I changed the calculation method and yet the job never stuck to step 2.
What could be the reason of this never-ending job?
Is there a chance that having a different source for the table sample_data in these 2 jobs causing an issue? and / or because I am using the same source table (PRODUCTS)?
Any help or hint will be very much appreciated. Thank you in advance!

Erland Sommarskog

unread,
Apr 22, 2020, 3:07:57 PM4/22/20
to
(sstia...@gmail.com) writes:
> It looks like it has to do with the change in step 9. However, by making
> changes to any other step, eg in 2 where I update a column of the table
> ITEMTRANSACTIONS I changed the calculation method and yet the job never
> stuck to step 2.
> What could be the reason of this never-ending job?
>

All sorts of reasons:
1) Long-running query plan.
2) Infinite loop.
3) Blocking.

There may be more, but those are the ones that come to mind immeidately..

Since I don't know your tables or your code, I can't say what it might
be, but you could use my beta_lockinfo to investigate whether there is
any blocking. It will also show you the current statement. Running it repeatedly, can also some indication of the progress.
http://www.sommarskog.se/sqlutil/beta_lockinfo.html

If the job is running a loop of some sort, you may also have help of
Lee Tudor's sp_sqltrace, which is also on my web site:
http://www.sommarskog.se/sqlutil/sqltrace.html
0 new messages