Anyway to truncate IQ log like ASE when doing HUGE Insert into

17 views
Skip to first unread message

sybase46

unread,
Oct 2, 2013, 11:41:53 AM10/2/13
to iq...@googlegroups.com
I have a 147 Gig table and 9 Gig TEMP STORE and fill up the TEMP STORE trying to do a select distinct into from table. I need to get rid of the Dups but 1st need to load them in a table with the distinct values. Having used ASE I could write a LOOP set @@rowcount and then "dump tran with truncate only and then continue until all the records were loaded. Is there a way to do this without putting a -m switch in the IQ cfg file which truncates the log on every checkpoint? Or would that even help?

Any Help out there? 

Mark Mumy

unread,
Oct 2, 2013, 11:46:34 AM10/2/13
to iq...@googlegroups.com, iq...@googlegroups.com
Temp space and the tran log are totally different. What would you do if tempdb filled in ASE?

Your issue is that the data you want to sort and store is too big for temp.  I would suggest that you add space to it. Add another file or two so hung tat you have enough space.  It'd his is simplex it can be filesystem or raw.  Once your load is done you can then drop the dbspaces/dbfiles that you added.

You could try to get more elegant like looping through. Subsets of data and stuffing tho results in arable only to take the results of subsets to run the distinct on again.  Sort of a looping distinct over smaller results.  But that is complex and will take time to write and test.  Better to just add some space. 

Mark
====================================
Mark Mumy
Big Data Architect, SAP
MOBILE: 347.820.2136
Twitter: @markdmumy
====================================

On Oct 2, 2013, at 18:41, sybase46 <syba...@gmail.com> wrote:

I have a 147 Gig table and 9 Gig TEMP STORE and fill up the TEMP STORE trying to do a select distinct into from table. I need to get rid of the Dups but 1st need to load them in a table with the distinct values. Having used ASE I could write a LOOP set @@rowcount and then "dump tran with truncate only and then continue until all the records were loaded. Is there a way to do this without putting a -m switch in the IQ cfg file which truncates the log on every checkpoint? Or would that even help?

Any Help out there? 

--
You received this message because you are subscribed to the Google Groups "iqug" group.
To unsubscribe from this group and stop receiving emails from it, send an email to iqug+uns...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

John Barton

unread,
Oct 3, 2013, 9:54:57 AM10/3/13
to iq...@googlegroups.com
Echo what Mark said about having enough IQ Temp Store configured, that will effect this solutions.
 
The traditional approach is to isolate the duplicates, so if you can load the 147 Gig table into an IQ table and then issue
 
select col1, col2, col3, col4, count(*) as TOTAL_ROWS, max(rowid(my_table)) as ROW_ID
into #rows_to_remote
from by_table
group by col1, col2, col3
having count(*) > 1
;
 
delete my_table from my_table, #rows_to_remote where rowid(my_table) = ROW_ID;
commit;
 
Now the first select statement needs enough IQ Temp to complete the group by to isolate the duplicates. Delete should remove the duplicate rows from the IQ table.
 
John

Jeffery Yoder

unread,
Oct 4, 2013, 6:56:42 PM10/4/13
to iq...@googlegroups.com
I created a loop that commited after so many rows????? set rowcount 1000000   and I created a counter to commit after that many rows were inserted. Same with dump tran with truncate_only on ASE


--
You received this message because you are subscribed to the Google Groups "iqug" group.
To unsubscribe from this group and stop receiving emails from it, send an email to iqug+uns...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.



--
Jeffery Hugh Younce-Yoder
Senior Sybase DBA
5539 Columbia Pike, #805
Arlington, VA. 22204

Jeffery Yoder

unread,
Oct 4, 2013, 7:19:05 PM10/4/13
to iq...@googlegroups.com
Thanks John!
Reply all
Reply to author
Forward
0 new messages