Query running out of memory during write

91 views
Skip to first unread message

Wasim Ismail

unread,
Mar 22, 2023, 11:49:23 AM3/22/23
to ClickHouse
Hi Everyone

I am creating a very wide table on an existing table in clickhouse.
The source table in this case is very small about 20GB. The query I am writing on top of it is just calculating some features on different time windows of the same data (using Sumif) and then applying some more complex calculations to calculate standard deviation and linear regression.

The resulting table has about 600 columns. the thing is that this query is able to execute.
But the I get this error during the write part.

DB::Exception: Memory limit (for query) exceeded: would use 83.82 GiB (attempt to allocate chunk of 4219492 bytes), maximum: 83.82 GiB. (MEMORY_LIMIT_EXCEEDED) (version 22.5.4.19 (official build))
 
I know its failing on write because when I look at the logs it shows some written rows.


Pavel Kruglov

unread,
Mar 23, 2023, 10:10:49 AM3/23/23
to ClickHouse

Hi! What kind of query you are using? Is it just INSERT INTO ... SELECT ... or something more complex (for example with materialized view)?
You can try to tune (decrease) settings min_insert_block_size_rows/min_insert_block_size_bytes (and min_insert_block_size_rows_for_materialized_views/min_insert_block_size_bytes_for_materialized_views if you are using materialized view). 
среда, 22 марта 2023 г. в 16:49:23 UTC+1, wasim....@gmail.com:

Wasim Ismail

unread,
Mar 23, 2023, 10:26:44 AM3/23/23
to ClickHouse
It is an INSERT INTO ... SELECT ... Query
stack_trace.jpg
Reply all
Reply to author
Forward
0 new messages