I am creating a very wide table on an existing table in clickhouse.
The source table in this case is very small about 20GB. The query I am writing on top of it is just calculating some features on different time windows of the same data (using Sumif) and then applying some more complex calculations to calculate standard deviation and linear regression.
The resulting table has about 600 columns. the thing is that this query is able to execute.
But the I get this error during the write part.
DB::Exception: Memory limit (for query) exceeded: would use 83.82 GiB (attempt to allocate chunk of 4219492 bytes), maximum: 83.82 GiB. (MEMORY_LIMIT_EXCEEDED) (version 18.104.22.168 (official build))
I know its failing on write because when I look at the logs it shows some written rows.