We were running a druid query server on a 2 core and 8GB RAM server(broker jvm: 4g and router jvm 512m). With this configuration we were able to query ~5GiB of data or 50L of records.
If we try to query more time periods we are getting error in druid UI "unparsable row in string return".

When we looked at logs found below error in broker.log
2021-04-15T08:16:00,618 ERROR [qtp1821711066-139] org.apache.druid.sql.http.SqlResource - Unable to send SQL response [20da59b3-0640-48fd-a00e-49cc8b8902a3]
org.apache.druid.query.QueryInterruptedException: Unexpected end-of-input: expected close marker for Array (start marker at [Source: (SequenceInputStream); line: -1, column: -1])
at [Source: (SequenceInputStream); line: -1, column: 396074945]
at org.apache.druid.client.JsonParserIterator.interruptQuery(JsonParserIterator.java:197) ~[druid-server-0.20.0.jar:0.20.0]
at org.apache.druid.client.JsonParserIterator.next(JsonParserIterator.java:119) ~[druid-server-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.BaseSequence.makeYielder(BaseSequence.java:90) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.BaseSequence.access$000(BaseSequence.java:27) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.BaseSequence$1.next(BaseSequence.java:114) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.MergeSequence.makeYielder(MergeSequence.java:131) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.MergeSequence.access$000(MergeSequence.java:32) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.MergeSequence$2.next(MergeSequence.java:173) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.WrappingYielder$1.get(WrappingYielder.java:53) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.WrappingYielder$1.get(WrappingYielder.java:49) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.SequenceWrapper.wrap(SequenceWrapper.java:55) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.WrappingYielder.next(WrappingYielder.java:48) ~[druid-core-0.20.0.jar:0.20.0]
at org.apache.druid.java.util.common.guava.MergeSequence.makeYielder(MergeSequence.java:131) ~[druid-core-0.20.0.jar:0.20.0]
Then we tried by upgrading to 4 core and 16GB RAM server(broker jvm: 10g and router jvm 1024m)
But nothing changed. The query limit and error remained as above.

We are trying to achieve nearly 200L/2B records or 20GB of data in a single query. Can someone suggest a hardware or software configuration which is required to achieve this result.