DateFormat: support for UTC T/Z format

1,742 views
Skip to first unread message

D Campo

unread,
Jul 9, 2018, 11:27:01 AM7/9/18
to ClickHouse
Hi,
I checked documentation and cannot see how to best work with timestamps in the typical ES/Solr format like: 

"2018-07-09T14:26:16.198Z"

What is the best way to deal wiith Strings in this format, and convert them to timestamp data?
I'm reading this data from Kafka through the Kafka Engine and Materialized View to place into a timestamp into a "final" table.

Thanks. 

Evan Hanks

unread,
Jul 9, 2018, 11:00:52 PM7/9/18
to ClickHouse
Not sure if it's the best approach, but what I have done in the past is used regex. ClickHouse has functions like regexReplaceOne or extractAll that you can use with a straightforward regex, e.g. (\d{4})-(\d{2})-(\d{2})T(\d{2}):(\d{2}):(\d{2}).(\d{3})Z to grab the various date parts. You can piece that together however you like, e.g. splitting to multiple columns or rewriting to a string that ClickHouse can natively parse into a DateTime type.

Amos Bird

unread,
Jul 10, 2018, 3:48:11 AM7/10/18
to ClickHouse

D Campo

unread,
Jul 10, 2018, 5:00:34 AM7/10/18
to ClickHouse
I am having an issue because my json message in Kafka comes in double quotes. 

Following with your example: 

f99e3721d9b3 :) INSERT INTO test VALUES ('2018-06-08T01:02:03.000Z');
INSERT INTO test VALUES
Ok.
1 rows in set. Elapsed: 0.002 sec. 

f99e3721d9b3 :) INSERT INTO test VALUES ("2018-06-08T01:02:03.000Z");
INSERT INTO test VALUES
Exception on client:
Code: 47. DB::Exception: Unknown identifier: 2018-06-08T01:02:03.000Z


Why is this difference there between single and double commas? 
I'm trying to capture directly into DateTime from the Kafka message's JSON String, into the KafkaEngine table DateTime field, but my data comes in double quotes and doesn't allow me to.

Amos Bird

unread,
Jul 10, 2018, 7:16:05 AM7/10/18
to ClickHouse
That should be unrelated. Double quotes in a SQL statement usually means an identifier while single quotes being string literal. Why did you turn Kafka messages directly into SQL insert statements? What's your workflow?

D Campo

unread,
Jul 10, 2018, 8:42:57 AM7/10/18
to ClickHouse
Hi Amos, 

I am not. I am just creating a table with Kafka Engine, like this: 

CREATE TABLE visitorsKafka
(
    createdAt
DateTime,
   
-- ... other fields
)
ENGINE
= Kafka('kafka:9092', 'visitors', 'groupCHVisitors', 'JSONEachRow');

JSON example:

{ ...   , "createdAt":"2018-07-10T12:30:18.749Z"  , ... }

The fields in JSON are to be inserted info visitorsKafka table. If I use "String" as a definition, it picks up the String value, but it does not auto-convert into DateTime from JSON data.
Since I can use a matview to push data into a separate, regular, persistent table, and do some transformations while doing so to put correct data into that table, I guess I can work it out there from a String; however I'd expect this to work with the latest time-related additions.
Reply all
Reply to author
Forward
0 new messages