Problem reading list<text> data type from Cassandra

Skip to first unread message

Anna Pawlicka

Feb 7, 2014, 11:59:59 AM2/7/14

I have a table with the following column definition:

 {:device_id :uuid
 :description :varchar
 :parent_id :uuid
 :entity_id :uuid
 :location :varchar
 :metadata :varchar
 :sensors :list<varchar>
 :privacy :varchar
 :meteringPointId :uuid
 :primary-key [:device_id]}

This is the data that I insert:

{:metadata "",
:privacy "private",
:location "{\"longitude\":\"\",\"latitude\":\"\",\"name\":\"\"}",
:parent_id #uuid "17e9924d-db19-4e5d-9668-b9bb188f3de0",
:meteringPointId #uuid "953c423d-84b0-4c5e-8c6c-bae00cc81317",
:entity_id #uuid "b354c80d-81a0-4680-bf65-01a928b8dae8",
:description "",
:device_id #uuid "297b7496-233d-435f-8210-5203594e580c"}

It gets inserted into Cassandra with no problems using cql/insert.
When I query this data using cqlsh, it looks like this (ignore the different values):

e8c10d58-e8dd-480d-9db0-1e863ebb0cfa | | a45fdbcb-9712-4226-ab00-ee4e9a939e41 |
{"longitude":"","latitude":"","name":""} | | e058d461-c23d-4586-80ed-d79dd86efc5e | 
189fc93a-f6a6-4d4f-95e7-62551ab11d24 | private |

I can obviously query it using cqlsh, but when I use Cassaforte (no matter if it's raw cql or dsl), I get the following error:
InvalidTypeException Invalid serialized value for type list<text> (String didn't validate.)  com.datastax.driver.core.DataType.deserialize (

I'm using Cassandra 1.2.13, driver com.datastax.cassandra 1.0.5, and cassaforte 1.3.0-beta9. According to CQL3 specs:

- "Enclose ASCII text, timestamp, and inet values in single quotation marks. Enclose names of a keyspace, table, or column in double quotation marks." - this is how Cassaforte seem to be inserting the data.

- "Writing list data is accomplished with a JSON-style syntax. To write a record using INSERT, specify the entire list as a JSON array." - I pass in an array of strings (stringified json objects), which seems ok.

Did anyone have a similar problem and knows how this can be fixed? 


Alex P

Feb 10, 2014, 4:54:46 AM2/10/14
Hi Anna,

I strongly recommend using prepared queries to avoid encoding problems.
We have tests for insert, update and select of Lists, and they all work seamlessly. 

Also, I'm not sure how you can use 1.0.5 datastax driver with 1.3.0-beta9 cassaforte, since we're using "2.0.0-rc2" underneath Cassaforte.

Try using prepared queries and vectors of strings (they implement Enumerable, so it works).

Even for non-list values I still suggest using prepared queries.

Thank you

Anna Pawlicka

Feb 10, 2014, 6:48:02 AM2/10/14
When I use prepared query (client/prepared (cql/insert "table" measurement)), I can see this warning: 

WARN  com.datastax.driver.core.Cluster - Re-preparing already prepared query INSERT INTO "measurements_test" (timestamp, error, month, type, value, device_id) VALUES (?, ?, ?, ?, ?, ?);. Please note that preparing the same query more than once is generally an anti-pattern and will likely affect performance. Consider preparing the statement only once.

Inserting takes at least twice as much time as it did without prepared query. Is this to be expected?

I have simplified (for testing purposes) strings that are inserted into a list field and it works. The problem is when the strings are stringified JSON objects and contain double quotes inside them: '{"resolution":"","period":"CUMULATIVE","correctedUnit":"M","correction":false,"accuracy":"\u0086","max":"8","correctionFactorBreakdown":"","type":"relativeHumidity","unit":"Z","min":"","correctionFactor":"\u00fb"}'. Would you know what is the expected format of such strings?


Alex Petrov

Feb 10, 2014, 8:11:14 AM2/10/14
to, Anna Pawlicka
This warning is fixed in beta 10. Could you check with Beta 10?

Yes, I understand your problem with quoting - I just suggest to check out com.datastax.driver.core.querybuilder.Utils for to make sure you escape all of it correctly.
You received this message because you are subscribed to the Google Groups "Cassaforte, a Clojure client for Apache Cassandra" group.
To unsubscribe from this group and stop receiving emails from it, send an email to
To post to this group, send email to
To view this discussion on the web visit
For more options, visit
Reply all
Reply to author
0 new messages