Hi All
I have table with below columns
Table A
- id : String
- udt1: UDT
- start_dummy: bigint
- subfield : UDT
- start : bigint
I am querying this UDT from spark sql using
"select id,udt1.subfield.start from mycatalog.keyspace.A where id=.."
Here I want to select specific field in UDT and do analysis in spark. Same query works correctly in CQL and returns appropriate field However in spark SQL it fails with below error
20/11/17 19:30:06 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.IllegalArgumentException: The value (1605484774387) of the type (java.lang.Long) cannot be converted to struct<start:bigint>
I have attached log trace for reference
However above query if I query 1 level above field in same UDT it works as expected
"select id,udt1.subfield.start_dummy from mycatalog.keyspace.A where id=.."
Is there any workaround to query the same in spark as we query in cql ?
Ideally I think it should work same was it works for field in top level UDT and be consistant with CQL
Regards
Amol