cascading-JDBC : Not able to update redshift table when there is DateType column coming from source file.

41 views
Skip to first unread message

Kishor Baindoor

unread,
Jun 9, 2016, 11:07:17 AM6/9/16
to cascading-user
Hi,

I am able to load the data into new redshift table from textDelimited file.  Which worked fine, I guess it worked in below way.

TextDelimited file I created has DateType column. This DateType column is converted to long value by cascading, and while loading it, it coverts back to Date value by  method tupleEntry.getCoercedTuple() . This method is called in JDBCScheme. While calling this method it checks whether fields has type specified.
Below is the code from JDBCScheme.
if ( fields.hasTypes() )
      result
= tupleEntry.getCoercedTuple( fields.getTypes() );


Now, I tried updating the Redhsift table which I created earlier. It failed with the error "Error converting data, invalid type for parameter: 1".

I am using below RedshiftScheme while updating. In this scheme we do not have fields argument. 
RedshiftScheme redshiftScheme =new RedshiftScheme( columnNames, null, updateByColumns );


I guess in this case, TextDelimited converted DateType column to long value. This long value is not coerced into Date  while writing into table.  Earlier mentioned getCoercedTuple() method is called only when fields has type. The RedshiftScheme available for update has no Fields argument in it. Due to this Long value is sent to Redshift. Which intern is throwing error.

Not sure if I have explained it clearly. So I have attached both code for creating new redshift table and updating existing redshift table, and also have attached the error log. Please check.










 



Code For new table.txt
Code for Updating existing Table.txt
Error Log.txt

Kishor Baindoor

unread,
Jun 13, 2016, 5:11:30 AM6/13/16
to cascading-user
Adding one more constructor in RedshiftScheme having fields in it's argument solved this issue.

Below is the constructor:
 public RedshiftScheme( Class<? extends DBInputFormat> inputFormatClass, Class<? extends DBOutputFormat> outputFormatClass,
      Fields columnFields, String[] columns, String[] orderBy, Fields updateByFields, String[] updateBy )
    {
    super(inputFormatClass, outputFormatClass, columnFields, columns, orderBy,  updateByFields, updateBy );
    } 

Andre Kelpe

unread,
Jun 13, 2016, 6:16:13 AM6/13/16
to cascading-user
Can you open a pull request with that? I can then integrate it.

- André
> --
> You received this message because you are subscribed to the Google Groups
> "cascading-user" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to cascading-use...@googlegroups.com.
> To post to this group, send email to cascadi...@googlegroups.com.
> Visit this group at https://groups.google.com/group/cascading-user.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/cascading-user/65ac0b8f-d5fd-4685-ba05-d7a9c0282e50%40googlegroups.com.
>
> For more options, visit https://groups.google.com/d/optout.



--
André Kelpe
an...@concurrentinc.com
http://concurrentinc.com

Kishor Baindoor

unread,
Jun 13, 2016, 7:31:45 AM6/13/16
to cascading-user
Sure. I will do it.

Thanks
Kishor

Kishor Baindoor

unread,
Jun 13, 2016, 9:28:35 AM6/13/16
to cascading-user
Hi Andre,

I am adding requited constructors for redshift scheme. Should I log the gh issue as well?

Thanks,
Kishor

Andre Kelpe

unread,
Jun 13, 2016, 9:32:31 AM6/13/16
to cascading-user
No, just open a PR against the wip-3.0 branch.

- André
> --
> You received this message because you are subscribed to the Google Groups
> "cascading-user" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to cascading-use...@googlegroups.com.
> To post to this group, send email to cascadi...@googlegroups.com.
> Visit this group at https://groups.google.com/group/cascading-user.
> To view this discussion on the web visit
> https://groups.google.com/d/msgid/cascading-user/5ac58baa-d3cf-4537-a838-a43c9bfbdef8%40googlegroups.com.
Reply all
Reply to author
Forward
0 new messages