The table messagechunk seems to be using a custom/non-standard datatype
(java.sql.Types.OTHER constant value 1111). At this time we do not have
a provision for handling this. One work around would be to populate a
new table using custom UDF to a standard datatype and then extract it
using Sqoop.
Going forward one of the areas of improvement that we are looking at is
the ability to support native database facilities to the extent
possible. But that is still in the early design stages and make take
some time to come in the code.
Arvind
On 10/05/2010 01:03 PM, Torsten Spindler wrote:
> Hello,
>
> we're trying to import data from postgres via sqoop into hadoop/hive.
> Unfortunately the import fails with the error in the subject line. Any
> ideas how to tackle this problem? Here's the output from the sqoop
> command:
>
>
> hadoop@dl360-1:~$ sqoop import --direct --connect "jdbc:postgresql://
> <host>:5432/lp_bugs" --username sqoop --password<password> --table
> messagechunk --hive-import
> 10/10/05 15:50:25 WARN tool.BaseSqoopTool: Setting your password on
> the command-line is i
> nsecure. Consider using -P instead.
> 10/10/05 15:50:25 INFO tool.BaseSqoopTool: Using Hive-specific
> delimiters for output. You
> can override
> 10/10/05 15:50:25 INFO tool.BaseSqoopTool: delimiters with --fields-
> terminated-by, etc.public static final int OTHER 1111
> public static final int REAL 7
> public static final int REF 2006
> public static final int ROWID -8
> public static final int SMALLINT 5
> public static final int SQLXML 2009
> public static final int STRUCT 2002
> public static final int TIME 92
> public static final int TIMESTAMP 93
> public static final int TINYINT -6
> public static final int VARBINARY -3
> public static final int VARCHAR 12