Cannot resolve SQL type 1111 error when importing postgresql table

1,700 views
Skip to first unread message

Torsten Spindler

unread,
Oct 5, 2010, 4:03:58 PM10/5/10
to Sqoop Users
Hello,

we're trying to import data from postgres via sqoop into hadoop/hive.
Unfortunately the import fails with the error in the subject line. Any
ideas how to tackle this problem? Here's the output from the sqoop
command:


hadoop@dl360-1:~$ sqoop import --direct --connect "jdbc:postgresql://
<host>:5432/lp_bugs" --username sqoop --password <password> --table
messagechunk --hive-import
10/10/05 15:50:25 WARN tool.BaseSqoopTool: Setting your password on
the command-line is i
nsecure. Consider using -P instead.
10/10/05 15:50:25 INFO tool.BaseSqoopTool: Using Hive-specific
delimiters for output. You
can override
10/10/05 15:50:25 INFO tool.BaseSqoopTool: delimiters with --fields-
terminated-by, etc.
10/10/05 15:50:25 INFO tool.CodeGenTool: Beginning code generation
10/10/05 15:50:25 INFO manager.PostgresqlManager: Executing SQL
statement: SELECT t.* FRO
M messagechunk AS t LIMIT 1
10/10/05 15:50:25 INFO manager.PostgresqlManager: Executing SQL
statement: SELECT t.* FROM messagechunk AS t LIMIT 1
10/10/05 15:50:25 ERROR orm.ClassWriter: Cannot resolve SQL type 1111
10/10/05 15:50:25 ERROR orm.ClassWriter: No Java type for SQL type
1111
10/10/05 15:50:25 ERROR orm.ClassWriter: No Java type for SQL type
1111
10/10/05 15:50:25 ERROR orm.ClassWriter: No Java type for SQL type
1111
10/10/05 15:50:25 ERROR orm.ClassWriter: No Java type for SQL type
1111
10/10/05 15:50:25 ERROR orm.ClassWriter: No Java type for SQL type
1111
10/10/05 15:50:25 ERROR orm.ClassWriter: No Java type for SQL type
1111
10/10/05 15:50:25 ERROR sqoop.Sqoop: Got exception running Sqoop:
java.lang.NullPointerException
java.lang.NullPointerException
at
com.cloudera.sqoop.orm.ClassWriter.parseColumn(ClassWriter.java:734)
at
com.cloudera.sqoop.orm.ClassWriter.generateParser(ClassWriter.java:
806)
at
com.cloudera.sqoop.orm.ClassWriter.generateClassForColumns(ClassWriter.java:
994)


The description for messagechunk table is:
Table
"public.messagechunk"
Column | Type |
Modifiers

----------+----------
+----------------------------------------------------------
-
id | integer | not null default
nextval('messagechunk_id_seq'::regclass)
message | integer | not
null
sequence | integer | not
null
content | text
|
blob | integer
|
fti | tsvector
|
Indexes:
"messagechunk_pkey" PRIMARY KEY, btree
(id)
"messagechunk_message_idx" UNIQUE, btree (message,
sequence)
"messagechunk_blob_idx" btree (blob) WHERE blob IS NOT
NULL
Check
constraints:
"text_or_content" CHECK (blob IS NULL AND content IS NULL OR (blob
IS NULL)
<> (content IS
NULL))
Foreign-key
constraints:
"messagechunk_message_fk" FOREIGN KEY (message) REFERENCES
message(id) ON DE
LETE CASCADE

Torsten

Arvind Prabhakar

unread,
Oct 5, 2010, 4:54:53 PM10/5/10
to sqoop...@cloudera.org
Hello Torsten,

The table messagechunk seems to be using a custom/non-standard datatype
(java.sql.Types.OTHER constant value 1111). At this time we do not have
a provision for handling this. One work around would be to populate a
new table using custom UDF to a standard datatype and then extract it
using Sqoop.

Going forward one of the areas of improvement that we are looking at is
the ability to support native database facilities to the extent
possible. But that is still in the early design stages and make take
some time to come in the code.

Arvind

On 10/05/2010 01:03 PM, Torsten Spindler wrote:
> Hello,
>
> we're trying to import data from postgres via sqoop into hadoop/hive.
> Unfortunately the import fails with the error in the subject line. Any
> ideas how to tackle this problem? Here's the output from the sqoop
> command:
>
>
> hadoop@dl360-1:~$ sqoop import --direct --connect "jdbc:postgresql://
> <host>:5432/lp_bugs" --username sqoop --password<password> --table
> messagechunk --hive-import
> 10/10/05 15:50:25 WARN tool.BaseSqoopTool: Setting your password on
> the command-line is i
> nsecure. Consider using -P instead.
> 10/10/05 15:50:25 INFO tool.BaseSqoopTool: Using Hive-specific
> delimiters for output. You
> can override
> 10/10/05 15:50:25 INFO tool.BaseSqoopTool: delimiters with --fields-

> terminated-by, etc.public static final int OTHER 1111
> public static final int REAL 7
> public static final int REF 2006
> public static final int ROWID -8
> public static final int SMALLINT 5
> public static final int SQLXML 2009
> public static final int STRUCT 2002
> public static final int TIME 92
> public static final int TIMESTAMP 93
> public static final int TINYINT -6
> public static final int VARBINARY -3
> public static final int VARCHAR 12

Torsten Spindler

unread,
Oct 5, 2010, 6:12:45 PM10/5/10
to sqoop...@cloudera.org
Hello Arvind,

thanks for the quick reply! We worked around the problem by not specifying the column when importing the table. This also worked, though the data is obviously gone then.

Torsten
Reply all
Reply to author
Forward
0 new messages