Create table and Select queries not working. Only show Tables and Describe table works.

964 views
Skip to first unread message

tarun gulyani

unread,
Nov 10, 2013, 8:57:51 AM11/10/13
to presto...@googlegroups.com
I have deployed presto using steps "http://prestodb.io/docs/current/installation/deployment.html" and Command line interface using "http://prestodb.io/docs/current/installation/cli.html".

Show tables and Describe Table works fine. Those tables i had created on hive.


presto:default> show tables;
Table
----------------
pokes
productfind
productview
scenario
search
(5 rows)

Query 20131110_134848_00005_vwjwd, FINISHED, 1 node
Splits: 2 total, 1 done (50.00%)
0:00 [0 rows, 0B] [0 rows/s, 0B/s]


presto:default> describe pokes;
Column | Type | Null | Partition Key
--------+---------+------+---------------
foo | bigint | true | false
bar | varchar | true | false
(2 rows)

Query 20131110_134928_00007_vwjwd, FINISHED, 1 node
Splits: 2 total, 2 done (100.00%)
0:00 [2 rows, 131B] [5 rows/s, 368B/s]

But when i use select query and create table query, then exception is :

presto:default> select foo from pokes limit 5;

Query 20131110_135438_00008_vwjwd, FAILED, 1 node
Splits: 1 total, 0 done (0.00%)
0:00 [0 rows, 0B] [0 rows/s, 0B/s]

Query 20131110_135438_00008_vwjwd failed: java.io.IOException: Failed on local exception: java.io.IOException: Response is null.; Host Details : local host is: "tarun-ThinkPad-Edge-E430/127.0.1.1"; destination host is: "localhost":54310;

presto:default> CREATE TABLE pokesData (fooData INT, barData STRING);
Query 20131110_135528_00009_vwjwd failed: line 1:24: mismatched input '(' expecting AS
CREATE TABLE pokesData (fooData INT, barData STRING)


Kindly please let me know what i am doing wrong here.

Dain Sundstrom

unread,
Nov 10, 2013, 12:40:15 PM11/10/13
to presto...@googlegroups.com, tarun gulyani
We saw this problem earlier this week.  It is caused when Presto tries to connect to an older version of CDH4.  From what I understand, Cloudera changed the protocol layer which causes the unhelpful "Response: is null" message.  We are using the client version: 2.0.0-cdh4.3.0   Can you upgrade to a newer version of CDH4?

-dain

tarun gulyani

unread,
Nov 12, 2013, 5:27:01 AM11/12/13
to presto...@googlegroups.com, tarun gulyani
I am not using cloudera distribution. I am using hadoop-1.2.1 and hive-0.11.

great...@gmail.com

unread,
Nov 14, 2013, 2:56:45 AM11/14/13
to presto...@googlegroups.com, tarun gulyani
Did you provide you hadoop config files in the config or jvm config file?

Had this problem but this one was solved by providing this informations.

tarun gulyani

unread,
Nov 18, 2013, 2:05:48 PM11/18/13
to presto...@googlegroups.com, tarun gulyani, great...@gmail.com
On Thursday, November 14, 2013 1:26:45 PM UTC+5:30, great...@gmail.com wrote:
> Did you provide you hadoop config files in the config or jvm config file?
>
> Had this problem but this one was solved by providing this informations.

Hi,

Can you please tell me the syntax of how to add entry hadoop config files "hdfs-site.xml,mapred-site.xml,core-site.xml" in jvm.config of Presto

David Phillips

unread,
Nov 18, 2013, 2:45:31 PM11/18/13
to presto...@googlegroups.com
On Mon, Nov 18, 2013 at 11:05 AM, <tarung...@gmail.com> wrote:
Can you please tell me the syntax of  how to add entry hadoop config files "hdfs-site.xml,mapred-site.xml,core-site.xml" in jvm.config of Presto

Add the following (with the appropriate paths) to the config file for the Hive plugin (etc/catalog/hive.properties):

  hive.config.resources=/etc/hadoop/conf/core-site.xml,/etc/hadoop/conf/hdfs-site.xml

tarun gulyani

unread,
Nov 18, 2013, 3:20:09 PM11/18/13
to presto...@googlegroups.com, da...@acz.org
Thanks for quick response . Still getting problem. I just execute create table query as :

presto:default> create table pokes (p STRING,q STRING);
Query 20131118_201139_00007_rwgzi failed: line 1:20: mismatched input '(' expecting AS
create table pokes (p STRING,q STRING)


Even whatever expected written also not taking.

java.m...@gmail.com

unread,
Nov 19, 2013, 12:58:39 AM11/19/13
to presto...@googlegroups.com, da...@acz.org
Hi David,

I too getting the same problem which is getting by tarung and i also tied with adding hadoop configuration files into hive.config file as you mentioned.

I am not using Cloudra's Hive release and i am using hive-0.10.0 and would like to go with existing setup.

Please help me to resolve this problem asap.

presto:default> show tables;
Table
----------------------------
bigtable
customer
default__test_test_index__
invout
test
(5 rows)

Query 20131119_054250_00002_8ti9x, FINISHED, 1 node
Splits: 2 total, 2 done (100.00%)
0:01 [5 rows, 137B] [9 rows/s, 249B/s]

presto:default> select *from bigtable limit 5;

Query 20131119_054308_00003_8ti9x, FAILED, 1 node
Splits: 1 total, 0 done (0.00%)
0:02 [0 rows, 0B] [0 rows/s, 0B/s]

Query 20131119_054308_00003_8ti9x failed: java.io.IOException: Failed on local exception: java.io.IOException: Response is null.; Host Details : local host is: "master1/192.168.2.30"; destination host is: "master1":9000;
presto:default>


Thanks,
Mansoor


David Phillips

unread,
Nov 20, 2013, 6:49:25 PM11/20/13
to presto...@googlegroups.com
On Tue, Nov 12, 2013 at 2:27 AM, <tarung...@gmail.com> wrote:
I am not using cloudera distribution. I am using hadoop-1.2.1 and hive-0.11.

Apologies for the late reply. The current Presto release only supports CDH4, but we just landed support for Apache Hadoop 1.x which will be available in the next release (which should be this week). For right now you can grab the latest version of the code and try it out:


If you do try it, please let us know how it works out!

David Phillips

unread,
Nov 20, 2013, 6:50:11 PM11/20/13
to presto...@googlegroups.com
On Mon, Nov 18, 2013 at 9:58 PM, <java.m...@gmail.com> wrote:
 I am not using Cloudra's Hive release and i am using hive-0.10.0 and would like to go with existing setup.

What version of Hadoop are you running?

David Phillips

unread,
Nov 20, 2013, 6:54:28 PM11/20/13
to presto...@googlegroups.com
On Mon, Nov 18, 2013 at 12:20 PM, <tarung...@gmail.com> wrote:
Thanks for quick response . Still getting problem. I just execute create table query as :

presto:default> create table pokes (p STRING,q STRING);
Query 20131118_201139_00007_rwgzi failed: line 1:20: mismatched input '(' expecting AS
create table pokes (p STRING,q STRING)

Presto does not yet support CREATE TABLE, which is why you get a syntax error. You will be able to query the table in Presto after first creating it in Hive.

lazar...@gmail.com

unread,
Nov 21, 2013, 4:35:01 PM11/21/13
to presto...@googlegroups.com, da...@acz.org

Hi David,

query: SHOW TABLES -- working fine
query:DESCRIBE books -- working fine

following query not working - can you please guide me

Hadoop - 1.2.1
Hive - 0.11.0
presto-server: 0.52

presto:default> select * from books;

Query 20131121_025845_00004_qqe25, FAILED, 1 node


Splits: 1 total, 0 done (0.00%)
0:00 [0 rows, 0B] [0 rows/s, 0B/s]

Query 20131121_025845_00004_qqe25 failed: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
presto:default>

exception:

45_00004_qqe25.1
java.lang.RuntimeException: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
at com.google.common.base.Throwables.propagate(Throwables.java:160) ~[guava-15.0.jar:na]
at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:433) ~[na:na]
at com.facebook.presto.hive.HiveSplitIterable$HiveSplitQueue.computeNext(HiveSplitIterable.java:392) ~[na:na]
at com.google.common.collect.AbstractIterator.tryToComputeNext(AbstractIterator.java:143) ~[guava-15.0.jar:na]
at com.google.common.collect.AbstractIterator.hasNext(AbstractIterator.java:138) ~[guava-15.0.jar:na]
at com.facebook.presto.execution.SqlStageExecution.startTasks(SqlStageExecution.java:463) [presto-main-0.52.jar:0.52]
at com.facebook.presto.execution.SqlStageExecution.access$300(SqlStageExecution.java:80) [presto-main-0.52.jar:0.52]
at com.facebook.presto.execution.SqlStageExecution$5.run(SqlStageExecution.java:435) [presto-main-0.52.jar:0.52]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45]
at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) [na:1.7.0_45]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) [na:1.7.0_45]
at java.lang.Thread.run(Thread.java:744) [na:1.7.0_45]
Caused by: java.io.IOException: Failed on local exception: java.io.IOException: Broken pipe; Host Details : local host is: "ubuntu/192.168.56.101"; destination host is: "localhost":54310;
at org.apache.hadoop.net.NetUtils.wrapException(NetUtils.java:763) ~[na:na]
at org.apache.hadoop.ipc.Client.call(Client.java:1229) ~[na:na]
at org.apache.hadoop.ipc.ProtobufRpcEngine$Invoker.invoke(ProtobufRpcEngine.java:202) ~[na:na]
at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[na:1.7.0_45]
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) ~[na:1.7.0_45]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) ~[na:1.7.0_45]
at java.lang.reflect.Method.invoke(Method.java:606) ~[na:1.7.0_45]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:164) ~[na:na]
at org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:83) ~[na:na]
at com.sun.proxy.$Proxy155.getListing(Unknown Source) ~[na:na]
at org.apache.hadoop.hdfs.protocolPB.ClientNamenodeProtocolTranslatorPB.getListing(ClientNamenodeProtocolTranslatorPB.java:441) ~[na:na]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1526) ~[na:na]
at org.apache.hadoop.hdfs.DFSClient.listPaths(DFSClient.java:1509) ~[na:na]
at org.apache.hadoop.hdfs.DistributedFileSystem.listStatus(DistributedFileSystem.java:406) ~[na:na]
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1462) ~[na:na]
at org.apache.hadoop.fs.FileSystem.listStatus(FileSystem.java:1502) ~[na:na]
at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na]
at com.facebook.presto.hive.ForwardingFileSystem.listStatus(ForwardingFileSystem.java:298) ~[na:na]
at com.facebook.presto.hive.FileSystemWrapper$3.listStatus(FileSystemWrapper.java:146) ~[na:na]
at org.apache.hadoop.fs.FileSystem$4.<init>(FileSystem.java:1778) ~[na:na]
at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1777) ~[na:na]
at org.apache.hadoop.fs.FileSystem.listLocatedStatus(FileSystem.java:1760) ~[na:na]
at com.facebook.presto.hive.util.AsyncRecursiveWalker$1.run(AsyncRecursiveWalker.java:58) ~[na:na]
at com.facebook.presto.hive.util.SuspendingExecutor$1.run(SuspendingExecutor.java:67) ~[na:na]
at com.facebook.presto.hive.util.BoundedExecutor.executeOrMerge(BoundedExecutor.java:82) ~[na:na]
at com.facebook.presto.hive.util.BoundedExecutor.access$000(BoundedExecutor.java:41) ~[na:na]
at com.facebook.presto.hive.util.BoundedExecutor$1.run(BoundedExecutor.java:53) ~[na:na]
... 3 common frames omitted
Caused by: java.io.IOException: Broken pipe
at sun.nio.ch.FileDispatcherImpl.write0(Native Method) ~[na:1.7.0_45]
at sun.nio.ch.SocketDispatcher.write(SocketDispatcher.java:47) ~[na:1.7.0_45]
at sun.nio.ch.IOUtil.writeFromNativeBuffer(IOUtil.java:93) ~[na:1.7.0_45]
at sun.nio.ch.IOUtil.write(IOUtil.java:65) ~[na:1.7.0_45]
at sun.nio.ch.SocketChannelImpl.write(SocketChannelImpl.java:487) ~[na:1.7.0_45]
at org.apache.hadoop.net.SocketOutputStream$Writer.performIO(SocketOutputStream.java:62) ~[na:na]
at org.apache.hadoop.net.SocketIOWithTimeout.doIO(SocketIOWithTimeout.java:143) ~[na:na]
at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:153) ~[na:na]
at org.apache.hadoop.net.SocketOutputStream.write(SocketOutputStream.java:114) ~[na:na]
at java.io.BufferedOutputStream.flushBuffer(BufferedOutputStream.java:82) ~[na:1.7.0_45]
at java.io.BufferedOutputStream.flush(BufferedOutputStream.java:140) ~[na:1.7.0_45]
at java.io.DataOutputStream.flush(DataOutputStream.java:123) ~[na:1.7.0_45]
at org.apache.hadoop.ipc.Client$Connection$3.run(Client.java:897) ~[na:na]
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471) [na:1.7.0_45]
at java.util.concurrent.FutureTask.run(FutureTask.java:262) [na:1.7.0_45]
... 3 common frames omitted
2013-11-20T21:58:45.915-0500 DEBUG task-notification-1 com.facebook.presto.execution.TaskStateMachine Task 20131121_025845_00004_qqe25.0.0 is CANCELED

yug...@aliyun.com

unread,
Jan 6, 2014, 3:19:26 AM1/6/14
to presto...@googlegroups.com, tarun gulyani, great...@gmail.com
How to do that?
I just do that base on our hadoop cluster.
The hadoop version is apache 1.0.3 and hive version is 0.11.0
Message has been deleted
Message has been deleted

Pravesh Jain

unread,
Jan 7, 2014, 5:06:00 AM1/7/14
to presto...@googlegroups.com
Is presto 0.56 compatible with hadoop-1.2.1?? I am still getting the error
Query 20140107_095634_00006_4n8mx failed: java.io.IOException: Failed on local exception: java.io.IOException: Response is null.; Host Details : local host is: IMPETUS-DSRV02.IMPETUS.CO.IN/192.168.145.183"; destination host is: "IMPETUS-DSRV02.IMPETUS.CO.IN":9000;

while trying to run SELECT command. Using hadoop-1.2.1 and hive-0.9.0.

Thanks in advance.

Pravesh Jain

unread,
Jan 7, 2014, 7:40:07 AM1/7/14
to presto...@googlegroups.com
Error solved. Presto is now working with hadoop 1.x

The variable connector.name (in presto-server-0.56/etc/catalog/hive.properties) was set to hive-cdh4. I changed it to hive-hadoop1 and now its working.

Lalit singh

unread,
Jan 23, 2014, 5:45:42 AM1/23/14
to presto...@googlegroups.com
Hi ,

I am also getting the same issue.Please find below  the details:

  • using CDH 4.4.0.
  • Show tables/Describe Table works fine for tables which I have already created in hive.
  • Qury Log error:
    • presto:default> select * from hive_cdr_metadata1;
    • Query 20140123_142940_00010_ydsad failed: MetaException(message:org.apache.hadoop.hive.serde2.SerDeException SerDe org.apache.hadoop.hive.hbase.HBaseSerDe does not exist)
Although I have already added json-serde-1.1.7-jar-with-dependencies.jar in ../presto-server-0.57/plugin/hive-cdh4/ directory but still getting the same issue.
please help!!!..


Thanks,
Lalit

Pravesh Jain

unread,
Jan 23, 2014, 5:51:31 AM1/23/14
to presto...@googlegroups.com
Hi Lalit,

Being a beginner myself i don't really know what might be causing your issue, but please check once if you have set the connector name in presto-server-0.56/etc/catalog/hive.properties correctly.

Regards

Reply all
Reply to author
Forward
0 new messages