About Hive problem

49 views
Skip to first unread message

John Elder

unread,
Sep 17, 2013, 3:52:54 AM9/17/13
to chenn...@googlegroups.com
I run Hadoop in Ubuntu and now I want to run Hive, but when run this command show tables; it shows that:

Failed with exception java.io.IOException:java.io.IOException: Cannot create an instance of InputFormat class org.apache.hadoop.mapred.TextInputFormat as specified in mapredWork!

I have checked online, and tried some ways, but it still doesn't work.

Could any body help me to solve this problem. 
Thanks

Ashwanth Kumar

unread,
Sep 17, 2013, 4:41:34 AM9/17/13
to chenn...@googlegroups.com
What are the version of Hive / Hadoop are you using? 


--
You received this message because you are subscribed to the Google Groups "Hadoop Users Group (HUG) Chennai" group.
To unsubscribe from this group and stop receiving emails from it, send an email to chennaihug+...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.



--

Ashwanth Kumar / ashwanthkumar.in

swapnil joshi

unread,
Sep 17, 2013, 4:52:57 AM9/17/13
to chenn...@googlegroups.com
Hi,

I think you forgot to include your core-site.xml.

<property>
<name>io.compression.codecs</name>
<value>org.apache.hadoop.io.compress.GzipCodec,org.apache.hadoop.io.compress.DefaultCodec,org.apache.hadoop.io.compress.BZip2Codec,com.hadoop.compression.lzo.LzoCodec,com.hadoop.compression.lzo.LzopCodec,org.apache.hadoop.io.compress.SnappyCodec</value>
</property>
<property>
<name>io.compression.codec.lzo.class</name>
<value>com.hadoop.compression.lzo.LzoCodec</value>
</property>
Regards,
Swapnil K. Joshi

johnsste...@gmail.com

unread,
Sep 17, 2013, 9:23:22 AM9/17/13
to chenn...@googlegroups.com, johnsste...@gmail.com
Yeah, is this file in the Hadoop conf dir?
I think I have this file there.

--
@2013-09-17 13:23

johnsste...@gmail.com

unread,
Sep 17, 2013, 9:24:22 AM9/17/13
to chenn...@googlegroups.com, johnsste...@gmail.com
I am using Hadoop 1.0.0
Hive is 0.9

--
@2013-09-17 13:24

John Elder

unread,
Sep 18, 2013, 12:57:38 AM9/18/13
to chenn...@googlegroups.com
How to include the core-site.xml?

Ashwanth Kumar

unread,
Sep 18, 2013, 1:08:06 AM9/18/13
to chenn...@googlegroups.com
As long as you have HADOOP_HOME env variable set, hive should pick *-site.xml up automatically. 

Quick question - Is the file that you are trying to run on compressed on a specific codec? Make sure you have the above said configuration and the required JARs to access them on the lib/ of HADOOP_HOME or HIVE_HOME. 

johnsste...@gmail.com

unread,
Sep 18, 2013, 1:18:54 AM9/18/13
to chenn...@googlegroups.com
I think I heve set the HADOOP_HOME env variable. I mean that I can access the by using $cd $HADOOP_HOME

My hive and hadoop directorys are the same owner so, I guess hive can access the lib directory.

I will check the configurations you have said.


--
@2013-09-18 05:18

John Elder

unread,
Sep 18, 2013, 4:08:10 AM9/18/13
to chenn...@googlegroups.com
I still occur this problem. Could you help me to solve this?
Thanks

Ashwanth Kumar

unread,
Sep 18, 2013, 4:34:35 AM9/18/13
to chenn...@googlegroups.com
So is input table compressed using a Codec? 

John Elder

unread,
Sep 18, 2013, 4:37:04 AM9/18/13
to chenn...@googlegroups.com
I dont know whether i do that? How to check it?

=====================================
1. I start hadoop using ./start-all.sh
2. hive
3. > show tabls;
then show errors

Ashwanth Kumar

unread,
Sep 18, 2013, 4:53:28 AM9/18/13
to chenn...@googlegroups.com
Not sure if you have already tired these before. Just in case if you haven't. 

I am assuming you have not configured to your HIVE to an external RDBMS like MySQL for metastore. If that's the case, the location from where you start $ hive. Check for a folder called metastore_db (that contains all the tables defined) and delete it (this will delete all the metadata for hive, so you are left with no tables now -- logically not physically from HDFS). 

- Now try doing $ show tables, it must not give you anything. If that's the case, then something is wrong with your Hive / Hadoop Installation. 
- Try re-installing Hive. If they doesn't work either, then 

To check if your Hadoop installation is okay, 
- Run one of the example programs on your Hadoop Cluster. Say something like TestDFS, Terasort etc. 
- If they come up with some weird ClassNotFoundException, try the steps below. 
- You said earlier you have those compression codes specified in your core-site.xml. Check if you have actual jars for them. 
      -- For lzo check in your HADOOP_HOME/lib for *lzo*.jar file. 
      -- I haven't used snappy, but I guess the process is the same. 
      -- Also check in your HADOOP_HOME/lib/native/Linux-<arch>/*.so* files for respective compression natives. For lzo it is typically "libgplcompression.so". 
- If you don't have any of the jars / *.so files then remove those entries from the core-site.xml
- If they still don't work, check for other changes you have made in your config. Revert to Default settings if that's an option. 







John Elder

unread,
Sep 18, 2013, 5:28:32 AM9/18/13
to chenn...@googlegroups.com
I do check the configurations you have told me.

This is result:
1. Hadoop example is run successful.
2. I don't find the *lzo*.jar
3. HADOOP_HOME/lib/native/Linux-<arch>/*.so* I am using the arm not the i386 or amd64

Ashwanth Kumar

unread,
Sep 18, 2013, 5:32:44 AM9/18/13
to chenn...@googlegroups.com
So after deleting metastore_db, and reverting the configurations (since you don't have any native libs for compression). Does hive show no tables when you do "show tables"? 

John Elder

unread,
Sep 18, 2013, 5:37:39 AM9/18/13
to chenn...@googlegroups.com
After delete the metastore_db && compression in my core-site.xml then restart the hadoop 
I run > show tables; it shows:

> show tables;
OK
records
records1
Time taken: 45.817 seconds

Maybe it work right. Thanks a lot.
Reply all
Reply to author
Forward
0 new messages