3

When I ran a query on hive console in debug mode, I got an error as listed below. I'm using hive-1.2.1 and spark 1.5.1; I checked the hive-exec jar, which has the class definition org/apache/hive/spark/client/Job .

Caused by: java.lang.NoClassDefFoundError: org/apache/hive/spark/client/Job
    at java.lang.ClassLoader.defineClass1(Native Method)
    at java.lang.ClassLoader.defineClass(ClassLoader.java:792)
    at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
    at java.net.URLClassLoader.defineClass(URLClassLoader.java:449)
    at java.net.URLClassLoader.access$100(URLClassLoader.java:71)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:361)
    at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
    at java.security.AccessController.doPrivileged(Native Method)
    at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
    at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:411)
    at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
    at java.lang.Class.forName0(Native Method)
    at java.lang.Class.forName(Class.java:270)
    at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readName(DefaultClassResolver.java:136)
    at org.apache.hive.com.esotericsoftware.kryo.util.DefaultClassResolver.readClass(DefaultClassResolver.java:115)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClass(Kryo.java:656)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.ObjectField.read(ObjectField.java:99)
    at org.apache.hive.com.esotericsoftware.kryo.serializers.FieldSerializer.read(FieldSerializer.java:507)
    at org.apache.hive.com.esotericsoftware.kryo.Kryo.readClassAndObject(Kryo.java:776)
    at org.apache.hive.spark.client.rpc.KryoMessageCodec.decode(KryoMessageCodec.java:96)
    at io.netty.handler.codec.ByteToMessageCodec$1.decode(ByteToMessageCodec.java:42)
    at io.netty.handler.codec.ByteToMessageDecoder.callDecode(ByteToMessageDecoder.java:327)
    ... 15 more*

And finally the query fails with:

"ERROR spark.SparkTask: Failed to execute spark task, with exception 'java.lang.IllegalStateException(RPC channel is closed.)'"*

How can I resolve this issue?

4
  • 1
    do you have the hivecontext on the spark cluster? Commented Oct 20, 2015 at 11:40
  • 1
    @eliasah, I have the hiveContext on spark. The query works fine on spark-sql but not on hive-on-spark Commented Oct 20, 2015 at 12:15
  • 1
    This issue was solved by moving to spark 1.3.0 version and rebuilding it without hive. Commented Oct 27, 2015 at 12:37
  • 1
    Actually I still have the problem, and I have Spark 1.6.0 with Hive 1.2.1 Commented Mar 21, 2016 at 12:29

1 Answer 1

1

In hive-1.2.1 pom.xml, the spark.version is 1.3.1

So, The easy way is dowload a spark-1.3.1-bin-hadoop from spark.apache.org.

then, add it's path to hive-site.xml like:

<property>
  <name>spark.home</name>
  <value>/path/spark-1.3.1-bin-hadoop2.4</value>
</property>
Sign up to request clarification or add additional context in comments.

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.