19

sbt package runs just fine, but after spark-submit I get the error:

Exception in thread "main" java.lang.NoSuchMethodError: org.apache.spark.SparkContext$.rddToPairRDDFunctions(Lorg/apache/spark/rdd/RDD;Lscala/reflect/ClassTag;Lscala/reflect/ClassTag;Lscala/math/Ordering;)Lorg/apache/spark/rdd/PairRDDFunctions; at SmokeStack$.main(SmokeStack.scala:46) at SmokeStack.main(SmokeStack.scala) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:498) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:736) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:185) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:210) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:124) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)

Here is the offending line:

val sigCounts = rowData.map(row => (row("Signature"), 1)).countByKey()

rowData is an RDD Map[String, String]. "Signature" key exists in all items in the map.

I suspect this may be a build issue. Below is my sbt file:

name := "Example1"
version := "0.1"
scalaVersion := "2.11.8"

libraryDependencies += "org.apache.spark" %% "spark-core" % "1.2.0"
scalacOptions ++= Seq("-feature")

I'm new to Scala so maybe the imports are not correct? I have:

import org.apache.spark.SparkContext
import org.apache.spark.SparkContext._
import scala.io.Source
3
  • 2
    Your compile and runtime versions of spark do not match. Commented Nov 2, 2016 at 20:19
  • That's fairly old Spark version. Try to use a bit newer one. And as mentioned above - your runtime and compile time versions of spark do not match. Commented Nov 2, 2016 at 20:22
  • @maasg I changed compile and runtime Spark versions to match and everything works. Can you put this as the answer so I can accept it? Commented Nov 2, 2016 at 20:45

3 Answers 3

35

java.lang.NoSuchMethodError is often an indication that the version the code was compiled against is on a higher version than the libraries used at runtime.

With Spark, that means that the Spark version used to compile is different from the one deployed (on the machine or cluster).

Aligning the versions between development and runtime should solve this issue.

Sign up to request clarification or add additional context in comments.

Comments

2

I Was facing the same problem while reading a simple oneline json file into a dataframe and showing it using .show() method. I would get this error on myDF.show() line of code.

For me it turned out to be wrong version of spark-sql library in the build.

i.e. I was having in my External Libraries from SBT , instead of .

Adding following line to my build.sbt resolved the issue

libraryDependencies += "org.apache.spark" %% "spark-sql" % "2.2.0"

Comments

1

If updating the version of one spark dependency, it is safest to update them all to the same version

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.