I am creating a Scala program to SQLContext using sbt. This is my build.sbt:
name := "sampleScalaProject"
version := "1.0"
scalaVersion := "2.11.7"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2"
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"
And this is test program:
import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext
object SqlContextSparkScala {
def main (args: Array[String]) {
val sc = SparkContext
val sqlcontext = new SQLContext(sc)
}
}
I am getting below error:
Error:(8, 26) overloaded method constructor SQLContext with alternatives:
(sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext <and>
(sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
cannot be applied to (org.apache.spark.SparkContext.type)
val sqlcontexttest = new SQLContext(sc)
Can anybody please let me know the issue as I am very new to scala and spark programming?