8

I am creating a Scala program to SQLContext using sbt. This is my build.sbt:

name := "sampleScalaProject"

version := "1.0"

scalaVersion := "2.11.7"
//libraryDependencies += "org.apache.spark" %% "spark-core" % "2.5.2"
libraryDependencies += "org.apache.spark" % "spark-core_2.11" % "1.5.2"
libraryDependencies += "org.apache.kafka" % "kafka_2.11" % "0.8.2.2"
libraryDependencies += "org.apache.spark" % "spark-streaming_2.11" % "1.5.2"
libraryDependencies += "org.apache.spark" % "spark-sql_2.11" % "1.5.2"
libraryDependencies += "org.apache.hadoop" % "hadoop-common" % "2.6.0"  

And this is test program:

import org.apache.spark.SparkContext
import org.apache.spark.sql.SQLContext

object SqlContextSparkScala {

  def main (args: Array[String]) {
    val sc = SparkContext
    val sqlcontext = new SQLContext(sc)
  }
} 

I am getting below error:

Error:(8, 26) overloaded method constructor SQLContext with alternatives:
  (sparkContext: org.apache.spark.api.java.JavaSparkContext)org.apache.spark.sql.SQLContext <and>
  (sparkContext: org.apache.spark.SparkContext)org.apache.spark.sql.SQLContext
 cannot be applied to (org.apache.spark.SparkContext.type)
    val sqlcontexttest = new SQLContext(sc)  

Can anybody please let me know the issue as I am very new to scala and spark programming?

5 Answers 5

12

For newer versions of Spark (2.0+), use SparkSession:

val spark = SparkSession.builder.getOrCreate()

SparkSession can do everything SQLContext can do but if needed the SQLContext can be accessed as follows,

val sqlContext = spark.sqlContext
Sign up to request clarification or add additional context in comments.

Comments

6

You need to new your SparkContext and that should solve it

Comments

4

Simply we can create SQLContext in scala

scala> val sqlContext = new org.apache.spark.sql.SQLContext(sc);    

Comments

0
val conf = new SparkConf().setAppName("SparkJoins").setMaster("local")
val sc = new SparkContext(conf);
val sqlContext = new org.apache.spark.sql.SQLContext(sc);    

Comments

0

If you are using scala shell then use below statement

val sqlContext = spark.sqlContext

And to read parquet files use below statement

val df = sqlContext.read.parquet("/path/to/folder/cotaning/arquet/files/")

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.