How to create Spark session in scala spark and pyspark

Create Spark session in scala spark and pyspark



Scala-spark
val spark = SparkSession.builder
       .master("local")
       .appName("My Application")
       .getOrCreate()


PySpark
spark = SparkSession.builder \
       .master("local") \
       .appName("My Application") \
       .getOrCreate()
We can use spark variable now to work with dataframes and perform all spark actions.   


Comments