Deprecated API


Contents
Deprecated Methods
org.apache.spark.streaming.StreamingContext.awaitTermination(long)
          As of 1.3.0, replaced by awaitTerminationOrTimeout(Long). 
org.apache.spark.streaming.api.java.JavaStreamingContext.awaitTermination(long)
          As of 1.3.0, replaced by awaitTerminationOrTimeout(Long). 
org.apache.spark.api.java.StorageLevels.create(boolean, boolean, boolean, int)
           
org.apache.spark.sql.DataFrame.createJDBCTable(String, String, boolean)
          As of 1.340, replaced by write().jdbc(). 
org.apache.spark.api.java.JavaSparkContext.defaultMinSplits()
          As of Spark 1.0.0, defaultMinSplits is deprecated, use JavaSparkContext.defaultMinPartitions() instead 
org.apache.spark.streaming.api.java.JavaDStreamLike.foreach(Function)
          As of release 0.9.0, replaced by foreachRDD 
org.apache.spark.streaming.dstream.DStream.foreach(Function1, BoxedUnit>)
          As of 0.9.0, replaced by foreachRDD. 
org.apache.spark.streaming.dstream.DStream.foreach(Function2, Time, BoxedUnit>)
          As of 0.9.0, replaced by foreachRDD. 
org.apache.spark.streaming.api.java.JavaDStreamLike.foreach(Function2)
          As of release 0.9.0, replaced by foreachRDD 
org.apache.spark.sql.types.DataType.fromCaseClassString(String)
          As of 1.2.0, replaced by DataType.fromJson() 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(String, Configuration, JavaStreamingContextFactory)
          As of 1.4.0, replaced by getOrCreate without JavaStreamingContextFactor. 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(String, Configuration, JavaStreamingContextFactory, boolean)
          As of 1.4.0, replaced by getOrCreate without JavaStreamingContextFactor. 
org.apache.spark.streaming.api.java.JavaStreamingContext.getOrCreate(String, JavaStreamingContextFactory)
          As of 1.4.0, replaced by getOrCreate without JavaStreamingContextFactor. 
org.apache.spark.sql.DataFrame.insertInto(String)
          As of 1.4.0, replaced by write().mode(SaveMode.Append).saveAsTable(tableName). 
org.apache.spark.sql.DataFrame.insertInto(String, boolean)
          As of 1.4.0, replaced by write().mode(SaveMode.Append|SaveMode.Overwrite).saveAsTable(tableName). 
org.apache.spark.sql.DataFrame.insertIntoJDBC(String, String, boolean)
          As of 1.4.0, replaced by write().jdbc(). 
org.apache.spark.sql.SQLContext.jdbc(String, String)
          As of 1.4.0, replaced by read().jdbc(). 
org.apache.spark.sql.SQLContext.jdbc(String, String, String[])
          As of 1.4.0, replaced by read().jdbc(). 
org.apache.spark.sql.SQLContext.jdbc(String, String, String, long, long, int)
          As of 1.4.0, replaced by read().jdbc(). 
org.apache.spark.sql.SQLContext.jsonFile(String)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonFile(String, double)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonFile(String, StructType)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonRDD(JavaRDD)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonRDD(JavaRDD, double)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonRDD(JavaRDD, StructType)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonRDD(RDD)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonRDD(RDD, double)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.jsonRDD(RDD, StructType)
          As of 1.4.0, replaced by read().json(). 
org.apache.spark.sql.SQLContext.load(String)
          As of 1.4.0, replaced by read().load(path). 
org.apache.spark.sql.SQLContext.load(String, Map)
          As of 1.4.0, replaced by read().format(source).options(options).load(). 
org.apache.spark.sql.SQLContext.load(String, Map)
          As of 1.4.0, replaced by read().format(source).options(options).load(). 
org.apache.spark.sql.SQLContext.load(String, String)
          As of 1.4.0, replaced by read().format(source).load(path). 
org.apache.spark.sql.SQLContext.load(String, StructType, Map)
          As of 1.4.0, replaced by read().format(source).schema(schema).options(options).load(). 
org.apache.spark.sql.SQLContext.load(String, StructType, Map)
          As of 1.4.0, replaced by read().format(source).schema(schema).options(options).load(). 
org.apache.spark.mllib.util.MLUtils.loadLabeledData(SparkContext, String)
          Should use RDD.saveAsTextFile(java.lang.String) for saving and MLUtils.loadLabeledPoints(org.apache.spark.SparkContext, java.lang.String, int) for loading. 
org.apache.spark.streaming.StreamingContext.networkStream(Receiver, ClassTag)
          As of 1.0.0", replaced by receiverStream. 
org.apache.spark.sql.SQLContext.parquetFile(String...)
          As of 1.4.0, replaced by read().parquet(). 
org.apache.spark.streaming.api.java.JavaDStreamLike.reduceByWindow(Function2, Duration, Duration)
          As this API is not Java compatible. 
org.apache.spark.sql.DataFrame.save(String)
          As of 1.4.0, replaced by write().save(path). 
org.apache.spark.sql.DataFrame.save(String, SaveMode)
          As of 1.4.0, replaced by write().mode(mode).save(path). 
org.apache.spark.sql.DataFrame.save(String, SaveMode, Map)
          As of 1.4.0, replaced by write().format(source).mode(mode).options(options).save(path). 
org.apache.spark.sql.DataFrame.save(String, SaveMode, Map)
          As of 1.4.0, replaced by write().format(source).mode(mode).options(options).save(path). 
org.apache.spark.sql.DataFrame.save(String, String)
          As of 1.4.0, replaced by write().format(source).save(path). 
org.apache.spark.sql.DataFrame.save(String, String, SaveMode)
          As of 1.4.0, replaced by write().format(source).mode(mode).save(path). 
org.apache.spark.sql.DataFrame.saveAsParquetFile(String)
          As of 1.4.0, replaced by write().parquet(). 
org.apache.spark.sql.DataFrame.saveAsTable(String)
          As of 1.4.0, replaced by write().saveAsTable(tableName). 
org.apache.spark.sql.DataFrame.saveAsTable(String, SaveMode)
          As of 1.4.0, replaced by write().mode(mode).saveAsTable(tableName). 
org.apache.spark.sql.DataFrame.saveAsTable(String, String)
          As of 1.4.0, replaced by write().format(source).saveAsTable(tableName). 
org.apache.spark.sql.DataFrame.saveAsTable(String, String, SaveMode)
          As of 1.4.0, replaced by write().mode(mode).saveAsTable(tableName). 
org.apache.spark.sql.DataFrame.saveAsTable(String, String, SaveMode, Map)
          As of 1.4.0, replaced by write().format(source).mode(mode).options(options).saveAsTable(tableName). 
org.apache.spark.sql.DataFrame.saveAsTable(String, String, SaveMode, Map)
          As of 1.4.0, replaced by write().format(source).mode(mode).options(options).saveAsTable(tableName). 
org.apache.spark.mllib.util.MLUtils.saveLabeledData(RDD, String)
          Should use RDD.saveAsTextFile(java.lang.String) for saving and MLUtils.loadLabeledPoints(org.apache.spark.SparkContext, java.lang.String, int) for loading. 
org.apache.spark.streaming.api.java.JavaStreamingContext.sc()
          As of 0.9.0, replaced by sparkContext 
org.apache.spark.mllib.optimization.LBFGS.setMaxNumIterations(int)
          use LBFGS.setNumIterations(int) instead 
org.apache.spark.api.java.JavaRDDLike.toArray()
          As of Spark 1.0.0, toArray() is deprecated, use JavaRDDLike.collect() instead 
org.apache.spark.streaming.StreamingContext.toPairDStreamFunctions(DStream>, ClassTag, ClassTag, Ordering)
          As of 1.3.0, replaced by implicit functions in the DStream companion object. This is kept here only for backward compatibility. 
org.apache.spark.sql.DataFrame.toSchemaRDD()
          As of 1.3.0, replaced by toDF(). 
org.apache.spark.mllib.rdd.RDDFunctions.treeAggregate(U, Function2, Function2, int, ClassTag)
          Use RDD.treeAggregate(U, scala.Function2, scala.Function2, int, scala.reflect.ClassTag) instead. 
org.apache.spark.mllib.rdd.RDDFunctions.treeReduce(Function2, int)
          Use RDD.treeReduce(scala.Function2, int) instead.