Packages

  • package root
    Definition Classes
    root
  • package org
    Definition Classes
    root
  • package apache
    Definition Classes
    org
  • package spark

    Core Spark functionality.

    Core Spark functionality. org.apache.spark.SparkContext serves as the main entry point to Spark, while org.apache.spark.rdd.RDD is the data type representing a distributed collection, and provides most parallel operations.

    In addition, org.apache.spark.rdd.PairRDDFunctions contains operations available only on RDDs of key-value pairs, such as groupByKey and join; org.apache.spark.rdd.DoubleRDDFunctions contains operations available only on RDDs of Doubles; and org.apache.spark.rdd.SequenceFileRDDFunctions contains operations available on RDDs that can be saved as SequenceFiles. These operations are automatically available on any RDD of the right type (e.g. RDD[(Int, Int)] through implicit conversions.

    Java programmers should reference the org.apache.spark.api.java package for Spark programming APIs in Java.

    Classes and methods marked with Experimental are user-facing features which have not been officially adopted by the Spark project. These are subject to change or removal in minor releases.

    Classes and methods marked with Developer API are intended for advanced users want to extend Spark through lower level interfaces. These are subject to changes or removal in minor releases.

    Definition Classes
    apache
  • package sql

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Allows the execution of relational queries, including those expressed in SQL using Spark.

    Definition Classes
    spark
  • package catalyst
    Definition Classes
    sql
  • package trees
    Definition Classes
    catalyst
  • package util
    Definition Classes
    catalyst
  • LogStringContext

package util

Linear Supertypes
Logging, AnyRef, Any
Ordering
  1. Alphabetic
  2. By Inheritance
Inherited
  1. util
  2. Logging
  3. AnyRef
  4. Any
  1. Hide All
  2. Show All
Visibility
  1. Public
  2. Protected

Type Members

  1. implicit class LogStringContext extends AnyRef
    Definition Classes
    Logging

Value Members

  1. val AUTO_GENERATED_ALIAS: String
  2. val INTERNAL_METADATA_KEYS: Seq[String]
  3. val METADATA_COL_ATTR_KEY: String
  4. val QUALIFIED_ACCESS_ONLY: String
  5. def escapeSingleQuotedString(str: String): String
  6. def fileToString(file: File, encoding: Charset): String
  7. def quietly[A](f: => A): A
  8. def quoteIdentifier(name: String): String
  9. def quoteIfNeeded(part: String): String
  10. def quoteNameParts(name: Seq[String]): String
  11. def removeInternalMetadata(schema: StructType): StructType
  12. def resourceToBytes(resource: String, classLoader: ClassLoader): Array[Byte]
  13. def resourceToString(resource: String, encoding: String, classLoader: ClassLoader): String
  14. def sideBySide(left: Seq[String], right: Seq[String]): Seq[String]
  15. def sideBySide(left: String, right: String): Seq[String]
  16. def stackTraceToString(t: Throwable): String
  17. def stringToFile(file: File, str: String): File
  18. def toPrettySQL(e: Expression): String
  19. def truncatedString[T](seq: Seq[T], sep: String, maxFields: Int): String
  20. def truncatedString[T](seq: Seq[T], start: String, sep: String, end: String, maxFields: Int): String
  21. def usePrettyExpression(e: Expression): Expression