write.jdbc {SparkR}R Documentation

Saves the content of the SparkDataFrame to an external database table via JDBC

Description

Additional JDBC database connection properties can be set (...)

Usage

write.jdbc(x, url, tableName, mode = "error", ...)

## S4 method for signature 'SparkDataFrame,character,character'
write.jdbc(x, url, tableName,
  mode = "error", ...)

Arguments

x

A SparkDataFrame

url

JDBC database url of the form 'jdbc:subprotocol:subname'

tableName

The name of the table in the external database

mode

One of 'append', 'overwrite', 'error', 'ignore' save mode (it is 'error' by default)

Details

Also, mode is used to specify the behavior of the save operation when data already exists in the data source. There are four modes:
append: Contents of this SparkDataFrame are expected to be appended to existing data.
overwrite: Existing data is expected to be overwritten by the contents of this SparkDataFrame.
error: An exception is expected to be thrown.
ignore: The save operation is expected to not save the contents of the SparkDataFrame and to not change the existing data.

See Also

Other SparkDataFrame functions: SparkDataFrame-class, [[, agg, arrange, as.data.frame, attach, cache, collect, colnames, coltypes, columns, count, dapply, describe, dim, distinct, dropDuplicates, dropna, drop, dtypes, except, explain, filter, first, group_by, head, histogram, insertInto, intersect, isLocal, join, limit, merge, mutate, ncol, persist, printSchema, registerTempTable, rename, repartition, sample, saveAsTable, selectExpr, select, showDF, show, str, take, unionAll, unpersist, withColumn, write.df, write.json, write.parquet, write.text

Examples

## Not run: 
##D sc <- sparkR.init()
##D sqlContext <- sparkRSQL.init(sc)
##D jdbcUrl <- "jdbc:mysql://localhost:3306/databasename"
##D write.jdbc(df, jdbcUrl, "table", user = "username", password = "password")
## End(Not run)

[Package SparkR version 2.0.0 Index]