write.jdbc {SparkR}R Documentation

Save the content of SparkDataFrame to an external database table via JDBC.


Save the content of the SparkDataFrame to an external database table via JDBC. Additional JDBC database connection properties can be set (...)


write.jdbc(x, url, tableName, mode = "error", ...)

## S4 method for signature 'SparkDataFrame,character,character'
write.jdbc(x, url, tableName,
  mode = "error", ...)



a SparkDataFrame.


JDBC database url of the form jdbc:subprotocol:subname.


yhe name of the table in the external database.


one of 'append', 'overwrite', 'error', 'errorifexists', 'ignore' save mode (it is 'error' by default)


additional JDBC database connection properties.


Also, mode is used to specify the behavior of the save operation when data already exists in the data source. There are four modes:


write.jdbc since 2.0.0

See Also

Other SparkDataFrame functions: SparkDataFrame-class, agg, alias, arrange, as.data.frame, attach,SparkDataFrame-method, broadcast, cache, checkpoint, coalesce, collect, colnames, coltypes, createOrReplaceTempView, crossJoin, cube, dapplyCollect, dapply, describe, dim, distinct, dropDuplicates, dropna, drop, dtypes, except, explain, filter, first, gapplyCollect, gapply, getNumPartitions, group_by, head, hint, histogram, insertInto, intersect, isLocal, isStreaming, join, limit, localCheckpoint, merge, mutate, ncol, nrow, persist, printSchema, randomSplit, rbind, registerTempTable, rename, repartition, rollup, sample, saveAsTable, schema, selectExpr, select, showDF, show, storageLevel, str, subset, summary, take, toJSON, unionByName, union, unpersist, withColumn, withWatermark, with, write.df, write.json, write.orc, write.parquet, write.stream, write.text


## Not run: 
##D sparkR.session()
##D jdbcUrl <- "jdbc:mysql://localhost:3306/databasename"
##D write.jdbc(df, jdbcUrl, "table", user = "username", password = "password")
## End(Not run)

[Package SparkR version 2.3.1 Index]