explain {SparkR}R Documentation



Print the logical and physical Catalyst plans to the console for debugging.


explain(x, ...)

## S4 method for signature 'SparkDataFrame'
explain(x, extended = FALSE)

## S4 method for signature 'StreamingQuery'
explain(x, extended = FALSE)



a SparkDataFrame or a StreamingQuery.


further arguments to be passed to or from other methods.


Logical. If extended is FALSE, prints only the physical plan.


explain since 1.4.0

explain(StreamingQuery) since 2.2.0

See Also

Other SparkDataFrame functions: SparkDataFrame-class, agg, alias, arrange, as.data.frame, attach,SparkDataFrame-method, broadcast, cache, checkpoint, coalesce, collect, colnames, coltypes, createOrReplaceTempView, crossJoin, cube, dapplyCollect, dapply, describe, dim, distinct, dropDuplicates, dropna, drop, dtypes, exceptAll, except, filter, first, gapplyCollect, gapply, getNumPartitions, group_by, head, hint, histogram, insertInto, intersectAll, intersect, isLocal, isStreaming, join, limit, localCheckpoint, merge, mutate, ncol, nrow, persist, printSchema, randomSplit, rbind, rename, repartitionByRange, repartition, rollup, sample, saveAsTable, schema, selectExpr, select, showDF, show, storageLevel, str, subset, summary, take, toJSON, unionByName, union, unpersist, withColumn, withWatermark, with, write.df, write.jdbc, write.json, write.orc, write.parquet, write.stream, write.text

Other StreamingQuery methods: awaitTermination, isActive, lastProgress, queryName, status, stopQuery


## Not run: 
##D sparkR.session()
##D path <- "path/to/file.json"
##D df <- read.json(path)
##D explain(df, TRUE)
## End(Not run)
## Not run:  explain(sq) 

[Package SparkR version 2.4.0 Index]