About 57,900 results
Open links in new tab
  1. Apache Spark™ - Unified Engine for large-scale data analytics

    Apache Spark is a multi-language engine for executing data engineering, data science, and machine learning on single-node machines or clusters.

  2. PySpark Overview — PySpark 4.0.1 documentation - Apache Spark

    Spark Connect is a client-server architecture within Apache Spark that enables remote connectivity to Spark clusters from any application. PySpark provides the client for the Spark …

  3. Spark Declarative Pipelines Programming Guide

    Spark Declarative Pipelines (SDP) is a declarative framework for building reliable, maintainable, and testable data pipelines on Spark. SDP simplifies ETL development by allowing you to …

  4. Spark Release 3.5.7 - Apache Spark

    Sep 24, 2025 · Spark Release 3.5.7 Spark 3.5.7 is the seventh maintenance release containing security and correctness fixes. This release is based on the branch-3.5 maintenance branch of …

  5. Overview - Spark 3.5.7 Documentation

    If you’d like to build Spark from source, visit Building Spark. Spark runs on both Windows and UNIX-like systems (e.g. Linux, Mac OS), and it should run on any platform that runs a …

  6. XML Files - Spark 4.0.0 Documentation - Apache Spark

    Spark SQL provides spark.read().xml("file_1_path","file_2_path") to read a file or directory of files in XML format into a Spark DataFrame, and dataframe.write().xml("path") to write to a xml file.

  7. Getting Started — PySpark 4.0.1 documentation - Apache Spark

    There are more guides shared with other languages such as Quick Start in Programming Guides at the Spark documentation. There are live notebooks where you can try PySpark out without …

  8. Spark Streaming - Spark 4.0.1 Documentation

    Spark Streaming is an extension of the core Spark API that enables scalable, high-throughput, fault-tolerant stream processing of live data streams. Data can be ingested from many sources …

  9. DECLARE VARIABLE - Spark 4.0.1 Documentation

    DECLARE VARIABLE Description The DECLARE VARIABLE statement is used to create a temporary variable in Spark. Temporary variables are scoped at a session level. You can …

  10. Spark Release 3.5.5 - Apache Spark

    Dependency changes While being a maintenance release we did still upgrade some dependencies in this release they are: [SPARK-50886]: Upgrade Avro to 1.11.4 You can …