About 13,700,000 results
Open links in new tab
  1. Is there a way to use parameters in Databricks in SQL with parameter ...

    Sep 29, 2024 · EDIT: I got a message from Databricks' employee that currently (DBR 15.4 LTS) the parameter marker syntax is not supported in this scenario. It might work in the future versions. …

  2. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  3. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · While Databricks manages the metadata for external tables, the actual data remains in the specified external location, providing flexibility and control over the data storage lifecycle. This …

  4. Databricks Permissions Required to Create a Cluster

    Nov 9, 2023 · In Azure Databricks, if you want to create a cluster, you need to have the " Can Manage " permission. This permission basically lets you handle everything related to clusters, like making new …

  5. Databricks shared access mode limitations - Stack Overflow

    Oct 2, 2023 · You're correct about listed limitations. But when you're using Unity Catalog, especially with shared clusters, you need to think a bit differently than before. UC + shared clusters provide very …

  6. Databricks CREATE VIEW equivalent in PySpark - Stack Overflow

    Jun 24, 2023 · Can someone let me know what the equivalent of the following CREATE VIEW in Databricks SQL is in PySpark? CREATE OR REPLACE VIEW myview as select …

  7. How to Pass Dynamic Parameters (e.g., Current Date) in Databricks ...

    Oct 17, 2024 · I'm setting up a job in the Databricks Workflow UI and I want to pass parameter value dynamically, like the current date (run_date), each time the job runs. In Azure Data Factory, I can …

  8. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark …

  9. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method3: Using third-party tool named DBFS Explorer DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and …

  10. Databricks - hand spaces and - in column name - Stack Overflow

    Feb 14, 2024 · Databricks - hand spaces and - in column name Asked 1 year, 10 months ago Modified 1 year, 10 months ago Viewed 1k times