About 10,600,000 results
Open links in new tab
  1. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · First, install the Databricks Python SDK and configure authentication per the docs here. pip install databricks-sdk Then you can use the approach below to print out secret …

  2. Is there a way to use parameters in Databricks in SQL with …

    Sep 29, 2024 · EDIT: I got a message from Databricks' employee that currently (DBR 15.4 LTS) the parameter marker syntax is not supported in this scenario. It might work in the future …

  3. Databricks: managed tables vs. external tables - Stack Overflow

    Jun 21, 2024 · The decision to use managed table or external table depends on your use case and also the existing setup of your delta lake, framework code and workflows. Your …

  4. how to get databricks job id at the run time - Stack Overflow

    Jun 9, 2025 · I am trying to get the job id and run id of a databricks job dynamically and keep it on in the table with below code run_id = self.spark.conf.get ("spark.databricks.job.runId", "no_ru...

  5. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath …

  6. Converting SQL stored procedure into a Databricks Notebook: …

    Dec 5, 2023 · 0 I'm trying to convert a SQL stored procedure into a Databricks notebook. One stored procedure has multiple IF statements combined with BEGIN/END statements. Based …

  7. How to Define Reusable Job Cluster Configurations in Databricks …

    Aug 13, 2025 · Problem I’m trying to create reusable job cluster configurations in Databricks Asset Bundles (DAB) that can be referenced across multiple jobs defined in separate YAML files. I …

  8. Databricks CREATE VIEW equivalent in PySpark - Stack Overflow

    Jun 24, 2023 · Can someone let me know what the equivalent of the following CREATE VIEW in Databricks SQL is in PySpark? CREATE OR REPLACE VIEW myview as select …

  9. databricks - How to get the cluster's JDBC/ODBC parameters ...

    Feb 11, 2021 · Databricks documentation shows how get the cluster's hostname, port, HTTP path, and JDBC URL parameters from the JDBC/ODBC tab in the UI. See image: (source: …

  10. Where does databricks store the managed tables? - Stack Overflow

    Nov 6, 2024 · Answering your two sub questions individually below: Does this mean that databricks is storing tables in the default Storage Account created during the creation of …