About 26,100,000 results
Open links in new tab
  1. Databricks: How do I get path of current notebook?

    Databricks is smart and all, but how do you identify the path of your current notebook? The guide on the website does not help. It suggests: %scala dbutils.notebook.getContext.notebookPath res1: ...

  2. Printing secret value in Databricks - Stack Overflow

    Nov 11, 2021 · 2 Building on @camo's answer, since you're looking to use the secret value outside Databricks, you can use the Databricks Python SDK to fetch the bytes representation of the secret …

  3. How to roll back delta table to previous version - Stack Overflow

    Vaccum table Table_name retain 0 hours Retain 0 hours will remove all history snapshots there is a spark config that you need to set before vaccum as by default delta logs are maintained for 7 days. …

  4. Databricks - How to change a partition of an existing Delta table?

    I have a table in Databricks delta which is partitioned by transaction_date. I want to change the partition column to view_date. I tried to drop the table and then create it with a new partition co...

  5. List All Files in a Folder Sitting in a Data Lake - Stack Overflow

    Databricks File System (DBFS) is a distributed file system mounted into an Azure Databricks workspace and available on Azure Databricks clusters. If you are using local file API you have to reference the …

  6. Connecting C# Application to Azure Databricks - Stack Overflow

    The Datalake is hooked to Azure Databricks. The requirement asks that the Azure Databricks is to be connected to a C# application to be able to run queries and get the result all from the C# application. …

  7. Databricks - Download a dbfs:/FileStore file to my Local Machine

    Method3: Using third-party tool named DBFS Explorer DBFS Explorer was created as a quick way to upload and download files to the Databricks filesystem (DBFS). This will work with both AWS and …

  8. REST API to query Databricks table - Stack Overflow

    Jul 24, 2022 · Is databricks designed for such use cases or is a better approach to copy this table (gold layer) in an operational database such as azure sql db after the transformations are done in pyspark …

  9. Change the datatype of a column in delta table - Stack Overflow

    May 31, 2021 · Is there a SQL command that I can easily use to change the datatype of a existing column in Delta table. I need to change the column datatype from BIGINT to STRING. Below is the …

  10. Databricks Permissions Required to Create a Cluster

    Nov 9, 2023 · In Azure Databricks, if you want to create a cluster, you need to have the " " permission. This permission basically lets you handle everything related to clusters, like making new ones and …