Azure Data Lake Storage Gen2 - Access Denied Errors When Using Azure Databricks with Service Principal
I'm sure I'm missing something obvious here, but I've looked through the documentation and I'm still confused about I'm working with an scenario when trying to read data from Azure Data Lake Storage Gen2 using Azure Databricks. I have configured a service principal for authentication, but I keep receiving `Access Denied` errors when executing the notebook. Here is the setup I've done: 1. I registered a new application in Azure Active Directory and created a client secret. 2. Assigned the service principal the `Storage Blob Data Reader` role on the specific container in the Data Lake. 3. In Databricks, I'm using the following code to access the Data Lake: ```python configs = { "fs.azure.account.auth.type": "OAuth", "fs.azure.account.oauth.provider.type": "org.apache.hadoop.fs.azurebfs.oauth2.ClientCredsTokenProvider", "fs.azure.account.oauth2.client.id": "<your-client-id>", "fs.azure.account.oauth2.client.secret": "<your-client-secret>", "fs.azure.account.oauth2.client.endpoint": "https://login.microsoftonline.com/<your-tenant-id>/oauth2/token" } spark.conf.setAll(configs) # Attempt to read a file from the Data Lake file_path = "abfss://<your-container>@<your-storage-account>.dfs.core.windows.net/path/to/your/file.csv" df = spark.read.csv(file_path) ``` However, when I run the notebook, I get the following behavior: ``` AnalysisException: Path Not Found: abfss://<your-container>@<your-storage-account>.dfs.core.windows.net/path/to/your/file.csv ``` I've double-checked the container name and file path, and they are correct. I also verified that the Databricks workspace and the Data Lake storage account are in the same Azure subscription. I’ve tried providing `Storage Blob Data Contributor` role to the service principal as well, but the question continues. Could there be any missing configurations or permissions that I'm overlooking? The stack includes Python and several other technologies. Any ideas how to fix this? Any ideas how to fix this?