You can use the below code as a reference for the Notebook created in the Azure Databricks cluster
Remember , you need to change the storage account, the container and the account keys
Python
blob_account_name = "appstore4000011" accountkey="W2ucVHdKQn4RZwqBu+H9TKa0bnpYK00BINXOwnviJZXNzwHa9Df7R7TuJ09Vcmznpu9EYclJbEX6eO50pkpdYQ==" wasbs_path = "wasbs://data@appstore4000011.blob.core.windows.net" fullname = "fs.azure.account.key." +blob_account_name+ ".blob.core.windows.net" dbutils.fs.mount( source = wasbs_path, mount_point ="/mnt/data2", extra_configs = {fullname : accountkey}) ds = spark.read.csv("/mnt/data2/customer.csv") display(ds)