dbutils.fs.mount(
source = f"wasbs://{blob.storage_account_container}@{blob.storage_account_name}.blob.core.windows.net/",
mount_point = "/mnt/MLRExtract/",
extra_configs = {f"fs.azure.account.key.{blob.storage_account_name}.blob.core.windows.net":blob.storage_account_access_key})
在创建挂载点时,我将面临以下错误--
IllegalArgumentException:文件必须是dbfs或s3n: /
发布于 2022-05-09 20:46:07
此错误IllegalArgumentException
主要在语法上出现错误。您可以遵循下面的语法,我在我的环境中复制了相同的东西,它运行得很好。
语法:
spark.conf.set("fs.azure.account.key.<storage-account-name>.dfs.core.windows.net", dbutils.secrets.get(scope="<Scope-Name>",key="Key_Value"))
dbutils.fs.mount(
source = "wasbs://<container-name>@<storage-account-name>.blob.core.windows.net/",
mount_point = "/mnt/io24",
extra_configs = {"fs.azure.account.key.<storage-account-name>.blob.core.windows.net":"<storage-account-Access key>"}).
参考:
https://stackoverflow.com/questions/72109539
复制相似问题