Azure data lake store is highly secured store and access permission is required to load the data to the folders in it even using Azure data factory. I was trying to access folder in my azure data lake store using ADF without giving permission to it and received this error. AccessControlException","message":"LISTSTATUS failed with error 0x83090aa2 Azure Data Factory was trying to access this folder mentioned below. To overcome this error, we need to provide access to ADF for this folder. click on Access and add resource which require the access. Please note that there are three options Read, write and Execute.
Showing posts from 2018
- Other Apps
When I heard about Azure data factory (ADF), I thought this was something new technology to store the data and never thought Microsoft will come up with another data load tool/service other than SSIS. though ADF is used for moving data between on premises database to cloud or between different cloud storage, it is not different from SSIS in basic functionality. Main difference between ADF and SSIS is that ADF can handle unstructured data and also can run the variety of language scripts such as python, node.js, USQL etc. ADF is created mainly to satisfy the cloud needs of the enterprise whereas SSIS is integrating data from traditional data sources. ADF can even execute the SSIS package. I started working ADF v1 and found it very difficult to implement as it requires JSON script to create datasets, pipeline and linked service. with the release of ADF V2, Microsoft eliminated this difficulty by including graphical interface to create datasets, pipeline and linked services. ADF
- Other Apps
Polybase is the one of the main feature in DWH amazed me a lot due to its easiness of exploring the data from blob storage. This helps in keeping the raw data as it is in the blob/data lake folders and analyze these data inside the DWH. below diagram provides the illustration on how data can be consumed from the sources till the visualization tools. all the raw files can be stored in one folder and polybase enables the user to query the data in all these files as a single unit. first step is to create the master key and create this only if it is not exists. keyword CREATE MASTER KEY; next step is to create the credentials to connect with blob storage. this steps accepts the storage key and abstract from the use. Keyword CREATE DATABASE SCOPED CREDENTIAL AzureStorageCredential WITH IDENTITY = ' <Name of the identity> ', SECRET = ' <storage key from storage account> ' ; next step is to create the data source