Once you have created the ADF instance, you can log into the ADF using two options.
Option 1
Option 2
Option 1
- Log in to portal and go to the ADF instance you created.
- Click on Author & Monitoring option.
- Log in to ADF portal using https://adf.azure.com/datafactories
- Provide subscription resource group and ADF instance name in drop downs. Then click Continue button.
- By the both options given above, it will redirect to the ADF instance where you can see main options in ADF.
Data Factory overview
- Home - landing page of the ADF
- Create pipeline - to create a pipeline
- Create data flow - to create a data flow
- Create pipeline from template - to create a pipeline using defined templates
- Copy data - to create pipeline for copy data from source to destination
- Configure SSIS Integration - this will allow you to configure SSIS packages to execute packages inside the ADF
- Set up code repository - this option facilitate you to configure your ADF code which is list of JSON files in to Azure DevOps or GIT or TFS code repository
- Videos - sample videos about different ADF functionalities
- Switch Data Factory - switch in between ADF instances which you have access
Author
This tab provides the play ground for all ADF functions. This is the only place you can implement ADF pipelines and integration work.
- Data Factory - this is the toggle between live mode and code mode. If you have configured your ADF code with code repository, you can see these two options here as drop down.
- Live Mode - deployed mode which you can run pipelines and triggers and see the status and history.
- Code Mode - where you can develop your code and then publish to see the changes or apply the changes to live mode. Once you save your work, that will automatically commit to the code repository.
- Publish all - this is to publish you work to live mode. If you have configured repository, you can do modification to pipelines but cannot publish changes in live mode. Publish is disabled in live mode. If not, you can do all in live mode and publish to save your work.
- Validate all - this will help you to validate your pipeline configuration with dependent data sets and connections. It will provide the miss configurations in all pipelines.
- Factory Resources - contains Pipelines, Data sets and Data flows.
- Pipelines - store pipelines where you do the real implementation for data extract, transform or any workflow. You can have multiple folders inside this for categorize them.
- Datesets - contains data sets used for above pipelines. As same as pipelines, you can have folder structure here also. Data flows - list of data flows you created in ADF
- Connections - list of connections you used in pipelines. There are two features in connection section.
- Liked services - here you can see connections for different source and destination. There are number of linked services support for ADF
- Integration runtimes - connection bridge for cloud - cloud or cloud - on premises. If you are running SSIS in ADF you need this bridge.
- Triggers - contains list of triggers which are the starters for pipelines. A trigger can be a schedule or based on any other event.
- Canvas - where you can see pipeline, data set or data flow designs. Based on the selected item from the left side menu (resource explorer), will be shown the features here.
- These two buttons are for pipeline debug and validation errors
- Factory validation output - to display errors in entire data factory or selected pipeline. Once you click validate all button and if there is any error in pipeline, will be shown in the blade.
- Active debug runs and sessions - when you execute pipelines (debug in implementation mode) or data flow or multiple of them, status will be shown here.
Monitor
This tab is to show ADF execution history and status of schedules and integration runtimes. Also you can see summary of each with multiple filtering options.
- Dashboard - display summary of ADF execution history in graphical way (pie and line charts)
- Pipeline runs - display the list of pipeline execution with status (in progress / succeeded / failed / canceled) duration of execution of each pipeline.
- Trigger runs - statuses of all trigger runs for given period (selected date time range)
- Integration runtimes - status of configured integration runtimes if any
- Alerts & metrics - if anything configured
That is all about ADF portal overview. Let's meet with a new blog of creating
Comments
Post a Comment