当前位置:网站首页>ADF Part 2: creating data factories using UI

ADF Part 2: creating data factories using UI

2020-12-08 08:51:40 Happy time

The user can go through UI To create ADF, stay UI Created in ADF when , Users don't need to download separate IDE, And just through Microsoft Edge perhaps Google Chrome browser . The user login Azure Portal, choice “Data factories” service , adopt Data factories Create in service ADF.

One , establish Data Factory example

open Data factories after , Click on “+ Add”, Create your own data factory instance :

step1, Fill in Basics Information

stay “Create Data Factory” Panel to start creating data factory instance , Fill in first “Basics” Information :Subscription( subscribe )、 Resource group (Resource group)、 Area (Region)、 name (Name) And version (Version), Version selection V2.

step2: To configure git

stay V2 In the version , When a user creates a data factory , It can also be configured “Git configuration”, For version control , You can check “Configure Git later”, After creating a data factory instance , Optional configuration git.

step3: Check and create

Check (Review+Create) When there is no mistake , Click on “Create” Button to create Data factory example . When the instance is created , Click on Next Step “Go to resource” Navigate to the data factory page .

Two , The author and the monitor

stay Data factory Of overview On the page , Click on "Authoer & Monitor" Button , This will navigate to Azure Data Factory The user interface (UI) On the page .

ADF Of UI The interface is shown in the figure below , The interface shows several commonly used functions :Create Pipeline、Create Data Flow etc. .

 

Because this is the first time we created Data Factory, Creating Pipeline Before , We also need to create connections (connection) And datasets (dataset).

3、 ... and , Create a connection service

Click on UI On the left side of the interface “Manage” tab , First create a connection , There are two types of connections :Linked services and Integration runtimes, This article creates Liked Services, because Linked Services Depend on Integration runtimes, therefore , Let's first create Integration runtimes.

1, establish Integration runtimes(IR)

How to create Integration runtimes, Please read :《ADF Third articles :Integration runtime and Linked Service

2, establish Linked Services

stay Connections Choose “Linked Services”, Click on “+New”, Create a new Linked Services:

 

Different data sources , Different Linked Service, According to the actual data source , Choose the right type of data source , The figure below creates Linked Service The type is SQL Server, Input Name、Connect via integration runtime、Server name、Database name、Authentication type 、 User name and Password.

Be careful ,Connect via integration runtime It was created in the previous section Integration runtimes.

Azure Key Vault It's a storage space , The user stores the password in Azure Key Vault in , Input Key Vault You can extract the information it stores .

Four , establish Dataset

dataset Represents the structure of data storage (schema), It can represent both data sources , Read data from a data source ; It can also represent data targets , Store data in the data target .

Create a dataset example , It only stores metadata information such as the structure of data storage , Instead of actually storing the actual data . Data is actually stored in dataset Point to the underlying storage object , for instance ,dataset perform SQL Server A table in the instance , So the data is actually stored in this table , and dataset The stored data is the table structure and navigation to the table Linked Service. The same dataset, It can be used as a source of data , It can also be used as a data target for storing data .

Click on “ The pencil ” Corresponding “Author” tab , Enter into Fact Resources Interface , Click on “+”, choice Dataset, Go to create Dataset The interface of

Set up Dataset Properties of , Set up Dataset Of Name, adopt Linked service To get the connection to the source data , adopt Table name To specify the table , Make a proposal to Import schema Set to From conneciton/store.

5、 ... and , establish Pipeline

Create pipes , The pipe is equivalent to a container , You can put one or more Activity Drag and drop into the pipeline .

If you put Activity? Users don't have to write any code , Just from “Activities” Select what you want from the list Activity, Drag and drop Pipeline in , frequently-used Activity Usually located in “General” Subdirectory .

This article demonstrates Copy data Activity Usage of , from “Move & transform” subdirectories , choice Copy data:

Copy Activity The purpose of this is to take data from a dataset Move to another dataset in .

1, Set up Copy Activity Of Source attribute

Source Property represents the data source ,Copy Activity from Source dataset Get data in :

2,Copy Activity Of Sink attribute

Sink Property is used to set the data target ,Sink dataset For storing data :

3,Copy Activity Other properties of

Mapping The Properties tab is used to set Source dataset and Sink dataset Column mapping between , And you can set the conversion of column type .

4, debugging Pipeline

Click on “Debug” For the current Pipeline debug

Here we are , A simple ADF Just create it .

 

Reference documents :

Quickstart: Create a data factory by using the Azure Data Factory UI

版权声明
本文为[Happy time]所创,转载请带上原文链接,感谢
https://chowdera.com/2020/12/202012080851151631.html