Upload
nguyenkhuong
View
214
Download
0
Embed Size (px)
Citation preview
Cloud Data Migration
API Reference
Issue 18
Date 2019-07-25
HUAWEI TECHNOLOGIES CO., LTD.
Copyright © Huawei Technologies Co., Ltd. 2019. All rights reserved.No part of this document may be reproduced or transmitted in any form or by any means without prior writtenconsent of Huawei Technologies Co., Ltd. Trademarks and Permissions
and other Huawei trademarks are trademarks of Huawei Technologies Co., Ltd.All other trademarks and trade names mentioned in this document are the property of their respectiveholders. NoticeThe purchased products, services and features are stipulated by the contract made between Huawei and thecustomer. All or part of the products, services and features described in this document may not be within thepurchase scope or the usage scope. Unless otherwise specified in the contract, all statements, information,and recommendations in this document are provided "AS IS" without warranties, guarantees orrepresentations of any kind, either express or implied.
The information in this document is subject to change without notice. Every effort has been made in thepreparation of this document to ensure accuracy of the contents, but all statements, information, andrecommendations in this document do not constitute a warranty of any kind, express or implied.
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. i
Contents
1 Before You Start............................................................................................................................. 11.1 Overview........................................................................................................................................................................ 11.2 Endpoints........................................................................................................................................................................ 11.3 Concepts......................................................................................................................................................................... 2
2 API Overview................................................................................................................................. 3
3 Calling APIs....................................................................................................................................53.1 Making an API Request..................................................................................................................................................53.2 Authentication................................................................................................................................................................ 83.3 Response.........................................................................................................................................................................9
4 Getting Started............................................................................................................................. 12
5 API.................................................................................................................................................. 215.1 Cluster Management.....................................................................................................................................................215.1.1 Creating a Cluster...................................................................................................................................................... 215.1.2 Querying the Cluster List.......................................................................................................................................... 255.1.3 Querying Cluster Details........................................................................................................................................... 285.1.4 Restarting a Cluster................................................................................................................................................... 325.1.5 Deleting a Cluster...................................................................................................................................................... 345.1.6 Stopping a Cluster..................................................................................................................................................... 365.1.7 Starting a Cluster....................................................................................................................................................... 375.2 Link Management.........................................................................................................................................................385.2.1 Creating a Link.......................................................................................................................................................... 395.2.2 Querying a Link.........................................................................................................................................................445.2.3 Modifying a Link.......................................................................................................................................................475.2.4 Deleting a Link.......................................................................................................................................................... 505.3 Job Management...........................................................................................................................................................515.3.1 Creating a Job in a Specified Cluster.........................................................................................................................525.3.2 Creating and Executing a Job in a Random Cluster.................................................................................................. 575.3.3 Querying a Job...........................................................................................................................................................605.3.4 Modifying a Job.........................................................................................................................................................655.3.5 Starting a Job............................................................................................................................................................. 695.3.6 Stopping a Job........................................................................................................................................................... 715.3.7 Querying Job Status...................................................................................................................................................73
Cloud Data MigrationAPI Reference Contents
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. ii
5.3.8 Querying Job Execution History............................................................................................................................... 775.3.9 Deleting a Job............................................................................................................................................................ 79
6 Public Data Structures................................................................................................................ 816.1 Link Parameter Description..........................................................................................................................................816.1.1 Link to a Relational Database....................................................................................................................................816.1.2 Link to OBS...............................................................................................................................................................856.1.3 Link to OSS on Alibaba Cloud..................................................................................................................................866.1.4 Link to KODO/COS.................................................................................................................................................. 876.1.5 Link to HDFS............................................................................................................................................................ 886.1.6 Link to HBase............................................................................................................................................................916.1.7 Link to CloudTable.................................................................................................................................................... 936.1.8 Link to Hive...............................................................................................................................................................956.1.9 Link to an FTP or SFTP Server................................................................................................................................. 966.1.10 Link to MongoDB................................................................................................................................................... 976.1.11 Link to Redis/DCS...................................................................................................................................................986.1.12 Link to NAS/SFS.....................................................................................................................................................996.1.13 Link to Kafka.........................................................................................................................................................1006.1.14 Link to DIS............................................................................................................................................................ 1016.1.15 Link to Elasticsearch/Cloud Search Service..........................................................................................................1026.1.16 Link to DLI............................................................................................................................................................1036.1.17 Link to CloudTable OpenTSDB............................................................................................................................ 1046.1.18 Link to Amazon S3................................................................................................................................................1056.1.19 Link to DMS Kafka............................................................................................................................................... 1066.2 Source Job Parameters................................................................................................................................................1076.2.1 From a Relational Database.....................................................................................................................................1076.2.2 From Object Storage................................................................................................................................................1106.2.3 From HDFS..............................................................................................................................................................1176.2.4 From Hive................................................................................................................................................................1236.2.5 From HBase/CloudTable......................................................................................................................................... 1246.2.6 From FTP/SFTP/NAS/SFS......................................................................................................................................1266.2.7 From HTTP/HTTPS................................................................................................................................................ 1326.2.8 From MongoDB/DDS............................................................................................................................................. 1346.2.9 From Redis/DCS......................................................................................................................................................1366.2.10 From DIS............................................................................................................................................................... 1376.2.11 From Kafka............................................................................................................................................................ 1396.2.12 From Elasticsearch/Cloud Search Service.............................................................................................................1406.2.13 From OpenTSDB...................................................................................................................................................1416.3 Destination Job Parameters........................................................................................................................................ 1426.3.1 To a Relational Database......................................................................................................................................... 1426.3.2 To OBS.................................................................................................................................................................... 1456.3.3 To HDFS..................................................................................................................................................................1516.3.4 To Hive.................................................................................................................................................................... 154
Cloud Data MigrationAPI Reference Contents
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. iii
6.3.5 To HBase/CloudTable..............................................................................................................................................1556.3.6 To FTP/SFTP/NAS/SFS.......................................................................................................................................... 1586.3.7 To DDS.................................................................................................................................................................... 1616.3.8 To DCS.................................................................................................................................................................... 1626.3.9 To Elasticsearch/Cloud Search Service................................................................................................................... 1646.3.10 To DLI................................................................................................................................................................... 1666.3.11 To DIS....................................................................................................................................................................1676.3.12 To OpenTSDB....................................................................................................................................................... 1686.4 Job Parameter Description..........................................................................................................................................169
7 Permissions Policies and Supported Actions...................................................................... 174
8 Appendix.....................................................................................................................................1778.1 Status Code................................................................................................................................................................. 1778.2 Error Code.................................................................................................................................................................. 1808.3 Obtaining the Project ID, Account Name, and AK/SK..............................................................................................189
A Change History......................................................................................................................... 190
Cloud Data MigrationAPI Reference Contents
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. iv
1 Before You Start
1.1 OverviewWelcome to Cloud Data Migration API Reference. Cloud Data Migration (CDM) implementsdata mobility by enabling batch data migration among homogeneous and heterogeneous datasources. It supports on-premises and cloud file systems, relational databases, data warehouses,NoSQL, big data, and object storage.
You can use the APIs provided in this document to perform operations on CDM, such ascreating a cluster and creating a migration job. For details about all supported operations, seeAPI Overview.
If you plan to access CDM through an API, ensure that you are familiar with CDM concepts.For details, see Service Overview.
1.2 EndpointsAn endpoint is the request address for calling an API. Endpoints vary depending on servicesand regions. For the endpoints of all services, see Regions and Endpoints.
Table 1-1 lists CDM endpoints. Select a desired one based on the service requirements.
Table 1-1 CDM endpoints
Region EndpointRegion
Endpoint Protocol
CN North-Beijing1
cn-north-1 cdm.cn-north-1.myhuaweicloud.com
HTTPS
CN East-Shanghai2
cn-east-2 cdm.cn-east-2.myhuaweicloud.com
HTTPS
CN South-Guangzhou
cn-south-1 cdm.cn-south-1.myhuaweicloud.com
HTTPS
Cloud Data MigrationAPI Reference 1 Before You Start
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 1
1.3 Conceptsl Account
An account is created upon successful registration with the cloud platform. The accounthas full access permissions for all of its cloud services and resources. It can be used toreset user passwords and grant user permissions. The account is a payment entity andshould not be used directly to perform routine management. For security purposes, createIAM users and grant them permissions for routine management.
l IAM userAn IAM user is created using an account to use cloud services. Each IAM user has itsown identity credentials (password and access keys).An IAM user can view the account ID and user ID on the My Credentials page of theconsole. The account name, username, and password will be required for APIauthentication.
l RegionA region is a geographic area in which cloud resources are deployed. Availability zones(AZs) in the same region can communicate with each other over an intranet, while AZsin different regions are isolated from each other. Deploying cloud resources in differentregions can better suit certain user requirements or comply with local laws orregulations.
l AZAZs are physically isolated locations in a region, but are interconnected through aninternal network for enhanced application availability
l ProjectProjects group and isolate resources (including compute, storage, and network resources)across physical regions. A default project is provided for each region, and subprojectscan be created under each default project. Users can be granted permissions to access allresources in a specific project. For more refined access control, create subprojects undera project and purchase resources in the subprojects. Users can then be assignedpermissions to access only specific resources in the subprojects.
Figure 1-1 Project isolating model
Cloud Data MigrationAPI Reference 1 Before You Start
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 2
2 API Overview
CDM provides self-developed APIs. You can perform the following operations with CDMAPIs.
Table 2-1 CDM APIs
Type API Description
ClusterManagementAPIs
Creating a Cluster Creates a cluster.
Querying the Cluster List Queries and displays the cluster list.
Querying Cluster Details Queries cluster details.
Restarting a Cluster Restarts a cluster.
Deleting a Cluster Deletes a cluster.
Stopping a Cluster Stops a specified cluster.
Starting a Cluster Starts a specified cluster.
LinkManagementAPIs
Creating a Link Connects to a data source.
Querying a Link Queries the link list.
Modifying a Link Modifies the link configuration.
Deleting a Link Deletes a link.
Job ManagementAPIs
Creating a Job in aSpecified Cluster
You can create a data migration job ina specified CDM cluster, but the jobwill not automatically start.
Creating and Executing aJob in a Random Cluster
You can specify a list of CDMclusters. The system selects a randomcluster in the running status from theclusters you specified and creates andexecutes a migration job.
Querying a Job Queries and displays the job list.
Cloud Data MigrationAPI Reference 2 API Overview
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 3
Type API Description
Modifying a Job Modifies the job configuration.
Starting a Job Starts a data migration job.
Stopping a Job Stops a running job.
Querying Job Status Queries and displays the runningstatus of a job.
Querying Job ExecutionHistory
Queries and displays the historicalstatus of job execution.
Deleting a Job Deletes a job.
Cloud Data MigrationAPI Reference 2 API Overview
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 4
3 Calling APIs
3.1 Making an API RequestThis section describes the structure of a REST API request, and uses the IAM API forobtaining a user token as an example to demonstrate how to call an API. The obtained tokencan then be used to authenticate the calling of other APIs.
Request URIA request URI is in the following format:
{URI-scheme} :// {Endpoint} / {resource-path} ? {query-string}
Table 3-1 Request URI
Parameter Description
URI-scheme Protocol used to transmit requests. All APIs use HTTPS.
Endpoint Domain name or IP address of the server bearing the REST service. Theendpoint varies between services in different regions. It can be obtainedfrom Endpoints.For example, the endpoint of IAM in the CN North-Beijing1 region isiam.cn-north-1.myhuaweicloud.com.
resource-path Access path of an API for performing a specified operation. Obtain thevalue from the URI of an API. For example, the resource-path of theAPI used to obtain a user token is /v3/auth/tokens.
query-string Query parameter, which is optional. Ensure that a question mark (?) isincluded before a query parameter that is in the format of "Parametername=Parameter value". For example, ? limit=10 indicates that amaximum of 10 data records will be displayed.
For example, to obtain an IAM token in the CN North-Beijing1 region, obtain the endpointof IAM (iam.cn-north-1.myhuaweicloud.com) for this region and the resource-path (/v3/
Cloud Data MigrationAPI Reference 3 Calling APIs
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 5
auth/tokens) in the URI of the API used to obtain a user token. Then, construct the URI asfollows:
https://iam.cn-north-1.myhuaweicloud.com/v3/auth/tokens
Figure 3-1 Example URI
NOTE
To simplify the URI display in this document, each API is provided only with a resource-path and arequest method. The URI-scheme of all APIs is HTTPS, and the endpoints of all APIs in the sameregion are identical.
Request Methods
The HTTP protocol defines the following request methods that can be used to send a requestto the server.
Table 3-2 HTTP methods
Method Description
GET Requests the server to return specified resources.
PUT Requests the server to update specified resources.
POST Requests the server to add resources or perform specialoperations.
DELETE Requests the server to delete specified resources, for example,an object.
HEAD Same as GET except that the server must return only theresponse header.
PATCH Requests the server to update partial content of a specifiedresource.If the resource does not exist, a new resource will be created.
For example, in the case of the API used to obtain a user token, the request method isPOST. The request is as follows:POST https://iam.cn-north-1.myhuaweicloud.com/v3/auth/tokens
Cloud Data MigrationAPI Reference 3 Calling APIs
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 6
Request HeaderYou can also add additional fields to a request, such as the fields required by a specified URIor HTTP method. For example, to request for the authentication information, add Content-Type, which specifies the request body type.
Table 3-3 describes common request headers.
Table 3-3 Common request headers
Name Description Mandatory ExampleValue
Content-type Request body type or format. Itsdefault value is application/json.
Yes application/json
Content-Length Length of the request body. Theunit is byte.
No 3495
X-Project-Id Project ID. This parameter is usedto obtain the token for the project.This parameter is mandatory for arequest from a DeC or multi-project user.
No e9993fc787d94b6c886cbaa340f9c0f4
X-Auth-Token User token, which is a response tothe API used to obtain a usertoken. This API is the only onethat does not requireauthentication.
Yes -
X-Language Request language. Yes en_us
x-sdk-date Time when the request is sent. Thetime is inYYYYMMDD'T'HHMMSS'Z'format.The value is the current GreenwichMean Time (GMT) time of thesystem.
No 20190407T101459Z
Host Information about the requestedserver. The value can be obtainedfrom the URL of the service APIin hostname[:port] format. If theport number is not specified, thedefault port is used. The defaultport number for https is 443.
No code.test.comorcode.test.com:443
The API used to obtain a user token does not require authentication. Therefore, only theContent-Type field needs to be added to requests for calling the API. An example of suchrequests is as follows:POST https://iam.cn-north-1.myhuaweicloud.com/v3/auth/tokensContent-Type: application/json
Cloud Data MigrationAPI Reference 3 Calling APIs
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 7
Request Body
The body of a request is often sent in a structured format as specified in the Content-Typeheader field. The request body transfers content except the request header.
The request body varies between APIs. Some APIs do not require the request body, such asthe APIs requested using the GET and DELETE methods.
In the case of the API used to obtain a user token, the request parameters and parameterdescription can be obtained from the API request. The following provides an example requestwith a body included.
POST https://iam.cn-north-1.myhuaweicloud.com/v3/auth/tokensContent-Type:application/json{ "auth": { "identity": { "methods": ["password"], "password": { "user": { "name": "username", //Replace it with the actual username. "password": "**********",//Replace it with the actual password. "domain": { "name": "domianname" //Replace it with the actual account name. } } } }, "scope": { "project": { "name": "cn-north-1" //Replace the value with the actual project name to obtain the token of the specified project. } } }}
NOTE
The scope parameter specifies where a token takes effect. Its value can be project or domain. In theexample, the value is project, indicating that the obtained token takes effect only for the resources in aspecified project. If the value is domain, the obtained token takes effect for all resources of the specifiedaccount.
For more information about this parameter, see Obtaining a User Token.
If all data required for the API request is available, you can send the request to call an APIthrough curl, Postman, or coding. In the response to the API used to obtain a user token, x-subject-token is the desired user token. This token can then be used to authenticate thecalling of other APIs.
3.2 AuthenticationCDM supports API calls by token authentication. You must obtain the user token and add X-Auth-Token to the request header of the API when calling it.
Token-based AuthenticationNOTE
The validity period of a token is 24 hours. When using a token for authentication, cache it to preventfrequently calling the IAM API used to obtain a user token.
Cloud Data MigrationAPI Reference 3 Calling APIs
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 8
During API authentication using a token, the token is added to requests to get permissions forcalling the API.
In Making an API Request, the process of calling the API used to obtain a user token isdescribed. The following is a request example.
POST https://iam.cn-north-1.myhuaweicloud.com/v3/auth/tokens
Content-Type:application/json{ "auth": { "identity": { "methods": ["password"], "password": { "user": { "name": "username", //Replace it with the actual username. "password": "**********",//Replace it with the actual password. "domain": { "name": "domianname" //Replace it with the actual domain name. } } } }, "scope": { "project": { "name": "cn-north-1" //Replace the value with the actual project name to obtain the token of the specified project. } } }}
x-subject-token in the response header is the desired user token.
After a token is obtained, the X-Auth-Token header field must be added to requests to specifythe token when calling other APIs. For example, if the token is ABCDEFJ...., X-Auth-Token: ABCDEFJ.... can be added to a request as follows:
GET https://iam.cn-north-1.myhuaweicloud.com/v3/auth/projects
Content-Type: application/jsonX-Language: en_usX-Auth-Token: ABCDEFJ....
3.3 ResponseAfter sending a request, you will receive a response, including a status code, response header,and response body.
Status Code
A status code is a group of digits, ranging from 1xx to 5xx. It indicates the status of a request.For more information, see Status Code.
For example, if status code 201 is returned for calling the API used to obtain a user token, therequest is successful.
Response Header
Similar to a request, a response also has a header, for example, Content-type.
Cloud Data MigrationAPI Reference 3 Calling APIs
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 9
Table 3-4 Common response headers
Parameter Description Mandatory
Content-Type Media type of the message body sent to a receiverType: stringDefault value: application/json; charset=UTF-8
Yes
X-request-id This field carries the request ID for task tracing.Type: string request_id-timestamp-hostname (Therequest_id is the UUID generated on the server.timestamp indicates the current timestamp, andhostname is the name of the server that processesthe current API.)Default value: none
No
X-ratelimit This field carries the total number of flow controlrequests.Type: integerDefault value: none
No
X-ratelimit-used This field carries the number of remaining requests.Type: integerDefault value: none
No
X-ratelimit-window
This field carries the flow control unit.Type: string The unit is minute, hour, or day.Default value: hour
No
Figure 1 shows the response header fields for the API used to obtain a user token.
The x-subject-token header field is the desired user token. This token can then be used toauthenticate the calling of other APIs.
Cloud Data MigrationAPI Reference 3 Calling APIs
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 10
Figure 3-2 Header fields of the response to the request for obtaining a user token
Response BodyThe body of a response is often returned in structured format as specified in the Content-Type header field. The response body transfers content except the response header.
The following is part of the response body for the API used to obtain a user token.
{ "token": { "expires_at": "2019-02-13T06:52:13.855000Z", "methods": [ "password" ], "catalog": [ { "endpoints": [ { "region_id": "cn-north-1",......
If an error occurs during API calling, an error code and a message will be displayed. Thefollowing shows an error response body.
{ "error_msg": "The format of message is error", "error_code": "AS.0001"}
In the response body, error_code is an error code, and error_msg provides information aboutthe error. For details, see Error Code.
Cloud Data MigrationAPI Reference 3 Calling APIs
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 11
4 Getting Started
This section describes how to use cURL to call CDM APIs to migrate data from a localMySQL database to DWS in the cloud.
1. Obtaining Token
Call the API to obtain the user token, which will be put into the request header forauthentication in a subsequent request.
2. Creating a CDM Cluster
– If you have created a CDM cluster, skip this step and directly use the ID of thecreated cluster.
– If you want to use a new cluster for migration, call the API in Creating a Clusterto create a CDM cluster.
3. Creating Links
Call the API in Creating a Link to create the MySQL and DWS links.
4. Creating a Migration Job
Call the API in Creating a Job in a Specified Cluster to create a job for migrating datafrom MySQL to DWS.
5. View Job Result
Call the API in Starting a Job to execute the job.
Preparing Data
Before calling an API, prepare the following data.
Table 4-1 Preparing data
Item Name Description Example
Accountinformation
Project name Name of the projectwhere CDM resides
Project Name
Project ID ID of the project whereCDM resides
1551c7f6c808414d8e9f3c514a170f2e
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 12
Item Name Description Example
Account name Name of an enterpriseaccount to which a userbelongs
Account Name
Username Username for using acloud service. The usermust have operationpermissions on CDM.For details, seePermissions Policiesand Supported Actions.
Username
Password User password password
VPCinformation
VPC ID The VPC where CDMresides must be the sameas that of DWS.
6b47302a-bf79-4b20-bf7a-80987408e196
Subnet ID The subnet where CDMresides must be the sameas that of DWS.
63bdc3cb-a4e7-486f-82ee-d9bf208c8f8c
Security groupID
The security group whereCDM resides must be thesame as that of DWS.
005af77a-cce5-45ac-99c7-2ea50ea8addf
Endpoint IAM endpoint For details, see Regionsand Endpoints.
iam_endpoint
CDM endpoint cdm_endpoint
MySQLdatabase
IP address IP address of the localMySQL database, whichallows CDM to accessthe MySQL databaseusing a public IP address
1xx.120.85.24
Port MySQL database port 3306
Database name Name of the MySQLdatabase from which datais to be exported
DB_name
Username Username for accessingthe MySQL database.The user must have theread, write, and deletepermissions on theMySQL database.
username
Password Password for accessingthe MySQL database
DB_password
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 13
Item Name Description Example
DWS database IP address IP address of the DWSdatabase. CDM canaccess the IP addressthrough the internalnetwork.
10.120.85.24
Port DWS database port 3306
Database name Name of the DWSdatabase to which data iswritten
DWS
Username Username for accessingthe DWS database. Theuser must have the read,write, and deletepermissions on the DWSdatabase.
user_dws
Password Password for accessingthe DWS database
dws_password
Obtaining Token1. Before calling other APIs, obtain the token and set it as an environment variable.
curl -H "Content-Type:application/json" https://{iam_endpoint}/v3/auth/tokens -X POST -d '{ "auth": { "identity": { "methods": [ "password" ], "password": { "user": { "name": "Username", "password": "password", "domain": { "name": "Account Name" } } } }, "scope": { "project": { "id": "1551c7f6c808414d8e9f3c514a170f2e" } } }}' -v -kThe value of X-Subject-Token in the response header is the token.X-Subject-Token:MIIDkgYJKoZIhvcNAQcCoIIDgzCCA38CAQExDTALBglghkgBZQMEAgEwgXXXXX...
2. Run the following command to set the token as an environment variable for future use:export X-Auth-Token = MIIDkgYJKoZIhvcNAQcCoIIDgzCCA38CAQExDTALBglghkgBZQMEAgEwgXXXXX...
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 14
Creating a CDM Cluster1. Call the API in Creating a Cluster to create a cluster. The following values are
examples:
– Cluster name: cdm-ab82
– Cluster flavor: cdm.medium
– The VPC, subnet, and security group are the same as those of DWS, and the EIP isautomatically bound.
If status code 200 is returned, the cluster is successfully created.curl -X POST -H 'Content-Type:application/json;charset=utf-8' -H "X-Auth-Token:$Token" -d '{ "cluster": { "name": "cdm-ab82", "vpcId": "6b47302a-bf79-4b20-bf7a-80987408e196", "instances": [{ "flavorRef": "fb8fe666-6734-4b11-bc6c-43d11db3c745", "nics": [{ "net-id": "63bdc3cb-a4e7-486f-82ee-d9bf208c8f8c", "securityGroupId": "005af77a-cce5-45ac-99c7-2ea50ea8addf" }], "availability_zone": "Project Name", "type": "cdm" }], "datastore": { "version": "1.8.5", "type": "cdm" }, "isScheduleBootOff": false, "scheduleBootTime": "null", "scheduleOffTime": "null", "isAutoOff": false, "sys_tags": [{ "key": "_sys_enterprise_project_id", "value": "1ce45885-4033-40d2-bdde-d4dbaceb387d" }] }, "autoRemind": false, "phoneNum": "null", "email": "null"}' https://{cdm_endpoint}/v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters -v -k
2. Call the API in Querying the Cluster List to query cluster information, obtain thecluster ID, and set the cluster ID to a global variable.curl -X GET -H 'Content-Type:application/json;charset=utf-8' -H "X-Auth-Token:$Token" https://{cdm_endpoint}/v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters -k -v
The response is as follows:{ "clusters": [{ "version": "x.x.x", "updated": "2018-09-05T08:38:25", "name": "cdm-ab82", "created": "2018-09-05T08:38:25", "id": "bae65496-643e-47ca-84af-948672de7eeb", "status": "200", "isFrozen": "0", "statusDetail": "Normal", "actionProgress": {}, "config_status": "In-Sync" }]}
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 15
If the value of status is 200, the cluster is successfully created. The cluster ID isbae65496-643e-47ca-84af-948672de7eeb.
3. Run the following command to set the cluster ID to a global variable for future use:export ID = bae65496-643e-47ca-84af-948672de7eeb
Creating Links1. Call the API in Creating a Link to create the MySQL link mysql_link. The following
values are examples:– IP address: 1xx.120.85.24– Port number: 3306– Database name: DB_name– Login username: username– Password: DB_passwordIf status code 200 is returned, the link is successfully created.curl -X POST -H "Content-Type:application/json" -H "X-Auth-Token:$Token" -d '{ "links": [{ "enabled": true, "update-user": null, "name": "mysql_link", "link-config-values": { "configs": [ { "name": "linkConfig", "inputs": [ { "name": "linkConfig.databaseType", "value": "MYSQL" }, { "name": "linkConfig.host", "value": "1xx.120.85.24" }, { "name": "linkConfig.port", "value": "3306" }, { "name": "linkConfig.database", "value": "DB_name" }, { "name": "linkConfig.username", "value": "username" }, { "name": "linkConfig.password", "value": "DB_password" }, { "name": "linkConfig.fetchSize", "value": "100000" }, { "name": "linkConfig.usingNative", "value": "true" } ] } ] }, "connector-name": "generic-jdbc-connector",
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 16
"creation-date": 1536654788622, "update-date": 1536654788622, "creation-user": null }]}' https://{cdm_endpoint}/v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb/cdm/link -k -v
2. Call the API in Creating a Link to create the DWS link dws_link. The following valuesare examples:– IP address of the database: 10.120.85.24– Port number: 3306– Database name: DWS– Login username: user_dws– Password: dws_passwordcurl -X POST -H "Content-Type:application/json" -H "X-Auth-Token:$Token" -d '{ "links": [{ "enabled": true, "update-user": null, "name": "dws_link", "link-config-values": { "configs": [ { "name": "linkConfig", "inputs": [ { "name": "linkConfig.databaseType", "value": "DWS" }, { "name": "linkConfig.host", "value": "10.120.85.24" }, { "name": "linkConfig.port", "value": "3306" }, { "name": "linkConfig.database", "value": "DWS" }, { "name": "linkConfig.username", "value": "user_dws" }, { "name": "linkConfig.password", "value": "dws_password" }, { "name": "linkConfig.fetchSize", "value": "100000" }, { "name": "linkConfig.usingNative", "value": "true" } ] } ] }, "connector-name": "generic-jdbc-connector", "creation-date": 1536654788622, "update-date": 1536654788622, "creation-user": null }]
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 17
}' https://{cdm_endpoint}/v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb/cdm/link -k -v
Creating a Migration Job1. After the links are created, call the API in Creating a Job in a Specified Cluster to
create a migration job. The following is a sample job:– The job name is mysql2dws.– The name of the MySQL database from which data is exported is default, and the
name of the exported table is mysql_tbl. The job is split into multiple tasks by idand the tasks are executed concurrently.
– The name of the database on DWS to which the data is imported is public, and thetable name is cdm_all_type. Do not clear the data in the table before import.
– If no table in the local MySQL database exists in the database on DWS, CDMautomatically creates the table on DWS.
– The field list loaded to DWS is id&gid&name.– When the job extracts data, three extractors are concurrently executed.If status code 200 is returned, the job is successfully created.curl -X POST -H "Content-Type:application/json" -H "X-Cluster-ID:$ID" -H "X-Auth-Token:$Token" -d '{ "jobs": [{ "job_type": "NORMAL_JOB", "name": "mysql2dws", "from-link-name": "mysql_link", "from-connector-name": "generic-jdbc-connector", "to-link-name": "dws_link", "to-connector-name": "generic-jdbc-connector", "from-config-values": { "configs": [{ "name": "fromJobConfig", "inputs": [{ "name": "fromJobConfig.schemaName", "value": "default" }, { "name": "fromJobConfig.tableName", "value": "mysql_tbl" }, { "name": "fromJobConfig.partitionColumn", "value": "id" }] }] },"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.schemaName", "value": "public" }, { "name": "toJobConfig.tablePreparation", "value": "CREATE_WHEN_NOT_EXIST" }, { "name": "toJobConfig.tableName", "value": "cdm_all_type" }, {
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 18
"name": "toJobConfig.columnList", "value": "id&gid&name" }, { "name": "toJobConfig.shouldClearTable", "value": "false" } ], "name": "toJobConfig" } ] }, "driver-config-values": { "configs": [{ "name": "throttlingConfig", "inputs": [{ "name": "throttlingConfig.numExtractors", "value": "3" }] }] } }]}' https://{cdm_endpoint}/v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb/cdm/job -k -v
2. Call the API in Starting a Job to execute the job.curl -X GET -H 'Content-Type:application/json;charset=utf-8' -H "X-Cluster-ID:$ID" -H "X-Auth-Token:$Token" https://{cdm_endpoint}/v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb/cdm/job/mysql2dws/start -k -v
The response is as follows:{ "submissions": [{ "progress": 1, "job-name": "mysql2dws", "status": "BOOTING", "creation-date": 1536654788622, "creation-user": "cdm" }]}
View Job Result1. Call the API in Querying Job Status to query the job status.
curl -X GET -H 'Content-Type:application/json;charset=utf-8' -H "X-Cluster-ID:$ID" -H "X-Auth-Token:$Token" https://{cdm_endpoint}/v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job/mysql2dws/status -k -v
2. View the job execution result. The response to successful job execution is as follows:{ "submissions": [{ "progress": 0, "job-name": "mysql2dws", "status": "SUCCEEDED", "creation-date": 1536654788622, "creation-user": "cdm", "isStopingIncrement": "", "last-update-date": 1536654888622, "is-execute-auto": false, "last-udpate-user": "cdm", "isDeleteJob": false, "isIncrementing": false, "external-id": "job_local1127970451_0009", "counters": { "org.apache.sqoop.submission.counter.SqoopCounters": { "BYTES_WRITTEN": -1, "TOTAL_FILES": -1,
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 19
"BYTES_READ": -1, "FILES_WRITTEN": -1, "TOTAL_SIZE": -1, "FILES_READ": -1, "ROWS_WRITTEN": 80, "ROWS_READ": 80 } } }]}
NOTE
l BYTES_WRITTEN: number of written bytes
l BYTES_READ: number of read bytes
l TOTAL_FILES: total number of files
l FILES_WRITTEN: number of written files
l FILES_READ: number of read files
l ROWS_WRITTEN: number of rows that are successfully written
l ROWS_READ: number of rows that are successfully read
Cloud Data MigrationAPI Reference 4 Getting Started
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 20
5 API
5.1 Cluster Management
5.1.1 Creating a Cluster
FunctionThis API is used to create a CDM cluster. Creating a CDM cluster is an asynchronousoperation. You can query the creation progress and result by periodically calling the API inQuerying Cluster Details.
URIl URI format
POST /v1.1/{project_id}/clustersl Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, see Obtainingthe Project ID, AccountName, and AK/SK.
Requestl Sample request
POST /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters{ "cluster": { "instances": [{ "flavorRef": "fb8fe666-6734-4b11-bc6c-43d11db3c745", "nics": [{ "net-id": "2d120298-6130-44d4-a438-454912fff901", "securityGroupId": "c37852d2-2d12-41cb-af47-65c80e995c80" }],
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 21
"availability_zone": "cn-north-1b", "type": "cdm" }], "name": "cdm-ab82", "datastore": { "version": "1.8.5", "type": "cdm" }, "vpcId": "67c06084-2212-4242-bcd4-d2144c2385a9", "isScheduleBootOff": false, "scheduleBootTime": "null", "scheduleOffTime": "null", "isAutoOff": false, "sys_tags": [{ "key": "_sys_enterprise_project_id", "value": "1ce45885-4033-40d2-bdde-d4dbaceb387d" }] }, "autoRemind": false, "phoneNum": "null", "email": "null"}
l Parameter description
Parameter Mandatory Type Description
cluster Yes List Cluster object. Fordetails, see Descriptionof the clusterparameter.
autoRemind No Boolean Whether to enablemessage notification.After the function isenabled, configure amaximum of five mobilenumbers or emailaddresses. You will benotified of job failures(only table/file migrationjobs) and EIP exceptionsby SMS message oremail.
phoneNum No String Mobile number forreceiving notifications
email No String Email address forreceiving notifications
l Description of the cluster parameter
Parameter Mandatory Type Description
instances Yes List Node list. For details, seeDescription of theinstances parameter.
name Yes String Cluster name
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 22
Parameter Mandatory Type Description
datastore Yes List Cluster information
version Yes String Cluster version
type Yes String Cluster type. Currently,only cdm is available.
vpcId Yes String VPC ID, which is usedfor configuring the clusternetwork
isScheduleBootOff
Yes Boolean Whether to enable thescheduled startup/shutdown function. Thescheduled startup/shutdown and autoshutdown functionscannot be enabled at thesame time.
scheduleBootTime
No String Time for scheduledstartup of a CDM cluster.The CDM cluster starts atthis time every day.
scheduleOffTime
No String Time for scheduledshutdown of a CDMcluster. During scheduledshutdown, the systemdoes not wait forunfinished jobs tocomplete.
isAutoOff Yes Boolean Whether to enable autoshutdown. The autoshutdown and scheduledstartup/shutdownfunctions cannot beenabled at the same time.After auto shutdown isenabled, if no job isrunning in the cluster andno scheduled job iscreated, a cluster willautomatically shut down15 minutes later to reducecosts.
sys_tags Yes List Enterprise projectinformation
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 23
Parameter Mandatory Type Description
key No String The value is fixed at_sys_enterprise_project_id.
value No String Enterprise project ID
l Description of the instances parameter
Parameter Mandatory Type Description
flavorRef Yes String Instance flavor. The optionsare as follows:l a79fd5ae-1833-448a-88e
8-3ea2b913e1f6:cdm.small with 2 vCPUsand 4 GB memoryapplicable to Proof ofConcept (PoC)verification anddevelopment tests
l fb8fe666-6734-4b11-bc6c-43d11db3c745:cdm.medium with 4vCPUs and 8 GBmemory applicable tomigration of a singledatabase table with fewerthan 10 million pieces ofdata
l 5ddb1071-c5d7-40e0-a874-8a032e81a697:cdm.large with 8 vCPUsand 16 GB memoryapplicable to migration ofa single database tablewith 10 million pieces ofdata or more
l 6ddb1072-c5d7-40e0-a874-8a032e81a698:cdm.xlarge with 16vCPUs and 32 GBmemory applicable toTB-level data migrationrequiring 10GE high-speed bandwidth
nics Yes List NIC list. A maximum of twoNICs are supported.
net-id Yes String Subnet ID
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 24
Parameter Mandatory Type Description
securityGroupId Yes String Security group ID
availability_zone Yes String AZ where a cluster is located
type Yes String Node type. Currently, onlycdm is available.
Responsel Sample response
{ "id": "7d85f602-a948-4a30-afd4-e84f47471c15", "name": "cdm-ab82"}
l Parameter description
Parameter Mandatory Type Description
id Yes String Cluster ID
name Yes String Cluster name
Return Valuel Normal
200
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.1.2 Querying the Cluster List
Function
This API is used to query the CDM cluster list.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 25
URIl URI format
GET /v1.1/{project_id}/clustersl Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, seeObtaining the ProjectID, Account Name,and AK/SK.
Request
Sample requestGET /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters
Responsel Sample response
{ "clusters": [{ "version": "x.x.x", "updated": "2018-09-05T08:38:25", "name": "cdm-c018", "created": "2018-09-05T08:38:25", "id": "bae65496-643e-47ca-84af-948672de7eeb", "publicEndpoint": "49.xx.xx.10", "status": "200", "isFrozen": "0", "statusDetail": "Normal", "actionProgress": {}, "config_status": "In-Sync" }]}
l Parameter description
Parameter Mandatory Type Description
clusters Yes List Cluster list. For details,see Description of theclusters parameter.
l Description of the clusters parameter
Parameter Mandatory Type Description
version Yes String Cluster version
updated Yes String Time when a cluster isupdated. The format isYYYY-MM-DDThh:mm:ssZ(ISO 8601).
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 26
Parameter Mandatory Type Description
name Yes String Cluster name
created Yes String Time when a cluster iscreated. The format isYYYY-MM-DDThh:mm:ssZ(ISO 8601).
id Yes String Cluster ID
publicEndpoint No String EIP bound to a cluster
status Yes String Cluster status. The optionsare as follows:l 100: Creatingl 200: Runningl 300: Failedl 303: Creation failedl 400: Deletedl 800: Frozenl 900: Stoppedl 910: Stoppingl 920: Starting
isFrozen Yes String Whether a cluster is frozen.The options are as follows:l 0: Nol 1: Yes
statusDetail No String Cluster status description
actionProgress No Dictionary Cluster operation progress.For details, see Descriptionof the actionProgressparameter.
config_status No String Cluster configurationstatus. The options are asfollows:l In-Sync: Configuration
synchronizedl Applying: Being
configuredl Sync-Failure:
Configuration failed
l Description of the actionProgress parameter
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 27
Parameter Mandatory Type Description
CREATING No String Cluster creation progress. Forexample, 29%.
Return Valuel Normal
200
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.1.3 Querying Cluster Details
Function
This API is used to query CDM cluster details.
URIl URI format
GET /v1.1/{project_id}/clusters/{cluster_id}l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtainthe project ID, seeObtaining theProject ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 28
Request
Sample requestGET /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb
Responsel Sample response
{ "version": "x.x.x", "instances": [{ "flavor": { "id": "fb8fe666-6734-4b11-bc6c-43d11db3c745" }, "volume": { "type": "LOCAL_DISK", "size": 100 }, "status": "200", "actions": ["REBOOTING"], "type": "cdm", "id": "635dce67-3df8-4756-b4c7-90e45e687367", "name": "cdm-c018", "isFrozen": "0", "config_status": "In-Sync" }], "updated": "2018-09-05T08:38:25", "name": "cdm-c018", "created": "2018-09-05T08:38:25", "id": "bae65496-643e-47ca-84af-948672de7eeb", "publicEndpoint": "49.xx.xx.10", "status": "200", "actions": ["REBOOTING"], "isFrozen": "0", "statusDetail": "Normal", "actionProgress": { }, "config_status": "In-Sync"}
l Parameter description
Parameter Mandatory Type Description
version Yes String Cluster version
instances Yes List Cluster node information.For details, see Descriptionof the instances parameter.
updated Yes String Time when a cluster isupdated. The format isYYYY-MM-DDThh:mm:ssZ(ISO 8601).
name Yes String Cluster name
created Yes String Time when a cluster iscreated. The format is YYYY-MM-DDThh:mm:ssZ (ISO8601).
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 29
Parameter Mandatory Type Description
id Yes String Cluster ID
publicEndpoint No String EIP bound to a cluster
status Yes String Cluster status. The optionsare as follows:l 100: Creatingl 200: Runningl 300: Failedl 303: Creation failedl 400: Deletedl 800: Frozenl 900: Stoppedl 910: Stoppingl 920: Starting
actions No String Cluster operation status list.The options are as follows:l REBOOTING:
Restartingl RESTORING:
Restoringl REBOOT_FAILURE:
Restart failed
isFrozen Yes String Whether a cluster is frozen.The options are as follows:l 0: Nol 1: Yes
statusDetail No String Cluster status description
actionProgress No List Cluster operation progress.For details, see Descriptionof the actionProgressparameter.
config_status No String Cluster configuration status.The options are as follows:l In-Sync: Configuration
synchronizedl Applying: Being
configuredl Sync-Failure:
Configuration failed
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 30
l Description of the instances parameter
Parameter Mandatory Type Description
flavor Yes Dictionary VM flavor of a node. Fordetails, see Description ofthe flavor parameter.
volume Yes Dictionary Disk information of a node.For details, see Descriptionof the volume parameter.
status Yes String Node status. The options areas follows:l 100: Creatingl 200: Normall 300: Failedl 303: Creation failedl 400: Deletedl 800: Frozen
actions No String list Node operation status list.The options are as follows:l REBOOTING: Restartingl RESTORING: Restoringl REBOOT_FAILURE:
Restart failed
type Yes String Node type. Currently, onlycdm is available.
id Yes String VM ID
name Yes String VM name
isFrozen Yes String Whether the node is frozen.The options are as follows:l 0: Nol 1: Yes
config_status No String Node configuration status.The options are as follows:l In-Sync: Configuration
synchronizedl Applying: Being
configuredl Sync-Failure:
Configuration failed
l Description of the flavor parameter
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 31
Parameter Mandatory Type Description
id Yes String VM flavor ID
l Description of the volume parameter
Parameter Mandatory Type Description
type Yes String Type of disks on the node. Onlylocal disks are supported.
size Yes Integer Size of the disk on the node(GB)
l Description of the actionProgress parameter
Parameter Mandatory Type Description
CREATING No String Cluster creation progress. Forexample, 29%.
Return Valuel Normal
200
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.1.4 Restarting a Cluster
Function
This API is used to restart a CDM cluster. Restarting a CDM cluster is an asynchronousoperation. You can query the restart result by periodically calling the API in QueryingCluster Details.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 32
URIl URI format
POST /v1.1/{project_id}/clusters/{cluster_id}/actionl Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, see Obtainingthe Project ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
Requestl Sample request
POST /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb/action{ "restart": { "type": "cdm" }}
l Parameter description
Parameter Mandatory Type Description
restart Yes Dictionary Cluster restart. For detailsabout how to define clusterrestart, see Description ofthe restart parameter.
l Description of the restart parameter
Parameter Mandatory Type Description
type Yes String Type of the cluster to berestarted. The parameter can beset to cdm only.
restartMode No Enumeration Restart mode. The options are asfollows:l IMMEDIATELY:
Immediate restartl GRACEFULL: Graceful
restartl FORCELY: Forcible restartThe default value isIMMEDIATELY.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 33
Parameter Mandatory Type Description
restartLevel No Enumeration Restart level. The options are asfollows:l SERVICE: Service restartl VM: VM restartThe default value is SERVICE.
restartDelayTime
No Integer Restart delay, in seconds
Responsel Sample response
{ "jobId": "ff8080815e59d92d015e5b27ccb0004d"}
l Parameter description
Parameter Mandatory Type Description
jobId No String Job ID
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.1.5 Deleting a Cluster
FunctionThis API is used to delete a specified CDM cluster. Deleting a CDM cluster is anasynchronous operation. You can query the deletion result by periodically calling the API inQuerying Cluster Details.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 34
URIl URI format
DELETE /v1.1/{project_id}/clusters/{cluster_id}l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, seeObtaining the ProjectID, Account Name, andAK/SK.
cluster_id Yes String Cluster ID
Request
Sample requestDELETE /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920 { }
Responsel Sample response
{ "jobId": "ff8080815e55125a015e552eddba001a"}
l Parameter description
Parameter Mandatory Type Description
jobId No String Job ID
Return Valuel Normal
200
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 35
Return Value Description
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.1.6 Stopping a Cluster
Function
This API is used to stop a specified CDM cluster. Stopping a CDM cluster is an asynchronousoperation. You can query the stop result by periodically calling the API in Querying ClusterDetails. A stopped cluster cannot execute migrations jobs or access the Job Managementpage. You are charged lower fees for a stopped cluster than a running cluster. When a CDMcluster is no longer used, you can call this API to save costs.
URIl URI format
POST /v1.1/{project_id}/clusters/{cluster_id}/actionl Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, see Obtainingthe Project ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
Requestl Sample request
POST /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb/action{ "stop": { }}
l Parameter description
Parameter Mandatory Type Description
stop Yes Dictionary Stopping a cluster
Responsel Sample response
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 36
{ "jobId": "ff8080815e59d92d015e5b27ccb0004d"}
l Parameter description
Parameter Mandatory Type Description
jobId No String Job ID
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.1.7 Starting a Cluster
Function
This API is used to start a specified CDM cluster. Starting a CDM cluster is an asynchronousoperation. You can query the start result by periodically calling the API in Querying ClusterDetails. When a cluster is stopped, you can call this API to start the cluster.
URIl URI format
POST /v1.1/{project_id}/clusters/{cluster_id}/actionl Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, see Obtainingthe Project ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 37
Requestl Sample request
POST /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/bae65496-643e-47ca-84af-948672de7eeb/action{ "start": { }}
l Parameter description
Parameter Mandatory Type Description
start Yes Dictionary Starting a cluster
Responsel Sample response
{ "jobId": "ff8080815e59d92d015e5b27ccb0004d"}
l Parameter description
Parameter Mandatory Type Description
jobId No String Job ID
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.2 Link Management
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 38
5.2.1 Creating a Link
FunctionWhen you create links for connectors, required parameters of links vary with connectors. Forexample, to create a link for the JDBC connector, you must enter the IP address, port number,username, and password of the database server.
URIl URI format
POST /v1.1/{project_id}/clusters/{cluster_id}/cdm/link?validate={validate}
l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details abouthow to obtain the project ID, seeObtaining the Project ID,Account Name, and AK/SK.
cluster_id Yes String Cluster ID
validate No Boolean When the parameter is set totrue, the API only validateswhether the parameters arecorrectly configured, but notcreate any link.
Requestl Sample request
POST /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/link
{ "links": [{ "enabled": true, "update-user": null, "name": "mysql_link", "link-config-values": { "configs": [ { "name": "linkConfig", "inputs": [ { "name": "linkConfig.databaseType", "value": "MYSQL" }, { "name": "linkConfig.host", "value": "10.120.85.24" }, { "name": "linkConfig.port", "value": "3306" }, {
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 39
"name": "linkConfig.database", "value": "DB_name" }, { "name": "linkConfig.username", "value": "username" }, { "name": "linkConfig.password", "value": "DB_password" }, { "name": "linkConfig.fetchSize", "value": "100000" }, { "name": "linkConfig.usingNative", "value": "true" } ] } ] }, "connector-name": "generic-jdbc-connector", "creation-date": 1496654788622, "update-date": 1496654788622, "creation-user": null }]}
l Parameter description
Parameter Mandatory Type Description
links Yes List Link list. For details, seeDescription of the linksparameter.
l Description of the links parameter
Parameter Mandatory Type Description
enabled No Boolean Whether to activate a link.The default value is true.
update-user No String User who updated a link
name Yes String Link name
link-config-values Yes Dictionary Link configuration. Fordetails, see Description ofthe link-config-valuesparameter.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 40
Parameter Mandatory Type Description
connector-name Yes String Connector name. Themappings between theconnectors and links are asfollows:l generic-jdbc-connector:
Link to a RelationalDatabase
l obs-connector: Link toOBS and Link to OSSon Alibaba Cloud
l hdfs-connector: Link toHDFS
l hbase-connector: Link toHBase and Link toCloudTable
l hive-connector: Link toHive
l ftp-connector/sftp-connector: Link to anFTP or SFTP Server
l mongodb-connector:Link to MongoDB
l redis-connector: Link toRedis/DCS
l nas-connector: Link toNAS/SFS
l kafka-connector: Link toKafka
l dis-connector: Link toDIS
l elasticsearch-connector:Link to Elasticsearch/Cloud Search Service
l dli-connector: Link toDLI
l opentsdb-connector:Link to CloudTableOpenTSDB
l http-connector: Link toHTTP/HTTPS. No linkparameters are required.
l thirdparty-obs-connector:Link to KODO/COSand Link to Amazon S3
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 41
Parameter Mandatory Type Description
l dms-kafka-connector:Link to DMS Kafka
creation-date No Timestamp Time when a link is created
update-date No Timestamp Time when a link is updated
creation-user No String User who created a link
l Description of the link-config-values parameter
Parameter Mandatory Type Description
configs Yes List Data structure of linkconfiguration items. Fordetails, see Description of theconfigs parameter.
l Description of the configs parameter
Parameter Mandatory Type Description
name Yes String Configuration name. The linkconfiguration name is fixed tolinkConfig.
inputs Yes List Input parameter list. Eachelement in the list is inname,value format. Fordetails, see Description ofthe inputs parameter.Parameter name of each link.Set this parameter based onthe connector. For details, seeLink Parameters.
l Description of the inputs parameter
Parameter Mandatory Type Description
name Yes String Parameter name
value Yes String Parameter value
Responsel Successful sample response
{ "name": "rdb_link", "validation-result": [{
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 42
}]}
l Failed sample response{ "validation-result": [{ "linkConfig": [{ "message": "Can't connect to the database with given credentials: The authentication type 12 is not supported. Check that you have configured the pg_hba.conf file to include the client's IP address or subnet, and that it is using an authentication scheme supported by the driver.", "status": "ERROR" }] }]}
l Parameter description
Parameter Mandatory Type Description
name Yes String Link name
validation-result Yes List Validation structurel If a link fails to be
created, the failurecause is returned. Fordetails, see Descriptionof the validation-resultparameter.
l If a link is successfullycreated, a blank list isreturned.
l Description of the validation-result parameter
Parameter Mandatory Type Description
linkConfig No List Validation result of linkcreation or update. Fordetails, see Description ofthe linkConfigparameter.
l Description of the linkConfig parameter
Parameter Mandatory Type Description
message No String Error message
status No Enumeration l ERRORl WARNING
Return Valuel Normal
200
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 43
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.2.2 Querying a Link
FunctionThis API is used to query the link list.
URIl URI format
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/link/{link_name}l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details abouthow to obtain the project ID,see Obtaining the Project ID,Account Name, and AK/SK.
cluster_id Yes String Cluster ID
link_name Yes String Link name. When theparameter is set to all, all linksare queried.
RequestSample requestGET /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/link/sftplink
Responsel Sample response
{ "links": [ { "link-config-values": {
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 44
"configs": [ { "name": "linkConfig", "inputs": [ { "size": 128, "name": "linkConfig.server", "type": "STRING", "mandatory": true, "value": "10.120.85.163" }, { "defaultValue": "22", "name": "linkConfig.port", "type": "INTEGER", "mandatory": true, "value": "22" }, { "size": 32, "name": "linkConfig.username", "type": "STRING", "mandatory": true, "value": "root" }, { "size": 32, "name": "linkConfig.password", "sensitive": true, "type": "STRING", "mandatory": true } ] } ] }, "creation-user": "cdm", "name": "sftp_link", "creation-date": 1516674482640, "connector-name": "sftp-connector", "update-date": 1516674476022, "enabled": true, "update-user": "cdm" } ]}
l Parameter description
Parameter Mandatory Type Description
links Yes List Link list. For details,see Description ofthe links parameter.
l Description of the links parameter
Parameter Mandatory Type Description
link-config-values Yes Dictionary Link configuration.For details, seeDescription of thelink-config-valuesparameter.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 45
Parameter Mandatory Type Description
creation-user Yes String User who created alink
name Yes String Link name
creation-date Yes Timestamp Time when a link iscreated
connector-name Yes String Connector name
update-date Yes Timestamp Time when a link isupdated
enabled No Boolean Whether to activate alink
update-user No String User who updated alink
l Description of the link-config-values parameter
Parameter Mandatory Type Description
configs Yes List Data structure of linkconfiguration items. Fordetails, see Description of theconfigs parameter.
l Description of the configs parameter
Parameter Mandatory Type Description
name Yes String Configuration name. The linkconfiguration name is fixed tolinkConfig.
inputs Yes List Parameter list. For details, seeDescription of the inputsparameter.
l Description of the inputs parameter
Parameter Mandatory Type Description
size No String Maximum length of theparameter
name Yes String Parameter name
type No String Parameter type
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 46
Parameter Mandatory Type Description
defaultValue No String Default value of theparameter
dependent No Boolean Whether the parameter isaffected by other components
value No String Parameter value
mandatory No Boolean Whether the parameter ismandatory
editable No String Whether a parameter can beedited
sensitive No Boolean Whether the parameter is apassword
Return Valuel Normal
200
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.2.3 Modifying a Link
Function
This API is used to modify a link that you have created.
URIl URI format
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/link/{link_name}l Parameter description
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 47
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, see Obtainingthe Project ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
link_name Yes String Link name
l Parameter description of a message body
The parameters for modifying a link are the same as those for creating a link. For details,see Request in Creating a Link.
Request
Sample request
PUT /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/link/rdb_link{ "links": [{ "enabled": true, "update-user": null, "name": "mysql_link", "link-config-values": { "configs": [ { "name": "linkConfig", "inputs": [ { "name": "linkConfig.databaseType", "value": "MYSQL" }, { "name": "linkConfig.host", "value": "10.120.85.24" }, { "name": "linkConfig.port", "value": "3306" }, { "name": "linkConfig.database", "value": "DB_name" }, { "name": "linkConfig.username", "value": "username" }, { "name": "linkConfig.password", "value": "Add password here" }, { "name": "linkConfig.fetchSize", "value": "100000" }, { "name": "linkConfig.usingNative", "value": "true"
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 48
} ] } ] }, "connector-name": "generic-jdbc-connector", "creation-date": 1496654788622, "update-date": 1496654788622, "creation-user": null }]}
Responsel Successful sample response
{ "name": "mysql_link", "validation-result": [{ }]}
l Failed sample response{ "validation-result": [{ "linkConfig": [{ "message": "Can't connect to the database with given credentials: The authentication type 12 is not supported. Check that you have configured the pg_hba.conf file to include the client's IP address or subnet, and that it is using an authentication scheme supported by the driver.", "status": "ERROR" }] }]}
l Parameter description
Parameter Mandatory Type Description
name Yes String Link name
validation-result Yes List Validation structurel If a link fails to be
modified, the failurecause is returned. Fordetails, see Descriptionof the validation-resultparameter.
l If a link is successfullymodified, a blank list isreturned.
l Description of the validation-result parameter
Parameter Mandatory Type Description
linkConfig No List Validation result of linkcreation or update. Fordetails, see Description ofthe linkConfigparameter.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 49
l Description of the linkConfig parameter
Parameter Mandatory Type Description
message No String Error message
status No Enumeration l ERRORl WARNING
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.2.4 Deleting a Link
Function
This API is used to delete a specified link. However, you cannot delete the links used by jobs.
URIl URI format
DELETE /v1.1/{project_id}/clusters/{cluster_id}/cdm/link/{link_name}l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, see Obtainingthe Project ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 50
Parameter Mandatory Type Description
link_name Yes String Link name
Request
Sample requestDELETE /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/link/jdbclink
Responsel Parameter description of a successful response
Nonel If a failure occurs, the HTTP response code is 500. A sample response is as follows:
{ "errCode": "Cdm.0021", "externalMessage": "Given link name is in use"}
l Parameter description of a failed response
Parameter Mandatory Type Description
errCode Yes String Error code
externalMessage Yes String Error message
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.3 Job Management
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 51
5.3.1 Creating a Job in a Specified Cluster
FunctionYou can create a data migration job in a specified CDM cluster, but the job will notautomatically start.
URIl URI format
POST /v1.1/{project_id}/clusters/{cluster_id}/cdm/jobl URI parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details abouthow to obtain the projectID, see Obtaining theProject ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
Requestl Sample request
POST /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job
{ "jobs": [ { "job_type": "NORMAL_JOB", "name": "mysql2dws", "from-link-name": "mysql_link", "from-connector-name": "generic-jdbc-connector", "to-link-name": "dws_link", "to-connector-name": "generic-jdbc-connector", "from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.useSql", "value": "false" }, { "name": "fromJobConfig.schemaName", "value": "rf_database" }, { "name": "fromJobConfig.tableName", "value": "rf_from" }, { "name": "fromJobConfig.columnList", "value": "AA&BB" }, { "name": "fromJobConfig.incrMigration", "value": "false"
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 52
}, { "name": "fromJobConfig.createOutTable", "value": "false" } ], "name": "fromJobConfig" } ] }, "to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.schemaName", "value": "rf_to_database" }, { "name": "toJobConfig.tablePreparation", "value": "DROP_AND_CREATE" }, { "name": "toJobConfig.tableName", "value": "rf_to" }, { "name": "toJobConfig.columnList", "value": "AA&BB" }, { "name": "toJobConfig.isCompress", "value": "false" }, { "name": "toJobConfig.orientation", "value": "ROW" }, { "name": "toJobConfig.useStageTable", "value": "false" }, { "name": "toJobConfig.shouldClearTable", "value": "false" }, { "name": "toJobConfig.extendCharLength", "value": "false" } ], "name": "toJobConfig" } ] }, "driver-config-values": { "configs": [ { "inputs": [ { "name": "throttlingConfig.numExtractors", "value": "1" }, { "name": "throttlingConfig.submitToCluster", "value": "false" }, { "name": "throttlingConfig.numLoaders",
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 53
"value": "1" }, { "name": "throttlingConfig.recordDirtyData", "value": "false" } ], "name": "throttlingConfig" }, { "inputs": [], "name": "jarConfig" }, { "inputs": [ { "name": "schedulerConfig.isSchedulerJob", "value": "false" }, { "name": "schedulerConfig.disposableType", "value": "NONE" } ], "name": "schedulerConfig" }, { "inputs": [], "name": "transformConfig" }, { "inputs": [ { "name": "smnConfig.isNeedNotification", "value": "false" } ], "name": "smnConfig" }, { "inputs": [ { "name": "retryJobConfig.retryJobType", "value": "NONE" } ], "name": "retryJobConfig" } ] } } ]}
l Parameter description
Parameter Mandatory Type Description
jobs Yes List Job list. For details, seeDescription of the jobsparameter.
l Description of the jobs parameter
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 54
Parameter Mandatory Type Description
job_type Yes String Job typel NORMAL_JOB: table/
file migrationl BATCH_JOB: entire
DB migration
name Yes String Job name
from-link-name Yes String Source link name
from-connector-name Yes String Name of the connector ofthe source link
to-link-name Yes String Destination link name
to-connector-name Yes String Name of the connector ofthe destination link
from-config-values Yes Dictionary Source link configuration.For details, see Descriptionof from-config-values.
to-config-values Yes Dictionary Destination linkconfiguration. For details,see Description of to-config-values.
driver-config-values Yes Dictionary Job parameterconfiguration. For details,see Description of driver-config-values.
l Description of from-config-values, to-config-values, and driver-config-values
Parameter Mandatory Type Description
configs Yes List The data structures of sourcelink parameters, destinationlink parameters, and jobparameters are the same.However, the inputsparameter varies. For details,see Description of theconfigs parameter.
l Description of the configs parameter
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 55
Parameter Mandatory Type Description
name Yes String Configuration namel fromJobConfig: source
job configuration namel toJobConfig: destination
job configuration name
inputs Yes List Input parameter list. Eachelement in the list is inname,value format. Fordetails, see Description ofthe inputs parameter.l In from-config-values,
different types of sourcelinks have different inputsparameter lists. Fordetails, see thedescription of sourcelink parameters.
l In to-config-values,different types ofdestination links havedifferent inputs parameterlists. For details, see thedescription ofdestination linkparameters.
l For details about theinputs parameter indriver-config-values, seeJob ParameterDescription.
l Description of the inputs parameter
Parameter Mandatory Type Description
name Yes String Parameter name
value Yes String Parameter value
Responsel Sample response
{ "name": "mysql2hive"}
l Parameter description
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 56
Parameter Mandatory Type Description
name Yes String Job name
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.3.2 Creating and Executing a Job in a Random Cluster
Function
You can specify a list of CDM clusters. The system selects a random cluster in the runningstatus from the clusters you specified and creates and executes a migration job.
URIl URI format
POST /v1.1/{projectId}/clusters/jobl URI parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details abouthow to obtain the projectID, see Obtaining theProject ID, AccountName, and AK/SK.
Requestl Sample request
POST /v1.1/ad359ede62234598a962ccb952e3e332/clusters/job
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 57
{ "jobs": [ { "job_type": "NORMAL_JOB", "name": "es_css", "from-link-name": "css", "from-connector-name": "elasticsearch-connector", "to-link-name": "dis", "to-connector-name": "dis-connector", "from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.index", "value": "52est" }, { "name": "fromJobConfig.type", "value": "est_array" }, { "name": "fromJobConfig.columnList", "value": "array_f1_int:long&array_f2_text:string&array_f3_object:nested" }, { "name": "fromJobConfig.splitNestedField", "value": "false" } ], "name": "fromJobConfig" } ] }, "to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.streamName", "value": "dis-lkGm" }, { "name": "toJobConfig.separator", "value": "|" }, { "name": "toJobConfig.columnList", "value": "1&2&3" } ], "name": "toJobConfig" } ] }, "driver-config-values": { "configs": [ { "inputs": [ { "name": "throttlingConfig.numExtractors", "value": "1" }, { "name": "throttlingConfig.submitToCluster", "value": "false" }, {
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 58
"name": "throttlingConfig.numLoaders", "value": "1" }, { "name": "throttlingConfig.recordDirtyData", "value": "false" } ], "name": "throttlingConfig" }, { "inputs": [], "name": "jarConfig" }, { "inputs": [ { "name": "schedulerConfig.isSchedulerJob", "value": "false" }, { "name": "schedulerConfig.disposableType", "value": "NONE" } ], "name": "schedulerConfig" }, { "inputs": [], "name": "transformConfig" }, { "inputs": [ { "name": "retryJobConfig.retryJobType", "value": "NONE" } ], "name": "retryJobConfig" } ] } } ], "clusters": [ "b0791496-e111-4e75-b7ca-9277aeab9297", "c2db1191-eb6c-464a-a0d3-b434e6c6df26", "9917ab56-7ac9-4a0b-8422-a806cf7b96e9" ]}
l Parameter description
Parameter Mandatory Type Description
jobs Yes List Job list. For details, seeDescription of the jobsparameter.
clusters Yes String IDs of CDM clusters. Thesystem selects a randomcluster in the running statusfrom the clusters youspecified and creates andexecutes a migration job.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 59
Responsel Sample response
{ "name": "es_css"}
l Parameter description
Parameter Mandatory Type Description
name Yes String Job name
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.3.3 Querying a Job
Function
This API is used to query all or a specified job.
URIl URI format
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{job_name}?filter={filter}&page_no={page_no}&page_size={page_size}&jobType={jobType}
l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details about howto obtain the project ID, seeObtaining the Project ID,Account Name, and AK/SK.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 60
Parameter Mandatory Type Description
cluster_id Yes String Cluster ID
job_name Yes String Job name. When this parameter isset to all, all jobs are returned.
filter No String This parameter is used to enablefuzzy filtration when job_name isset to all.
page_no No Integer Page number. Jobs on the specifiedpage will be returned.
page_size No Integer Number of jobs on each page
jobType No String Job type to be queried. The optionsare as follows:l JobType=NORMAL_JOB:
table/file migration jobl JobType=BATCH_JOB:
entire DB migration jobIf this parameter is not specified,only table/file migration jobs arequeried by default.
Request
Sample requestGET /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job/all?jobType=NORMAL_JOB
Responsel Sample response
{ "total": 1, "jobs": [{ "name": "oracle2hive", "type": "NORMAL_JOB", "creation-user": "cdm", "creation-date": 1514880878504, "update-date": 1514880878083, "update-user": "cdm", "status": "NEW", "from-link-name": "oraclelink", "from-connector-name": "generic-jdbc-connector", "to-link-name": "hivelink", "to-connector-name": "hive-connector", "from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.useSql", "value": "false" }, {
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 61
"name": "fromJobConfig.schemaName", "value": "rf_database" }, { "name": "fromJobConfig.tableName", "value": "rf_from" }, { "name": "fromJobConfig.columnList", "value": "AA&BB" }, { "name": "fromJobConfig.incrMigration", "value": "false" }, { "name": "fromJobConfig.createOutTable", "value": "false" } ], "name": "fromJobConfig" } ] }, "to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.hive", "value": "hive" }, { "name": "toJobConfig.database", "value": "rf_database" }, { "name": "toJobConfig.table", "value": "rf_to" }, { "name": "toJobConfig.tablePreparation", "value": "DO_NOTHING" }, { "name": "toJobConfig.columnList", "value": "aa&bb&cc&dd" }, { "name": "toJobConfig.shouldClearTable", "value": "true" } ], "name": "toJobConfig" } ] }, "driver-config-values": { "configs": [ { "inputs": [ { "name": "throttlingConfig.numExtractors", "value": "1" }, { "name": "throttlingConfig.numLoaders", "value": "1" },
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 62
{ "name": "throttlingConfig.recordDirtyData", "value": "false" } ], "name": "throttlingConfig" }, { "inputs": [], "name": "jarConfig" }, { "inputs": [ { "name": "schedulerConfig.isSchedulerJob", "value": "false" }, { "name": "schedulerConfig.disposableType", "value": "NONE" } ], "name": "schedulerConfig" }, { "inputs": [], "name": "transformConfig" }, { "inputs": [ { "name": "retryJobConfig.retryJobType", "value": "NONE" } ], "name": "retryJobConfig" } ] } } }], "page_no": 1, "page_size": 20}
l Response parameter description
Parameter Mandatory Type Description
total No Integer Number of jobs
jobs Yes List Job list. For details, seeDescription of the jobsparameter.
page_no No Integer Page number. Jobs on thespecified page will be returned.
page_size No Integer Number of jobs on each page
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 63
Parameter Mandatory Type Description
simple No Boolean When this parameter is set totrue, compact information willbe returned. In other words,only parameter names andvalues will be returned.Attributes such as size, type,and id will not be returned.
l Description of the jobs parameter
Parameter Mandatory Type Description
name Yes String Job name
type Yes String Job type. The options are asfollows:l NORMAL_JOB: table/file
migrationl BATCH_JOB: entire DB
migration
creation-user Yes String User who created a job
creation-date Yes Timestamp Time when a job is created
update-date Yes Timestamp Time when a job is updated
update-user No String User who updated a job
status No String Status of a job. The options areas follows:l BOOTING: The job is
starting.l RUNNING: The job is
running.l SUCCEEDED: The job is
executed successfully.l FAILED: The job failed.l NEW: The job has not been
executed.
from-link-name
Yes String Source link name
from-connector-name
Yes String Name of the connector of thesource link
to-link-name Yes String Destination link name
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 64
Parameter Mandatory Type Description
to-connector-name
Yes String Name of the connector of thedestination link
from-config-values
Yes Dictionary Source link configuration. Fordetails, see Description offrom-config-values.
to-config-values
Yes Dictionary Destination link configuration.For details, see Description ofto-config-values.
driver-config-values
Yes Dictionary Job parameter configuration.For details, see Description ofdriver-config-values.
Return Valuel Normal
200
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.3.4 Modifying a Job
Function
This API is used to modify a job.
URIl URI format
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{job_id}l Parameter description
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 65
Parameter Mandatory Type Description
project_id Yes String Project ID. For details about howto obtain the project ID, seeObtaining the Project ID,Account Name, and AK/SK.
cluster_id Yes String Cluster ID
job_id Yes String Job ID. The job name is used asthe unique job ID.
l Parameter description of a message body
The message body parameters for modifying a job are the same as those for creating ajob. For details, see Description of the jobs parameter in Creating a Job in aSpecified Cluster.
RequestSample request
PUT /v1.1/1551c7f6c808414d8e9f3c514a170f2e/cluster/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job/mysql2hive
{ "jobs": [{ "name": "mysql2hive", "from-link-name": "mysqllink", "from-connector-name": "generic-jdbc-connector", "to-link-name": "hivelink", "to-connector-name": "hive-connector", "from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.useSql", "value": "false" }, { "name": "fromJobConfig.schemaName", "value": "rf_database" }, { "name": "fromJobConfig.tableName", "value": "rf_from" }, { "name": "fromJobConfig.columnList", "value": "AA&BB" }, { "name": "fromJobConfig.incrMigration", "value": "false" }, { "name": "fromJobConfig.createOutTable", "value": "false" } ], "name": "fromJobConfig" } ]
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 66
}, "to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.hive", "value": "hive" }, { "name": "toJobConfig.database", "value": "rf_database" }, { "name": "toJobConfig.table", "value": "rf_to" }, { "name": "toJobConfig.tablePreparation", "value": "DO_NOTHING" }, { "name": "toJobConfig.columnList", "value": "aa&bb&cc&dd" }, { "name": "toJobConfig.shouldClearTable", "value": "true" } ], "name": "toJobConfig" } ] }, "driver-config-values": { "configs": [ { "inputs": [ { "name": "throttlingConfig.numExtractors", "value": "1" }, { "name": "throttlingConfig.numLoaders", "value": "1" }, { "name": "throttlingConfig.recordDirtyData", "value": "false" } ], "name": "throttlingConfig" }, { "inputs": [], "name": "jarConfig" }, { "inputs": [ { "name": "schedulerConfig.isSchedulerJob", "value": "false" }, { "name": "schedulerConfig.disposableType", "value": "NONE" } ], "name": "schedulerConfig"
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 67
}, { "inputs": [], "name": "transformConfig" }, { "inputs": [ { "name": "retryJobConfig.retryJobType", "value": "NONE" } ], "name": "retryJobConfig" } ] } }]}
Responsel Sample response
{ "name": "mysql2hive", "validation-result": [{},{},{}]}
l Parameter Description
Parameter Mandatory Type Description
name Yes String Job name
validation-result Yes List Validation resultl If a job fails to be
modified, the failurecause is returned.
l If a job is successfullymodified, a blank list isreturned.
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 68
Return Value Description
503 Service Unavailable Service unavailable.
5.3.5 Starting a Job
FunctionThis API is used to start a job.
URIl URI format
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{job_name}/startl Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details abouthow to obtain the project ID,see Obtaining the ProjectID, Account Name, andAK/SK.
cluster_id Yes String Cluster ID
job_name Yes String Job name
RequestSample requestPUT /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job/jdbc2hive/start
Responsel Sample response
{ "submissions": [{ "progress": 1, "job-name": "jdbc2hive", "status": "BOOTING", "creation-date": 1536905778725, "creation-user": "cdm" }] }
l Parameter description
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 69
Parameter Mandatory Type Description
submissions Yes List Job running information.For details, seeDescription of thesubmissions parameter.
l Description of the submissions parameter
Parameter Mandatory Type Description
progress Yes Integer Job progress. If a job fails, thevalue is -1. Otherwise, thevalue is 0 to 100.
job-name Yes String Job name
status Yes String Job status. The options are asfollows:l BOOTING: The job is
starting.l FAILURE_ON_SUBMIT
: The job fails to besubmitted.
l RUNNING: The job isrunning.
l SUCCEEDED: The job isexecuted successfully.
l FAILED: The job failed.l UNKNOWN: The job
status is unknown.l NEVER_EXECUTED:
The job has not beenexecuted.
creation-date Yes Timestamp Time when a job is created
creation-user Yes String User who created a job
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 70
Return Value Description
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.3.6 Stopping a Job
Function
This API is used to stop a running job.
URIl URI format
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{job_name}/stopl Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details abouthow to obtain the project ID,see Obtaining the ProjectID, Account Name, andAK/SK.
cluster_id Yes String Cluster ID
job_name Yes String Job name
Request
Sample requestPUT /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job/jdbc2hive/stop
Responsel Sample response
{ "submissions": [{ "progress": -1, "job-name": "jdbc2hive", "status": "FAILED", "creation-date": 1536905778725, "creation-user": "cdm" }]}
l Parameter description
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 71
Parameter Mandatory Type Description
submissions Yes List Job running information.For details, seeDescription of thesubmissions parameter.
l Description of the submissions parameter
Parameter Mandatory Type Description
progress Yes Integer Job progress. If a job fails,the value is -1. Otherwise, thevalue is 0 to 100.
job-name Yes String Job name
status Yes String Job status. The options are asfollows:l BOOTING: The job is
starting.l FAILURE_ON_SUBMI
T: The job fails to besubmitted.
l RUNNING: The job isrunning.
l SUCCEEDED: The jobis executed successfully.
l FAILED: The job failed.l UNKNOWN: The job
status is unknown.l NEVER_EXECUTED:
The job has not beenexecuted.
creation-date Yes Timestamp Time when a job is created
creation-user Yes String User who created a job
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 72
Return Value Description
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.3.7 Querying Job Status
FunctionThis API is used to query job running status.
URIl URI format
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{job_name}/status
l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, seeObtaining the ProjectID, Account Name,and AK/SK.
cluster_id Yes String Cluster ID
job_name Yes String Job name
RequestSample requestGET /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job/jdbc2hive/status
Responsel Sample job running response
{ "submissions": [{ "progress": 1, "job-name": "jdbc2hive", "status": "BOOTING", "creation-date": 1536905778725, "creation-user": "cdm" }]}
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 73
l Sample response of a successful job{ "submissions": [{ "progress": 0, "job-name": "jdbc2hive", "status": "SUCCEEDED", "creation-date": 1536906582997, "creation-user": "cdm", "isStopingIncrement": "", "last-update-date": 1536916580005, "is-execute-auto": false, "last-udpate-user": "cdm", "isDeleteJob": false, "isIncrementing": false, "external-id": "job_local1127970451_0009", "counters": { "org.apache.sqoop.submission.counter.SqoopCounters": { "BYTES_WRITTEN": -1, "TOTAL_FILES": -1, "BYTES_READ": -1, "FILES_WRITTEN": -1, "TOTAL_SIZE": -1, "FILES_READ": -1, "ROWS_WRITTEN": 80, "ROWS_READ": 80 } } }]}
l Parameter description
Parameter Mandatory Type Description
submissions Yes List Job runninginformation. Fordetails, seeDescription of thesubmissionsparameter.
l Description of the submissions parameter
Parameter Mandatory Type Description
job-name Yes String Job name
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 74
Parameter Mandatory Type Description
status Yes String Job status. The options are asfollows:l NEW: The job has not
been executed.l PENDING: The job is
waiting for systemscheduling.
l BOOTING: The data tobe migrated is beinganalyzed.
l RUNNING: The job isrunning.
l FAILED: The job failed.l SUCCEEDED: The job
is executed successfully.
creation-date Yes Timestamp Time when a job wasexecuted
creation-user Yes String User who created a job
isStopingIncrement
Yes Boolean Whether to stop incrementalmigration
last-update-date Yes Timestamp Time when a job is updated
is-execute-auto Yes Boolean Whether to execute a job atthe scheduled time
last-udpate-user Yes String Last user who updates the jobstatus
isDeleteJob Yes Boolean Whether to delete a job afterthe job is executed
isIncrementing Yes Boolean Whether a job is anincremental migration job
external-id Yes String Job ID
counters No Dictionary Job running result statistics.This parameter is availableonly when status isSUCCEEDED. For details,see Description of thecounters parameter.
l Description of the counters parameter
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 75
Parameter Mandatory Type Description
org.apache.sqoop.submission.counter.SqoopCounters
Yes List Job running resultstatistics. For details, seeDescription oforg.apache.sqoop.submission.counter.SqoopCounters.
l Description of org.apache.sqoop.submission.counter.SqoopCounters
Parameter Mandatory Type Description
ROWS_WRITTEN No Integer Number of rows that arewritten
ROWS_READ No Integer Number of rows that areread
BYTES_WRITTEN No Integer Number of bytes that arewritten
BYTES_READ No Integer Number of bytes that areread
TOTAL_SIZE No Integer Total number of bytes
FILES_WRITTEN No Integer Number of files that arewritten
FILES_READ No Integer Number of files that areread
FILES_SKIPPED No Integer Number of files that areskipped
TOTAL_FILES No Integer Total number of files
Return Valuel Normal
200
l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 76
Return Value Description
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
5.3.8 Querying Job Execution History
FunctionThis API is used to query the historical job execution status.
URIl URI format
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/submissions?jname={job_name}
l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For detailsabout how to obtain theproject ID, see Obtainingthe Project ID, AccountName, and AK/SK.
cluster_id Yes String Cluster ID
job_name Yes String Job name
RequestSample requestGET /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/submissions?jname=jdbc2hive
Responsel Sample job running response
{ "submissions": [{ "progress": 1, "job-name": "jdbc2hive", "status": "BOOTING", "creation-date": 1536905778725, "creation-user": "cdm" }]}
l Sample response of a successful job{ "submissions": [{ "progress": 0, "job-name": "jdbc2hive",
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 77
"status": "SUCCEEDED", "creation-date": 1536906582997, "creation-user": "cdm", "isStopingIncrement": "", "last-update-date": 1536916580005, "is-execute-auto": false, "last-udpate-user": "cdm", "isDeleteJob": false, "isIncrementing": false, "external-id": "job_local1127970451_0009", "counters": { "org.apache.sqoop.submission.counter.SqoopCounters": { "BYTES_WRITTEN": -1, "TOTAL_FILES": -1, "BYTES_READ": -1, "FILES_WRITTEN": -1, "TOTAL_SIZE": -1, "FILES_READ": -1, "ROWS_WRITTEN": 80, "ROWS_READ": 80 } } }]}
l Parameter description
Parameter Mandatory Type Description
submissions Yes List Job running information.For details, seeDescription of thesubmissions parameter.
Return Valuel Normal
200l Abnormal
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 78
5.3.9 Deleting a Job
Function
This API is used to delete a specified job.
URIl URI format
DELETE /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{job_name}l Parameter description
Parameter Mandatory Type Description
project_id Yes String Project ID. For details abouthow to obtain the project ID,see Obtaining the Project ID,Account Name, and AK/SK.
cluster_id Yes String Cluster ID
job_name Yes String Job name
Request
Sample requestDELETE /v1.1/1551c7f6c808414d8e9f3c514a170f2e/clusters/6ec9a0a4-76be-4262-8697-e7af1fac7920/cdm/job/jdbc2hive
Responsel Successful sample response
{}
l Failed sample response{ "errCode": "Cdm.0100", "externalMessage": "Job[jdbc2hive] doesn't exist."}
l Parameter description of a failed response
Parameter Mandatory Type Description
errCode Yes String Error code
externalMessage Yes String Error message
Return Valuel Normal
200
l Abnormal
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 79
Return Value Description
400 Bad Request Request error.
401 Unauthorized Authentication failed.
403 Forbidden No operation permission.
404 Not Found The requested resource is not found.
500 Internal Server Error Internal error. For details about the returnederror code, see Error Code.
503 Service Unavailable Service unavailable.
Cloud Data MigrationAPI Reference 5 API
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 80
6 Public Data Structures
6.1 Link Parameter Description
6.1.1 Link to a Relational Database
Description
By creating a JDBC link, you can extract data from or load data to the following relationaldatabases:l Data Warehouse Servicel RDS for MySQLl RDS for PostgreSQLl RDS for SQL Serverl DDMl MySQLl PostgreSQLl Microsoft SQL Serverl Oraclel IBM Db2l FusionInsight LibrAl Derecho (GaussDB)
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.databaseType", "value": "MYSQL"
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 81
}, { "name": "linkConfig.host", "value": "10.120.205.30" }, { "name": "linkConfig.port", "value": "3306" }, { "name": "linkConfig.database", "value": "DB_name" }, { "name": "linkConfig.username", "value": "username" }, { "name": "linkConfig.password", "value": "Add password here" }, { "name": "linkConfig.fetchSize", "value": "100000" }, { "name": "linkConfig.usingNative", "value": "false" }, { "name": "linkConfig.useSSL", "value": "false" } ], "name": "linkConfig" } ] }, "name": "mysql_link", "connector-name": "generic-jdbc-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.databaseType
Yes Enumeration Database type. The options areas follows:l ORACLEl MYSQLl SQLSERVERl DB2l POSTGRESQLl GAUSSDBl DWS
linkConfig.host No String IP address of the database server
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 82
Parameter Mandatory Type Description
linkConfig.port No String Port number of the databaseserver
linkConfig.databaseconfig
No Enumeration Oracle database link type. Thisparameter is available only whenan Oracle link is created. Theoptions are as follows:l SERVICENAME: Use
SERVICE_NAME toconnect to the Oracledatabase.
l SID: Use SID to connect tothe Oracle database.
linkConfig.sidname
No String Oracle instance ID, which isused to differentiate databasesby instances. This parameter isavailable only when an Oraclelink is created and the databaselink typelinkConfig.databaseconfig isset to SID.
linkConfig.database
No String Database name
linkConfig.username
Yes String Username
linkConfig.password
Yes String Password
linkConfig.fetchSize
No String Number of data rows obtainedeach time
linkConfig.usingNative
No Boolean Whether to use the local APIacceleration function of thedatabaseWhen creating a MySQL link,you can use the LOAD DATAfunction of MySQL to acceleratedata import and improve theperformance of importing data tothe MySQL database.
linkConfig.useSSL
No Boolean Whether to use encryptedtransmission. You can enableSSL encrypted transmission forRelational Database Service(RDS).
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 83
Parameter Mandatory Type Description
linkConfig.jdbcProperties
No Map Link attribute, which specifiesthe JDBC connector attributes ofthe data source. For details abouthow to configure the linkattributes, see the JDBCconnector description of thecorresponding database.
linkConfig.version
No Enumeration Oracle database version. Thisparameter is available only whenyou create an Oracle link. Theoptions are as follows:l HIGH_VERSION: Select
this value if the Oracledatabase version is later than12.1.
l MED_VERSION: Selectthis value if the Oracledatabase version is 12.1.
l LOW_VERSION: Selectthis value if the Oracledatabase version is earlierthan 12.1.
If error message"java.sql.SQLException:Protocol violation" is displayed,select another option.
dialect.identifierEnclose
No String Reference identifier, which is thedelimiter between the referencedtable names or column names.For details, see the productdocumentation of thecorresponding database.
linkConfig.importMode
No Enumeration Data import mode. Thisparameter is available only whenthe database type is DWS.l GDS: In GDS mode, port
25000 is enabled on CDM,and multiple DWSDataNodes extract data fromCDM.
l COPY: In COPY mode,CDM copies data to DWS byusing the JDBC API ofDWS.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 84
6.1.2 Link to OBS
DescriptionBy creating an OBS link, you can extract files from or load files to OBS. Files in CSV, JSON,and binary format are supported.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.storageType", "value": "OBS" }, { "name": "linkConfig.server", "value": "10.121.16.183" }, { "name": "linkConfig.port", "value": "443" }, { "name":"linkConfig.accessKey", "value": "RSO6TTEZMJ6TTFBBAACE" }, { "name":"linkConfig.securityKey", "value":"Add password here" } ], "name": "linkConfig" } ] }, "name": "obs_link", "connector-name": "obs-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.storageType
Yes String Storage class of an object
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 85
Parameter Mandatory Type Description
linkConfig.server Yes String Endpoint of the OBS server. Youcan enter a bucket-level domainname.For example, if you set theparameter totest.obs.myhuaweicloud.com,only the test bucket can bequeried.
linkConfig.port Yes String Data transmission port. TheHTTPS port number is 443 andthe HTTP port number is 80.
linkConfig.accessKey
Yes String AK
linkConfig.securityKey
Yes String SK
6.1.3 Link to OSS on Alibaba Cloud
DescriptionBy creating an OSS link, you can extract files from Object Storage Service (OSS) on AlibabaCloud. Files in CSV, JSON, and binary formats are supported.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.storageType", "value": "OSS" }, { "name": "linkConfig.ossEndpoint", "value": "oss-cn-qingdao.aliyuncs.com" }, { "name": "linkConfig.ossAuthType", "value": "ACCESS_KEY" }, { "name": "linkConfig.accessKey", "value": "LTAI1mR805IE0cVr" }, { "name": "linkConfig.securityKey", "value": "Add password here" } ],
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 86
"name": "linkConfig" } ] }, "name": "op_oss_link", "connector-name": "obs-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.storageType
Yes String Storage class of an object
linkConfig.ossEndpoint
Yes String Endpoint of OSS
linkConfig.ossAuthType
Yes Boolean Identity authentication methodl ACCESS_KEY: Use the
access key to access OSS.l STS: Use the temporary key
and security token to accessOSS.
linkConfig.accessKey
Yes String AK
linkConfig.securityKey
Yes String SK
linkConfig.securityToken
No String Temporary security tokenrequired when a temporaryaccess key is used to log in toOSS
6.1.4 Link to KODO/COS
DescriptionBy creating a KODO/COS link, you can extract files from Qiniu Cloud's object storage(KODO) or Tencent Cloud's Cloud Object Storage (COS). Files in CSV, JSON, and binaryformats are supported.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ {
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 87
"name": "linkConfig.storageType", "value": "KODO_COS" }, { "name": "linkConfig.zone", "value": "z1" }, { "name": "linkConfig.accessKey", "value": "xib1R_oeN4KBYTA-yNoV0Kd3AbUwi1yuvzbPF4LE" }, { "name": "linkConfig.securityKey", "value": "Add password here" }, { "name": "linkConfig.useCustomDomain", "value": "false" } ], "name": "linkConfig" } ] }, "name": "kodo_link", "connector-name": "thirdparty-obs-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.storageType
Yes String Storage class of an object
linkConfig.zone Yes String Storage area of KODO/COS
linkConfig.accessKey
Yes String AK
linkConfig.securityKey
Yes String SK
linkConfig.useCustomDomain
No Boolean This parameter is available onlyfor KODO links. You can choosewhether to use the customdomain name to downloadobjects.
6.1.5 Link to HDFS
DescriptionBy creating an HDFS link, you can extract files from or load files to MRS, FusionInsight HD,or Apache Hadoop. Files in CSV, Parquet, and binary formats are supported.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 88
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.hadoopType", "value": "FusionInsight HD" }, { "name": "linkConfig.host", "value": "10.120.205.143" }, { "name": "linkConfig.casPort", "value": "20009" }, { "name": "linkConfig.port", "value": "28443" }, { "name": "linkConfig.authType", "value": "KERBEROS" }, { "name": "linkConfig.user", "value": "admin" }, { "name": "linkConfig.password", "value": "Add password here" }, { "name": "linkConfig.runMode", "value": "STANDALONE" } ], "name": "linkConfig" } ] }, "name": "hdfslink", "connector-name": "hdfs-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.hadoopType
Yes Enumeration Hadoop type. The options are asfollows:l MRS: link to HDFS of MRSl FusionInsight HD: link to
HDFS of FusionInsight HDl Apache Hadoop: link to HDFS
of Apache Hadoop
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 89
Parameter Mandatory Type Description
linkConfig.uri No String NameNode URI required for thelink to Apache Hadoop. The formatis ip:port.
linkConfig.host No String IP address of Manager required forthe link to MRS or FusionInsightHD
linkConfig.port No String Port number of Manager requiredfor the link to FusionInsight HD
linkConfig.casPort
No String Port number of CAS Server thatconnects to FusionInsight HDrequired for the link toFusionInsight HD
linkConfig.user No String Username for accessing Manager.If SIMPLE authentication is usedfor connecting to MRS, you do notneed to enter the username andpassword for accessing Manager.
linkConfig.password
No String Password for accessing Manager. IfSIMPLE authentication is used forconnecting to MRS, you do notneed to enter the username andpassword for accessing Manager.
linkConfig.authType
No Enumeration Authentication method. Theoptions are as follows:l Simple: for non-security model Kerberos: for security mode
linkConfig.principal
No String Account principal required forKerberos authentication. You cancontact the administrator to obtainthe account.
linkConfig.keytab
No FileContent Local absolute path of the keytabfile required for Kerberosauthentication. You can contact theadministrator to obtain the file.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 90
Parameter Mandatory Type Description
linkConfig.runMode
No Enumeration Running mode of the HDFS link.The options are as follows:l EMBEDDED: The link
instance runs with CDM. Thismode delivers betterperformance.
l STANDALONE: The linkinstance runs in an independentprocess. If CDM needs toconnect to multiple Hadoopdata sources (MRS, Hadoop, orCloudTable) with bothKerberos and Simpleauthentication methods,STANDALONE prevails.
If STANDALONE is selected,CDM can migrate data betweenHDFSs of multiple MRS clusters.
6.1.6 Link to HBase
DescriptionBy creating an HBase link, you can extract data from or load data to HBase of MRS,FusionInsight HD, and Apache Hadoop.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.hbaseType", "value": "MRS" }, { "name": "linkConfig.host", "value": "192.168.2.23" }, { "name": "linkConfig.authType", "value": "SIMPLE" }, { "name": "linkConfig.runMode", "value": "STANDALONE" } ], "name": "linkConfig" }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 91
] }, "name": "mrshbase", "connector-name": "hbase-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.hbaseType
Yes Enumeration HBase type. The options are asfollows:l CloudTable: link to CloudTable
Service (CloudTable)l MRS: link to HBase of MRSl FusionInsight HD: link to HBase
of FusionInsight HDl Apache Hadoop: link to HBase
of Apache Hadoop
linkConfig.uri No String NameNode URI required for the linkto Apache Hadoop. The format isip:port.
llinkConfig.host
No String IP address of Manager required forthe link to MRS or FusionInsight HD
linkConfig.port No String Port number of Manager required forthe link to FusionInsight HD
linkConfig.casPort
No String Port number of CAS Server thatconnects to FusionInsight HDrequired for the link to FusionInsightHD
linkConfig.user No String Username for accessing Manager. IfSIMPLE authentication is used forconnecting to MRS, you do not needto enter the username and passwordfor accessing Manager.
linkConfig.password
No String Password for accessing Manager. IfSIMPLE authentication is used forconnecting to MRS, you do not needto enter the username and passwordfor accessing Manager.
linkConfig.authType
No Enumeration Authentication method. The optionsare as follows:l Simple: for non-security model Kerberos: for security mode
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 92
Parameter Mandatory Type Description
linkConfig.principal
No String Account principal required forKerberos authentication. You cancontact the administrator to obtain theaccount.
linkConfig.keytab
No FileContent Local absolute path of the keytab filerequired for Kerberos authentication.You can contact the administrator toobtain the file.
linkConfig.serviceType
No String Service type
linkConfig.runMode
No Enumeration Running mode of the HBase link.Options are as follows:l EMBEDDED: The link instance
runs with CDM. This modedelivers better performance.
l STANDALONE: The linkinstance runs in an independentprocess. If CDM needs to connectto multiple Hadoop data sources(MRS, Hadoop, or CloudTable)with both Kerberos and Simpleauthentication methods,STANDALONE prevails.
6.1.7 Link to CloudTable
DescriptionBy creating a CloudTable link, you can extract data from or load data to CloudTable.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.hbaseType", "value": "CloudTable" }, { "name": "linkConfig.zookeeperQuorum", "value": "cloudtable-pass-zk2-bae54VGN.cloudtable.com:2181,cloudtable-pass-zk1-Fu828so2.cloudtable.com:2181" }, { "name": "linkConfig.iamAuth", "value": "true"
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 93
}, { "name": "linkConfig.cloudtableUser", "value": "zane" }, { "name": "linkConfig.accessKey", "value": "GRC2WR0IxxxxxxxYLWU2" }, { "name": "linkConfig.securityKey", "value": "Add password here" }, { "name": "linkConfig.runMode", "value": "EMBEDDED" } ], "name": "linkConfig" } ] }, "name": "cloudtablelink", "connector-name": "hbase-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.hbaseType
Yes Enumeration HBase type. The options are asfollows:l CloudTable: link to CloudTablel MRS: link to MRSl FusionInsight HD: link to
FusionInsight HDl Apache Hadoop: link to Apache
Hadoop
linkConfig.zookeeperQuorum
Yes String ZooKeeper link of CloudTable. Thisparameter is mandatory for theCloudTable link.
linkConfig.iamAuth
Yes Boolean If you choose IAM for identityauthentication, enter the username,AK, and SK.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 94
Parameter Mandatory Type Description
linkConfig.runMode
Yes Enumeration Running mode of the HBase link. Theoptions are as follows:l EMBEDDED: The link instance
runs with CDM. This modedelivers better performance.
l STANDALONE: The linkinstance runs in an independentprocess. If CDM needs to connectto multiple Hadoop data sources(MRS, Hadoop, or CloudTable)with both Kerberos and Simpleauthentication methods,STANDALONE prevails.
linkConfig.cloudtableUser
Yes String Username for accessing theCloudTable cluster
linkConfig.accessKey
Yes String AK for accessing the CloudTablecluster
linkConfig.securityKey
Yes String SK for accessing the CloudTablecluster
6.1.8 Link to Hive
DescriptionBy creating a Hive link, you can extract data from or load data to Hive of MRS.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.host", "value": "10.120.205.230" }, { "name": "linkConfig.authType", "value": "KERBEROS" }, { "name": "linkConfig.user", "value": "cdm" }, { "name": "linkConfig.password", "value": "Add password here" } ],
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 95
"name": "linkConfig" } ] }, "name": "hive_link", "connector-name": "hive-connector" } ]}
Link ParametersParameter Mandatory Type Description
llinkConfig.host
No String IP address of MRS Manager
linkConfig.authType
No Enumeration Authentication method of MRS. Theoptions are as follows:l SIMPLE: for non-security model KERBEROS: for security mode
linkConfig.principal
No String Account principal required forKerberos authentication. You cancontact the administrator to obtain theaccount.
linkConfig.keytab
No FileContent Local absolute path of the keytab filerequired for Kerberos authentication.You can contact the administrator toobtain the file.
6.1.9 Link to an FTP or SFTP Server
Description
By creating an FTP or SFTP link, you are able to extract files from or load files to the FTP orSFTP server. Files in CSV, JSON, and binary format are supported.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.server", "value": "10.120.85.167" }, { "name": "linkConfig.port", "value": "22" }, {
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 96
"name": "linkConfig.username", "value": "username" }, { "name": "linkConfig.password", "value": "Add password here" } ], "name": "linkConfig" } ] }, "name": "sftp_link", "connector-name": "sftp-connector" } ]}
Link ParametersParameters for creating the FTP or SFTP link are the same.
Parameter Mandatory Type Description
linkConfig.server
Yes String IP address of the FTP or SFTP server
linkConfig.port Yes String Port number of the FTP or SFTPserver
linkConfig.username
Yes String Username for logging in to the FTPor SFTP server
linkConfig.password
Yes String Password for logging in to the FTP orSFTP server
6.1.10 Link to MongoDB
DescriptionBy creating a MongoDB link, you can extract data from or load data to MongoDB.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.serverList", "value": "10.120.84.149:27017" }, { "name": "linkConfig.database", "value": "DB_name" }, {
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 97
"name": "linkConfig.userName", "value": "username" }, { "name": "linkConfig.password", "value": "Add password here" } ], "name": "linkConfig" } ] }, "name": "mongo_link", "connector-name": "mongodb-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.serverList
Yes String Server IP address list inhost1:port1;host2:port2 format
linkConfig.database
Yes String Name of the MongoDB database
linkConfig.userName
Yes String Username for logging in to theMongoDB server
linkConfig.password
Yes String Password for logging in to theMongoDB server
6.1.11 Link to Redis/DCS
DescriptionBy creating a Redis link, you can extract data from or load data to the Redis server. Bycreating a DCS link, you can load data to Data Cache Service (DCS), but not extract data forDCS. The data can be stored in string or hash format.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.deploymentMode", "value": "Cluster" }, { "name": "linkConfig.serverlist", "value": "10.120.84.149:7300" }, {
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 98
"name": "linkConfig.password", "value": "Add password here" }, { "name": "linkConfig.dbIndex", "value": "0" } ], "name": "linkConfig" } ] }, "name": "redis_link", "connector-name": "redis-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.deploymentMode
Yes Enumeration Redis deployment mode. Theoptions are as follows:l Single: single-node deploymentl Cluster: cluster deployment
linkConfig.serverlist
Yes String Server IP address list inhost1:port1;host2:port2 format
linkConfig.password
Yes String Password for logging in to the Redisserver
linkConfig.dbIndex
Yes String Redis database index
6.1.12 Link to NAS/SFS
DescriptionBy creating a NAS link, you can extract files from or load files to the NAS server. Currently,the link supports the SMB, CIFS, and NFS protocols, as well as cloud services that providefile systems supporting the SMB, CIFS, and NFS protocols, such as Scalable File Service(SFS). Files in CSV, JSON, and binary formats are supported.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.protocol", "value": "CIFS" },
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 99
{ "name": "linkConfig.server", "value": "\\\\10.121.16.20\\data" }, { "name": "linkConfig.username", "value": "username" }, { "name": "linkConfig.password", "value": "Add password here" } ], "name": "linkConfig" } ] }, "name": "nas_link", "connector-name": "nas-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.protocol Yes String NAS file protocol. Currently,only the SMB, CIFS, and NFSprotocols are supported.
linkConfig.server Yes String Shared path of the NASserver, in which \\\\ isconverted into \\
linkConfig.username Yes String Username for accessing theNAS server
linkConfig.password Yes String Password for accessing theNAS server
6.1.13 Link to Kafka
DescriptionBy creating a Kafka link, you are able to access open source Kafka and migrate data fromKafka to other data sources as required. Currently, only data export from Kafka is supported.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.brokerList", "value": "127.0.0.1:9092"
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 100
} ], "name": "linkConfig" } ] }, "name": "kafka_link", "connector-name": "kafka-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.brokerList
Yes String Kafka broker list inhost1:port1,host2:port2 format
6.1.14 Link to DIS
DescriptionBy creating a DIS link, you can access DIS and migrate data from DIS to other data sourcesas required.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.region", "value": "Region" }, { "name": "linkConfig.endpoint", "value": "https://dis.cn-north-1.myhuaweiclouds.com" }, { "name": "linkConfig.ak", "value": "RSO6TTEZMJ6TTFBBAACE" }, { "name": "linkConfig.sk", "value": "Add password here" }, { "name": "linkConfig.projectId", "value": "11d4d5af17c84660bc90b6631327d7c7" } ], "name": "linkConfig" } ] }, "name": "dis_link",
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 101
"connector-name": "dis-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.region Yes String Region where DIS resides
linkConfig.endpoint Yes String URL of DIS in the format ofhttps://Endpoint
linkConfig.ak Yes String AK for accessing the DISserver
linkConfig.sk Yes String SK for accessing the DISserver
linkConfig.projectId Yes String Project ID
6.1.15 Link to Elasticsearch/Cloud Search Service
Description
By creating an Elasticsearch link, you can extract data from or load data to the Elasticsearchserver or Cloud Search Service.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.host", "value": "192.168.0.1:9200;192.168.0.2:9200" }, { "name": "linkConfig.user", "value": "cdm" }, { "name": "linkConfig.password", "value": "Add password here" }, { "name": "linkConfig.linkType", "value": "ElasticSearch" } ], "name": "linkConfig" } ] }, "name": "es_link",
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 102
"connector-name": "elasticsearch-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.host Yes String List of one or more Elasticsearchservers, including the port number.The format is ip:port. Usesemicolons (;) to separate multipleIP addresses. For example,192.168.0.1:9200;192.168.0.2:9200.
linkConfig.user No String For Elasticsearch that supportsusername and passwordauthentication, configure theusername and password whencreating a link.
linkConfig.password
No String Password for accessing theElasticsearch server
linkConfig.linkType
Yes String Link type, which is used todistinguish the Elasticsearch linkfrom the Cloud Search Service link
6.1.16 Link to DLI
DescriptionBy creating a DLI link, you can import data to DLI. Currently, you cannot export data fromDLI using CDM.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.ak", "value": "GRC2WR0IDC6NGROYLWU2" }, { "name": "linkConfig.sk", "value": "Add password here" }, { "name": "linkConfig.region", "value": "cn-north-1" }, {
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 103
"name": "linkConfig.projectId", "value": "c48475ce8e174a7a9f775706a3d5ebe2" } ], "name": "linkConfig" } ] }, "name": "dli", "connector-name": "dli-connector" } ]}
Link Parameters
Parameter Mandatory Type Description
linkConfig.ak Yes String AK for accessing the DLI database
linkConfig.sk Yes String SK for accessing the DLI database
linkConfig.region
Yes String Region where DLI resides
linkConfig.projectId
Yes String Project ID of the DLI service
6.1.17 Link to CloudTable OpenTSDB
Description
By creating an OpenTSDB link, you can extract data from and load data to CloudTableOpenTSDB.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.openTSDBQuorum", "value": "opentsdb-sp8afz7bgbps5ur.cloudtable.com:4242" }, { "name": "linkConfig.securityMode", "value": "UNSAFE" } ], "name": "linkConfig" } ] }, "name": "opentsdb", "connector-name": "opentsdb-connector" }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 104
]}
Link ParametersParameter Mandatory Type Description
linkConfig.openTSDBQuorum
Yes String ZK Link of OpenTSDB
linkConfig.securityMode
Yes String Security or non-security modeIf you select Security, enter theproject ID, username, and AK/SK.
linkConfig.user No String Username for accessing CloudTable
linkConfig.ak No String AK for accessing CloudTable
linkConfig.sk No String SK for accessing CloudTable
linkConfig.projectId
No String Project ID of CloudTable
6.1.18 Link to Amazon S3
Description
By creating an Amazon S3 link, you can extract files from Amazon S3. Files in CSV, JSON,and binary formats are supported.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.storageType", "value": "S3" }, { "name": "linkConfig.accessKey", "value": "AKIAIPRxxxxxHYWEGDWQ" }, { "name": "linkConfig.securityKey", "value": "Add password here" } ], "name": "linkConfig" } ] }, "name": "awslink", "connector-name": "thirdparty-obs-connector" }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 105
]}
Link ParametersParameter Mandatory Type Description
linkConfig.storageType
Yes String Storage class of an object
linkConfig.accessKey
Yes String AK for accessing the bucket ofAmazon S3
linkConfig.securityKey
Yes String SK for accessing the bucket ofAmazon S3
6.1.19 Link to DMS Kafka
DescriptionBy creating a DMS Kafka link, you can connect to Kafka Basic or Kafka Platinum on DMS.Currently, you can only export data from DMS Kafka to Cloud Search Service.
Sample Link{ "links": [ { "link-config-values": { "configs": [ { "inputs": [ { "name": "linkConfig.kafkaType", "value": "Basic" }, { "name": "linkConfig.endpoint", "value": "dms-kafka.cn-north-1.myhuaweicloud.com:37000" }, { "name": "linkConfig.ak", "value": "GRC2WR0IDC6NGROYLWU2" }, { "name": "linkConfig.sk", "value": "Add password here" }, { "name": "linkConfig.projectId", "value": "c48475ce8e174a7a9f775706a3d5ebe2" }, { "name": "linkConfig.targetProjectId", "value": "be1a07648c074b948700a03fa3c0990e" } ], "name": "linkConfig" } ] },
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 106
"name": "dms", "connector-name": "dms-kafka-connector" } ]}
Link ParametersParameter Mandatory Type Description
linkConfig.kafkaType Yes Enumeration DMS Kafka version. Theoptions are as follows:l Basic: Kafka Basicl Platinum: Kafka Platinum
linkConfig.endpoint Yes String DMS endpoint in the format ofhost:port
linkConfig.ak Yes String AK for accessing DMS Kafka
linkConfig.sk Yes String SK for accessing DMS Kafka
linkConfig.projectId Yes String Project ID in the region whereDMS resides
linkConfig.targetProjectId
No String To access the queuesauthorized by other tenants,enter the grantor's project ID
6.2 Source Job Parameters
6.2.1 From a Relational Database
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.useSql", "value": "false" }, { "name": "fromJobConfig.schemaName", "value": "rf_database" }, { "name": "fromJobConfig.tableName", "value": "rf_from" }, { "name": "fromJobConfig.columnList", "value": "AA&BB" }, { "name": "fromJobConfig.incrMigration",
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 107
"value": "false" }, { "name": "fromJobConfig.createOutTable", "value": "false" } ], "name": "fromJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
fromJobConfig.useSql
Yes Boolean Whether to use the customizedSQL statement to export data whenexporting relational database data
fromJobConfig.sql No String Customized SQL statement. CDMexecutes the SQL statement toexport data.
fromJobConfig.schemaName
Yes String Database mode or tablespace. Forexample, public.NOTE
The parameter value can containwildcard characters (*), which is usedto export all databases whose namesstart with a certain prefix or end with acertain suffix. The examples are asfollows:
l SCHEMA* indicates that alldatabases whose names startingwith SCHEMA are exported.
l *SCHEMA indicates that alldatabases whose names endingwith SCHEMA are exported.
l *SCHEMA* indicates that alldatabases whose names containingSCHEMA are exported.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 108
Parameter Mandatory Type Description
fromJobConfig.tableName
Yes String Table name. For example,TBL_EXAMPLE.NOTE
The table name can contain wildcardcharacters (*), which is used to exportall tables whose names start with acertain prefix or end with a certainsuffix. The number and types of fieldsin the tables must be the same. Theexamples are as follows:
l table* indicates that all tableswhose names starting with tableare exported.
l *table indicates that all tableswhose names ending with tableare exported.
l *table* indicates that all tableswhose names containing table areexported.
fromJobConfig.whereClause
No String WHERE clause used to specify thedata to be extracted. If no WHEREclause is configured, the entiretable will be extracted. Forexample, age > 18 and age <= 60.
fromJobConfig.columnList
No String List of fields to be extracted. Use& to separate field names. Forexample, id&gid&name.
fromJobConfig.partitionColumn
No String Partition field to be extracted, bywhich a job is split in multiple sub-jobs executed concurrently. Forexample, id.
fromJobConfig.regainSymbol
No String After this parameter is set to aspecified field, CDM queries thetable imported from the destinationdatabase every time the job isstarted. If the table does not containthe specified field, CDM performsfull migration. If the destinationtable contains the specified fieldand the field has a value, CDMperforms incremental migration tomigrate only the data whose valueis greater than the value of thisfield.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 109
Parameter Mandatory Type Description
fromJobConfig.incrMigration
No Boolean Whether to migrate incrementaldata in MySQL Binlog mode.Currently, this mode can be used inthe table/file migration from theMySQL database to the DWSdatabase only.After this function is enabled, datain the source table and destinationtable can be synchronized in realtime. One MySQL link supportsonly one incremental migration job,and one source table supports onlyone incremental migration job.
fromJobConfig.usePartition
No Boolean When data is exported from theOracle database, data can beextracted from each partition in apartitioned table. When thisfunction is enabled, you can use thefromJobConfig.partitionListparameter to specify the partitionsin the Oracle table. This functiondoes not support non-partitionedtables.
fromJobConfig.partitionList
No String Oracle table partitions to bemigrated. Separate multiplepartitions with ampersands (&). Ifyou do not set this parameter, allpartitions will be migrated.
6.2.2 From Object Storage
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.bucketName", "value": "cdm-est" }, { "name": "fromJobConfig.inputDirectory", "value": "/obsfrom/varchar.txt" }, { "name": "fromJobConfig.inputFormat", "value": "CSV_FILE" }, { "name": "fromJobConfig.columnList", "value": "1&2&3"
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 110
}, { "name": "fromJobConfig.fieldSeparator", "value": "," }, { "name": "fromJobConfig.quoteChar", "value": "false" }, { "name": "fromJobConfig.regexSeparator", "value": "false" }, { "name": "fromJobConfig.firstRowAsHeader", "value": "false" }, { "name": "fromJobConfig.encodeType", "value": "UTF-8" }, { "name": "fromJobConfig.fromCompression", "value": "NONE" }, { "name": "fromJobConfig.splitType", "value": "FILE" } ], "name": "fromJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
fromJobConfig.bucketName
Yes String Bucket name
fromJobConfig.inputDirectory
Yes String Path for storing files to beextracted. You can enter amaximum of 50 file paths,which are separated by verticalbars (|). You can also customizethe separators. For example,FROM/example.csv|FROM/b.txt.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 111
Parameter Mandatory Type Description
fromJobConfig.inputFormat
Yes Enumeration File format required for datatransmission. Currently, thefollowing file formats aresupported:l CSV_FILE: CSV format,
used to migrate files to datatables
l JSON_FILE: JSON format,used to migrate files to datatables
l BINARY_FILE: Files(even not in binary format)will be directly transferredwithout resolution. It isapplicable to file copy.
l CARBONDATA:CarbonData format, used tomigrate files to data tables.This parameter is availableonly when the migrationsource is OBS.
If you select BINARY_FILE,the migration destination mustalso be a file system.
fromJobConfig.lineSeparator
No String Line feed character in the file.For example, \n. By default, \n,\r, or \r\n is automaticallyidentified.
fromJobConfig.columnList
No String Numbers of columns to beextracted. Use & to separatecolumn numbers in ascendingorder. For example, 1&3&5.
fromJobConfig.lineSeparator
No String Line feed character. Thisparameter is valid only whenthe file format is CSV_FILE.The default value is \r\n.
fromJobConfig.regexSeparator
No Boolean Whether to use the regularexpression to separate fields.This parameter is valid onlywhen the file format isCSV_FILE.
fromJobConfig.regex
No String Regular expression. Thisparameter is valid only whenthe regular expression is used toseparate fields.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 112
Parameter Mandatory Type Description
fromJobConfig.fieldSeparator
No String Field delimiter. This parameteris valid only when the fileformat is CSV_FILE. Thedefault value is ,.
fromJobConfig.quoteChar
No Boolean Whether to use the encirclingsymbol. If this parameter is setto true, the field delimiters inthe encircling symbol areregarded as a part of the stringvalue. Currently, the defaultencircling symbol of CDM isdouble quotation mark (").
fromJobConfig.firstRowAsHeader
No Boolean Whether to regard the first lineas the heading line. Thisparameter is valid only whenthe file format is CSV_FILE.When you migrate a CSV file toa table, CDM writes all data tothe table by default. If thisparameter is set to true, CDMuses the first line of the CSVfile as the heading line and doesnot write the line to thedestination table.
fromJobConfig.fromCompression
No Enumeration Compression format. Thisparameter is valid only whenthe file format is CSV_FILE orJSON. The options are asfollows:l NONE: Files in all formats
are transferred.l GZIP: Files in gzip format
are transferred.l ZIP: Files in Zip format are
transferred.
fromJobConfig.jsonReferenceNode
No String Reference node. This parameteris valid when the file format isJSON_FILE. Resolve data onthe JSON node. If the datacorresponding to the node is aJSON array, the system extractsdata from the array in the samemode. Nested JSON nodes areseparated by periods (.). Forexample, data.list.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 113
Parameter Mandatory Type Description
fromJobConfig.encodeType
No String Encoding type. For example,UTF_8 or GBK.
fromJobConfig.fromFileOpType
No Enumeration Source file processing mode.After a job is completed,operations on the source filecan be performed. The sourcefile can be renamed or deleted.
fromJobConfig.useMarkerFile
No Boolean Whether to start a job by amarker file. A job is startedonly when a marker file forstarting the job exists in thesource path. Otherwise, the jobwill be suspended for a periodof time specified byfromJobConfig.waitTime.
fromJobConfig.markerFile
No String Name of the marker file forstarting a job. After a markerfile is specified, the task isexecuted only when the fileexists in the source path. If themarker file is not specified, thisfunction is disabled by default.For example, ok.txt.
fromJobConfig.waitTime
No String Period of waiting for a markerfile. If you set Start Job byMarker File to Yes but nomarker file exists in the sourcepath, the job fails uponsuspension timeout.If you set this parameter to 0and no marker file exists in thesource path, the job will failimmediately.Unit: second
fromJobConfig.filterType
No Enumeration Filter type. Possible values areas follows:l WILDCARD: Enter a
wildcard character to filterpaths or files. CDM willmigrate the paths or filesthat meet the filtercondition.
l TIME: Specify a time filter.CDM will migrate the filesmodified after the specifiedtime point.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 114
Parameter Mandatory Type Description
fromJobConfig.pathFilter
No String Path filter, which is configuredwhen the filter type isWILDCARD. It is used tofilter the file directories. Forexample, *input.
fromJobConfig.fileFilter
No String File filter, which is configuredwhen the filter type isWILDCARD. It is used tofilter files in the specifieddirectory. Use commas (,) toseparate multiple files. Forexample, *.csv,*.txt.
fromJobConfig.startTime
No String If you set Filter Type to TimeFilter, specify a time point tofilter files. Only the filesmodified after the specifiedtime point are transferred. Thetime format must be yyyy-MM-dd HH:mm:ss.This parameter can be set to amacro variable of date and time.For example, ${timestamp(dateformat(yyyy-MM-ddHH:mm:ss,-90,DAY))}indicates that only filesgenerated within the latest 90days are migrated.
fromJobConfig.endTime
No String If you set Filter Type to TimeFilter, specify a time point tofilter files. Only the filesmodified before the specifiedtime point are transferred. Thetime format must be yyyy-MM-dd HH:mm:ss.This parameter can be set to amacro variable of date and time.For example, ${timestamp(dateformat(yyyy-MM-dd HH:mm:ss))}indicates that only the fileswhose modification time isearlier than the current time aremigrated.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 115
Parameter Mandatory Type Description
fromJobConfig.fileSeparator
No String File separator. If you entermultiple file paths infromJobConfig.inputDirectory, CDM uses the file separatorto separate files. The defaultvalue is |.
fromJobConfig.decryption
No Enumeration Whether to decrypt theencrypted file before export andthe decryption method. Theoptions are as follows:l NONE: Do not decrypt but
directly export the file.l AES-256-GCM: Use the
AES-256-GCM(NoPadding) algorithm todecrypt the file and thenexport the file.
fromJobConfig.dek
No String Data decryption key. The key isa string of 64-bit hexadecimalnumbers and must be the sameas the data encryption keytoJobConfig.dek configuredduring encryption. If theencryption and decryption keysare inconsistent, the systemdoes not report an exception,but the decrypted data isincorrect.
fromJobConfig.iv No String Initialization vector required fordecryption. The initializationvector is a string of 32-bithexadecimal numbers and mustbe the same as the initializationvector toJobConfig.ivconfigured during encryption. Ifthe initialization vectors areinconsistent, the system doesnot report an exception, but thedecrypted data is incorrect.
fromJobConfig.md5FileSuffix
No String Check whether the filesextracted by CDM areconsistent with those in themigration source.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 116
6.2.3 From HDFS
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.inputDirectory", "value": "/hdfsfrom/from_hdfs_est.csv" }, { "name": "fromJobConfig.inputFormat", "value": "CSV_FILE" }, { "name": "fromJobConfig.columnList", "value": "1" }, { "name": "fromJobConfig.jsonType", "value": "JSON_OBJECT" }, { "name": "fromJobConfig.fieldSeparator", "value": "," }, { "name": "fromJobConfig.quoteChar", "value": "false" }, { "name": "fromJobConfig.regexSeparator", "value": "false" }, { "name": "fromJobConfig.firstRowAsHeader", "value": "false" }, { "name": "fromJobConfig.encodeType", "value": "UTF-8" }, { "name": "fromJobConfig.fromCompression", "value": "NONE" }, { "name": "fromJobConfig.compressedFileSuffix", "value": "*" }, { "name": "fromJobConfig.splitType", "value": "FILE" }, { "name": "fromJobConfig.fromFileOpType", "value": "DO_NOTHING" }, { "name": "fromJobConfig.useMarkerFile", "value": "false" }, { "name": "fromJobConfig.fileSeparator", "value": "|" },
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 117
{ "name": "fromJobConfig.filterType", "value": "NONE" } ], "name": "fromJobConfig" } ] }
Parameter Descriptionl HDFS job parameter description
Parameter Mandatory Type Description
fromJobConfig.inputDirectory
Yes String Path for storing data to beextracted. For example, /data_dir.
fromJobConfig.inputFormat
Yes Enumeration File format required for datatransmission. Currently, thefollowing file formats aresupported:l CSV_FILE: CSV formatl PARQUET_FILE:
Parquet formatl BINARY_FILE: binary
formatIf you select BINARY_FILE,the migration destination mustalso be a file system.
fromJobConfig.columnList
No String Numbers of columns to beextracted. Use & to separatecolumn numbers in ascendingorder. For example, 1&3&5.
fromJobConfig.lineSeparator
No String Line feed character. Thisparameter is valid only whenthe file format is CSV_FILE.The default value is \r\n.
fromJobConfig.fieldSeparator
No String Field delimiter. Thisparameter is valid only whenthe file format is CSV_FILE.The default value is ,.
fromJobConfig.encodeType
No String Encoding type. For example,UTF_8 or GBK.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 118
Parameter Mandatory Type Description
fromJobConfig.firstRowAsHeader
No Boolean Whether to regard the firstline as the heading line. Thisparameter is valid only whenthe file format is CSV_FILE.When you migrate a CSV fileto a table, CDM writes alldata to the table by default. Ifthis parameter is set to true,CDM uses the first line of theCSV file as the heading lineand does not write the line tothe destination table.
fromJobConfig.fromCompression
No Enumeration Compression format. Only thesource files in specifiedcompression format aretransferred. NONE indicatesfiles in all formats aretransferred.
fromJobConfig.splitType
No Enumeration Whether to split files by fileor size. If HDFS files aresplit, each shard is regardedas a file.l FILE: Split files by file
quantity. If there are 10files andthrottlingConfig.numExtractors is set to 5, eachshard consists of two files.
l SIZE: Split files by filesize. Files will not be splitfor balance. Suppose thereare 10 files, among whichnine are 10 MB and one is200 MB in size. IfthrottlingConfig.numExtractors is set to 2, twoshards will be created, onefor processing the nine 10MB files, the other forprocessing the 200 MBfile.
fromJobConfig.fromFileOpType
No Enumeration Source file processing mode.After a job is completed,operations on the source filecan be performed. The sourcefile can be renamed ordeleted.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 119
Parameter Mandatory Type Description
fromJobConfig.markerFile
No String Name of the marker file forstarting a job. After a markerfile is specified, the task isexecuted only when the fileexists in the source path. Ifthe marker file is notspecified, this function isdisabled by default. Forexample, ok.txt.
fromJobConfig.filterType
No Enumeration Filter type. Possible valuesare as follows:l WILDCARD: Enter a
wildcard character to filterpaths or files. CDM willmigrate the paths or filesthat meet the filtercondition.
l TIME: Specify a timefilter. CDM will migratethe files modified after thespecified time point.
fromJobConfig.pathFilter
No String Path filter, which isconfigured when the filtertype is WILDCARD. It isused to filter the filedirectories. For example,*input.
fromJobConfig.fileFilter
No String File filter, which is configuredwhen the filter type isWILDCARD. It is used tofilter files in the specifieddirectory. Use commas (,) toseparate multiple files. Forexample, *.csv,*.txt.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 120
Parameter Mandatory Type Description
fromJobConfig.startTime
No String If you set Filter Type toTime Filter, specify a timepoint to filter files. Only thefiles modified after thespecified time point aretransferred. The time formatmust be yyyy-MM-ddHH:mm:ss.This parameter can be set to amacro variable of date andtime. For example, ${timestamp(dateformat(yyyy-MM-ddHH:mm:ss,-90,DAY))}indicates that only filesgenerated within the latest 90days are migrated.
fromJobConfig.endTime
No String If you set Filter Type toTime Filter, specify a timepoint to filter files. Only thefiles modified before thespecified time point aretransferred. The time formatmust be yyyy-MM-ddHH:mm:ss.This parameter can be set to amacro variable of date andtime. For example, ${timestamp(dateformat(yyyy-MM-dd HH:mm:ss))}indicates that only the fileswhose modification time isearlier than the current timeare migrated.
fromJobConfig.createSnapshot
No Boolean If this parameter is set to true,CDM creates a snapshot forthe source directory to bemigrated (the snapshot cannotbe created for a single file)before it reads files fromHDFS. Then CDM migratesthe data in the snapshot.Only the HDFS administratorcan create a snapshot. Afterthe CDM job is completed,the snapshot is deleted.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 121
Parameter Mandatory Type Description
fromJobConfig.formats
No Data structure Time format. This parameteris mandatory only whenfromJobConfig.inputFormat is set to CSV_FILE and thetime field exists in the file.For details, see Descriptionof thefromJobConfig.formatsparameter.
fromJobConfig.decryption
No Enumeration This parameter is availableonly whenfromJobConfig.inputFormat is set to BINARY_FILE. Itspecifies whether to decryptthe encrypted file beforeexport, and the decryptionmethod. The options are asfollows:l NONE: Do not decrypt
but directly export the file.l AES-256-GCM: Use the
AES-256-GCM(NoPadding) algorithm todecrypt the file and thenexport the file.
fromJobConfig.dek
No String Data decryption key. The keyis a string of 64-bithexadecimal numbers andmust be the same as the dataencryption keytoJobConfig.dek configuredduring encryption. If theencryption and decryptionkeys are inconsistent, thesystem does not report anexception, but the decrypteddata is incorrect.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 122
Parameter Mandatory Type Description
fromJobConfig.iv No String Initialization vector requiredfor decryption. Theinitialization vector is a stringof 32-bit hexadecimalnumbers and must be thesame as the initializationvector toJobConfig.ivconfigured during encryption.If the initialization vectors areinconsistent, the system doesnot report an exception, butthe decrypted data isincorrect.
l Description of the fromJobConfig.formats parameter
Parameter Mandatory Type Description
name Yes String Column number. For example, 1.
value Yes String Time format. For example, yyyy-MM-dd.
6.2.4 From Hive
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.hive", "value": "hive" }, { "name": "fromJobConfig.database", "value": "rf_database" }, { "name": "fromJobConfig.table", "value": "rf_from" }, { "name": "fromJobConfig.columnList", "value": "tiny&small&int&integer&bigint&float&double×tamp&char&varchar&text" } ], "name": "fromJobConfig" } ] }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 123
Parameter DescriptionParameter Mandatory Type Description
fromJobConfig.hive
No String Data source to be extracted. If thedata source is Hive, set thisparameter to hive.
fromJobConfig.database
No String Database from which data isextracted. For example, default.
fromJobConfig.table
Yes String Name of the table from which data isextracted. For example, cdm.
fromJobConfig.columnList
No String Numbers of columns to be extracted.Use & to separate column numbersin ascending order. For example,1&3&5.
6.2.5 From HBase/CloudTable
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.table", "value": "rf_from" }, { "name": "fromJobConfig.columnFamily", "value": "rowkey&f" }, { "name": "fromJobConfig.columns", "value": "rowkey:rowkey&f:_small" }, { "name": "fromJobConfig.formats", "value": { "f:_date": "yyyy-MM-dd", "f:_timestamp": "yyyy-MM-dd HH:mm:ss" } } ], "name": "fromJobConfig" } ] }
Parameter Descriptionl HBase/CloudTable job parameter description
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 124
Parameter Mandatory Type Description
fromJobConfig.table
Yes String Name of the table from which data isextracted. For example, cdm.
fromJobConfig.columnFamilies
No String Column family to which the data tobe extracted belongs
fromJobConfig.columns
No String Columns to be extracted. Use & toseparate column numbers and : toseparate column families andcolumns. For example,cf1:c1&cf2:c2.
fromJobConfig.isSplit
No Boolean Whether to split the rowkey. Forexample, true.
fromJobConfig.delimiter
No String Delimiter used for splitting rowkeys.If this parameter is not set, row keyswill not be split. For example,vertical bars (|).
fromJobConfig.startTime
No String Minimum timestamp of the timerange (the time point is included).The format is yyyy-MM-ddhh:mm:ss.Only the data created at this point intime and later is extracted.
fromJobConfig.endTime
No String Maximum timestamp of the timerange (the time point is notincluded). The format is yyyy-MM-dd hh:mm:ss.Only the data created before thispoint in time is extracted.
fromJobConfig.formats
No Datastructure
Time format. For details, seeDescription of thefromJobConfig.formatsparameter.
l Description of the fromJobConfig.formats parameter
Parameter Mandatory Type Description
name Yes String Column number. For example, 1.
value Yes String Time format. For example, yyyy-MM-dd.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 125
6.2.6 From FTP/SFTP/NAS/SFS
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.inputDirectory", "value": "/sftpfrom/from_sftp.csv" }, { "name": "fromJobConfig.inputFormat", "value": "CSV_FILE" }, { "name": "fromJobConfig.columnList", "value": "1&2&3&4&5&6&7&8&9&10&11&12" }, { "name": "fromJobConfig.fieldSeparator", "value": "," }, { "name": "fromJobConfig.regexSeparator", "value": "false" }, { "name": "fromJobConfig.firstRowAsHeader", "value": "false" }, { "name": "fromJobConfig.encodeType", "value": "UTF-8" }, { "name": "fromJobConfig.fromCompression", "value": "NONE" }, { "name": "fromJobConfig.splitType", "value": "FILE" } ], "name": "fromJobConfig" } ] }
Parameter DescriptionSource link job parameters of FTP, SFTP, NAS, and SFS are the same. Table 6-1 describesthe parameters.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 126
Table 6-1 Source link job parameters of file systems
Parameter Mandatory Type Description
fromJobConfig.inputDirectory
Yes String Path for storing files to beextracted. You can enter amaximum of 50 file paths,which are separated by verticalbars (|). You can also customizethe separators. For example,FROM/example.csv|FROM/b.txt.
fromJobConfig.inputFormat
Yes Enumeration File format required for datatransmission. Currently, thefollowing file formats aresupported:l CSV_FILE: CSV format,
used to migrate files to datatables
l JSON_FILE: JSON format,used to migrate files to datatables
l BINARY_FILE: Files (evennot in binary format) will bedirectly transferred withoutresolution. It is applicable tofile copy.
If you select BINARY_FILE,the migration destination mustalso be a file system.
fromJobConfig.lineSeparator
No String Line feed character in the file.For example, \n. By default, \n,\r, or \r\n is automaticallyidentified.
fromJobConfig.columnList
No String Numbers of columns to beextracted. Use & to separatecolumn numbers in ascendingorder. For example, 1&3&5.
fromJobConfig.lineSeparator
No String Line feed character. Thisparameter is valid only when thefile format is CSV_FILE. Thedefault value is \r\n.
fromJobConfig.fieldSeparator
No String Field delimiter. This parameteris valid only when the fileformat is CSV_FILE. Thedefault value is ,.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 127
Parameter Mandatory Type Description
fromJobConfig.quoteChar
No Boolean Whether to use the encirclingsymbol. If this parameter is setto true, the field delimiters inthe encircling symbol areregarded as a part of the stringvalue. Currently, the defaultencircling symbol of CDM isdouble quotation mark (").
fromJobConfig.regexSeparator
No Boolean Whether to use the regularexpression to separate fields.This parameter is valid onlywhen the file format isCSV_FILE.
fromJobConfig.regex
No String Regular expression. Thisparameter is valid only when theregular expression is used toseparate fields.
fromJobConfig.firstRowAsHeader
No Boolean Whether to regard the first lineas the heading line. Thisparameter is valid only when thefile format is CSV_FILE. Whenyou migrate a CSV file to atable, CDM writes all data to thetable by default. If thisparameter is set to true, CDMuses the first line of the CSV fileas the heading line and does notwrite the line to the destinationtable.
fromJobConfig.fromCompression
No Enumeration Compression format. Thisparameter is valid only when thefile format is CSV_FILE orJSON. The options are asfollows:l NONE: Files in all formats
are transferred.l GZIP: Files in gzip format
are transferred.l ZIP: Files in Zip format are
transferred.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 128
Parameter Mandatory Type Description
fromJobConfig.jsonReferenceNode
No String Reference node. This parameteris valid when the file format isJSON_FILE. Resolve data onthe JSON node. If the datacorresponding to the node is aJSON array, the system extractsdata from the array in the samemode. Nested JSON nodes areseparated by periods (.). Forexample, data.list.
fromJobConfig.encodeType
No String Encoding type. For example,UTF_8 or GBK.
fromJobConfig.fromFileOpType
No Enumeration Source file processing mode.After a job is completed,operations on the source file canbe performed. The source filecan be renamed or deleted.
fromJobConfig.useMarkerFile
No Boolean Whether to start a job by amarker file. A job is started onlywhen a marker file for startingthe job exists in the source path.Otherwise, the job will besuspended for a period of timespecified byfromJobConfig.waitTime.
fromJobConfig.markerFile
No String Name of the marker file forstarting a job. After a marker fileis specified, the task is executedonly when the file exists in thesource path. If the marker file isnot specified, this function isdisabled by default. Forexample, ok.txt.
fromJobConfig.waitTime
No String Period of waiting for a markerfile. If you set Start Job byMarker File to Yes but nomarker file exists in the sourcepath, the job fails uponsuspension timeout.If you set this parameter to 0 andno marker file exists in thesource path, the job will failimmediately.Unit: second
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 129
Parameter Mandatory Type Description
fromJobConfig.filterType
No Enumeration Filter type. Possible values areas follows:l WILDCARD: Enter a
wildcard character to filterpaths or files. CDM willmigrate the paths or files thatmeet the filter condition.
l TIME: Specify a time filter.CDM will migrate the filesmodified after the specifiedtime point.
fromJobConfig.pathFilter
No String Path filter, which is configuredwhen the filter type isWILDCARD. It is used to filterthe file directories. For example,*input.
fromJobConfig.fileFilter
No String File filter, which is configuredwhen the filter type isWILDCARD. It is used to filterfiles in the specified directory.Use commas (,) to separatemultiple files. For example,*.csv,*.txt.
fromJobConfig.startTime
No String If you set Filter Type to TimeFilter, specify a time point tofilter files. Only the filesmodified after the specified timepoint are transferred. The timeformat must be yyyy-MM-ddHH:mm:ss.This parameter can be set to amacro variable of date and time.For example, ${timestamp(dateformat(yyyy-MM-ddHH:mm:ss,-90,DAY))}indicates that only filesgenerated within the latest 90days are migrated.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 130
Parameter Mandatory Type Description
fromJobConfig.endTime
No String If you set Filter Type to TimeFilter, specify a time point tofilter files. Only the filesmodified before the specifiedtime point are transferred. Thetime format must be yyyy-MM-dd HH:mm:ss.This parameter can be set to amacro variable of date and time.For example, ${timestamp(dateformat(yyyy-MM-dd HH:mm:ss))} indicatesthat only the files whosemodification time is earlier thanthe current time are migrated.
fromJobConfig.fileSeparator
No String File separator. If you entermultiple file paths infromJobConfig.inputDirectory, CDM uses the file separator toseparate files. The default valueis |.
fromJobConfig.decryption
No Enumeration Whether to decrypt theencrypted file before export andthe decryption method. Theoptions are as follows:l NONE: Do not decrypt but
directly export the file.l AES-256-GCM: Use the
AES-256-GCM (NoPadding)algorithm to decrypt the fileand then export the file.
fromJobConfig.dek
No String Data decryption key. The key isa string of 64-bit hexadecimalnumbers and must be the sameas the data encryption keytoJobConfig.dek configuredduring encryption. If theencryption and decryption keysare inconsistent, the system doesnot report an exception, but thedecrypted data is incorrect.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 131
Parameter Mandatory Type Description
fromJobConfig.iv No String Initialization vector required fordecryption. The initializationvector is a string of 32-bithexadecimal numbers and mustbe the same as the initializationvector toJobConfig.ivconfigured during encryption. Ifthe initialization vectors areinconsistent, the system does notreport an exception, but thedecrypted data is incorrect.
fromJobConfig.md5FileSuffix
No String Check whether the filesextracted by CDM are consistentwith those in the migrationsource.
6.2.7 From HTTP/HTTPS
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.inputDirectory", "value": "http://10.114.196.186:8080/httpfrom/symbol.txt" }, { "name": "fromJobConfig.inputFormat", "value": "BINARY_FILE" }, { "name": "fromJobConfig.fromCompression", "value": "TARGZ" }, { "name": "fromJobConfig.compressedFileSuffix", "value": "*" }, { "name": "fromJobConfig.fileSeparator", "value": "|" } ], "name": "fromJobConfig" } ] }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 132
Parameter DescriptionParameter Mandatory Type Description
fromJobConfig.inputDirectory
Yes String URL of the file to be extractedThese connectors are used to readfiles with an HTTP/HTTPS URL,such as reading public files on thethird-party object storage systemand web disks.
fromJobConfig.inputFormat
Yes Enumeration File format required for datatransmission. Currently, only thebinary format is supported.
fromJobConfig.fromCompression
No Enumeration Compression format of the sourcefiles. The options are as follows:l NONE: Files in all formats are
transferred.l GZIP: Files in gzip format are
transferred.l ZIP: Files in Zip format are
transferred.l TAR.GZ: Files in TAR.GZ
format are transferred.
fromJobConfig.compressedFileSuf-fix
No String Extension of the files to bedecompressed. The decompressionoperation is performed only whenthe file name extension is used in abatch of files. Otherwise, files aretransferred in the original format.If you enter * or leave theparameter blank, all files aredecompressed.
fromJobConfig.fileSeparator
No String File separator. When multiple filesare transferred, CDM uses the fileseparator to separate files. Thedefault value is |.
fromJobConfig.useQuery
No Boolean l If you set this parameter totrue, the name of the objectsuploaded to OBS does notcarry the query parameter.
l If you set this parameter tofalse, the name of the objectsuploaded to OBS carries thequery parameter.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 133
Parameter Mandatory Type Description
fromJobConfig.decryption
No Enumeration Whether to decrypt the encryptedfile before export and thedecryption method. The optionsare as follows:l NONE: Do not decrypt but
directly export the file.l AES-256-GCM: Use the
AES-256-GCM (NoPadding)algorithm to decrypt the fileand then export the file.
fromJobConfig.dek
No String Data decryption key. The key is astring of 64-bit hexadecimalnumbers and must be the same asthe data encryption keytoJobConfig.dek configuredduring encryption. If theencryption and decryption keys areinconsistent, the system does notreport an exception, but thedecrypted data is incorrect.
fromJobConfig.iv No String Initialization vector required fordecryption. The initializationvector is a string of 32-bithexadecimal numbers and must bethe same as the initialization vectortoJobConfig.iv configured duringencryption. If the initializationvectors are inconsistent, the systemdoes not report an exception, butthe decrypted data is incorrect.
fromJobConfig.md5FileSuffix
No String Check whether the files extractedby CDM are consistent with thosein the migration source.
6.2.8 From MongoDB/DDS
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.database", "value": "cdm" }, { "name": "fromJobConfig.collectionName", "value": "rf_from"
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 134
}, { "name": "fromJobConfig.columnList", "value": "TINYTEST&SMALLTEST&INTTEST&INTEGERTEST&BIGINTTEST&FLOATTEST" }, { "name": "fromJobConfig.isBatchMigration", "value": "false" }, { "name": "fromJobConfig.filters", "value": "{'last_name': 'Smith'}" } ], "name": "fromJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
fromJobConfig.database
Yes String Name of the MongoDB/DDSdatabase
fromJobConfig.collectionName
Yes String Name of the MongoDB/DDScollection
fromJobConfig.columnList
No String List of fields to be extracted. Use& to separate field names. Forexample, id&gid&name.
fromJobConfig.isBatchMigration
No Boolean Whether to migrate all data in thedatabase
fromJobConfig.filters
No String Filter condition for files. CDMmigrates only the data that meetsthe filter condition. Examples:1. Filter by expression:
{'last_name': 'Smith'}indicates that all files whoselast_name value is Smith arequeried.
2. Filter by parameter: { x :"john" }, { z : 1 } indicates thatall z fields whose x is john arequeried.
3. Filter by condition: { "field" :{ $gt: 5 } } indicates that thefield values greater than 5 arequeried.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 135
6.2.9 From Redis/DCS
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.isBatchMigration", "value": "false" }, { "name": "fromJobConfig.keyPrefix", "value": "rf_string_from" }, { "name": "fromJobConfig.keySeparator", "value": ":" }, { "name": "fromJobConfig.valueStoreType", "value": "STRING" }, { "name": "fromJobConfig.valueSeparator", "value": "," }, { "name": "fromJobConfig.columnList", "value": "1&2&3&4&5&6&7&8&9&10&11&12" } ], "name": "fromJobConfig" } ] }
Parameter Descriptionl Redis job parameter description
Parameter Mandatory Type Description
fromJobConfig.isBatchMigration
No Boolean Whether to migrate all data in thedatabase
fromJobConfig.keyPrefix
Yes String Key prefix, which is the name ofthe corresponding associationtable.Mapping between Redis and theassociation table: Name of theassociation table + delimiter is aRedis key, and a row of data inthe association table is a Redisvalue.
fromJobConfig.keySeparator
Yes String Key delimiter, which separatesthe association table and primarykey
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 136
Parameter Mandatory Type Description
fromJobConfig.valueStoreType
Yes String Storage mode of rows of data inthe association table on Redis.The options are string and hash.l STRING: indicates that a
row of data is stored as acharacter string usingdelimiters to separatecolumns. This mode reducesstorage space occupation.
l HASH: indicates that a rowof data is stored in columnname:column value format inthe hash table.
fromJobConfig.valueSeparator
No String Value delimiter. The defaultvalue is \tab. This parameter isvalid when valueStoreType isset to STRING.
fromJobConfig.columnList
No String List of fields to be extracted. Use& to separate field names. Forexample, id&gid&name.
fromJobConfig.formats
No Datastructure
Time format. For details, seeDescription of thefromJobConfig.formatsparameter.
l Description of the fromJobConfig.formats parameter
Parameter Mandatory Type Description
name Yes String Column number. For example, 1.
value Yes String Time format. For example, yyyy-MM-dd.
6.2.10 From DIS
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.streamName", "value": "cdm" }, { "name": "fromJobConfig.disConsumerStrategy", "value": "FROM_LAST_STOP" },
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 137
{ "name": "fromJobConfig.isPermanency", "value": "true" }, { "name": "fromJobConfig.maxPollRecords", "value": "100" }, { "name": "fromJobConfig.shardId", "value": "0" }, { "name": "fromJobConfig.separator", "value": "," } ], "name": "fromJobConfig" } ] }
Parameter Description
Parameter Mandatory Type Description
fromJobConfig.streamName
Yes String DIS stream name
fromJobConfig.disConsumerStrategy
Yes Enumeration Used to set the initial offset whendata is pulled from DIS. Theoptions are as follows:l LATEST: maximum offset,
that is, the latest datal FROM_LAST_STOP: data
after the last stopl EARLIEST: minimum
offset, that is, the earliest data
fromJobConfig.isPermanency
Yes Boolean Whether to run permanently
fromJobConfig.maxPollRecords
No String Maximum number of requeststhat can be sent to DIS each time
fromJobConfig.shardId
Yes String ID of the DIS partition. You canenter multiple partition IDs,which are separated by commas(,).
fromJobConfig.dataFormat
Yes Enumeration Format used for parsing data. Theoptions are as follows:l BINARY: Data is transferred
without being parsed, whichis applicable to file migration.
l CSV: Source data will bemigrated after being parsed inCSV format.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 138
Parameter Mandatory Type Description
fromJobConfig.separator
No String Field delimiter
fromJobConfig.appName
No String Unique identifier of the consumerapplication
6.2.11 From Kafka
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.topicsList", "value": "est1,est2" }, { "name": "fromJobConfig.kafkaConsumerStrategy", "value": "EARLIEST" }, { "name": "fromJobConfig.isPermanency", "value": "true" } ], "name": "fromJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
fromJobConfig.topicsList
Yes String List of Kafka topics. Separatemultiple topics by commas (,).
fromJobConfig.kafkaConsumerStrategy
Yes Enumeration Used to set the initial offset whendata is pulled from Kafka. Theoptions are as follows:l LATEST: maximum offset,
that is, the latest datal EARLIEST: minimum offset,
that is, the earliest data
fromJobConfig.isPermanency
Yes Boolean Whether to run permanently
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 139
Parameter Mandatory Type Description
fromJobConfig.groupId
No String Consumer group IDIf you export data from DMSKafka, enter any value for KafkaPlatinum but a valid consumergroup ID for Kafka Basic.
fromJobConfig.dataFormat
Yes Enumeration Format used for parsing data. Theoptions are as follows:l BINARY: Data is transferred
without being parsed, which isapplicable to file migration.
l CSV: Source data will bemigrated after being parsed inCSV format.
fromJobConfig.maxPollRecords
No String Maximum number of requests thatcan be sent to Kafka each time
fromJobConfig.maxPollInterval
No String Maximum interval between eachpoll
fromJobConfig.separator
No String Field delimiter
6.2.12 From Elasticsearch/Cloud Search Service
Sample JSON File"from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.index", "value": "cdm" }, { "name": "fromJobConfig.type", "value": "es" }, { "name": "fromJobConfig.columnList", "value": "a1:numeric&s1:string" }, { "name": "fromJobConfig.splitNestedField", "value": "true" }, { "name": "fromJobConfig.queryString", "value": "last_name:Smith" } ], "name": "fromJobConfig" }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 140
] }
Parameter Description
Parameter Mandatory Type Description
fromJobConfig.index
Yes String Index of the extracted data, whichis similar to the database name inthe relational database
fromJobConfig.type
Yes String Type of the extracted data, which issimilar to the table name in therelational database
fromJobConfig.columnList
No String List of fields to be extracted. Use &to separate field names. Forexample, id&gid&name.
fromJobConfig.splitNestedField
No Boolean Whether to split the JSON contentof the nested field. For example, a:{ b:{ c:1, d:{ e:2, f:3 } } } can besplit into a.b.c, a.b.d.e, and a.b.d.f.
fromJobConfig.queryString
No String Whether to use the Elasticsearchquery string to filter the sourcedata. CDM migrates only the datathat meets the filter criteria.
6.2.13 From OpenTSDB
Sample JSON File "from-config-values": { "configs": [ { "inputs": [ { "name": "fromJobConfig.start", "value": "0" }, { "name": "fromJobConfig.metric", "value": "city.temp" }, { "name": "fromJobConfig.aggregator", "value": "sum" }, { "name": "fromJobConfig.columnList", "value": "ps.timestample&metric&aggregator&dps.value" } ], "name": "fromJobConfig" } ] }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 141
Parameter DescriptionParameter Mandatory Type Description
fromJobConfig.start
Yes String Start time of the query. The value isa character string or timestamp inthe format of yyyyMMddHHmmdd.
fromJobConfig.end
No String End time of the query. The value isa character string or timestamp inthe format of yyyyMMddHHmmdd.
fromJobConfig.metric
Yes String Metric of the data to be migrated
fromJobConfig.aggregator
Yes String Aggregate function
fromJobConfig.tags
No String Data tag. If you specify thisparameter, only the tagged data willbe migrated.
fromJobConfig.columnList
No String Field list
6.3 Destination Job Parameters
6.3.1 To a Relational Database
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.schemaName", "value": "cdm" }, { "name": "toJobConfig.tablePreparation", "value": "DROP_AND_CREATE" }, { "name": "toJobConfig.tableName", "value": "rf_to" }, { "name": "toJobConfig.columnList", "value": "id&gid&name" }, { "name": "toJobConfig.isCompress", "value": "false" }, { "name": "toJobConfig.orientation", "value": "ROW"
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 142
}, { "name": "toJobConfig.useStageTable", "value": "false" }, { "name": "toJobConfig.shouldClearTable", "value": "false" }, { "name": "toJobConfig.extendCharLength", "value": "false" } ], "name": "toJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.schemaName
No String Database mode or tablespace
toJobConfig.tablePreparation
Yes Enumeration This parameter is available onlywhen both the source anddestination databases are relationaldatabases. The options for datawrite to tables are as follows:l DO_NOTHING: Do not create
the table automatically.l CREATE_WHEN_NOT_EXI
ST: If the destination databasedoes not contain the tablespecified by tableName, CDMautomatically creates the table.
l DROP_AND_CREATE:Delete the table specified bytableName, and then create thetable again.
toJobConfig.tableName
Yes String Name of the table to which data iswritten
toJobConfig.columnList
No String List of fields to be loaded. Use &to separate field names. Forexample, id&gid&name.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 143
Parameter Mandatory Type Description
toJobConfig.beforeImportType
No Enumeration Whether to clear the data in thetarget table before data import. Theoptions are as follows:l none: Do not clear the data in
the target table before dataimport but append data to thetable.
l shouldClearTable: Clear thedata in the target table beforedata import.
l whereClause: To clear data inthe target table based on theWHERE clause, set thetoJobConfig.whereClauseparameter. CDM deletes thedata from the target table asspecified.
toJobConfig.whereClause
No String WHERE clause used to delete datafrom the target table before dataimport
toJobConfig.orientation
No Enumeration Storage mode. This parameter isenabled only for the DWSdatabase. When the DWS databasetable needs to be automaticallycreated, the optional data storagemodes of the table are as follows:l ROW: Data in the table is
stored in rows.l COLUMN: Data in the table is
stored in columns.
toJobConfig.isCompress
No Boolean Whether to perform compression.This parameter is enabled only forthe DWS database. When theDWS database table needs to beautomatically created, you canspecify whether to store the data inthe table after compression.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 144
Parameter Mandatory Type Description
oJobConfig.useStageTable
No Boolean Whether to import data to thephase table first. If this parameteris set to true, the data is importedto the phase table before it isimported to the destination table.After the data is successfullyimported to the phase table, it isthen imported from the phase tableto the destination table. In thisway, the data that is successfullyimported to the destination tableremains in case the data importfails.
toJobConfig.extendCharLength
No Boolean Whether to extend the length ofthe character string field. If thisparameter is set to true, the lengthof the character string field in thedestination table is three times thelength of the corresponding field inthe source table when thedestination table needs to beautomatically created.
toJobConfig.useNullable
No Boolean If you choose to create a targettable automatically and specify theNOT NULL constraint, keep theNOT NULL constraints of thesource and target tables consistent.
6.3.2 To OBS
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.bucketName", "value": "cdm" }, { "name": "toJobConfig.outputDirectory", "value": "/obsfrom/advance/" }, { "name": "toJobConfig.outputFormat", "value": "CSV_FILE" }, { "name": "toJobConfig.fieldSeparator", "value": "," }, {
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 145
"name": "toJobConfig.writeToTempFile", "value": "false" }, { "name": "toJobConfig.validateMD5", "value": "false" }, { "name": "toJobConfig.recordMD5Result", "value": "false" }, { "name": "toJobConfig.encodeType", "value": "UTF-8" }, { "name": "toJobConfig.markerFile", "value": "finish.txt" }, { "name": "toJobConfig.duplicateFileOpType", "value": "REPLACE" }, { "name": "toJobConfig.shouldClearTable", "value": "false" }, { "name": "toJobConfig.columnList", "value": "1&2" }, { "name": "toJobConfig.quoteChar", "value": "false" }, { "name": "toJobConfig.encryption", "value": "NONE" }, { "name": "toJobConfig.copyContentType", "value": "false" } ], "name": "toJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.bucketName
Yes String OBS bucket name. For example,cdm.
toJobConfig.outputDirectory
Yes String Path to which data is written. Forexample, data_dir.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 146
Parameter Mandatory Type Description
toJobConfig.outputFormat
Yes Enumeration File format required for datawrites (except the binaryformat). Currently, the followingfile formats are supported:l CSV_FILE: Write data in
CSV format.l BINARY_FILE: Files are
directly transferred withoutresolving the content. CDMwrites the file withoutchanging the file format.
l CARBONDATA: Write datain CarbonData format.
If you select BINARY_FILE,the migration source must alsobe a file system.
toJobConfig.fieldSeparator
No String Column delimiter. Thisparameter is valid only whentoJobConfig.outputFormat isCSV_FILE. The default valueis ,.
toJobConfig.lineSeparator
No String Line feed character. Thisparameter is valid only whentoJobConfig.outputFormat isCSV_FILE. The default value is\r\n.
toJobConfig.writeFileSize
No String Whether to fragment multiplefiles by size so that the files areexported in proper size. The unitis MB. This parameter is validwhen the migration source is adatabase.
toJobConfig.duplicateFileOpType
No Enumeration Method for processing duplicatefiles. If the name and size of afile are the same as those ofanother file, the file is regardedas a duplicate file. Duplicatefiles can be processed in thefollowing ways:l REPLACE: Replace
duplicate files.l SKIP: Skip duplicate files.l ABANDON: Stop the job
when any duplicate file isfound.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 147
Parameter Mandatory Type Description
toJobConfig.encryption
No Enumeration Whether to encrypt the uploadeddata and the encryption method.The options are as follows:l NONE: Directly write data
without encryption.l KMS: Use KMS in Data
Encryption Workshop (DEW)for encryption. If KMSencryption is enabled, MD5verification for data cannotbe performed.
l AES-256-GCM: Use theAES 256-bit encryptionalgorithm to encrypt data.Currently, only the AES-256-GCM (NoPadding)encryption algorithm issupported.
toJobConfig.dek No String Data encryption key. Thisparameter is available only whentoJobConfig.encryption is setto AES-256-GCM. The key is astring of 64-bit hexadecimalnumbers.Remember the key configuredhere because the decryption keymust be the same as thatconfigured here. If theencryption and decryption keysare inconsistent, the system doesnot report an exception, but thedecrypted data is incorrect.
toJobConfig.iv No String Initialization vector. Thisparameter is available whentoJobConfig.encryption is setto AES-256-GCM. Theinitialization vector is a string of32-bit hexadecimal numbers.Remember the initializationvector configured here becausethe initialization vector used fordecryption must be the same asthat configured here. If theinitialization vectors areinconsistent, the system does notreport an exception, but thedecrypted data is incorrect.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 148
Parameter Mandatory Type Description
toJobConfig.kmsID
No String Key used for encryption duringdata upload. You must create akey in KMS before data upload.
toJobConfig.projectID
No String ID of the project to which theKMS key belongs.
toJobConfig.validateMD5
No Boolean Whether to verify the MD5value. MD5 verification cannotbe used together with KMSencryption. MD5 values can beverified only when files aretransferred in binary format.Calculate the MD5 value of thesource file and verify it with theMD5 value returned by OBS. Ifan MD5 file exists on the sourceend, directly read the MD5 fileand verify the MD5 file with theMD5 value returned by OBS.
toJobConfig.recordMD5Result
No Boolean Whether to record theverification result when the MD5value is verified
toJobConfig.recordMD5Link
No String OBS link where the bucket towhich the MD5 verificationresult is written resides
toJobConfig.recordMD5Bucket
No String OBS bucket to which the MD5verification result is written
toJobConfig.recordMD5Directory
No String Directory to which the MD5verification result is written
toJobConfig.encodeType
No String Encoding type. For example,UTF_8 or GBK.
toJobConfig.markerFile
No String Whether to generate a markerfile with a custom name in thedestination directory after a jobis executed successfully. If youdo not specify a file name, thisfunction is disabled by default.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 149
Parameter Mandatory Type Description
toJobConfig.copyContentType
No Boolean This parameter is displayed onlywhentoJobConfig.outputFormat isBinary and both the migrationsource and destination are objectstorage.If you set this parameter to Yes,the Content-Type attribute of thesource file is copied duringobject file migration. Thisfunction is mainly used for staticwebsite migrationThe Content-Type attributecannot be written to Archivebuckets. Therefore, if you setthis parameter to Yes, themigration destination must be anon-Archive bucket.
toJobConfig.quoteChar
No Boolean This parameter is available onlywhentoJobConfig.outputFormat isCSV. It is used when databasetables are migrated to filesystems.If you set this parameter to Yesand a field in the source datatable contains a field delimiter orline separator, CDM uses doublequotation marks (") as the quotecharacter to quote the fieldcontent as a whole to prevent afield delimiter from dividing afield into two fields, or a lineseparator from dividing a fieldinto different lines. For example,if the hello,world field in thedatabase is quoted, it will beexported to the CSV file as awhole.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 150
Parameter Mandatory Type Description
toJobConfig.firstRowAsHeader
No Boolean This parameter is available onlywhentoJobConfig.outputFormat isset to CSV. When a table ismigrated to a CSV file, CDMdoes not migrate the heading lineof the table by default. If thisparameter is set to Yes, CDMwrites the heading line of thetable to the file.
6.3.3 To HDFS
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.outputDirectory", "value": "/hdfsto" }, { "name": "toJobConfig.outputFormat", "value": "BINARY_FILE" }, { "name": "toJobConfig.writeToTempFile", "value": "false" }, { "name": "toJobConfig.duplicateFileOpType", "value": "REPLACE" }, { "name": "toJobConfig.compression", "value": "NONE" }, { "name": "toJobConfig.appendMode", "value": "true" } ], "name": "toJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.outputDirectory
Yes String Path to which data is written. Forexample, /data_dir.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 151
Parameter Mandatory Type Description
toJobConfig.outputFormat
Yes Enumeration File format required for data writes(except the binary format).Currently, the following fileformats are supported:l CSV_FILE: Write data in CSV
format.l BINARY_FILE: Files are
directly transferred withoutresolving the content. CDMwrites the file without changingthe file format.
If you select BINARY_FILE, themigration source must also be a filesystem.
toJobConfig.lineSeparator
No String Line feed character. This parameteris valid only whentoJobConfig.outputFormat isCSV_FILE. The default value is \r\n.
toJobConfig.fieldSeparator
No String Column delimiter. This parameter isvalid only whentoJobConfig.outputFormat isCSV_FILE. The default value is ,.
toJobConfig.writeToTempFile
No Boolean The binary file is written to a .tmpfile first. After the migration issuccessful, run the rename or movecommand at the migrationdestination to restore the file.
toJobConfig.duplicateFileOpType
No Enumeration Method for processing duplicatefiles. If the name and size of a fileare the same as those of anotherfile, the file is regarded as aduplicate file. Duplicate files can beprocessed in the following ways:l REPLACE: Replace duplicate
files.l SKIP: Skip duplicate files.l ABANDON: Stop the job when
any duplicate file is found.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 152
Parameter Mandatory Type Description
toJobConfig.compression
No Enumeration After the file is written, select thecompression format of the file. Thefollowing compression formats aresupported:l NONE: Do not compress the
file.l DEFLATE: Compress the file
in DEFLATE format.l GZIP: Compress the file in gzip
format.l BZIP2: Compress the file in
bzip2 format.l LZ4: Compress the file in LZ4
format.l SNAPPY: Compress the file in
Snappy format.
toJobConfig.appendMode
Yes Boolean Whether to write data when one ormore files exist in the loading path.The default value is false.
toJobConfig.encryption
No Enumeration This parameter is available onlywhen toJobConfig.outputFormatis set to BINARY_FILE. Itspecifies whether to encrypt theuploaded data, and the encryptionmethod. The options are as follows:l NONE: Directly write data
without encryption.l AES-256-GCM: Use the AES
256-bit encryption algorithm toencrypt data. Currently, only theAES-256-GCM (NoPadding)encryption algorithm issupported.
toJobConfig.dek No String Data encryption key. Thisparameter is available only whentoJobConfig.encryption is set toAES-256-GCM. The key is a stringof 64-bit hexadecimal numbers.Remember the key configured herebecause the decryption key must bethe same as that configured here. Ifthe encryption and decryption keysare inconsistent, the system doesnot report an exception, but thedecrypted data is incorrect.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 153
Parameter Mandatory Type Description
toJobConfig.iv No String Initialization vector. This parameteris available whentoJobConfig.encryption is set toAES-256-GCM. The initializationvector is a string of 32-bithexadecimal numbers.Remember the initialization vectorconfigured here because theinitialization vector used fordecryption must be the same as thatconfigured here. If the initializationvectors are inconsistent, the systemdoes not report an exception, butthe decrypted data is incorrect.
6.3.4 To Hive
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.hive", "value": "hive" }, { "name": "toJobConfig.database", "value": "rf_database" }, { "name": "toJobConfig.table", "value": "rf_to" }, { "name": "toJobConfig.tablePreparation", "value": "DO_NOTHING" }, { "name": "toJobConfig.columnList", "value": "aa&bb&cc&dd" }, { "name": "toJobConfig.shouldClearTable", "value": "true" } ], "name": "toJobConfig" } ] }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 154
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.hive
No String Data source to which data iswritten
toJobConfig.database
No String Name of the database to whichdata is written. For example,default.
toJobConfig.table
Yes String Name of the table to which datais written
toJobConfig.tablePreparation
Yes Enumeration The options for data write totables are as follows:l DO_NOTHING: Do not
create the table automatically.l CREATE_WHEN_NOT_E
XIST: If the destinationdatabase does not contain thetable specified by tableName,CDM automatically createsthe table.
l DROP_AND_CREATE:Delete the table specified bytableName, and then createthe table again.
toJobConfig.columnList
No String List of fields to be loaded. Use &to separate field names. Forexample, id&gid&name.
toJobConfig.shouldClearTable
No Boolean Whether to clear the data in thetarget table before data import. Ifthis parameter is set to true, thedata in the target table is clearedbefore the job is started.
6.3.5 To HBase/CloudTable
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.table", "value": "rf_to" }, { "name": "toJobConfig.storageType", "value": "PUTLIST" },
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 155
{ "name": "toJobConfig.columns", "value": "AA:AA&BB:BB&CC:CC&DD:DD" }, { "name": "toJobConfig.rowKeyColumn", "value": "AA:AA" }, { "name": "toJobConfig.isOverride", "value": "false" }, { "name": "toJobConfig.isRowkeyRedundancy", "value": "false" }, { "name": "toJobConfig.algorithm", "value": "NONE" }, { "name": "toJobConfig.writeToWAL", "value": "true" }, { "name": "toJobConfig.transType", "value": "false" } ], "name": "toJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.table Yes String Name of the table to which data iswritten. For example,TBL_EXAMPLE.
toJobConfig.storageType
Yes Enumeration Mode for writing data to an HBasetable. The options are as follows:l BULKLOAD: The
BULKLOAD mode isrecommended to improve theloading performance.
l PUTLIST: The PUTLISTmode is recommended onlywhen the data volume is small.
toJobConfig.columns
No String Columns to be extracted. Use & toseparate column numbers and : toseparate column families andcolumns. For example,cf1:c1&cf2:c2.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 156
Parameter Mandatory Type Description
toJobConfig.rowKeyColumn
Yes String Columns serve as rowkeys. Use &to separate column numbers and :to separate column families andcolumns. For example,cf1:c1&cf2:c2.
toJobConfig.isOverride
No Boolean Whether to clear data when data isimported in BULKLOAD mode.For example, true.
toJobConfig.delimiter
No String Delimiter used for separatingcolumns when multiple columnsare used as rowkeys. For example,vertical bars (|).
toJobConfig.isRowkeyRedundancy
No Boolean Whether to write rowkey data tothe HBase column at the sametime
toJobConfig.algorithm
No Enumeration Compression algorithm used whena new HBase table is created. TheSnappy and GZ algorithms aresupported. The default value isNone.
toJobConfig.writeToWAL
No Boolean Whether to enable Write AheadLog (WAL) of HBase. The optionsare as follows:l Yes: If the HBase server breaks
down after the function isenabled, you can replay theoperations that have not beenperformed in WAL.
l No: If you set this parameter toNo, the write performance isimproved. However, if theHBase server breaks down,data may be lost.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 157
Parameter Mandatory Type Description
toJobConfig.transType
No Boolean l true: Data of the Short, Int,Long, Float, Double, andDecimal columns in the sourcedatabase is converted intoByte[] arrays (binary) andwritten into HBase. Other typesof data are written as characterstrings. If several types of datamentioned above are combinedas rowkeys, they will be writtenas character strings.This function saves storagespace. In specific scenarios, therowkey distribution is evener.
l false: All types of data in thesource database are written intoHBase as character strings.
6.3.6 To FTP/SFTP/NAS/SFS
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.outputDirectory", "value": "/opt/data" }, { "name": "toJobConfig.outputFormat", "value": "CSV_FILE" }, { "name": "toJobConfig.fieldSeparator", "value": "," }, { "name": "toJobConfig.duplicateFileOpType", "value": "REPLACE" } ], "name": "toJobConfig" } ] }
Parameter Description
Parameter Mandatory Type Description
toJobConfig.outputDirectory
Yes String Path to which data is written. Forexample, /data_dir.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 158
Parameter Mandatory Type Description
toJobConfig.outputFormat
Yes Enumeration File format required for data writes(except the binary format).Currently, the following fileformats are supported:l CSV_FILE: Write data in CSV
format.l BINARY_FILE: Files are
directly transferred withoutresolving the content. CDMwrites the file without changingthe file format.
If you select BINARY_FILE, themigration source must also be afile system.
toJobConfig.duplicateFileOpType
No Enumeration Method for processing duplicatefiles. If the name and size of a fileare the same as those of anotherfile, the file is regarded as aduplicate file. Duplicate files canbe processed in the followingways:l REPLACE: Replace duplicate
files.l SKIP: Skip duplicate files.l ABANDON: Stop the job
when any duplicate file isfound.
toJobConfig.lineSeparator
No String Line feed character. Thisparameter is valid only whentoJobConfig.outputFormat isCSV_FILE. The default value is\r\n.
toJobConfig.fieldSeparator
No String Column delimiter. This parameteris valid only whentoJobConfig.outputFormat isCSV_FILE. The default value is ,.
toJobConfig.encodeType
No String Encoding type. For example,UTF_8 or GBK.
toJobConfig.writeToTempFile
No Boolean The binary file is written to a .tmpfile first. After the migration issuccessful, run the rename ormove command at the migrationdestination to restore the file.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 159
Parameter Mandatory Type Description
toJobConfig.recordMD5Result
No Boolean This parameter is invalid whenFile Format is set to Binary. AnMD5 hash value is generated foreach transferred file, and the valueis recorded in a new .md5 file. Youcan specify the directory where theMD5 value is generated.
toJobConfig.recordMD5Directory
No String Directory for storing MD5 values
toJobConfig.markerFile
No String Whether to generate a marker filewith a custom name in thedestination directory after a job isexecuted successfully. If you donot specify a file name, thisfunction is disabled by default.
toJobConfig.firstRowAsHeader
No Boolean This parameter is available onlywhen toJobConfig.outputFormatis set to CSV. When a table ismigrated to a CSV file, CDM doesnot migrate the heading line of thetable by default. If this parameteris set to Yes, CDM writes theheading line of the table to the file.
toJobConfig.encryption
No Enumeration Whether to encrypt the uploadeddata and the encryption method.The options are as follows:l NONE: Directly write data
without encryption.l AES-256-GCM: Use the AES
256-bit encryption algorithm toencrypt data. Currently, onlythe AES-256-GCM(NoPadding) encryptionalgorithm is supported.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 160
Parameter Mandatory Type Description
toJobConfig.dek No String Data encryption key. Thisparameter is available only whentoJobConfig.encryption is set toAES-256-GCM. The key is astring of 64-bit hexadecimalnumbers.Remember the key configured herebecause the decryption key mustbe the same as that configuredhere. If the encryption anddecryption keys are inconsistent,the system does not report anexception, but the decrypted datais incorrect.
toJobConfig.iv No String Initialization vector. Thisparameter is available whentoJobConfig.encryption is set toAES-256-GCM. The initializationvector is a string of 32-bithexadecimal numbers.Remember the initialization vectorconfigured here because theinitialization vector used fordecryption must be the same asthat configured here. If theinitialization vectors areinconsistent, the system does notreport an exception, but thedecrypted data is incorrect.
6.3.7 To DDS
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.database", "value": "demo" }, { "name": "toJobConfig.collectionName", "value": "cdmbase" }, { "name": "toJobConfig.columnList", "value": "_char&_varchar" }, { "name": "toJobConfig.isBatchMigration",
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 161
"value": "false" } ], "name": "toJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.database
Yes String Name of the MongoDB/DDSdatabase
toJobConfig.collectionName
Yes String Name of the MongoDB/DDScollection
toJobConfig.columnList
No String List of fields to be extracted. Use& to separate field names. Forexample, id&gid&name.
toJobConfig.isBatchMigration
No Boolean Whether to migrate all data in thedatabase
6.3.8 To DCS
Sample JSON File "to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.isBatchMigration", "value": "false" }, { "name": "toJobConfig.shouldClearDatabase", "value": "false" }, { "name": "toJobConfig.keyPrefix", "value": "cdm_string" }, { "name": "toJobConfig.keySeparator", "value": ":" }, { "name": "toJobConfig.primaryKeyList", "value": "1" }, { "name": "toJobConfig.valueStoreType", "value": "STRING" }, { "name": "toJobConfig.valueSeparator", "value": "," }, {
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 162
"name": "toJobConfig.columnList", "value": "1&2&3&4&5&6&7&8&9&10&11&12" } ], "name": "toJobConfig" } ] }
Parameter Descriptionl Parameter description
Parameter Mandatory Type Description
toJobConfig.isBatchMigration
No Boolean Whether to migrate all data in thedatabase
toJobConfig.shouldClearDatabase
No Boolean Whether to clear data beforeimport
toJobConfig.keyPrefix
Yes String Key prefix, which is similar tothe table name of a relationaldatabaseMapping between Redis and theassociation table: Name of theassociation table + delimiter is aRedis key, and a row of data inthe association table is a Redisvalue.
toJobConfig.keySeparator
Yes String Key delimiter, which separatesthe association table and primarykey
toJobConfig.primaryKeyList
Yes String List of primary keys. Use & toseparate field names. Forexample, id&gid.
toJobConfig.valueStoreType
Yes Enumeration
Storage mode of rows of data inthe association table on Redis.The options are string and hash.l STRING: indicates that a row
of data is stored as a characterstring, in which columns inthe row are separated byvalueSeparator.
l Hash: indicates that a row ofdata is stored in columnname:column value format inthe hash table.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 163
Parameter Mandatory Type Description
toJobConfig.valueSeparator
No String Value delimiter. The default valueis \tab. This parameter is validwhen valueStoreType is set tostring.
toJobConfig.columnList
No String List of fields to be written. Use &to separate field names. Forexample, id&gid&name.
toJobConfig.formats
No Datastructure
Time format. For details, seeDescription of thetoJobConfig.formatsparameter.
l Description of the toJobConfig.formats parameter
Parameter Mandatory Type Description
name Yes String Column number. For example, 1.
value Yes String Time format. For example, yyyy-MM-dd.
6.3.9 To Elasticsearch/Cloud Search Service
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.index", "value": "cdm" }, { "name": "toJobConfig.type", "value": "type1" }, { "name": "toJobConfig.shouldClearType", "value": "false" }, { "name": "toJobConfig.pipeLine", "value": "es_03" } ], "name": "toJobConfig" } ] }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 164
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.index Yes String Index of the written data, which issimilar to the database name in therelational database
toJobConfig.type Yes String Type of the written data, which issimilar to the table name in therelational database
toJobConfig.shouldClearType
No Boolean Whether to clear data before import
toJobConfig.primaryKey
No String Primary key or unique index
toJobConfig.columnList
No String List of fields to be written. Use &to separate field names. Forexample, id&gid&name.
toJobConfig.pipeLine
No String This parameter is available onlyafter a pipeline ID is created inKibana. It is used to convert thedata format using the datatransformation pipeline of CloudSearch Service or Elasticsearchafter data is transferred to CloudSearch Service or Elasticsearch.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 165
Parameter Mandatory Type Description
toJobConfig.createIndexStrategy
No Enumeration
For streaming jobs thatcontinuously write data toElasticsearch, CDM periodicallycreates indexes and writes data tothe indexes, which helps you deleteexpired data. The indexes can becreated based on the followingperiods:l EveryHour: CDM creates
indexes on the hour. The newindexes are named in the formatof Index name+Year+Month+Day+Hour, for example,index2018121709.
l EveryDay: CDM createsindexes at 00:00 every day. Thenew indexes are named in theformat of Index name+Year+Month+Day, for example,index20181217.
l EveryWeek: CDM createsindexes at 00:00 every Monday.The new indexes are named inthe format of Index name+Year+Week, for example,index201842.
l EveryMonth: CDM createsindexes at 00:00 on the first dayof each month. The new indexesare named in the format ofIndex name+Year+Month, forexample, index201812.
When extracting data from a file,you must configure a singleextractor, which means settingConcurrent Extractors to 1.Otherwise, this parameter isinvalid.
6.3.10 To DLI
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.queue",
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 166
"value": "cdm" }, { "name": "toJobConfig.database", "value": "sqoop" }, { "name": "toJobConfig.table", "value": "est1" }, { "name": "toJobConfig.columnList", "value": "string_&int_&date_&double_&boolean_&short_×tamp_&long_&smallint_&bigint_" }, { "name": "toJobConfig.shouldClearTable", "value": "false" } ], "name": "toJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.queue
Yes String Resource queue to which data iswritten
toJobConfig.database
Yes String DLI database to which data iswritten
toJobConfig.table Yes String Name of the table to which data iswritten
toJobConfig.columnList
No String List of fields to be loaded. Use & toseparate field names. For example,id&gid&name.
toJobConfig.shouldClearTable
No Boolean Whether to clear data in theresource queue before data import
6.3.11 To DIS
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.streamName", "value": "cdm" }, { "name": "toJobConfig.separator", "value": "," },
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 167
{ "name": "toJobConfig.identifierEnclose", "value": "'" } ], "name": "toJobConfig" } ] }
Parameter descriptionParameter Mandatory Type Description
toJobConfig.streamName
Yes String DIS stream name
toJobConfig.separator
No String Field separator. The default valueis a space.
toJobConfig.identifierEnclose
No String Delimiter used to separatereferenced table names or columnnames. By default, this parameteris left blank.
6.3.12 To OpenTSDB
Sample JSON File"to-config-values": { "configs": [ { "inputs": [ { "name": "toJobConfig.metric", "value": "city.temp" }, { "name": "toJobConfig.timeStamp", "value": "0" }, { "name": "toJobConfig.tags", "value": "tagk1:tagv1" }, { "name": "toJobConfig.columnList", "value": "metric:1&tag:2×tamp&value" } ], "name": "toJobConfig" } ] }
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 168
Parameter DescriptionParameter Mandatory Type Description
toJobConfig.metric
No String Data metric
toJobConfig.timeStamp
No String Data point. The value is a characterstring or timestamp in the format ofyyyyMMddHHmmdd.
toJobConfig.tags No String Data tag
toJobConfig.columnList
No String Field list
6.4 Job Parameter DescriptionWhen Creating a Job in a Specified Cluster or Creating and Executing a Job in aRandom Cluster, the driver-config-values parameter specifies the job configuration, whichincludes the following functions:
l Retry upon Failure: If a job fails to be executed, you can choose whether toautomatically restart the job.
l Job Group: CDM allows you to group jobs. You can filter, delete, start, or export jobsby group.
l Schedule Execution: Specify whether to execute scheduled jobs.l Concurrent Extractors: Enter the number of concurrent extractors.l Write Dirty Data: Specify this parameter if data that fails to be processed or filtered out
during job execution needs to be written to OBS for future viewing. Before writing dirtydata, create an OBS link.
l Delete Job After Completion: Specify whether to delete a job after the job is executed.
Sample JSON File"driver-config-values": { "configs": [ { "inputs": [ { "name": "throttlingConfig.numExtractors", "value": "1" }, { "name": "throttlingConfig.numLoaders", "value": "1" }, { "name": "throttlingConfig.recordDirtyData", "value": "false" } ], "name": "throttlingConfig" }, { "inputs": [],
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 169
"name": "jarConfig" }, { "inputs": [ { "name": "schedulerConfig.isSchedulerJob", "value": "false" }, { "name": "schedulerConfig.disposableType", "value": "NONE" } ], "name": "schedulerConfig" }, { "inputs": [], "name": "transformConfig" }, { "inputs": [ { "name": "retryJobConfig.retryJobType", "value": "NONE" } ], "name": "retryJobConfig" } ] }
Parameter DescriptionParameter Mandatory Type Description
throttlingConfig.numExtractors
No Integer Maximum number of concurrentextraction jobs. For example, 20.
groupJobConfig.groupName
No Enumeration Group to which a job belongs. Thedefault group is DEFAULT.
throttlingConfig.numLoaders
No Integer This parameter is available onlywhen HBase or Hive serves as thedestination data source.Maximum number of loading jobs.For example, 5.
throttlingConfig.recordDirtyData
No Boolean Whether to write dirty data. Forexample, true.
throttlingConfig.writeToLink
No String Link to which dirty data is written.Currently, dirty data can be writtenonly to OBS or HDFS. Forexample, obslink.
throttlingConfig.obsBucket
No String Name of the OBS bucket to whichdirty data is written. This parameteris valid only when dirty data iswritten to OBS. For example,dirtyData.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 170
Parameter Mandatory Type Description
throttlingConfig.dirtyDataDirectory
No String Directory to which dirty data iswrittenl To write dirty data to HDFS, set
this parameter to the specifiedHDFS directory.
l To write dirty data to OBS, setthis parameter to the directory inthe OBS bucket. For example, /data/dirtydata/.
throttlingConfig.maxErrorRecords
No String Maximum number of error recordsin a single shard. When the numberof error records of a map exceedsthe upper limit, the taskautomatically ends. The importeddata will not be rolled back.
schedulerConfig.isSchedulerJob
No Boolean Whether to enable a scheduled task.For example, true.
schedulerConfig.cycleType
No String Cycle type of a scheduled task. Theoptions are as follows:l minute: minutel hour: hourl day: dayl week: weekl month: month
schedulerConfig.cycle
No Integer Cycle of a scheduled task. IfcycleType is set to minute andcycle is set to 10, the scheduledtask is executed every 10 minutes.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 171
Parameter Mandatory Type Description
schedulerConfig.runAt
No String Time when a scheduled task istriggered in a cycle. This parameteris valid only when cycleType is setto hour, week, or month.l If cycleType is set to month,
cycle is set to 1, and runAt isset to 15, the scheduled task isexecuted on the 15th day ofeach month. You can set runAtto multiple values and separatethe values with commas (,).For example, if runAt is set to1,2,3,4,5, the scheduled task isexecuted on the first day, secondday, third day, fourth day, andfifth day of each month.
l If cycleType is set to week andrunAt is set tomon,tue,wed,thu,fri, thescheduled task is executed onMonday to Friday.
l If cycleType is set to hour andrunAt is set to 27,57, thescheduled task is executed at the27th and 57th minute in thecycle.
schedulerConfig.startDate
No String Start time of a scheduled task. Forexample, 2018-01-24 19:56:19.
schedulerConfig.stopDate
No String End time of a scheduled task. Forexample, 2018-01-27 23:59:00.If you do not set the end time, thescheduled task is always executedand will never stop.
schedulerConfig.disposableType
No Enumeration Whether to delete a job after the jobis executed. The options are asfollows:l NONE: A job will not be
deleted after it is executed.l DELETE_AFTER_SUCCEE
D: A job will be deleted onlyafter it is successfully executed.It is applicable to massive one-time jobs.
l DELETE: A job will be deletedafter it is executed, regardless ofthe execution result.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 172
Parameter Mandatory Type Description
retryJobConfig.retryJobType
No Enumeration Whether to automatically retry if ajob fails to be executed. Theoptions are as follows:l NONE: Do not retry.l RETRY_TRIPLE: Retry three
times.
Cloud Data MigrationAPI Reference 6 Public Data Structures
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 173
7 Permissions Policies and Supported
Actions
This chapter describes fine-grained permissions management for your CDM. If your accountdoes not need individual IAM users, then you may skip over this chapter.
A policy is a set of permissions defined in JSON format. A policy is a set of permissionsdefined in JSON format. By default, new IAM users do not have any permissions assigned.You need to add a user to one or more groups, and assign permissions policies to thesegroups. The user then inherits permissions from the groups it is a member of. This process iscalled authorization. After authorization, the user can perform specified operations on CDMbased on the permissions.
There are fine-grained policies and role-based access control (RBAC) policies.
l An RBAC policy consists of permissions for an entire service. Users in a group withsuch a policy assigned are granted all of the permissions required for that service.
l A fine-grained policy consists of API-based permissions for operations on specificresource types. Fine-grained policies, as the name suggests, allow for more fine-grainedcontrol than RBAC policies.
NOTE
l Fine-grained policies are currently available for open beta testing. You can apply to use the fine-grained access control function free of charge.
l If you want to allow or deny the access to an API, fine-grained authorization is a good choice.
An account has all of the permissions required to call all APIs, but IAM users must have therequired permissions specifically assigned.
The permissions required for calling an API are determined by the actions supported by theAPI. Only users who have been granted permissions allowing the actions can call the APIsuccessfully. For example, if an IAM user creates clusters using an API, the user must havebeen granted permissions that allow the cdm:cluster:create action.
Supported Actions
Operations supported by a fine-grained policy are specific to APIs. The following describesthe headers of the actions provided in this chapter:
l Permissions: Defined by actions in a custom policy.
Cloud Data MigrationAPI Reference 7 Permissions Policies and Supported Actions
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 174
l Actions: Added to a custom policy to control permissions for specific operations.l Authorization Scope: A custom policy can be applied to IAM projects or enterprise
projects or both. Policies that contain actions supporting both IAM and enterpriseprojects can be assigned to user groups and take effect in both IAM and EnterpriseManagement. Policies that only contain actions supporting IAM projects can be assignedto user groups and only take effect in IAM. Such policies will not take effect if they areassigned to user groups in Enterprise Management.
l APIs: REST APIs that can be called in a custom policy.
CDM supports the actions listed in Table 7-1 that can be defined in custom policies. Allactions listed in the following table support both Project and Enterprise Project.
Table 7-1 Actions
Permission
API Action CDMAdmin/CDMOperator
CDMUser
CDMViewer
Creating acluster
POST /v1.1/{project_id}/clusters
cdm:cluster:create
√ × ×
Queryingthe clusterlist
GET /v1.1/{project_id}/clusters
cdm:cluster:list
√ √ √
Queryingclusterdetails
GET /v1.1/{project_id}/clusters/{cluster_id}
cdm:cluster:get
√ √ √
Restartinga cluster
POST /v1.1/{project_id}/clusters/{cluster_id}/action
cdm:cluster:operate
√ × ×
Modifyingclusterconfiguration
POST /v1.1/{project_id}/cluster/modify/{cluster_id}
cdm:cluster:modify
√ × ×
Deleting acluster
DELETE /v1.1/{project_id}/clusters/{cluster_id}
cdm:cluster:delete
√ × ×
Creating alink
POST /v1.1/{project_id}/clusters/{cluster_id}/cdm/link
cdm:link:operate
√ √ ×
Querying alink
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/link/{linkName}
cdm:cluster:get
√ √ √
Cloud Data MigrationAPI Reference 7 Permissions Policies and Supported Actions
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 175
Permission
API Action CDMAdmin/CDMOperator
CDMUser
CDMViewer
Modifyinga link
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/link/{link_name}
cdm:link:operate
√ √ ×
Deleting alink
DELETE /v1.1/{project_id}/clusters/{cluster_id}/cdm/link/{linkName}
cdm:link:operate
√ √ ×
Creating ajob in aspecifiedcluster
POST /v1.1/{project_id}/clusters/{cluster_id}/cdm/job
cdm:job:operate
√ √ ×
Querying ajob
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{jobName}
cdm:cluster:get
√ √ √
Modifyinga job
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{jobName}
cdm:job:operate
√ √ ×
Starting ajob
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{jobName}/start
cdm:job:operate
√ √ ×
Stopping ajob
PUT /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{jobName}/stop
cdm:job:operate
√ √ ×
Queryingjob status
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{jobName}/status
cdm:cluster:get
√ √ √
Queryingjobexecutionhistory
GET /v1.1/{project_id}/clusters/{cluster_id}/cdm/submissions
cdm:cluster:get
√ √ √
Deleting ajob
DELETE /v1.1/{project_id}/clusters/{cluster_id}/cdm/job/{jobName}
cdm:job:operate
√ √ ×
Cloud Data MigrationAPI Reference 7 Permissions Policies and Supported Actions
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 176
8 Appendix
8.1 Status CodeTable 8-1 describes the status code.
Table 8-1 Status Code
StatusCode
Code Description
100 Continue The client continues sending the request.This interim response is used to inform the client thatthe initial part of the request has been received andhas not yet been rejected by the server.
101 Switching Protocols Switching protocols. The target protocol must bemore advanced than the source protocol.For example, the current HTTP protocol is switchedto a later version.
201 Created The request for creating a resource has been fulfilled.
202 Accepted The request has been accepted, but the processing hasnot been completed.
203 Non-AuthoritativeInformation
The server successfully processed the request, but isreturning information that may be from anothersource.
204 NoContent The server has successfully processed the request, buthas not returned any content.The status code is returned in response to an HTTPOPTIONS request.
205 Reset Content The server has fulfilled the request, but the requesteris required to reset the content.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 177
StatusCode
Code Description
206 Partial Content The server has successfully processed a part of theGET request.
300 Multiple Choices There are multiple options for the location of therequested resource. The response contains a list ofresource characteristics and addresses from which theuser or user agent (such as a browser) can choose themost appropriate one.
301 Moved Permanently The requested resource has been assigned a newpermanent URI, and the new URI is contained in theresponse.
302 Found The requested resource resides temporarily under adifferent URI.
303 See Other Retrieve a location.The response to the request can be found under adifferent URI and should be retrieved using a GET orPOST method.
304 Not Modified The requested resource has not been modified. Whenthe server returns this status code, it does not returnany resources.
305 Use Proxy The requested resource must be accessed through aproxy.
306 Unused The HTTP status code is no longer used.
400 BadRequest Invalid request.The client should not repeat the request withoutmodifications.
401 Unauthorized The status code is returned after the client providesthe authentication information, indicating that theauthentication information is incorrect or invalid.
402 Payment Required This status code is reserved for future use.
403 Forbidden The server understood the request, but is refusing tofulfill it.The client should not repeat the request withoutmodifications.
404 NotFound The requested resource cannot be found.The client should not repeat the request withoutmodifications.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 178
StatusCode
Code Description
405 MethodNotAllowed The method specified in the request is not supportedfor the requested resource.The client should not repeat the request withoutmodifications.
406 Not Acceptable The server cannot fulfill the request according to thecontent characteristics of the request.
407 Proxy AuthenticationRequired
This status code is similar to 401, but indicates thatthe client must first authenticate itself with the proxy.
408 Request Time-out The request timed out.The client may repeat the request withoutmodifications at any later time.
409 Conflict The request could not be completed due to a conflictwith the current state of the resource.This status code indicates that the resource that theclient attempts to create already exits, or the requestfails to be processed because of the update of theconflict request.
410 Gone The requested resource is no longer available.The status code indicates that the requested resourcehas been deleted permanently.
411 Length Required The server refuses to process the request without adefined Content-Length.
412 Precondition Failed The server does not meet one of the preconditionsthat the requester puts on the request.
413 Request Entity TooLarge
The request is larger than that a server is able toprocess. The server may close the connection toprevent the client from continuing the request. If theserver cannot process the request temporarily, theresponse will contain a Retry-After header field.
414 Request-URI TooLarge
The URI provided was too long for the server toprocess.
415 Unsupported MediaType
The server is unable to process the media format inthe request.
416 Requested range notsatisfiable
The requested range is invalid.
417 Expectation Failed The server fails to meet the requirements of theExpect request-header field.
422 UnprocessableEntity The request is well-formed but is unable to beprocessed due to semantic errors.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 179
StatusCode
Code Description
429 TooManyRequests The client has sent more requests than its rate limit isallowed within a given amount of time, or the serverhas received more requests than it is able to processwithin a given amount of time. In this case, it isadvisable for the client to re-initiate requests after thetime specified in the Retry-After header of theresponse expires.
500 InternalServerError The server is able to receive the request but it couldnot understand the request.
501 Not Implemented The server does not support the requested function.
502 Bad Gateway The server is acting as a gateway or proxy andreceives an invalid request from a remote server.
503 ServiceUnavailable The requested service is invalid.The client should not repeat the request withoutmodifications.
504 ServerTimeout The request cannot be fulfilled within a given time.This status code is returned to the client only whenthe Timeout parameter is specified in the request.
505 HTTP Version notsupported
The server does not support the HTTP protocolversion used in the request.
8.2 Error CodeNo data is returned if an API fails to be called. You can locate the cause of an error accordingto the error code of each API. When the API calling fails, HTTP status code 4xx or 5xx isreturned. The returned message body contains the specific error code and error information. Ifyou fail to locate the cause of an error, contact customer service and provide the error code sothat we can help you solve the problem as soon as possible.
l Sample exception response{ "errCode": "Cdm.0100", "externalMessage": "Job[jdbc2hive] doesn't exist."}
l Parameter description
Parameter Mandatory Type Description
errCode No String Error code
externalMessage No String Error message
l Error code
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 180
In the following error message, %s is a variable. In the actual situation, the parametername, table name, job name, and link name are replaced with the actual values.
Error Code Status Code Error Message
Cdm.0000 500 System error.
Cdm.0009 500 %s is not an integer or is beyond the valuerange [0, 2147483647].
Cdm.0010 500 The integer must be within the range of[%s].
Cdm.0011 500 Validator has malformed argument.
Cdm.0012 500 JDBC driver class is not found.
Cdm.0014 500 Invalid parameter.
Cdm.0015 500 An error occurred during file parse.
Cdm.0016 500 The file to be uploaded cannot be empty.
Cdm.0017 500 MRS Kerberos authentication failed.
Cdm.0018 500 The content of jobs or links is invalid.
Cdm.0019 500 Log %s does not exist.
Cdm.0020 500 The string must contain the followingsubstring: %s
Cdm.0021 500 Failed to connect to the server: %s .
Cdm.0024 500 [%s] must be within the range of [%s].
Cdm.0051 500 Invalid submission engine: %s
Cdm.0052 500 Job %s is running.
Cdm.0053 500 Job %s is not running.
Cdm.0054 500 Job %s does not exist.
Cdm.0055 500 Unsupported job type.
Cdm.0056 500 Failed to submit the job. Cause: %s
Cdm.0057 500 Invalid job execution engine: %s
Cdm.0058 500 Invalid combination of submission andexecution engines.
Cdm.0059 500 Job %s has been disabled. Failed to submitthe job.
Cdm.0060 500 Link %s for this job has been disabled.Failed to submit the job.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 181
Error Code Status Code Error Message
Cdm.0061 500 Connector %s does not support thespecified direction. Failed to submit thejob.
Cdm.0062 500 The binary file is applicable only to theSFTP, FTP, HDFS, or OBS connector.
Cdm.0063 500 An error occurred during table creation.Cause: %s
Cdm.0064 500 The data format is incorrect.
Cdm.0065 500 Failed to start the scheduler. Cause: %s
Cdm.0066 500 Failed to obtain the sample value. Cause:%s
Cdm.0067 500 Failed to obtain the schema. Cause: %s
Cdm.0100 500 Job [%s] does not exist.
Cdm.0101 500 Link [%s] does not exist.
Cdm.0102 500 Connector [%s] does not exist.
Cdm.0104 500 The job name already exists.
Cdm.0010 500 The integer must be within the range of[%s].
Cdm.0201 500 Failed to obtain the instance.
Cdm.0202 500 The job status is unknown.
Cdm.0204 500 No MRS link is created.
Cdm.0230 500 Failed to load the specified class: %s
Cdm.0231 500 Failed to initialize the specified class: %s
Cdm.0232 500 Failed to write data. Cause: %s
Cdm.0233 500 An exception occurred during dataextraction. Cause: %s
Cdm.0234 500 An exception occurred during data loading.Cause: %s
Cdm.0235 500 All data has been consumed. Cause: %s
Cdm.0236 500 Invalid partitions have been retrieved fromPartitioner.
Cdm.0237 500 Failed to find the JAR file of the connector.
Cdm.0238 500 %s cannot be empty.
Cdm.0239 500 Failed to obtain HDFS. Cause: %s
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 182
Error Code Status Code Error Message
Cdm.0240 500 Failed to obtain the status of the %s file.
Cdm.0241 500 Failed to obtain the type of the %s file.
Cdm.0242 500 An exception occurs during file check: %s
Cdm.0243 500 Failed to rename %s to %s.
Cdm.0244 500 Failed to create the %s file.
Cdm.0245 500 Failed to delete the %s file.
Cdm.0246 500 Failed to create the %s directory.
Cdm.0247 500 HBase operation failure. Cause: %s
Cdm.0248 500 Failed to clear data in %s. Cause: %s
Cdm.0249 500 The file name %s is invalid.
Cdm.0250 500 Failed to perform operations in the path: %s
Cdm.0251 500 Failed to load data to HBase. Cause: %s
Cdm.0307 500 Failed to release the link. Cause: %s
Cdm.0315 500 The Link name %s already exists.
Cdm.0316 500 The link that does not exist cannot beupdated.
Cdm.0317 500 Link %s is invalid.
Cdm.0318 500 The job already exists and cannot becreated repeatedly.
Cdm.0319 500 The job that does not exist cannot beupdated.
Cdm.0320 500 Job %s is invalid.
Cdm.0321 500 Link %s has been used.
Cdm.0322 500 Job %s has been used.
Cdm.0323 500 The submission already exists and cannotbe created repeatedly.
Cdm.0327 500 Invalid link or job: %s
Cdm.0411 500 An error occurred when connecting to thefile server.
Cdm.0412 500 An error occurred when disconnecting fromthe file server.
Cdm.0413 500 An error occurred in data transfer to the fileserver.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 183
Error Code Status Code Error Message
Cdm.0415 500 An error occurred when downloading filesfrom the file server.
Cdm.0416 500 An error occurred during data extraction.
Cdm.0420 500 The source file or source directory does notexist.
Cdm.0423 500 Duplicate files exist in the destination path.
Cdm.0500 500 The source directory or the [%s] file doesnot exist.
Cdm.0501 500 Invalid URI [%s].
Cdm.0518 500 Failed to connect to HDFS. Cause: %s
Cdm.0600 500 Failed to connect to the FTP server.
Cdm.0700 500 Failed to connect to the SFTP server.
Cdm.0800 500 Failed to connect to the OBS server.
Cdm.0801 500 OBS bucket [%s] does not exist.
Cdm.0900 500 Table [%s] does not exist.
Cdm.0901 500 Failed to connect to the database server.Cause: %s
Cdm.0902 500 Failed to execute the SQL statement.Cause: %s
Cdm.0903 500 Failed to obtain metadata. Cause: %s
Cdm.0904 500 An error occurred while retrieving datafrom the result. Cause: %s
Cdm.0905 500 No partition column is found.
Cdm.0906 500 No boundary is found in the partitioncolumn.
Cdm.0911 500 The table name or SQL must be specified.
Cdm.0912 500 The table name and SQL cannot bespecified at the same time.
Cdm.0913 500 Schema and SQL cannot be specified at thesame time.
Cdm.0914 500 Partition column is mandatory for query-based import.
Cdm.0915 500 The SQL-based import mode andcolumnList cannot be used at the sametime.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 184
Error Code Status Code Error Message
Cdm.0916 500 Last value is mandatory for incrementalread.
Cdm.0917 500 Last value cannot be obtained without fieldcheck.
Cdm.0918 500 If no transfer table is specified,shouldClearStageTable cannot bespecified.
Cdm.0921 500 Type %s is not supported.
Cdm.0925 500 The partition column contains unsupportedvalues.
Cdm.0926 500 Failed to obtain the schema. Cause: %s
Cdm.0927 500 The transfer table is not empty.
Cdm.0928 500 An error occurred when data is migratedfrom the transfer table to the destinationtable.
Cdm.0931 500 Schema column size [%s] does not matchthe result set column size [%s].
Cdm.0932 500 Failed to obtain the maximum value of thefield.
Cdm.0934 500 Multiple tables of the same name exist indifferent schemas or catalogs.
Cdm.0935 500 No primary key. Specify the partitioncolumn.
Cdm.0936 500 The number of dirty data records reachesthe upper limit.
Cdm.0940 500 Failed to match the exact table name.
Cdm.0941 500 Failed to connect to the server. Cause: %s
Cdm.0950 500 Failed to connect to the database with theexisting authentication information.
Cdm.0960 500 serverlist must be specified.
Cdm.0961 500 Invalid serverlist format.
Cdm.0962 500 The host IP address must be specified.
Cdm.0963 500 The host port must be specified.
Cdm.0964 500 The database must be specified.
Cdm.1000 500 Hive table [%s] does not exist.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 185
Error Code Status Code Error Message
Cdm.1010 500 Invalid URI [%s]. The URI must be eithernull or a valid URI.
Cdm.1011 500 Failed to connect to Hive. Cause: %s
Cdm.1100 500 Table [%s] does not exist.
Cdm.1101 500 Failed to obtain the link. Cause: %s
Cdm.1102 500 Failed to create the table. Cause: %s
Cdm.1103 500 No rowkey is found.
Cdm.1104 500 Failed to open the table. Cause: %s
Cdm.1105 500 Failed to initialize the job. Cause: %s
Cdm.1111 500 The table name is mandatory.
Cdm.1112 500 The import mode is mandatory.
Cdm.1113 500 Whether to clear data before import has notbeen specified.
Cdm.1114 500 The rowkey is empty. Set it in fieldmapping.
Cdm.1115 500 Columns is empty. Set it in field mapping.
Cdm.1116 500 Duplicate column names. Set it in fieldmapping.
Cdm.1117 500 An error occurred when checking whetherthe table exists. Cause: %s
Cdm.1118 500 Table %s does not contain the %s columnfamily.
Cdm.1119 500 The number of column families is %s andthe number of columns is %s.
Cdm.1120 500 The table contains data. Clear the table dataor set the configuration item to specifywhether to clear the table data before theimport.
Cdm.1121 500 Failed to close the link. Cause: %s
Cdm.1400 500 Failed to connect to the NAS server.
Cdm.1401 500 No permission to access the NAS server.
Cdm.1201 500 Failed to connect to the Redis server.Cause: %s
Cdm.1202 500 Failed to connect to the Redis cluster insingle-node mode.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 186
Error Code Status Code Error Message
Cdm.1203 500 Failed to extract data from the Redis server.Cause: %s
Cdm.1205 500 The prefix of the Redis value cannot beblank.
Cdm.1206 500 The storage type of the Redis value must bestring or hash.
Cdm.1207 500 When the value of the storage type isstring, valueSeparator must be specified.
Cdm.1208 500 columnList of Redis must be specified.
Cdm.1209 500 keySeparator of Redis key cannot beempty.
Cdm.1210 500 primaryKeyList of Redis must bespecified.
Cdm.1211 500 primaryKeyList of Redis must exist incolumnList.
Cdm.1212 500 databaseType of Redis must be Originalor DCS.
Cdm.1213 500 serverlist of Redis must be specified.
Cdm.1301 500 Failed to connect to the MongoDB server.Cause: %s
Cdm.1302 500 Failed to extract data from the MongoDBserver. Cause: %s
Cdm.1304 500 The collection of MongoDB servers mustbe specified.
Cdm.1305 500 serverlist of MongoDB must be specified.
Cdm.1306 500 The database name of the MongoDBservice must be specified.
Cdm.1307 500 serverlist of MongoDB must be specified.
Cdm.1400 500 Failed to connect to the NAS server.
Cdm.1401 500 No permission to access the NAS server.
Cdm.1501 500 Failed to connect to the Elasticsearchserver. Cause: %s
Cdm.1502 500 Failed to write data to the Elasticsearchserver. Cause: %s
Cdm.1503 500 Failed to close the Elasticsearch link.Cause: %s
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 187
Error Code Status Code Error Message
Cdm.1504 500 An error occurred when obtaining theElasticsearch index. Cause: %s
Cdm.1505 500 An error occurred when obtaining theElasticsearch type. Cause: %s
Cdm.1506 500 An error occurred when obtaining theElasticsearch field. Cause: %s
Cdm.1507 500 An error occurred when obtaining theElasticsearch sample data. Cause: %s
Cdm.1508 500 The host name or IP address of theElasticsearch server must be specified.
Cdm.1509 500 The port of the Elasticsearch server must bespecified.
Cdm.1510 500 The Elasticsearch index must be specified.
Cdm.1511 500 The Elasticsearch type must be specified.
Cdm.1512 500 columnList of Elasticsearch must bespecified.
Cdm.1513 500 columnList must contain the field typedefinition.
Cdm.1514 500 columnList must contain primaryKey.
Cdm.1515 500 An error occurred when resolving the JSONcharacter string. Cause: %s
Cdm.1516 500 The column name %s is invalid.
Cdm.1517 500 An error occurred when obtaining thenumber of documents.
Cdm.1518 500 The partition fails to be created.
Cdm.1519 500 An error occurred during data extraction.
Cdm.1520 500 Failed to obtain the type. Cause: %s
Cdm.1601 500 Failed to connect to the server.
Cdm.1603 500 Failed to obtain the sample value of the %stopic.
Cdm.1604 500 No data exists in topic %s.
Cdm.1605 500 Invalid brokerList.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 188
8.3 Obtaining the Project ID, Account Name, and AK/SKl In Authentication, you need to enter the account name, username, or Access Key ID
(AK)/Secret Access Key (SK).l When calling APIs, you need to specify a project ID (project_id) in certain URLs.
To obtain a project ID (example), perform the following operations:
How to Obtain1. Log in to the management console.2. Hover the cursor on the username in the upper right corner and select My Credential
from the drop-down list.3. On the displayed My Credential page, view the username, account name, or project ID.
Figure 8-1 Viewing the project ID
4. Click Access Keys to view the added AK.– If no AK/SK pair is available, click Add Access Key to add one.– If you have generated the AK/SK, find the AK/SK file you downloaded. Generally,
the file name is credentials.csv.
Cloud Data MigrationAPI Reference 8 Appendix
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 189
A Change History
Release Date What's New
2019-07-23 This is the twentieth official release.Modified the JSON format. For details, see Querying a Job andLink to DIS.
2019-07-02 This is the nineteenth official release.l Supported the export from Tencent Cloud Object Storage
(COS) to OBS and updated sections Creating a Link andLink to KODO/COS.
l Supported the migration of historical files based on thecreation time and updated sections From Object Storage,From HDFS, and From FTP/SFTP/NAS/SFS.
2019-06-06 This is the eighteenth official release.l Optimized the document outline.l Modified From a Relational Database because concurrent
migration of Oracle partition tables is supported.l Modified From a Relational Database: When exporting data
from a MySQL database, you can migrate all tables startingwith a prefix (the number and types of fields in the tables mustbe the same) to a Hive table.
l Modified To OBS: When data is imported to OBS, CDMallows you to use KMS keys of other projects to encrypt data.
l CDM can connect to the MRS 2.0 cluster.
Cloud Data MigrationAPI Reference A Change History
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 190
Release Date What's New
2019-04-03 This is the seventeenth official release.l Added Permissions Policies and Supported Actions because
the IAM fine-grained authorization policy is supported.l Modified Job Parameter Description because the job group
is supported.l Modified the following sections because the AES-256-GCM
(NoPadding) algorithm can be used to encrypt and decryptfiles in file migrations:– From Object Storage– From HDFS– From FTP/SFTP/NAS/SFS– From HTTP/HTTPS– To OBS– To HDFS– To FTP/SFTP/NAS/SFS
2019-01-30 This is the sixteenth official release.l Modified From DIS and From Kafka because the binary
format is supported in data migration of Kafka, DIS, andDMS.
l Modified From HTTP/HTTPS because you can choosewhether to delete the query parameter from the name of theobjects uploaded to OBS.
2019-01-07 This is the fifteenth official release.l Added Creating and Executing a Job in a Random Cluster.l Modified the parameters in From HDFS because when CDM
extracts files from HDFS, a snapshot can be created for theHDFS source directory.
l Modified the parameters in To Elasticsearch/Cloud SearchService because for streaming jobs that continuously writedata to Elasticsearch, CDM periodically creates indexes andwrites data to the indexes, which helps you delete expireddata.
Cloud Data MigrationAPI Reference A Change History
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 191
Release Date What's New
2018-11-30 This is the fourteenth official release.l Added the following sections because CDM supports data
migration of DMS Kafka and Amazon S3:– Link to Amazon S3– Link to DMS Kafka
l Modified the following sections because the file separator canbe customized and the heading line of the table can be writtento the target file:– From Object Storage– From FTP/SFTP/NAS/SFS– To OBS– To FTP/SFTP/NAS/SFS
l Modified To a Relational Database because during relationaldatabase migration, data in the destination table can be deletedbased on the WHERE clause.
2018-10-30 This is the thirteenth official release.l Modified To OBS because after data is successfully imported
to OBS, a job success marker file is written.l Added To DIS.
2018-09-30 This is the twelfth official release.l Modified the following sections because CDM supports data
migration of SFS:– Link to NAS/SFS– From FTP/SFTP/NAS/SFS– To FTP/SFTP/NAS/SFS
l Changed domain name of OBS to bucket-level domain namein Link to OBS.
l Added custom SQL statement used for data export to From aRelational Database.
l Added the following sections:– Link to CloudTable OpenTSDB– From OpenTSDB– To OpenTSDB
2018-08-31 This is the eleventh official release.l Added To DIS.l Modified the following sections because the time filter
function is added:– From Object Storage– From HDFS– From FTP/SFTP/NAS/SFS
Cloud Data MigrationAPI Reference A Change History
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 192
Release Date What's New
2018-08-03 This is the tenth official release.l Added the cdm.xlarge flavor to Creating a Cluster.l Updated the Start Job by Marker File and Use Quote
Character functions for file migration jobs in the followingsections:– From Object Storage– From FTP/SFTP/NAS/SFS– To OBS– To FTP/SFTP/NAS/SFS
l Added incremental migration in MySQL Binlog mode toFrom a Relational Database.
l Added the data type match function of HBase to To HBase/CloudTable.
2018-07-05 This is the ninth official release.l Added Link to KODO/COS.l Added the linkConfig.runMode parameter to the HDFS and
HBase links. The following sections are updated:– Link to HDFS– Link to HBase
l Modified From Elasticsearch/Cloud Search Service becausewhen Elasticsearch or Cloud Search Service is the migrationsource, data can be filtered using filter criteria.
l Updated To OBS. When exported to OBS, the data isfragmented by size.
2018-06-02 This is the eighth official release.l Added From HTTP/HTTPS.l Added the CarbonData format of OBS to the following
sections:– From Object Storage– To OBS
l Added the retry function after configuration failure to JobParameter Description.
l Added the pipeline and authentication configurations of theElasticsearch data source to the following sections:– Link to Elasticsearch/Cloud Search Service– To Elasticsearch/Cloud Search Service
Cloud Data MigrationAPI Reference A Change History
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 193
Release Date What's New
2018-05-04 This is the seventh official release.l Added the following sections:
– Stopping a Cluster– Starting a Cluster– Link to CloudTable– To DDS
l Modified the following sections:– Creating a Link– Link to a Relational Database– Link to OBS– Link to OSS on Alibaba Cloud– Link to HDFS– Link to HBase
l Changed Elasticsearch Service of HUAWEI CLOUD to CloudSearch Service.
l Changed Unlimited Query Service (UQuery) of HUAWEICLOUD to Data Lake Insight (DLI).
2018-04-09 This is the sixth official release.l Added the following sections:
– Link to OSS on Alibaba Cloud– Link to DLI– To DLI
l Modified the following sections:– From Object Storage– From HBase/CloudTable– To OBS
Cloud Data MigrationAPI Reference A Change History
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 194
Release Date What's New
2018-01-31 This is the fifth official release.l Deleted the type parameter from the configs field.l Modified the description of several link parameters and job
parameters.l Added a sample JSON file to each connector.l Added a sample JSON file to each type of data source.l Added the GaussDB connector to Link to a Relational
Database.l Added the following sections:
– Before You Start– API Overview– Getting Started
l Deleted the following sections:– Common Message Headers– AK/SK Authentication– Common Return Values– Common Exception Response
Cloud Data MigrationAPI Reference A Change History
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 195
Release Date What's New
2018-01-11 This is the fourth official release.l Added the following sections:
– Link to HDFS– Link to CloudTable– Link to Kafka
l Deleted the description of the datastore parameter andmodified the description of the instances parameter inCreating a Cluster.
l Deleted the description of the datastore parameter and addedthe description of the version parameter in Querying theCluster List.
l Deleted the description of the datastore parameter and addedthe description of the version parameter in Querying ClusterDetails.
l Modified the value of thefromJobConfig.disConsumerStrategy parameter in FromDIS.
l Modified Creating a Job in a Specified Cluster.l Deleted the following sections:
– Querying Versions– Querying Flavors– Create MRS Data Source– Querying MRS Data Source– Deleting MRS Data Source– Public Configuration Data Structures
2017-11-30 This is the third official release.l Added the NAS, DIS, and Elasticsearch connectors.l Added the NAS and DIS data sources to the job parameter
description of source links.l Added the NAS and Elasticsearch data sources to the job
parameter description of destination links.
2017-10-31 This is the second official release.l Replaced projectId with project_id in this document.l Replaced datastoreId with datastore_id in this document.l Replaced clusterId with cluster_id in this document.l Added and modified the URI format and parameters in several
sections.l Added section "MongoDB Link".l Added section "Common Exception Response".
2017-9-30 This is the first official release.
Cloud Data MigrationAPI Reference A Change History
Issue 18 (2019-07-25) Copyright © Huawei Technologies Co., Ltd. 196