Sign InTry Free

Import or Migrate from Amazon S3 or GCS to TiDB Cloud

This document describes how to use Amazon Simple Storage Service (Amazon S3) or Google Cloud Storage (GCS) as a staging area for importing or migrating data into TiDB Cloud.

Import or migrate from Amazon S3 to TiDB Cloud

If your organization is using TiDB Cloud as a service on AWS, you can use Amazon S3 as a staging area for importing or migrating data into TiDB Cloud.

Prerequisites

Before migrating data from Amazon S3 to TiDB Cloud, ensure you have administrator access to your corporate-owned AWS account.

Step 1. Create an Amazon S3 bucket and prepare source data files

  1. Create an Amazon S3 bucket in your corporate-owned AWS account.

    For more information, see Creating a bucket in the AWS User Guide.

  2. If you are migrating data from an upstream database, you need to export the source data first.

    For more information, see Migrate Data from MySQL-Compatible Databases.

  3. If your source data is in local files, you can upload the files to the Amazon S3 bucket using either the Amazon S3 Console or the AWS CLI.

    • To upload files using the Amazon S3 Console, see Uploading objects in the AWS User Guide.

    • To upload files using the AWS CLI, use the following command:

      aws s3 sync <Local path> <Amazon S3 bucket URL>
      

      For example:

      aws s3 sync ./tidbcloud-samples-us-west-2/ s3://tidb-cloud-source-data
      

Step 2. Configure Amazon S3 access

To allow TiDB Cloud to access the source data in your Amazon S3 bucket, you need to configure the bucket access for TiDB Cloud and get the Role-ARN. Once the configuration is done for one TiDB cluster in a project, all TiDB clusters in that project can use the same Role-ARN to access your Amazon S3 bucket.

For detailed steps, see Configure Amazon S3 access.

Step 3. Import data into TiDB Cloud

  1. Log in to the TiDB Cloud console, and navigate to the Clusters page.

  2. Locate your target cluster, click ... in the upper-right corner of the cluster area, and select Import Data. The Data Import page is displayed.

  3. On the Data Import page, fill in the following information:

    • Data Format: choose the format of your data.
    • Location: AWS
    • Bucket URL: fill in the bucket URL of your source data.
    • Role-ARN: enter the Role-ARN you obtained in Step 2.
    • Target Cluster: shows the cluster name and the region name.

    If the region of the bucket is different from your cluster, confirm the compliance of cross region. Click Next.

    TiDB Cloud starts validating whether it can access your data in the specified bucket URL. After validation, TiDB Cloud tries to scan all the files in the data source using the default file naming pattern, and returns a scan summary result on the left side of the next page. If you get the AccessDenied error, see Troubleshoot Access Denied Errors during Data Import from S3.

  4. Modify the file patterns and add the table filter rules if needed.

  5. Click Next.

  6. On the Preview page, confirm the data to be imported and then click Start Import.

After the data is imported, if you want to remove the Amazon S3 access of TiDB Cloud, simply delete the policy that you added in Step 2. Configure Amazon S3 access.

Import or migrate from GCS to TiDB Cloud

If your organization is using TiDB Cloud as a service on Google Cloud Platform (GCP), you can use Google Cloud Storage (GCS) as a staging area for importing or migrating data into TiDB Cloud.

Prerequisites

Before migrating data from GCS to TiDB Cloud, ensure the following:

  • You have administrator access to your corporate-owned GCP account.
  • You have administrator access to the TiDB Cloud Management Portal.

Step 1. Create a GCS bucket and prepare source data files

  1. Create a GCS bucket in your corporate-owned GCP account.

    For more information, see Creating storage buckets in the Google Cloud Storage documentation.

  2. If you are migrating data from an upstream database, you need to export the source data first.

    For more information, see Install TiUP and Export data from MySQL compatible databases.

Step 2. Configure GCS access

To allow TiDB cloud to access the source data in your GCS bucket, you need to configure the GCS access for each TiDB Cloud as a service on the GCP project and GCS bucket pair. Once the configuration is done for one cluster in a project, all database clusters in that project can access the GCS bucket.

For detailed steps, see Configure GCS access.

Step 3. Copy source data files to GCS and import data into TiDB Cloud

  1. To copy your source data files to your GCS bucket, you can upload the data to the GCS bucket using either Google Cloud Console or gsutil.

    • To upload data using Google Cloud Console, see Creating storage buckets in Google Cloud Storage documentation.

    • To upload data using gsutil, use the following command:

      gsutil rsync -r <Local path> <GCS URL>
      

      For example:

      gsutil rsync -r ./tidbcloud-samples-us-west-2/ gs://target-url-in-gcs
      
  2. From the TiDB Cloud console, navigate to the Clusters page, and then click the name of your target cluster to go to its own overview page. In the Import area, click Import Data, and then fill in the importing related information on the Data Import page.

Download PDFRequest docs changes
Was this page helpful?
Open Source Ecosystem
TiDB
TiKV
TiSpark
Chaos Mesh
© 2022 PingCAP. All Rights Reserved.