文档

alicloud_nas_data_flow

更新时间:
一键部署

Provides a Network Attached Storage (NAS) Data Flow resource.

For information about Network Attached Storage (NAS) Data Flow and how to use it, see What is Data Flow.

-> NOTE: Available since v1.153.0.

Example Usage

Basic Usage


data "alicloud_nas_zones" "example" {
  file_system_type = "cpfs"
}

resource "alicloud_vpc" "example" {
  vpc_name   = "terraform-example"
  cidr_block = "172.17.3.0/24"
}

resource "alicloud_vswitch" "example" {
  vswitch_name = "terraform-example"
  cidr_block   = "172.17.3.0/24"
  vpc_id       = alicloud_vpc.example.id
  zone_id      = data.alicloud_nas_zones.example.zones[1].zone_id
}

resource "alicloud_nas_file_system" "example" {
  protocol_type    = "cpfs"
  storage_type     = "advance_200"
  file_system_type = "cpfs"
  capacity         = 3600
  description      = "terraform-example"
  zone_id          = data.alicloud_nas_zones.example.zones[1].zone_id
  vpc_id           = alicloud_vpc.example.id
  vswitch_id       = alicloud_vswitch.example.id
}

resource "alicloud_nas_mount_target" "example" {
  file_system_id = alicloud_nas_file_system.example.id
  vswitch_id     = alicloud_vswitch.example.id
}
resource "random_integer" "example" {
  max = 99999
  min = 10000
}
resource "alicloud_oss_bucket" "example" {
  bucket = "example-value-${random_integer.example.result}"
  acl    = "private"
  tags = {
    cpfs-dataflow = "true"
  }
}

resource "alicloud_nas_fileset" "example" {
  file_system_id   = alicloud_nas_mount_target.example.file_system_id
  description      = "terraform-example"
  file_system_path = "/example_path/"
}

resource "alicloud_nas_data_flow" "example" {
  fset_id              = alicloud_nas_fileset.example.fileset_id
  description          = "terraform-example"
  file_system_id       = alicloud_nas_file_system.example.id
  source_security_type = "SSL"
  source_storage       = join("", ["oss://", alicloud_oss_bucket.example.bucket])
  throughput           = 600
}

Argument Reference

The following arguments are supported:

  • description - (Optional) The Description of the data flow. Restrictions:
    • 2 ~ 128 English or Chinese characters in length.
    • Must start with uppercase or lowercase letters or Chinese, and cannot start with http:// and https://.
    • Can contain numbers, semicolons (:), underscores (_), or dashes (-).
  • dry_run - (Optional) The dry run.
  • file_system_id - (Required, ForceNew) The ID of the file system.
  • fset_id - (Required, ForceNew) The ID of the Fileset.
  • source_security_type - (Optional, ForceNew) The security protection type of the source storage. If the source storage must be accessed through security protection, specify the security protection type of the source storage. Value:
    • NONE (default): Indicates that the source storage does not need to be accessed through security protection.
    • SSL: Protects access through SSL certificates.
  • source_storage - (Required, ForceNew) The access path of the source store. Format: <storage type>://<path>. Among them:
    • storage type: currently only OSS is supported.
    • path: the bucket name of OSS.
      • Only lowercase letters, numbers, and dashes (-) are supported and must start and end with lowercase letters or numbers.
      • 8 to 128 English characters in length.
      • Use UTF-8 coding.
      • Cannot start with http:// and https://.
  • status - (Optional) The status of the Data flow. Valid values: Running, Stopped.
  • throughput - (Required) The maximum transmission bandwidth of data flow, unit: MB/s. Valid values: 1200, 1500, 600. NOTE: The transmission bandwidth of data flow must be less than the IO bandwidth of the file system.

Attributes Reference

The following attributes are exported:

  • id - The resource ID of Data Flow. The value formats as <file_system_id>:<data_flow_id>.
  • data_flow_id - The ID of the Data flow.

Timeouts

The timeouts block allows you to specify timeouts for certain actions:

  • create - (Defaults to 10 mins) Used when create the Data Flow.
  • update - (Defaults to 10 mins) Used when update the Data Flow.
  • delete - (Defaults to 10 mins) Used when delete the Data Flow.

Import

Network Attached Storage (NAS) Data Flow can be imported using the id, e.g.

$ terraform import alicloud_nas_data_flow.example <file_system_id>:<data_flow_id>