Hi there! Are you looking for the official Deno documentation? Try docs.deno.com for all your Deno learning needs.

CreateEnvironmentInput

import type { CreateEnvironmentInput } from "https://aws-api.deno.dev/v0.3/services/mwaa.ts?docs=full";

This section contains the Amazon Managed Workflows for Apache Airflow (MWAA) API reference documentation to create an environment. For more information, see Get started with Amazon Managed Workflows for Apache Airflow.

interface CreateEnvironmentInput {
AirflowConfigurationOptions?: {
[key: string]: string | null | undefined;
}
| null;
AirflowVersion?: string | null;
DagS3Path: string;
EnvironmentClass?: string | null;
ExecutionRoleArn: string;
KmsKey?: string | null;
LoggingConfiguration?: LoggingConfigurationInput | null;
MaxWorkers?: number | null;
MinWorkers?: number | null;
Name: string;
NetworkConfiguration: NetworkConfiguration;
PluginsS3ObjectVersion?: string | null;
PluginsS3Path?: string | null;
RequirementsS3ObjectVersion?: string | null;
RequirementsS3Path?: string | null;
Schedulers?: number | null;
SourceBucketArn: string;
Tags?: {
[key: string]: string | null | undefined;
}
| null;
WebserverAccessMode?: WebserverAccessMode | null;
WeeklyMaintenanceWindowStart?: string | null;
}

§Properties

§
AirflowConfigurationOptions?: {
[key: string]: string | null | undefined;
}
| null
[src]

A list of key-value pairs containing the Apache Airflow configuration options you want to attach to your environment. To learn more, see Apache Airflow configuration options.

§
AirflowVersion?: string | null
[src]

The Apache Airflow version for your environment. If no value is specified, defaults to the latest version. Valid values: 1.10.12, 2.0.2. To learn more, see Apache Airflow versions on Amazon Managed Workflows for Apache Airflow (MWAA).

§
DagS3Path: string
[src]

The relative path to the DAGs folder on your Amazon S3 bucket. For example, dags. To learn more, see Adding or updating DAGs.

§
EnvironmentClass?: string | null
[src]

The environment class type. Valid values: mw1.small, mw1.medium, mw1.large. To learn more, see Amazon MWAA environment class.

§
ExecutionRoleArn: string
[src]

The Amazon Resource Name (ARN) of the execution role for your environment. An execution role is an Amazon Web Services Identity and Access Management (IAM) role that grants MWAA permission to access Amazon Web Services services and resources used by your environment. For example, arn:aws:iam::123456789:role/my-execution-role. To learn more, see Amazon MWAA Execution role.

§
KmsKey?: string | null
[src]

The Amazon Web Services Key Management Service (KMS) key to encrypt the data in your environment. You can use an Amazon Web Services owned CMK, or a Customer managed CMK (advanced). To learn more, see Create an Amazon MWAA environment.

§
LoggingConfiguration?: LoggingConfigurationInput | null
[src]

Defines the Apache Airflow logs to send to CloudWatch Logs.

§
MaxWorkers?: number | null
[src]

The maximum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. For example, 20. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the one worker that is included with your environment, or the number you specify in MinWorkers.

§
MinWorkers?: number | null
[src]

The minimum number of workers that you want to run in your environment. MWAA scales the number of Apache Airflow workers up to the number you specify in the MaxWorkers field. When there are no more tasks running, and no more in the queue, MWAA disposes of the extra workers leaving the worker count you specify in the MinWorkers field. For example, 2.

§
Name: string
[src]

The name of the Amazon MWAA environment. For example, MyMWAAEnvironment.

§
NetworkConfiguration: NetworkConfiguration
[src]

The VPC networking components used to secure and enable network traffic between the Amazon Web Services resources for your environment. To learn more, see About networking on Amazon MWAA.

§
PluginsS3ObjectVersion?: string | null
[src]

The version of the plugins.zip file on your Amazon S3 bucket. A version must be specified each time a plugins.zip file is updated. To learn more, see How S3 Versioning works.

§
PluginsS3Path?: string | null
[src]

The relative path to the plugins.zip file on your Amazon S3 bucket. For example, plugins.zip. If specified, then the plugins.zip version is required. To learn more, see Installing custom plugins.

§
RequirementsS3ObjectVersion?: string | null
[src]

The version of the requirements.txt file on your Amazon S3 bucket. A version must be specified each time a requirements.txt file is updated. To learn more, see How S3 Versioning works.

§
RequirementsS3Path?: string | null
[src]

The relative path to the requirements.txt file on your Amazon S3 bucket. For example, requirements.txt. If specified, then a file version is required. To learn more, see Installing Python dependencies.

§
Schedulers?: number | null
[src]

The number of Apache Airflow schedulers to run in your environment. Valid values:

  • v2.0.2 - Accepts between 2 to 5. Defaults to 2.
  • v1.10.12 - Accepts 1.
§
SourceBucketArn: string
[src]

The Amazon Resource Name (ARN) of the Amazon S3 bucket where your DAG code and supporting files are stored. For example, arn:aws:s3:::my-airflow-bucket-unique-name. To learn more, see Create an Amazon S3 bucket for Amazon MWAA.

§
Tags?: {
[key: string]: string | null | undefined;
}
| null
[src]

The key-value tag pairs you want to associate to your environment. For example, "Environment": "Staging". To learn more, see Tagging Amazon Web Services resources.

§
WebserverAccessMode?: WebserverAccessMode | null
[src]

The Apache Airflow Web server access mode. To learn more, see Apache Airflow access modes.

§
WeeklyMaintenanceWindowStart?: string | null
[src]

The day and time of the week in Coordinated Universal Time (UTC) 24-hour standard time to start weekly maintenance updates of your environment in the following format: DAY:HH:MM. For example: TUE:03:30. You can specify a start time in 30 minute increments only.