Hi there! Are you looking for the official Deno documentation? Try docs.deno.com for all your Deno learning needs.

TransferConfig

import type { TransferConfig } from "https://googleapis.deno.dev/v1/bigquerydatatransfer:v1.ts";

Represents a data transfer configuration. A transfer configuration contains all metadata needed to perform a data transfer. For example, destination_dataset_id specifies where data should be stored. When a new transfer configuration is created, the specified destination_dataset_id is created when needed and shared with the appropriate data source service account.

interface TransferConfig {
dataRefreshWindowDays?: number;
readonly datasetRegion?: string;
dataSourceId?: string;
destinationDatasetId?: string;
disabled?: boolean;
displayName?: string;
emailPreferences?: EmailPreferences;
encryptionConfiguration?: EncryptionConfiguration;
name?: string;
readonly nextRunTime?: Date;
notificationPubsubTopic?: string;
readonly ownerInfo?: UserInfo;
params?: {
[key: string]: any;
}
;
schedule?: string;
scheduleOptions?: ScheduleOptions;
readonly state?:
| "TRANSFER_STATE_UNSPECIFIED"
| "PENDING"
| "RUNNING"
| "SUCCEEDED"
| "FAILED"
| "CANCELLED";
readonly updateTime?: Date;
userId?: bigint;
}

§Properties

§
dataRefreshWindowDays?: number
[src]

The number of days to look back to automatically refresh the data. For example, if data_refresh_window_days = 10, then every day BigQuery reingests data for [today-10, today-1], rather than ingesting data for just [today-1]. Only valid if the data source supports the feature. Set the value to 0 to use the default value.

§
readonly datasetRegion?: string
[src]

Output only. Region in which BigQuery dataset is located.

§
dataSourceId?: string
[src]

Data source ID. This cannot be changed once data transfer is created. The full list of available data source IDs can be returned through an API call: https://cloud.google.com/bigquery-transfer/docs/reference/datatransfer/rest/v1/projects.locations.dataSources/list

§
destinationDatasetId?: string
[src]

The BigQuery target dataset id.

§
disabled?: boolean
[src]

Is this config disabled. When set to true, no runs will be scheduled for this transfer config.

§
displayName?: string
[src]

User specified display name for the data transfer.

§
emailPreferences?: EmailPreferences
[src]

Email notifications will be sent according to these preferences to the email address of the user who owns this transfer config.

§
encryptionConfiguration?: EncryptionConfiguration
[src]

The encryption configuration part. Currently, it is only used for the optional KMS key name. The BigQuery service account of your project must be granted permissions to use the key. Read methods will return the key name applied in effect. Write methods will apply the key if it is present, or otherwise try to apply project default keys if it is absent.

§
name?: string
[src]

Identifier. The resource name of the transfer config. Transfer config names have the form either projects/{project_id}/locations/{region}/transferConfigs/{config_id} or projects/{project_id}/transferConfigs/{config_id}, where config_id is usually a UUID, even though it is not guaranteed or required. The name is ignored when creating a transfer config.

§
readonly nextRunTime?: Date
[src]

Output only. Next time when data transfer will run.

§
notificationPubsubTopic?: string
[src]

Pub/Sub topic where notifications will be sent after transfer runs associated with this transfer config finish. The format for specifying a pubsub topic is: projects/{project_id}/topics/{topic_id}

§
readonly ownerInfo?: UserInfo
[src]

Output only. Information about the user whose credentials are used to transfer data. Populated only for transferConfigs.get requests. In case the user information is not available, this field will not be populated.

§
params?: {
[key: string]: any;
}
[src]

Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq

§
schedule?: string
[src]

Data transfer schedule. If the data source does not support a custom schedule, this should be empty. If it is empty, the default value for the data source will be used. The specified times are in UTC. Examples of valid format: 1st,3rd monday of month 15:30, every wed,fri of jan,jun 13:15, and first sunday of quarter 00:00. See more explanation about the format here: https://cloud.google.com/appengine/docs/flexible/python/scheduling-jobs-with-cron-yaml#the_schedule_format NOTE: The minimum interval time between recurring transfers depends on the data source; refer to the documentation for your data source.

§
scheduleOptions?: ScheduleOptions
[src]

Options customizing the data transfer schedule.

§
readonly state?: "TRANSFER_STATE_UNSPECIFIED" | "PENDING" | "RUNNING" | "SUCCEEDED" | "FAILED" | "CANCELLED"
[src]

Output only. State of the most recently updated transfer run.

§
readonly updateTime?: Date
[src]

Output only. Data transfer modification time. Ignored by server on input.

§
userId?: bigint
[src]

Deprecated. Unique ID of the user on whose behalf transfer is done.