Hi there! Are you looking for the official Deno documentation? Try docs.deno.com for all your Deno learning needs.

TransferRun

import type { TransferRun } from "https://googleapis.deno.dev/v1/bigquerydatatransfer:v1.ts";

Represents a data transfer run.

interface TransferRun {
readonly dataSourceId?: string;
readonly destinationDatasetId?: string;
readonly emailPreferences?: EmailPreferences;
readonly endTime?: Date;
errorStatus?: Status;
name?: string;
readonly notificationPubsubTopic?: string;
readonly params?: {
[key: string]: any;
}
;
runTime?: Date;
readonly schedule?: string;
scheduleTime?: Date;
readonly startTime?: Date;
state?:
| "TRANSFER_STATE_UNSPECIFIED"
| "PENDING"
| "RUNNING"
| "SUCCEEDED"
| "FAILED"
| "CANCELLED";
readonly updateTime?: Date;
userId?: bigint;
}

§Properties

§
readonly dataSourceId?: string
[src]

Output only. Data source id.

§
readonly destinationDatasetId?: string
[src]

Output only. The BigQuery target dataset id.

§
readonly emailPreferences?: EmailPreferences
[src]

Output only. Email notifications will be sent according to these preferences to the email address of the user who owns the transfer config this run was derived from.

§
readonly endTime?: Date
[src]

Output only. Time when transfer run ended. Parameter ignored by server for input requests.

§
errorStatus?: Status
[src]

Status of the transfer run.

§
name?: string
[src]

Identifier. The resource name of the transfer run. Transfer run names have the form projects/{project_id}/locations/{location}/transferConfigs/{config_id}/runs/{run_id}. The name is ignored when creating a transfer run.

§
readonly notificationPubsubTopic?: string
[src]

Output only. Pub/Sub topic where a notification will be sent after this transfer run finishes. The format for specifying a pubsub topic is: projects/{project_id}/topics/{topic_id}

§
readonly params?: {
[key: string]: any;
}
[src]

Output only. Parameters specific to each data source. For more information see the bq tab in the 'Setting up a data transfer' section for each data source. For example the parameters for Cloud Storage transfers are listed here: https://cloud.google.com/bigquery-transfer/docs/cloud-storage-transfer#bq

§
runTime?: Date
[src]

For batch transfer runs, specifies the date and time of the data should be ingested.

§
readonly schedule?: string
[src]

Output only. Describes the schedule of this transfer run if it was created as part of a regular schedule. For batch transfer runs that are scheduled manually, this is empty. NOTE: the system might choose to delay the schedule depending on the current load, so schedule_time doesn't always match this.

§
scheduleTime?: Date
[src]

Minimum time after which a transfer run can be started.

§
readonly startTime?: Date
[src]

Output only. Time when transfer run was started. Parameter ignored by server for input requests.

§
state?: "TRANSFER_STATE_UNSPECIFIED" | "PENDING" | "RUNNING" | "SUCCEEDED" | "FAILED" | "CANCELLED"
[src]

Data transfer run state. Ignored for input requests.

§
readonly updateTime?: Date
[src]

Output only. Last time the data transfer run state was updated.

§
userId?: bigint
[src]

Deprecated. Unique ID of the user on whose behalf transfer is done.