Hi there! Are you looking for the official Deno documentation? Try docs.deno.com for all your Deno learning needs.

ImportContext

import type { ImportContext } from "https://googleapis.deno.dev/v1/sqladmin:v1.ts";

Database instance import context.

interface ImportContext {
bakImportOptions?: {
bakType?:
| "BAK_TYPE_UNSPECIFIED"
| "FULL"
| "DIFF"
| "TLOG";
encryptionOptions?: {
certPath?: string;
keepEncrypted?: boolean;
pvkPassword?: string;
pvkPath?: string;
}
;
noRecovery?: boolean;
recoveryOnly?: boolean;
stopAt?: Date;
stopAtMark?: string;
striped?: boolean;
}
;
csvImportOptions?: {
columns?: string[];
escapeCharacter?: string;
fieldsTerminatedBy?: string;
linesTerminatedBy?: string;
quoteCharacter?: string;
table?: string;
}
;
database?: string;
fileType?:
| "SQL_FILE_TYPE_UNSPECIFIED"
| "SQL"
| "CSV"
| "BAK"
| "TDE";
importUser?: string;
kind?: string;
sqlImportOptions?: {
parallel?: boolean;
postgresImportOptions?: {
clean?: boolean;
ifExists?: boolean;
}
;
threads?: number;
}
;
tdeImportOptions?: {
certificatePath?: string;
name?: string;
privateKeyPassword?: string;
privateKeyPath?: string;
}
;
uri?: string;
}

§Properties

§
bakImportOptions?: {
bakType?:
| "BAK_TYPE_UNSPECIFIED"
| "FULL"
| "DIFF"
| "TLOG";
encryptionOptions?: {
certPath?: string;
keepEncrypted?: boolean;
pvkPassword?: string;
pvkPath?: string;
}
;
noRecovery?: boolean;
recoveryOnly?: boolean;
stopAt?: Date;
stopAtMark?: string;
striped?: boolean;
}
[src]

Import parameters specific to SQL Server .BAK files

§
csvImportOptions?: {
columns?: string[];
escapeCharacter?: string;
fieldsTerminatedBy?: string;
linesTerminatedBy?: string;
quoteCharacter?: string;
table?: string;
}
[src]

Options for importing data as CSV.

§
database?: string
[src]

The target database for the import. If fileType is SQL, this field is required only if the import file does not specify a database, and is overridden by any database specification in the import file. For entire instance parallel import operations, the database is overridden by the database name stored in subdirectory name. If fileType is CSV, one database must be specified.

§
fileType?: "SQL_FILE_TYPE_UNSPECIFIED" | "SQL" | "CSV" | "BAK" | "TDE"
[src]

The file type for the specified uri.`SQL: The file contains SQL statements. \CSV`: The file contains CSV data.

§
importUser?: string
[src]

The PostgreSQL user for this import operation. PostgreSQL instances only.

§
kind?: string
[src]

This is always sql#importContext.

§
sqlImportOptions?: {
parallel?: boolean;
postgresImportOptions?: {
clean?: boolean;
ifExists?: boolean;
}
;
threads?: number;
}
[src]

Optional. Options for importing data from SQL statements.

§
tdeImportOptions?: {
certificatePath?: string;
name?: string;
privateKeyPassword?: string;
privateKeyPath?: string;
}
[src]

Optional. Import parameters specific to SQL Server TDE certificates

§
uri?: string
[src]

Path to the import file in Cloud Storage, in the form gs://bucketName/fileName. Compressed gzip files (.gz) are supported when fileType is SQL. The instance must have write permissions to the bucket and read access to the file.