Hi there! Are you looking for the official Deno documentation? Try docs.deno.com for all your Deno learning needs.

PySparkJob

import type { PySparkJob } from "https://googleapis.deno.dev/v1/dataproc:v1.ts";

A Dataproc job for running Apache PySpark (https://spark.apache.org/docs/0.9.0/python-programming-guide.html) applications on YARN.

interface PySparkJob {
archiveUris?: string[];
args?: string[];
fileUris?: string[];
jarFileUris?: string[];
loggingConfig?: LoggingConfig;
mainPythonFileUri?: string;
properties?: {
[key: string]: string;
}
;
pythonFileUris?: string[];
}

§Properties

§
archiveUris?: string[]
[src]

Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.

§
args?: string[]
[src]

Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.

§
fileUris?: string[]
[src]

Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.

§
jarFileUris?: string[]
[src]

Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Python driver and tasks.

§
loggingConfig?: LoggingConfig
[src]

Optional. The runtime log config for job execution.

§
mainPythonFileUri?: string
[src]

Required. The HCFS URI of the main Python file to use as the driver. Must be a .py file.

§
properties?: {
[key: string]: string;
}
[src]

Optional. A mapping of property names to values, used to configure PySpark. Properties that conflict with values set by the Dataproc API might be overwritten. Can include properties set in /etc/spark/conf/spark-defaults.conf and classes in user code.

§
pythonFileUris?: string[]
[src]

Optional. HCFS file URIs of Python files to pass to the PySpark framework. Supported file types: .py, .egg, and .zip.