PySparkJob
import type { PySparkJob } from "https://googleapis.deno.dev/v1/dataproc:v1.ts";
A Dataproc job for running Apache PySpark (https://spark.apache.org/docs/0.9.0/python-programming-guide.html) applications on YARN.
§Properties
Optional. HCFS URIs of archives to be extracted into the working directory of each executor. Supported file types: .jar, .tar, .tar.gz, .tgz, and .zip.
Optional. The arguments to pass to the driver. Do not include arguments, such as --conf, that can be set as job properties, since a collision may occur that causes an incorrect job submission.
Optional. HCFS URIs of files to be placed in the working directory of each executor. Useful for naively parallel tasks.
Optional. HCFS URIs of jar files to add to the CLASSPATHs of the Python driver and tasks.
Optional. The runtime log config for job execution.
Required. The HCFS URI of the main Python file to use as the driver. Must be a .py file.