Module dataflow
@pulumi/gcp > dataflow
Index ▹
class Job
extends CustomResource
Creates a job on Dataflow, which is an implementation of Apache Beam running on Google Compute Engine. For more information see the official documentation for Beam and Dataflow.
Example Usage
import * as pulumi from "@pulumi/pulumi";
import * as gcp from "@pulumi/gcp";
const bigDataJob = new gcp.dataflow.Job("big_data_job", {
parameters: {
baz: "qux",
foo: "bar",
},
tempGcsLocation: "gs://my-bucket/tmp_dir",
templateGcsPath: "gs://my-bucket/templates/template_file",
});
Note on “destroy” / “apply”
There are many types of Dataflow jobs. Some Dataflow jobs run constantly, getting new data from (e.g.) a GCS bucket, and outputting data continuously. Some jobs process a set amount of data then terminate. All jobs can fail while running due to programming errors or other issues. In this way, Dataflow jobs are different from most other Terraform / Google resources.
The Dataflow resource is considered ‘existing’ while it is in a nonterminal state. If it reaches a terminal state (e.g. ‘FAILED’, ‘COMPLETE’, ‘CANCELLED’), it will be recreated on the next ‘apply’. This is as expected for jobs which run continously, but may surprise users who use this resource for other kinds of Dataflow jobs.
A Dataflow job which is ‘destroyed’ may be “cancelled” or “drained”. If “cancelled”, the job terminates - any data written remains where it is, but no new data will be processed. If “drained”, no new data will enter the pipeline, but any data currently in the pipeline will finish being processed. The default is “cancelled”, but if a user sets on_delete
to "drain"
in the configuration, you may experience a long wait for your terraform destroy
to complete.
constructor
new Job(name: string, args: JobArgs, opts?: pulumi.CustomResourceOptions)
Create a Job resource with the given unique name, arguments, and options.
name
The unique name of the resource.args
The arguments to use to populate this resource's properties.opts
A bag of options that control this resource's behavior.
method get
public static get(name: string, id: pulumi.Input<pulumi.ID>, state?: JobState, opts?: pulumi.CustomResourceOptions): Job
Get an existing Job resource’s state with the given name, ID, and optional extra properties used to qualify the lookup.
method getProvider
method isInstance
static isInstance(obj: any): boolean
Returns true if the given object is an instance of CustomResource. This is designed to work even when multiple copies of the Pulumi SDK have been loaded into the same process.
property id
id: Output<ID>;
id is the provider-assigned unique ID for this managed resource. It is set during deployments and may be missing (undefined) during planning phases.
property maxWorkers
public maxWorkers: pulumi.Output<number | undefined>;
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
property name
public name: pulumi.Output<string>;
A unique name for the resource, required by Dataflow.
property onDelete
public onDelete: pulumi.Output<string | undefined>;
One of “drain” or “cancel”. Specifies behavior of deletion during terraform destroy
. See above note.
property parameters
public parameters: pulumi.Output<{[key: string]: any} | undefined>;
Key/Value pairs to be passed to the Dataflow job (as used in the template).
property project
public project: pulumi.Output<string | undefined>;
The project in which the resource belongs. If it is not provided, the provider project is used.
property region
public region: pulumi.Output<string | undefined>;
property state
public state: pulumi.Output<string>;
The current state of the resource, selected from the JobState enum
property tempGcsLocation
public tempGcsLocation: pulumi.Output<string>;
A writeable location on GCS for the Dataflow job to dump its temporary data.
property templateGcsPath
public templateGcsPath: pulumi.Output<string>;
The GCS path to the Dataflow job template.
property urn
urn: Output<URN>;
urn is the stable logical URN used to distinctly address a resource, both before and after deployments.
property zone
public zone: pulumi.Output<string | undefined>;
The zone in which the created job should run. If it is not provided, the provider zone is used.
interface JobArgs
The set of arguments for constructing a Job resource.
property maxWorkers
maxWorkers?: pulumi.Input<number>;
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
property name
name?: pulumi.Input<string>;
A unique name for the resource, required by Dataflow.
property onDelete
onDelete?: pulumi.Input<string>;
One of “drain” or “cancel”. Specifies behavior of deletion during terraform destroy
. See above note.
property parameters
parameters?: pulumi.Input<{[key: string]: any}>;
Key/Value pairs to be passed to the Dataflow job (as used in the template).
property project
project?: pulumi.Input<string>;
The project in which the resource belongs. If it is not provided, the provider project is used.
property region
region?: pulumi.Input<string>;
property tempGcsLocation
tempGcsLocation: pulumi.Input<string>;
A writeable location on GCS for the Dataflow job to dump its temporary data.
property templateGcsPath
templateGcsPath: pulumi.Input<string>;
The GCS path to the Dataflow job template.
property zone
zone?: pulumi.Input<string>;
The zone in which the created job should run. If it is not provided, the provider zone is used.
interface JobState
Input properties used for looking up and filtering Job resources.
property maxWorkers
maxWorkers?: pulumi.Input<number>;
The number of workers permitted to work on the job. More workers may improve processing speed at additional cost.
property name
name?: pulumi.Input<string>;
A unique name for the resource, required by Dataflow.
property onDelete
onDelete?: pulumi.Input<string>;
One of “drain” or “cancel”. Specifies behavior of deletion during terraform destroy
. See above note.
property parameters
parameters?: pulumi.Input<{[key: string]: any}>;
Key/Value pairs to be passed to the Dataflow job (as used in the template).
property project
project?: pulumi.Input<string>;
The project in which the resource belongs. If it is not provided, the provider project is used.
property region
region?: pulumi.Input<string>;
property state
state?: pulumi.Input<string>;
The current state of the resource, selected from the JobState enum
property tempGcsLocation
tempGcsLocation?: pulumi.Input<string>;
A writeable location on GCS for the Dataflow job to dump its temporary data.
property templateGcsPath
templateGcsPath?: pulumi.Input<string>;
The GCS path to the Dataflow job template.
property zone
zone?: pulumi.Input<string>;
The zone in which the created job should run. If it is not provided, the provider zone is used.