blob: 72edfbee90522ee1b2630a1ecd6714006eaa5045 [file] [log] [blame]
<html><body>
<style>
body, h1, h2, h3, div, span, p, pre, a {
margin: 0;
padding: 0;
border: 0;
font-weight: inherit;
font-style: inherit;
font-size: 100%;
font-family: inherit;
vertical-align: baseline;
}
body {
font-size: 13px;
padding: 1em;
}
h1 {
font-size: 26px;
margin-bottom: 1em;
}
h2 {
font-size: 24px;
margin-bottom: 1em;
}
h3 {
font-size: 20px;
margin-bottom: 1em;
margin-top: 1em;
}
pre, code {
line-height: 1.5;
font-family: Monaco, 'DejaVu Sans Mono', 'Bitstream Vera Sans Mono', 'Lucida Console', monospace;
}
pre {
margin-top: 0.5em;
}
h1, h2, h3, p {
font-family: Arial, sans serif;
}
h1, h2, h3 {
border-bottom: solid #CCC 1px;
}
.toc_element {
margin-top: 0.5em;
}
.firstline {
margin-left: 2 em;
}
.method {
margin-top: 1em;
border: solid 1px #CCC;
padding: 1em;
background: #EEE;
}
.details {
font-weight: bold;
font-size: 14px;
}
</style>
<h1><a href="storagetransfer_v1.html">Storage Transfer API</a> . <a href="storagetransfer_v1.transferJobs.html">transferJobs</a></h1>
<h2>Instance Methods</h2>
<p class="toc_element">
<code><a href="#create">create(body=None, x__xgafv=None)</a></code></p>
<p class="firstline">Creates a transfer job that runs periodically.</p>
<p class="toc_element">
<code><a href="#get">get(jobName, projectId=None, x__xgafv=None)</a></code></p>
<p class="firstline">Gets a transfer job.</p>
<p class="toc_element">
<code><a href="#list">list(filter=None, pageToken=None, pageSize=None, x__xgafv=None)</a></code></p>
<p class="firstline">Lists transfer jobs.</p>
<p class="toc_element">
<code><a href="#list_next">list_next(previous_request, previous_response)</a></code></p>
<p class="firstline">Retrieves the next page of results.</p>
<p class="toc_element">
<code><a href="#patch">patch(jobName, body=None, x__xgafv=None)</a></code></p>
<p class="firstline">Updates a transfer job. Updating a job's transfer spec does not affect</p>
<h3>Method Details</h3>
<div class="method">
<code class="details" id="create">create(body=None, x__xgafv=None)</code>
<pre>Creates a transfer job that runs periodically.
Args:
body: object, The request body.
The object takes the form of:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
&quot;projectId&quot;: &quot;A String&quot;, # The ID of the Google Cloud Platform Project that owns the job.
&quot;status&quot;: &quot;A String&quot;, # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# **Note:** The effect of the new job status takes place during a subsequent
# job run. For example, if you change the job status from
# ENABLED to DISABLED, and an operation
# spawned by the transfer is running, the status change would not affect the
# current operation.
&quot;creationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was created.
&quot;lastModificationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was last modified.
&quot;description&quot;: &quot;A String&quot;, # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
&quot;transferSpec&quot;: { # Configuration for running a transfer. # Transfer specification.
&quot;awsS3DataSource&quot;: { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data resource, an object&#x27;s name is the S3 object&#x27;s key name.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. S3 Bucket name (see
# [Creating a
# bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
&quot;awsAccessKey&quot;: { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
# bucket. Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# [AWS Security
# Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
&quot;accessKeyId&quot;: &quot;A String&quot;, # Required. AWS access key ID.
&quot;secretAccessKey&quot;: &quot;A String&quot;, # Required. AWS secret access key. This field is not returned in RPC
# responses.
},
},
&quot;objectConditions&quot;: { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects&#x27; &quot;last modification time&quot; do not exclude objects in a data sink.
# to S3 and Cloud Storage objects.
#
# The &quot;last modification time&quot; refers to the time of the
# last change to the object&#x27;s content or metadata — specifically, this is
# the `updated` property of Cloud Storage objects and the `LastModified`
# field of S3 objects.
&quot;excludePrefixes&quot;: [ # `exclude_prefixes` must follow the requirements described for
# include_prefixes.
#
# The max size of `exclude_prefixes` is 1000.
&quot;A String&quot;,
],
&quot;maxTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# `NOW` - `max_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob,
# `NOW` refers to the start_time of the
# `TransferOperation`.
&quot;lastModifiedSince&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# this timestamp and objects that don&#x27;t have a &quot;last modification time&quot; are
# transferred.
#
# The `last_modified_since` and `last_modified_before` fields can be used
# together for chunked data processing. For example, consider a script that
# processes each day&#x27;s worth of data at a time. For that you&#x27;d set each
# of the fields as follows:
#
# * `last_modified_since` to the start of the day
#
# * `last_modified_before` to the end of the day
&quot;lastModifiedBefore&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before this
# timestamp and objects that don&#x27;t have a &quot;last modification time&quot; will be
# transferred.
&quot;minTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before
# `NOW` - `min_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob, `NOW`
# refers to the start_time of the
# `TransferOperation`.
&quot;includePrefixes&quot;: [ # If `include_prefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `include_prefixes`
# and that do not start with any of the exclude_prefixes. If
# `include_prefixes` is not specified, all objects except those that have
# names starting with one of the `exclude_prefixes` must satisfy the object
# conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
# and must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace. No include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace. No exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `include_prefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `include_prefixes`.
#
# The max size of `include_prefixes` is 1000.
&quot;A String&quot;,
],
},
&quot;gcsDataSink&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data sink.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
&quot;azureBlobStorageDataSource&quot;: { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
# An AzureBlobStorageData resource represents one Azure container. The storage
# account determines the [Azure
# endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
# In an AzureBlobStorageData resource, a blobs&#x27;s name is the [Azure Blob
# Storage blob&#x27;s key
# name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
&quot;azureCredentials&quot;: { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
&quot;sasToken&quot;: &quot;A String&quot;, # Required. Azure shared access signature. (see
# [Grant limited access to Azure Storage resources using shared access
# signatures
# (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
},
&quot;storageAccount&quot;: &quot;A String&quot;, # Required. The name of the Azure Storage account.
&quot;container&quot;: &quot;A String&quot;, # Required. The container to transfer from the Azure Storage account.
},
&quot;httpDataSource&quot;: { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
# over HTTP. The information of the objects to be transferred is contained in
# a file referenced by a URL. The first line in the file must be
# `&quot;TsvHttpData-1.0&quot;`, which specifies the format of the file. Subsequent
# lines specify the information of the list of objects, one object per list
# entry. Each entry has the following tab-delimited fields:
#
# * **HTTP URL** — The location of the object.
#
# * **Length** — The size of the object in bytes.
#
# * **MD5** — The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from
# URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/&lt;URL-path&gt;` is
# transferred to a data sink, the name of the object at the data sink is
# `&lt;hostname&gt;/&lt;URL-path&gt;`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5
# hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * ObjectConditions have no effect when filtering objects to transfer.
&quot;listUrl&quot;: &quot;A String&quot;, # Required. The URL that points to the file that stores the object list
# entries. This file must allow public access. Currently, only URLs with
# HTTP and HTTPS schemes are supported.
},
&quot;transferOptions&quot;: { # TransferOptions uses three boolean parameters to define the actions # If the option
# delete_objects_unique_in_sink
# is `true`, object conditions based on objects&#x27; &quot;last modification time&quot; are
# ignored and do not exclude objects in a data source or a data sink.
# to be performed on objects in a transfer.
&quot;deleteObjectsFromSourceAfterTransfer&quot;: True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
#
# **Note:** This option and delete_objects_unique_in_sink are mutually
# exclusive.
&quot;overwriteObjectsAlreadyExistingInSink&quot;: True or False, # Whether overwriting objects that already exist in the sink is allowed.
&quot;deleteObjectsUniqueInSink&quot;: True or False, # Whether objects that exist only in the sink should be deleted.
#
# **Note:** This option and delete_objects_from_source_after_transfer are
# mutually exclusive.
},
&quot;gcsDataSource&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data source.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
},
&quot;name&quot;: &quot;A String&quot;, # A unique name (within the transfer project) assigned when the job is
# created. If this field is empty in a CreateTransferJobRequest, Storage
# Transfer Service will assign a unique name. Otherwise, the specified name
# is used as the unique name for this job.
#
# If the specified name is in use by a job, the creation request fails with
# an ALREADY_EXISTS error.
#
# This name must start with `&quot;transferJobs/&quot;` prefix and end with a letter or
# a number, and should be no more than 128 characters.
# Example: `&quot;transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$&quot;`
#
# Invalid job names will fail with an
# INVALID_ARGUMENT error.
&quot;deletionTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was deleted.
&quot;notificationConfig&quot;: { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
# Notifications will be published to the customer-provided topic using the
# following `PubsubMessage.attributes`:
#
# * `&quot;eventType&quot;`: one of the EventType values
# * `&quot;payloadFormat&quot;`: one of the PayloadFormat values
# * `&quot;projectId&quot;`: the project_id of the
# `TransferOperation`
# * `&quot;transferJobName&quot;`: the
# transfer_job_name of the
# `TransferOperation`
# * `&quot;transferOperationName&quot;`: the name of the
# `TransferOperation`
#
# The `PubsubMessage.data` will contain a TransferOperation resource
# formatted according to the specified `PayloadFormat`.
&quot;eventTypes&quot;: [ # Event types for which a notification is desired. If empty, send
# notifications for all event types.
&quot;A String&quot;,
],
&quot;pubsubTopic&quot;: &quot;A String&quot;, # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
# notifications. Must be of the format: `projects/{project}/topics/{topic}`.
# Not matching this format will result in an
# INVALID_ARGUMENT error.
&quot;payloadFormat&quot;: &quot;A String&quot;, # Required. The desired format of the notification message payloads.
},
&quot;schedule&quot;: { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
&quot;scheduleEndDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
# UTC time. A job will run once per 24 hours within the following guidelines:
#
# * If `schedule_end_date` and schedule_start_date are the same and in
# the future relative to UTC, the transfer is executed only one time.
# * If `schedule_end_date` is later than `schedule_start_date` and
# `schedule_end_date` is in the future relative to UTC, the job will
# run each day at start_time_of_day through `schedule_end_date`.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
&quot;startTimeOfDay&quot;: { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
# start later than this time.
#
# If `start_time_of_day` is not specified:
#
# * One-time transfers run immediately.
# * Recurring transfers run immediately, and each day at midnight UTC,
# through schedule_end_date.
#
# If `start_time_of_day` is specified:
#
# * One-time transfers run at the specified time.
# * Recurring transfers run at the specified time each day, through
# `schedule_end_date`.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
&quot;nanos&quot;: 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
&quot;seconds&quot;: 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
&quot;hours&quot;: 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value &quot;24:00:00&quot; for scenarios like business closing time.
&quot;minutes&quot;: 42, # Minutes of hour of day. Must be from 0 to 59.
},
&quot;scheduleStartDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
# relative to UTC time. If `schedule_start_date` and start_time_of_day
# are in the past relative to the job&#x27;s creation time, the transfer starts
# the day after you schedule the transfer request.
#
# **Note:** When starting jobs at or near midnight UTC it is possible that
# a job will start later than expected. For example, if you send an outbound
# request on June 1 one millisecond prior to midnight UTC and the Storage
# Transfer Service server receives the request on June 2, then it will create
# a TransferJob with `schedule_start_date` set to June 2 and a
# `start_time_of_day` set to midnight UTC. The first scheduled
# TransferOperation will take place on June 3 at midnight UTC.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
},
}
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
&quot;projectId&quot;: &quot;A String&quot;, # The ID of the Google Cloud Platform Project that owns the job.
&quot;status&quot;: &quot;A String&quot;, # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# **Note:** The effect of the new job status takes place during a subsequent
# job run. For example, if you change the job status from
# ENABLED to DISABLED, and an operation
# spawned by the transfer is running, the status change would not affect the
# current operation.
&quot;creationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was created.
&quot;lastModificationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was last modified.
&quot;description&quot;: &quot;A String&quot;, # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
&quot;transferSpec&quot;: { # Configuration for running a transfer. # Transfer specification.
&quot;awsS3DataSource&quot;: { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data resource, an object&#x27;s name is the S3 object&#x27;s key name.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. S3 Bucket name (see
# [Creating a
# bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
&quot;awsAccessKey&quot;: { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
# bucket. Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# [AWS Security
# Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
&quot;accessKeyId&quot;: &quot;A String&quot;, # Required. AWS access key ID.
&quot;secretAccessKey&quot;: &quot;A String&quot;, # Required. AWS secret access key. This field is not returned in RPC
# responses.
},
},
&quot;objectConditions&quot;: { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects&#x27; &quot;last modification time&quot; do not exclude objects in a data sink.
# to S3 and Cloud Storage objects.
#
# The &quot;last modification time&quot; refers to the time of the
# last change to the object&#x27;s content or metadata — specifically, this is
# the `updated` property of Cloud Storage objects and the `LastModified`
# field of S3 objects.
&quot;excludePrefixes&quot;: [ # `exclude_prefixes` must follow the requirements described for
# include_prefixes.
#
# The max size of `exclude_prefixes` is 1000.
&quot;A String&quot;,
],
&quot;maxTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# `NOW` - `max_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob,
# `NOW` refers to the start_time of the
# `TransferOperation`.
&quot;lastModifiedSince&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# this timestamp and objects that don&#x27;t have a &quot;last modification time&quot; are
# transferred.
#
# The `last_modified_since` and `last_modified_before` fields can be used
# together for chunked data processing. For example, consider a script that
# processes each day&#x27;s worth of data at a time. For that you&#x27;d set each
# of the fields as follows:
#
# * `last_modified_since` to the start of the day
#
# * `last_modified_before` to the end of the day
&quot;lastModifiedBefore&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before this
# timestamp and objects that don&#x27;t have a &quot;last modification time&quot; will be
# transferred.
&quot;minTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before
# `NOW` - `min_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob, `NOW`
# refers to the start_time of the
# `TransferOperation`.
&quot;includePrefixes&quot;: [ # If `include_prefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `include_prefixes`
# and that do not start with any of the exclude_prefixes. If
# `include_prefixes` is not specified, all objects except those that have
# names starting with one of the `exclude_prefixes` must satisfy the object
# conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
# and must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace. No include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace. No exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `include_prefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `include_prefixes`.
#
# The max size of `include_prefixes` is 1000.
&quot;A String&quot;,
],
},
&quot;gcsDataSink&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data sink.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
&quot;azureBlobStorageDataSource&quot;: { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
# An AzureBlobStorageData resource represents one Azure container. The storage
# account determines the [Azure
# endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
# In an AzureBlobStorageData resource, a blobs&#x27;s name is the [Azure Blob
# Storage blob&#x27;s key
# name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
&quot;azureCredentials&quot;: { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
&quot;sasToken&quot;: &quot;A String&quot;, # Required. Azure shared access signature. (see
# [Grant limited access to Azure Storage resources using shared access
# signatures
# (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
},
&quot;storageAccount&quot;: &quot;A String&quot;, # Required. The name of the Azure Storage account.
&quot;container&quot;: &quot;A String&quot;, # Required. The container to transfer from the Azure Storage account.
},
&quot;httpDataSource&quot;: { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
# over HTTP. The information of the objects to be transferred is contained in
# a file referenced by a URL. The first line in the file must be
# `&quot;TsvHttpData-1.0&quot;`, which specifies the format of the file. Subsequent
# lines specify the information of the list of objects, one object per list
# entry. Each entry has the following tab-delimited fields:
#
# * **HTTP URL** — The location of the object.
#
# * **Length** — The size of the object in bytes.
#
# * **MD5** — The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from
# URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/&lt;URL-path&gt;` is
# transferred to a data sink, the name of the object at the data sink is
# `&lt;hostname&gt;/&lt;URL-path&gt;`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5
# hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * ObjectConditions have no effect when filtering objects to transfer.
&quot;listUrl&quot;: &quot;A String&quot;, # Required. The URL that points to the file that stores the object list
# entries. This file must allow public access. Currently, only URLs with
# HTTP and HTTPS schemes are supported.
},
&quot;transferOptions&quot;: { # TransferOptions uses three boolean parameters to define the actions # If the option
# delete_objects_unique_in_sink
# is `true`, object conditions based on objects&#x27; &quot;last modification time&quot; are
# ignored and do not exclude objects in a data source or a data sink.
# to be performed on objects in a transfer.
&quot;deleteObjectsFromSourceAfterTransfer&quot;: True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
#
# **Note:** This option and delete_objects_unique_in_sink are mutually
# exclusive.
&quot;overwriteObjectsAlreadyExistingInSink&quot;: True or False, # Whether overwriting objects that already exist in the sink is allowed.
&quot;deleteObjectsUniqueInSink&quot;: True or False, # Whether objects that exist only in the sink should be deleted.
#
# **Note:** This option and delete_objects_from_source_after_transfer are
# mutually exclusive.
},
&quot;gcsDataSource&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data source.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
},
&quot;name&quot;: &quot;A String&quot;, # A unique name (within the transfer project) assigned when the job is
# created. If this field is empty in a CreateTransferJobRequest, Storage
# Transfer Service will assign a unique name. Otherwise, the specified name
# is used as the unique name for this job.
#
# If the specified name is in use by a job, the creation request fails with
# an ALREADY_EXISTS error.
#
# This name must start with `&quot;transferJobs/&quot;` prefix and end with a letter or
# a number, and should be no more than 128 characters.
# Example: `&quot;transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$&quot;`
#
# Invalid job names will fail with an
# INVALID_ARGUMENT error.
&quot;deletionTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was deleted.
&quot;notificationConfig&quot;: { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
# Notifications will be published to the customer-provided topic using the
# following `PubsubMessage.attributes`:
#
# * `&quot;eventType&quot;`: one of the EventType values
# * `&quot;payloadFormat&quot;`: one of the PayloadFormat values
# * `&quot;projectId&quot;`: the project_id of the
# `TransferOperation`
# * `&quot;transferJobName&quot;`: the
# transfer_job_name of the
# `TransferOperation`
# * `&quot;transferOperationName&quot;`: the name of the
# `TransferOperation`
#
# The `PubsubMessage.data` will contain a TransferOperation resource
# formatted according to the specified `PayloadFormat`.
&quot;eventTypes&quot;: [ # Event types for which a notification is desired. If empty, send
# notifications for all event types.
&quot;A String&quot;,
],
&quot;pubsubTopic&quot;: &quot;A String&quot;, # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
# notifications. Must be of the format: `projects/{project}/topics/{topic}`.
# Not matching this format will result in an
# INVALID_ARGUMENT error.
&quot;payloadFormat&quot;: &quot;A String&quot;, # Required. The desired format of the notification message payloads.
},
&quot;schedule&quot;: { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
&quot;scheduleEndDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
# UTC time. A job will run once per 24 hours within the following guidelines:
#
# * If `schedule_end_date` and schedule_start_date are the same and in
# the future relative to UTC, the transfer is executed only one time.
# * If `schedule_end_date` is later than `schedule_start_date` and
# `schedule_end_date` is in the future relative to UTC, the job will
# run each day at start_time_of_day through `schedule_end_date`.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
&quot;startTimeOfDay&quot;: { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
# start later than this time.
#
# If `start_time_of_day` is not specified:
#
# * One-time transfers run immediately.
# * Recurring transfers run immediately, and each day at midnight UTC,
# through schedule_end_date.
#
# If `start_time_of_day` is specified:
#
# * One-time transfers run at the specified time.
# * Recurring transfers run at the specified time each day, through
# `schedule_end_date`.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
&quot;nanos&quot;: 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
&quot;seconds&quot;: 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
&quot;hours&quot;: 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value &quot;24:00:00&quot; for scenarios like business closing time.
&quot;minutes&quot;: 42, # Minutes of hour of day. Must be from 0 to 59.
},
&quot;scheduleStartDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
# relative to UTC time. If `schedule_start_date` and start_time_of_day
# are in the past relative to the job&#x27;s creation time, the transfer starts
# the day after you schedule the transfer request.
#
# **Note:** When starting jobs at or near midnight UTC it is possible that
# a job will start later than expected. For example, if you send an outbound
# request on June 1 one millisecond prior to midnight UTC and the Storage
# Transfer Service server receives the request on June 2, then it will create
# a TransferJob with `schedule_start_date` set to June 2 and a
# `start_time_of_day` set to midnight UTC. The first scheduled
# TransferOperation will take place on June 3 at midnight UTC.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
},
}</pre>
</div>
<div class="method">
<code class="details" id="get">get(jobName, projectId=None, x__xgafv=None)</code>
<pre>Gets a transfer job.
Args:
jobName: string, Required. The job to get. (required)
projectId: string, Required. The ID of the Google Cloud Platform Console project that owns the
job.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
&quot;projectId&quot;: &quot;A String&quot;, # The ID of the Google Cloud Platform Project that owns the job.
&quot;status&quot;: &quot;A String&quot;, # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# **Note:** The effect of the new job status takes place during a subsequent
# job run. For example, if you change the job status from
# ENABLED to DISABLED, and an operation
# spawned by the transfer is running, the status change would not affect the
# current operation.
&quot;creationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was created.
&quot;lastModificationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was last modified.
&quot;description&quot;: &quot;A String&quot;, # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
&quot;transferSpec&quot;: { # Configuration for running a transfer. # Transfer specification.
&quot;awsS3DataSource&quot;: { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data resource, an object&#x27;s name is the S3 object&#x27;s key name.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. S3 Bucket name (see
# [Creating a
# bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
&quot;awsAccessKey&quot;: { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
# bucket. Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# [AWS Security
# Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
&quot;accessKeyId&quot;: &quot;A String&quot;, # Required. AWS access key ID.
&quot;secretAccessKey&quot;: &quot;A String&quot;, # Required. AWS secret access key. This field is not returned in RPC
# responses.
},
},
&quot;objectConditions&quot;: { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects&#x27; &quot;last modification time&quot; do not exclude objects in a data sink.
# to S3 and Cloud Storage objects.
#
# The &quot;last modification time&quot; refers to the time of the
# last change to the object&#x27;s content or metadata — specifically, this is
# the `updated` property of Cloud Storage objects and the `LastModified`
# field of S3 objects.
&quot;excludePrefixes&quot;: [ # `exclude_prefixes` must follow the requirements described for
# include_prefixes.
#
# The max size of `exclude_prefixes` is 1000.
&quot;A String&quot;,
],
&quot;maxTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# `NOW` - `max_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob,
# `NOW` refers to the start_time of the
# `TransferOperation`.
&quot;lastModifiedSince&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# this timestamp and objects that don&#x27;t have a &quot;last modification time&quot; are
# transferred.
#
# The `last_modified_since` and `last_modified_before` fields can be used
# together for chunked data processing. For example, consider a script that
# processes each day&#x27;s worth of data at a time. For that you&#x27;d set each
# of the fields as follows:
#
# * `last_modified_since` to the start of the day
#
# * `last_modified_before` to the end of the day
&quot;lastModifiedBefore&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before this
# timestamp and objects that don&#x27;t have a &quot;last modification time&quot; will be
# transferred.
&quot;minTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before
# `NOW` - `min_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob, `NOW`
# refers to the start_time of the
# `TransferOperation`.
&quot;includePrefixes&quot;: [ # If `include_prefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `include_prefixes`
# and that do not start with any of the exclude_prefixes. If
# `include_prefixes` is not specified, all objects except those that have
# names starting with one of the `exclude_prefixes` must satisfy the object
# conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
# and must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace. No include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace. No exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `include_prefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `include_prefixes`.
#
# The max size of `include_prefixes` is 1000.
&quot;A String&quot;,
],
},
&quot;gcsDataSink&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data sink.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
&quot;azureBlobStorageDataSource&quot;: { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
# An AzureBlobStorageData resource represents one Azure container. The storage
# account determines the [Azure
# endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
# In an AzureBlobStorageData resource, a blobs&#x27;s name is the [Azure Blob
# Storage blob&#x27;s key
# name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
&quot;azureCredentials&quot;: { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
&quot;sasToken&quot;: &quot;A String&quot;, # Required. Azure shared access signature. (see
# [Grant limited access to Azure Storage resources using shared access
# signatures
# (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
},
&quot;storageAccount&quot;: &quot;A String&quot;, # Required. The name of the Azure Storage account.
&quot;container&quot;: &quot;A String&quot;, # Required. The container to transfer from the Azure Storage account.
},
&quot;httpDataSource&quot;: { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
# over HTTP. The information of the objects to be transferred is contained in
# a file referenced by a URL. The first line in the file must be
# `&quot;TsvHttpData-1.0&quot;`, which specifies the format of the file. Subsequent
# lines specify the information of the list of objects, one object per list
# entry. Each entry has the following tab-delimited fields:
#
# * **HTTP URL** — The location of the object.
#
# * **Length** — The size of the object in bytes.
#
# * **MD5** — The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from
# URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/&lt;URL-path&gt;` is
# transferred to a data sink, the name of the object at the data sink is
# `&lt;hostname&gt;/&lt;URL-path&gt;`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5
# hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * ObjectConditions have no effect when filtering objects to transfer.
&quot;listUrl&quot;: &quot;A String&quot;, # Required. The URL that points to the file that stores the object list
# entries. This file must allow public access. Currently, only URLs with
# HTTP and HTTPS schemes are supported.
},
&quot;transferOptions&quot;: { # TransferOptions uses three boolean parameters to define the actions # If the option
# delete_objects_unique_in_sink
# is `true`, object conditions based on objects&#x27; &quot;last modification time&quot; are
# ignored and do not exclude objects in a data source or a data sink.
# to be performed on objects in a transfer.
&quot;deleteObjectsFromSourceAfterTransfer&quot;: True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
#
# **Note:** This option and delete_objects_unique_in_sink are mutually
# exclusive.
&quot;overwriteObjectsAlreadyExistingInSink&quot;: True or False, # Whether overwriting objects that already exist in the sink is allowed.
&quot;deleteObjectsUniqueInSink&quot;: True or False, # Whether objects that exist only in the sink should be deleted.
#
# **Note:** This option and delete_objects_from_source_after_transfer are
# mutually exclusive.
},
&quot;gcsDataSource&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data source.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
},
&quot;name&quot;: &quot;A String&quot;, # A unique name (within the transfer project) assigned when the job is
# created. If this field is empty in a CreateTransferJobRequest, Storage
# Transfer Service will assign a unique name. Otherwise, the specified name
# is used as the unique name for this job.
#
# If the specified name is in use by a job, the creation request fails with
# an ALREADY_EXISTS error.
#
# This name must start with `&quot;transferJobs/&quot;` prefix and end with a letter or
# a number, and should be no more than 128 characters.
# Example: `&quot;transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$&quot;`
#
# Invalid job names will fail with an
# INVALID_ARGUMENT error.
&quot;deletionTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was deleted.
&quot;notificationConfig&quot;: { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
# Notifications will be published to the customer-provided topic using the
# following `PubsubMessage.attributes`:
#
# * `&quot;eventType&quot;`: one of the EventType values
# * `&quot;payloadFormat&quot;`: one of the PayloadFormat values
# * `&quot;projectId&quot;`: the project_id of the
# `TransferOperation`
# * `&quot;transferJobName&quot;`: the
# transfer_job_name of the
# `TransferOperation`
# * `&quot;transferOperationName&quot;`: the name of the
# `TransferOperation`
#
# The `PubsubMessage.data` will contain a TransferOperation resource
# formatted according to the specified `PayloadFormat`.
&quot;eventTypes&quot;: [ # Event types for which a notification is desired. If empty, send
# notifications for all event types.
&quot;A String&quot;,
],
&quot;pubsubTopic&quot;: &quot;A String&quot;, # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
# notifications. Must be of the format: `projects/{project}/topics/{topic}`.
# Not matching this format will result in an
# INVALID_ARGUMENT error.
&quot;payloadFormat&quot;: &quot;A String&quot;, # Required. The desired format of the notification message payloads.
},
&quot;schedule&quot;: { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
&quot;scheduleEndDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
# UTC time. A job will run once per 24 hours within the following guidelines:
#
# * If `schedule_end_date` and schedule_start_date are the same and in
# the future relative to UTC, the transfer is executed only one time.
# * If `schedule_end_date` is later than `schedule_start_date` and
# `schedule_end_date` is in the future relative to UTC, the job will
# run each day at start_time_of_day through `schedule_end_date`.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
&quot;startTimeOfDay&quot;: { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
# start later than this time.
#
# If `start_time_of_day` is not specified:
#
# * One-time transfers run immediately.
# * Recurring transfers run immediately, and each day at midnight UTC,
# through schedule_end_date.
#
# If `start_time_of_day` is specified:
#
# * One-time transfers run at the specified time.
# * Recurring transfers run at the specified time each day, through
# `schedule_end_date`.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
&quot;nanos&quot;: 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
&quot;seconds&quot;: 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
&quot;hours&quot;: 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value &quot;24:00:00&quot; for scenarios like business closing time.
&quot;minutes&quot;: 42, # Minutes of hour of day. Must be from 0 to 59.
},
&quot;scheduleStartDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
# relative to UTC time. If `schedule_start_date` and start_time_of_day
# are in the past relative to the job&#x27;s creation time, the transfer starts
# the day after you schedule the transfer request.
#
# **Note:** When starting jobs at or near midnight UTC it is possible that
# a job will start later than expected. For example, if you send an outbound
# request on June 1 one millisecond prior to midnight UTC and the Storage
# Transfer Service server receives the request on June 2, then it will create
# a TransferJob with `schedule_start_date` set to June 2 and a
# `start_time_of_day` set to midnight UTC. The first scheduled
# TransferOperation will take place on June 3 at midnight UTC.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
},
}</pre>
</div>
<div class="method">
<code class="details" id="list">list(filter=None, pageToken=None, pageSize=None, x__xgafv=None)</code>
<pre>Lists transfer jobs.
Args:
filter: string, Required. A list of query parameters specified as JSON text in the form of:
{&quot;project&lt;span&gt;_&lt;/span&gt;id&quot;:&quot;my_project_id&quot;,
&quot;job_names&quot;:[&quot;jobid1&quot;,&quot;jobid2&quot;,...],
&quot;job_statuses&quot;:[&quot;status1&quot;,&quot;status2&quot;,...]}.
Since `job_names` and `job_statuses` support multiple values, their values
must be specified with array notation. `project`&lt;span&gt;`_`&lt;/span&gt;`id` is
required. `job_names` and `job_statuses` are optional. The valid values
for `job_statuses` are case-insensitive:
ENABLED,
DISABLED, and
DELETED.
pageToken: string, The list page token.
pageSize: integer, The list page size. The max allowed value is 256.
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # Response from ListTransferJobs.
&quot;transferJobs&quot;: [ # A list of transfer jobs.
{ # This resource represents the configuration of a transfer job that runs
# periodically.
&quot;projectId&quot;: &quot;A String&quot;, # The ID of the Google Cloud Platform Project that owns the job.
&quot;status&quot;: &quot;A String&quot;, # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# **Note:** The effect of the new job status takes place during a subsequent
# job run. For example, if you change the job status from
# ENABLED to DISABLED, and an operation
# spawned by the transfer is running, the status change would not affect the
# current operation.
&quot;creationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was created.
&quot;lastModificationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was last modified.
&quot;description&quot;: &quot;A String&quot;, # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
&quot;transferSpec&quot;: { # Configuration for running a transfer. # Transfer specification.
&quot;awsS3DataSource&quot;: { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data resource, an object&#x27;s name is the S3 object&#x27;s key name.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. S3 Bucket name (see
# [Creating a
# bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
&quot;awsAccessKey&quot;: { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
# bucket. Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# [AWS Security
# Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
&quot;accessKeyId&quot;: &quot;A String&quot;, # Required. AWS access key ID.
&quot;secretAccessKey&quot;: &quot;A String&quot;, # Required. AWS secret access key. This field is not returned in RPC
# responses.
},
},
&quot;objectConditions&quot;: { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects&#x27; &quot;last modification time&quot; do not exclude objects in a data sink.
# to S3 and Cloud Storage objects.
#
# The &quot;last modification time&quot; refers to the time of the
# last change to the object&#x27;s content or metadata — specifically, this is
# the `updated` property of Cloud Storage objects and the `LastModified`
# field of S3 objects.
&quot;excludePrefixes&quot;: [ # `exclude_prefixes` must follow the requirements described for
# include_prefixes.
#
# The max size of `exclude_prefixes` is 1000.
&quot;A String&quot;,
],
&quot;maxTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# `NOW` - `max_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob,
# `NOW` refers to the start_time of the
# `TransferOperation`.
&quot;lastModifiedSince&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# this timestamp and objects that don&#x27;t have a &quot;last modification time&quot; are
# transferred.
#
# The `last_modified_since` and `last_modified_before` fields can be used
# together for chunked data processing. For example, consider a script that
# processes each day&#x27;s worth of data at a time. For that you&#x27;d set each
# of the fields as follows:
#
# * `last_modified_since` to the start of the day
#
# * `last_modified_before` to the end of the day
&quot;lastModifiedBefore&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before this
# timestamp and objects that don&#x27;t have a &quot;last modification time&quot; will be
# transferred.
&quot;minTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before
# `NOW` - `min_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob, `NOW`
# refers to the start_time of the
# `TransferOperation`.
&quot;includePrefixes&quot;: [ # If `include_prefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `include_prefixes`
# and that do not start with any of the exclude_prefixes. If
# `include_prefixes` is not specified, all objects except those that have
# names starting with one of the `exclude_prefixes` must satisfy the object
# conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
# and must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace. No include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace. No exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `include_prefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `include_prefixes`.
#
# The max size of `include_prefixes` is 1000.
&quot;A String&quot;,
],
},
&quot;gcsDataSink&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data sink.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
&quot;azureBlobStorageDataSource&quot;: { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
# An AzureBlobStorageData resource represents one Azure container. The storage
# account determines the [Azure
# endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
# In an AzureBlobStorageData resource, a blobs&#x27;s name is the [Azure Blob
# Storage blob&#x27;s key
# name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
&quot;azureCredentials&quot;: { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
&quot;sasToken&quot;: &quot;A String&quot;, # Required. Azure shared access signature. (see
# [Grant limited access to Azure Storage resources using shared access
# signatures
# (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
},
&quot;storageAccount&quot;: &quot;A String&quot;, # Required. The name of the Azure Storage account.
&quot;container&quot;: &quot;A String&quot;, # Required. The container to transfer from the Azure Storage account.
},
&quot;httpDataSource&quot;: { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
# over HTTP. The information of the objects to be transferred is contained in
# a file referenced by a URL. The first line in the file must be
# `&quot;TsvHttpData-1.0&quot;`, which specifies the format of the file. Subsequent
# lines specify the information of the list of objects, one object per list
# entry. Each entry has the following tab-delimited fields:
#
# * **HTTP URL** — The location of the object.
#
# * **Length** — The size of the object in bytes.
#
# * **MD5** — The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from
# URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/&lt;URL-path&gt;` is
# transferred to a data sink, the name of the object at the data sink is
# `&lt;hostname&gt;/&lt;URL-path&gt;`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5
# hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * ObjectConditions have no effect when filtering objects to transfer.
&quot;listUrl&quot;: &quot;A String&quot;, # Required. The URL that points to the file that stores the object list
# entries. This file must allow public access. Currently, only URLs with
# HTTP and HTTPS schemes are supported.
},
&quot;transferOptions&quot;: { # TransferOptions uses three boolean parameters to define the actions # If the option
# delete_objects_unique_in_sink
# is `true`, object conditions based on objects&#x27; &quot;last modification time&quot; are
# ignored and do not exclude objects in a data source or a data sink.
# to be performed on objects in a transfer.
&quot;deleteObjectsFromSourceAfterTransfer&quot;: True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
#
# **Note:** This option and delete_objects_unique_in_sink are mutually
# exclusive.
&quot;overwriteObjectsAlreadyExistingInSink&quot;: True or False, # Whether overwriting objects that already exist in the sink is allowed.
&quot;deleteObjectsUniqueInSink&quot;: True or False, # Whether objects that exist only in the sink should be deleted.
#
# **Note:** This option and delete_objects_from_source_after_transfer are
# mutually exclusive.
},
&quot;gcsDataSource&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data source.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
},
&quot;name&quot;: &quot;A String&quot;, # A unique name (within the transfer project) assigned when the job is
# created. If this field is empty in a CreateTransferJobRequest, Storage
# Transfer Service will assign a unique name. Otherwise, the specified name
# is used as the unique name for this job.
#
# If the specified name is in use by a job, the creation request fails with
# an ALREADY_EXISTS error.
#
# This name must start with `&quot;transferJobs/&quot;` prefix and end with a letter or
# a number, and should be no more than 128 characters.
# Example: `&quot;transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$&quot;`
#
# Invalid job names will fail with an
# INVALID_ARGUMENT error.
&quot;deletionTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was deleted.
&quot;notificationConfig&quot;: { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
# Notifications will be published to the customer-provided topic using the
# following `PubsubMessage.attributes`:
#
# * `&quot;eventType&quot;`: one of the EventType values
# * `&quot;payloadFormat&quot;`: one of the PayloadFormat values
# * `&quot;projectId&quot;`: the project_id of the
# `TransferOperation`
# * `&quot;transferJobName&quot;`: the
# transfer_job_name of the
# `TransferOperation`
# * `&quot;transferOperationName&quot;`: the name of the
# `TransferOperation`
#
# The `PubsubMessage.data` will contain a TransferOperation resource
# formatted according to the specified `PayloadFormat`.
&quot;eventTypes&quot;: [ # Event types for which a notification is desired. If empty, send
# notifications for all event types.
&quot;A String&quot;,
],
&quot;pubsubTopic&quot;: &quot;A String&quot;, # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
# notifications. Must be of the format: `projects/{project}/topics/{topic}`.
# Not matching this format will result in an
# INVALID_ARGUMENT error.
&quot;payloadFormat&quot;: &quot;A String&quot;, # Required. The desired format of the notification message payloads.
},
&quot;schedule&quot;: { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
&quot;scheduleEndDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
# UTC time. A job will run once per 24 hours within the following guidelines:
#
# * If `schedule_end_date` and schedule_start_date are the same and in
# the future relative to UTC, the transfer is executed only one time.
# * If `schedule_end_date` is later than `schedule_start_date` and
# `schedule_end_date` is in the future relative to UTC, the job will
# run each day at start_time_of_day through `schedule_end_date`.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
&quot;startTimeOfDay&quot;: { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
# start later than this time.
#
# If `start_time_of_day` is not specified:
#
# * One-time transfers run immediately.
# * Recurring transfers run immediately, and each day at midnight UTC,
# through schedule_end_date.
#
# If `start_time_of_day` is specified:
#
# * One-time transfers run at the specified time.
# * Recurring transfers run at the specified time each day, through
# `schedule_end_date`.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
&quot;nanos&quot;: 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
&quot;seconds&quot;: 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
&quot;hours&quot;: 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value &quot;24:00:00&quot; for scenarios like business closing time.
&quot;minutes&quot;: 42, # Minutes of hour of day. Must be from 0 to 59.
},
&quot;scheduleStartDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
# relative to UTC time. If `schedule_start_date` and start_time_of_day
# are in the past relative to the job&#x27;s creation time, the transfer starts
# the day after you schedule the transfer request.
#
# **Note:** When starting jobs at or near midnight UTC it is possible that
# a job will start later than expected. For example, if you send an outbound
# request on June 1 one millisecond prior to midnight UTC and the Storage
# Transfer Service server receives the request on June 2, then it will create
# a TransferJob with `schedule_start_date` set to June 2 and a
# `start_time_of_day` set to midnight UTC. The first scheduled
# TransferOperation will take place on June 3 at midnight UTC.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
},
},
],
&quot;nextPageToken&quot;: &quot;A String&quot;, # The list next page token.
}</pre>
</div>
<div class="method">
<code class="details" id="list_next">list_next(previous_request, previous_response)</code>
<pre>Retrieves the next page of results.
Args:
previous_request: The request for the previous page. (required)
previous_response: The response from the request for the previous page. (required)
Returns:
A request object that you can call &#x27;execute()&#x27; on to request the next
page. Returns None if there are no more items in the collection.
</pre>
</div>
<div class="method">
<code class="details" id="patch">patch(jobName, body=None, x__xgafv=None)</code>
<pre>Updates a transfer job. Updating a job&#x27;s transfer spec does not affect
transfer operations that are running already. Updating a job&#x27;s schedule
is not allowed.
**Note:** The job&#x27;s status field can be modified
using this RPC (for example, to set a job&#x27;s status to
DELETED,
DISABLED, or
ENABLED).
Args:
jobName: string, Required. The name of job to update. (required)
body: object, The request body.
The object takes the form of:
{ # Request passed to UpdateTransferJob.
&quot;projectId&quot;: &quot;A String&quot;, # Required. The ID of the Google Cloud Platform Console project that owns the
# job.
&quot;transferJob&quot;: { # This resource represents the configuration of a transfer job that runs # Required. The job to update. `transferJob` is expected to specify only
# four fields: description,
# transfer_spec,
# notification_config, and
# status. An `UpdateTransferJobRequest` that specifies
# other fields will be rejected with the error
# INVALID_ARGUMENT.
# periodically.
&quot;projectId&quot;: &quot;A String&quot;, # The ID of the Google Cloud Platform Project that owns the job.
&quot;status&quot;: &quot;A String&quot;, # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# **Note:** The effect of the new job status takes place during a subsequent
# job run. For example, if you change the job status from
# ENABLED to DISABLED, and an operation
# spawned by the transfer is running, the status change would not affect the
# current operation.
&quot;creationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was created.
&quot;lastModificationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was last modified.
&quot;description&quot;: &quot;A String&quot;, # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
&quot;transferSpec&quot;: { # Configuration for running a transfer. # Transfer specification.
&quot;awsS3DataSource&quot;: { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data resource, an object&#x27;s name is the S3 object&#x27;s key name.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. S3 Bucket name (see
# [Creating a
# bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
&quot;awsAccessKey&quot;: { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
# bucket. Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# [AWS Security
# Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
&quot;accessKeyId&quot;: &quot;A String&quot;, # Required. AWS access key ID.
&quot;secretAccessKey&quot;: &quot;A String&quot;, # Required. AWS secret access key. This field is not returned in RPC
# responses.
},
},
&quot;objectConditions&quot;: { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects&#x27; &quot;last modification time&quot; do not exclude objects in a data sink.
# to S3 and Cloud Storage objects.
#
# The &quot;last modification time&quot; refers to the time of the
# last change to the object&#x27;s content or metadata — specifically, this is
# the `updated` property of Cloud Storage objects and the `LastModified`
# field of S3 objects.
&quot;excludePrefixes&quot;: [ # `exclude_prefixes` must follow the requirements described for
# include_prefixes.
#
# The max size of `exclude_prefixes` is 1000.
&quot;A String&quot;,
],
&quot;maxTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# `NOW` - `max_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob,
# `NOW` refers to the start_time of the
# `TransferOperation`.
&quot;lastModifiedSince&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# this timestamp and objects that don&#x27;t have a &quot;last modification time&quot; are
# transferred.
#
# The `last_modified_since` and `last_modified_before` fields can be used
# together for chunked data processing. For example, consider a script that
# processes each day&#x27;s worth of data at a time. For that you&#x27;d set each
# of the fields as follows:
#
# * `last_modified_since` to the start of the day
#
# * `last_modified_before` to the end of the day
&quot;lastModifiedBefore&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before this
# timestamp and objects that don&#x27;t have a &quot;last modification time&quot; will be
# transferred.
&quot;minTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before
# `NOW` - `min_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob, `NOW`
# refers to the start_time of the
# `TransferOperation`.
&quot;includePrefixes&quot;: [ # If `include_prefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `include_prefixes`
# and that do not start with any of the exclude_prefixes. If
# `include_prefixes` is not specified, all objects except those that have
# names starting with one of the `exclude_prefixes` must satisfy the object
# conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
# and must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace. No include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace. No exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `include_prefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `include_prefixes`.
#
# The max size of `include_prefixes` is 1000.
&quot;A String&quot;,
],
},
&quot;gcsDataSink&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data sink.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
&quot;azureBlobStorageDataSource&quot;: { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
# An AzureBlobStorageData resource represents one Azure container. The storage
# account determines the [Azure
# endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
# In an AzureBlobStorageData resource, a blobs&#x27;s name is the [Azure Blob
# Storage blob&#x27;s key
# name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
&quot;azureCredentials&quot;: { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
&quot;sasToken&quot;: &quot;A String&quot;, # Required. Azure shared access signature. (see
# [Grant limited access to Azure Storage resources using shared access
# signatures
# (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
},
&quot;storageAccount&quot;: &quot;A String&quot;, # Required. The name of the Azure Storage account.
&quot;container&quot;: &quot;A String&quot;, # Required. The container to transfer from the Azure Storage account.
},
&quot;httpDataSource&quot;: { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
# over HTTP. The information of the objects to be transferred is contained in
# a file referenced by a URL. The first line in the file must be
# `&quot;TsvHttpData-1.0&quot;`, which specifies the format of the file. Subsequent
# lines specify the information of the list of objects, one object per list
# entry. Each entry has the following tab-delimited fields:
#
# * **HTTP URL** — The location of the object.
#
# * **Length** — The size of the object in bytes.
#
# * **MD5** — The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from
# URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/&lt;URL-path&gt;` is
# transferred to a data sink, the name of the object at the data sink is
# `&lt;hostname&gt;/&lt;URL-path&gt;`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5
# hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * ObjectConditions have no effect when filtering objects to transfer.
&quot;listUrl&quot;: &quot;A String&quot;, # Required. The URL that points to the file that stores the object list
# entries. This file must allow public access. Currently, only URLs with
# HTTP and HTTPS schemes are supported.
},
&quot;transferOptions&quot;: { # TransferOptions uses three boolean parameters to define the actions # If the option
# delete_objects_unique_in_sink
# is `true`, object conditions based on objects&#x27; &quot;last modification time&quot; are
# ignored and do not exclude objects in a data source or a data sink.
# to be performed on objects in a transfer.
&quot;deleteObjectsFromSourceAfterTransfer&quot;: True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
#
# **Note:** This option and delete_objects_unique_in_sink are mutually
# exclusive.
&quot;overwriteObjectsAlreadyExistingInSink&quot;: True or False, # Whether overwriting objects that already exist in the sink is allowed.
&quot;deleteObjectsUniqueInSink&quot;: True or False, # Whether objects that exist only in the sink should be deleted.
#
# **Note:** This option and delete_objects_from_source_after_transfer are
# mutually exclusive.
},
&quot;gcsDataSource&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data source.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
},
&quot;name&quot;: &quot;A String&quot;, # A unique name (within the transfer project) assigned when the job is
# created. If this field is empty in a CreateTransferJobRequest, Storage
# Transfer Service will assign a unique name. Otherwise, the specified name
# is used as the unique name for this job.
#
# If the specified name is in use by a job, the creation request fails with
# an ALREADY_EXISTS error.
#
# This name must start with `&quot;transferJobs/&quot;` prefix and end with a letter or
# a number, and should be no more than 128 characters.
# Example: `&quot;transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$&quot;`
#
# Invalid job names will fail with an
# INVALID_ARGUMENT error.
&quot;deletionTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was deleted.
&quot;notificationConfig&quot;: { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
# Notifications will be published to the customer-provided topic using the
# following `PubsubMessage.attributes`:
#
# * `&quot;eventType&quot;`: one of the EventType values
# * `&quot;payloadFormat&quot;`: one of the PayloadFormat values
# * `&quot;projectId&quot;`: the project_id of the
# `TransferOperation`
# * `&quot;transferJobName&quot;`: the
# transfer_job_name of the
# `TransferOperation`
# * `&quot;transferOperationName&quot;`: the name of the
# `TransferOperation`
#
# The `PubsubMessage.data` will contain a TransferOperation resource
# formatted according to the specified `PayloadFormat`.
&quot;eventTypes&quot;: [ # Event types for which a notification is desired. If empty, send
# notifications for all event types.
&quot;A String&quot;,
],
&quot;pubsubTopic&quot;: &quot;A String&quot;, # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
# notifications. Must be of the format: `projects/{project}/topics/{topic}`.
# Not matching this format will result in an
# INVALID_ARGUMENT error.
&quot;payloadFormat&quot;: &quot;A String&quot;, # Required. The desired format of the notification message payloads.
},
&quot;schedule&quot;: { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
&quot;scheduleEndDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
# UTC time. A job will run once per 24 hours within the following guidelines:
#
# * If `schedule_end_date` and schedule_start_date are the same and in
# the future relative to UTC, the transfer is executed only one time.
# * If `schedule_end_date` is later than `schedule_start_date` and
# `schedule_end_date` is in the future relative to UTC, the job will
# run each day at start_time_of_day through `schedule_end_date`.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
&quot;startTimeOfDay&quot;: { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
# start later than this time.
#
# If `start_time_of_day` is not specified:
#
# * One-time transfers run immediately.
# * Recurring transfers run immediately, and each day at midnight UTC,
# through schedule_end_date.
#
# If `start_time_of_day` is specified:
#
# * One-time transfers run at the specified time.
# * Recurring transfers run at the specified time each day, through
# `schedule_end_date`.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
&quot;nanos&quot;: 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
&quot;seconds&quot;: 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
&quot;hours&quot;: 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value &quot;24:00:00&quot; for scenarios like business closing time.
&quot;minutes&quot;: 42, # Minutes of hour of day. Must be from 0 to 59.
},
&quot;scheduleStartDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
# relative to UTC time. If `schedule_start_date` and start_time_of_day
# are in the past relative to the job&#x27;s creation time, the transfer starts
# the day after you schedule the transfer request.
#
# **Note:** When starting jobs at or near midnight UTC it is possible that
# a job will start later than expected. For example, if you send an outbound
# request on June 1 one millisecond prior to midnight UTC and the Storage
# Transfer Service server receives the request on June 2, then it will create
# a TransferJob with `schedule_start_date` set to June 2 and a
# `start_time_of_day` set to midnight UTC. The first scheduled
# TransferOperation will take place on June 3 at midnight UTC.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
},
},
&quot;updateTransferJobFieldMask&quot;: &quot;A String&quot;, # The field mask of the fields in `transferJob` that are to be updated in
# this request. Fields in `transferJob` that can be updated are:
# description,
# transfer_spec,
# notification_config, and
# status. To update the `transfer_spec` of the job, a
# complete transfer specification must be provided. An incomplete
# specification missing any required fields will be rejected with the error
# INVALID_ARGUMENT.
}
x__xgafv: string, V1 error format.
Allowed values
1 - v1 error format
2 - v2 error format
Returns:
An object of the form:
{ # This resource represents the configuration of a transfer job that runs
# periodically.
&quot;projectId&quot;: &quot;A String&quot;, # The ID of the Google Cloud Platform Project that owns the job.
&quot;status&quot;: &quot;A String&quot;, # Status of the job. This value MUST be specified for
# `CreateTransferJobRequests`.
#
# **Note:** The effect of the new job status takes place during a subsequent
# job run. For example, if you change the job status from
# ENABLED to DISABLED, and an operation
# spawned by the transfer is running, the status change would not affect the
# current operation.
&quot;creationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was created.
&quot;lastModificationTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was last modified.
&quot;description&quot;: &quot;A String&quot;, # A description provided by the user for the job. Its max length is 1024
# bytes when Unicode-encoded.
&quot;transferSpec&quot;: { # Configuration for running a transfer. # Transfer specification.
&quot;awsS3DataSource&quot;: { # An AwsS3Data resource can be a data source, but not a data sink. # An AWS S3 data source.
# In an AwsS3Data resource, an object&#x27;s name is the S3 object&#x27;s key name.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. S3 Bucket name (see
# [Creating a
# bucket](https://docs.aws.amazon.com/AmazonS3/latest/dev/create-bucket-get-location-example.html)).
&quot;awsAccessKey&quot;: { # AWS access key (see # Required. AWS access key used to sign the API requests to the AWS S3
# bucket. Permissions on the bucket must be granted to the access ID of the
# AWS access key.
# [AWS Security
# Credentials](https://docs.aws.amazon.com/general/latest/gr/aws-security-credentials.html)).
&quot;accessKeyId&quot;: &quot;A String&quot;, # Required. AWS access key ID.
&quot;secretAccessKey&quot;: &quot;A String&quot;, # Required. AWS secret access key. This field is not returned in RPC
# responses.
},
},
&quot;objectConditions&quot;: { # Conditions that determine which objects will be transferred. Applies only # Only objects that satisfy these object conditions are included in the set
# of data source and data sink objects. Object conditions based on
# objects&#x27; &quot;last modification time&quot; do not exclude objects in a data sink.
# to S3 and Cloud Storage objects.
#
# The &quot;last modification time&quot; refers to the time of the
# last change to the object&#x27;s content or metadata — specifically, this is
# the `updated` property of Cloud Storage objects and the `LastModified`
# field of S3 objects.
&quot;excludePrefixes&quot;: [ # `exclude_prefixes` must follow the requirements described for
# include_prefixes.
#
# The max size of `exclude_prefixes` is 1000.
&quot;A String&quot;,
],
&quot;maxTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# `NOW` - `max_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob,
# `NOW` refers to the start_time of the
# `TransferOperation`.
&quot;lastModifiedSince&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; on or after
# this timestamp and objects that don&#x27;t have a &quot;last modification time&quot; are
# transferred.
#
# The `last_modified_since` and `last_modified_before` fields can be used
# together for chunked data processing. For example, consider a script that
# processes each day&#x27;s worth of data at a time. For that you&#x27;d set each
# of the fields as follows:
#
# * `last_modified_since` to the start of the day
#
# * `last_modified_before` to the end of the day
&quot;lastModifiedBefore&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before this
# timestamp and objects that don&#x27;t have a &quot;last modification time&quot; will be
# transferred.
&quot;minTimeElapsedSinceLastModification&quot;: &quot;A String&quot;, # If specified, only objects with a &quot;last modification time&quot; before
# `NOW` - `min_time_elapsed_since_last_modification` and objects that don&#x27;t
# have a &quot;last modification time&quot; are transferred.
#
# For each TransferOperation started by this TransferJob, `NOW`
# refers to the start_time of the
# `TransferOperation`.
&quot;includePrefixes&quot;: [ # If `include_prefixes` is specified, objects that satisfy the object
# conditions must have names that start with one of the `include_prefixes`
# and that do not start with any of the exclude_prefixes. If
# `include_prefixes` is not specified, all objects except those that have
# names starting with one of the `exclude_prefixes` must satisfy the object
# conditions.
#
# Requirements:
#
# * Each include-prefix and exclude-prefix can contain any sequence of
# Unicode characters, to a max length of 1024 bytes when UTF8-encoded,
# and must not contain Carriage Return or Line Feed characters. Wildcard
# matching and regular expression matching are not supported.
#
# * Each include-prefix and exclude-prefix must omit the leading slash.
# For example, to include the `requests.gz` object in a transfer from
# `s3://my-aws-bucket/logs/y=2015/requests.gz`, specify the include
# prefix as `logs/y=2015/requests.gz`.
#
# * None of the include-prefix or the exclude-prefix values can be empty,
# if specified.
#
# * Each include-prefix must include a distinct portion of the object
# namespace. No include-prefix may be a prefix of another
# include-prefix.
#
# * Each exclude-prefix must exclude a distinct portion of the object
# namespace. No exclude-prefix may be a prefix of another
# exclude-prefix.
#
# * If `include_prefixes` is specified, then each exclude-prefix must start
# with the value of a path explicitly included by `include_prefixes`.
#
# The max size of `include_prefixes` is 1000.
&quot;A String&quot;,
],
},
&quot;gcsDataSink&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data sink.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
&quot;azureBlobStorageDataSource&quot;: { # An AzureBlobStorageData resource can be a data source, but not a data sink. # An Azure Blob Storage data source.
# An AzureBlobStorageData resource represents one Azure container. The storage
# account determines the [Azure
# endpoint](https://docs.microsoft.com/en-us/azure/storage/common/storage-create-storage-account#storage-account-endpoints).
# In an AzureBlobStorageData resource, a blobs&#x27;s name is the [Azure Blob
# Storage blob&#x27;s key
# name](https://docs.microsoft.com/en-us/rest/api/storageservices/naming-and-referencing-containers--blobs--and-metadata#blob-names).
&quot;azureCredentials&quot;: { # Azure credentials # Required. Credentials used to authenticate API requests to Azure.
&quot;sasToken&quot;: &quot;A String&quot;, # Required. Azure shared access signature. (see
# [Grant limited access to Azure Storage resources using shared access
# signatures
# (SAS)](https://docs.microsoft.com/en-us/azure/storage/common/storage-sas-overview)).
},
&quot;storageAccount&quot;: &quot;A String&quot;, # Required. The name of the Azure Storage account.
&quot;container&quot;: &quot;A String&quot;, # Required. The container to transfer from the Azure Storage account.
},
&quot;httpDataSource&quot;: { # An HttpData resource specifies a list of objects on the web to be transferred # An HTTP URL data source.
# over HTTP. The information of the objects to be transferred is contained in
# a file referenced by a URL. The first line in the file must be
# `&quot;TsvHttpData-1.0&quot;`, which specifies the format of the file. Subsequent
# lines specify the information of the list of objects, one object per list
# entry. Each entry has the following tab-delimited fields:
#
# * **HTTP URL** — The location of the object.
#
# * **Length** — The size of the object in bytes.
#
# * **MD5** — The base64-encoded MD5 hash of the object.
#
# For an example of a valid TSV file, see
# [Transferring data from
# URLs](https://cloud.google.com/storage-transfer/docs/create-url-list).
#
# When transferring data based on a URL list, keep the following in mind:
#
# * When an object located at `http(s)://hostname:port/&lt;URL-path&gt;` is
# transferred to a data sink, the name of the object at the data sink is
# `&lt;hostname&gt;/&lt;URL-path&gt;`.
#
# * If the specified size of an object does not match the actual size of the
# object fetched, the object will not be transferred.
#
# * If the specified MD5 does not match the MD5 computed from the transferred
# bytes, the object transfer will fail. For more information, see
# [Generating MD5
# hashes](https://cloud.google.com/storage-transfer/docs/create-url-list#md5)
#
# * Ensure that each URL you specify is publicly accessible. For
# example, in Cloud Storage you can
# [share an object publicly]
# (https://cloud.google.com/storage/docs/cloud-console#_sharingdata) and get
# a link to it.
#
# * Storage Transfer Service obeys `robots.txt` rules and requires the source
# HTTP server to support `Range` requests and to return a `Content-Length`
# header in each response.
#
# * ObjectConditions have no effect when filtering objects to transfer.
&quot;listUrl&quot;: &quot;A String&quot;, # Required. The URL that points to the file that stores the object list
# entries. This file must allow public access. Currently, only URLs with
# HTTP and HTTPS schemes are supported.
},
&quot;transferOptions&quot;: { # TransferOptions uses three boolean parameters to define the actions # If the option
# delete_objects_unique_in_sink
# is `true`, object conditions based on objects&#x27; &quot;last modification time&quot; are
# ignored and do not exclude objects in a data source or a data sink.
# to be performed on objects in a transfer.
&quot;deleteObjectsFromSourceAfterTransfer&quot;: True or False, # Whether objects should be deleted from the source after they are
# transferred to the sink.
#
# **Note:** This option and delete_objects_unique_in_sink are mutually
# exclusive.
&quot;overwriteObjectsAlreadyExistingInSink&quot;: True or False, # Whether overwriting objects that already exist in the sink is allowed.
&quot;deleteObjectsUniqueInSink&quot;: True or False, # Whether objects that exist only in the sink should be deleted.
#
# **Note:** This option and delete_objects_from_source_after_transfer are
# mutually exclusive.
},
&quot;gcsDataSource&quot;: { # In a GcsData resource, an object&#x27;s name is the Cloud Storage object&#x27;s # A Cloud Storage data source.
# name and its &quot;last modification time&quot; refers to the object&#x27;s `updated`
# property of Cloud Storage objects, which changes when the content or the
# metadata of the object is updated.
&quot;bucketName&quot;: &quot;A String&quot;, # Required. Cloud Storage bucket name (see
# [Bucket Name
# Requirements](https://cloud.google.com/storage/docs/naming#requirements)).
},
},
&quot;name&quot;: &quot;A String&quot;, # A unique name (within the transfer project) assigned when the job is
# created. If this field is empty in a CreateTransferJobRequest, Storage
# Transfer Service will assign a unique name. Otherwise, the specified name
# is used as the unique name for this job.
#
# If the specified name is in use by a job, the creation request fails with
# an ALREADY_EXISTS error.
#
# This name must start with `&quot;transferJobs/&quot;` prefix and end with a letter or
# a number, and should be no more than 128 characters.
# Example: `&quot;transferJobs/[A-Za-z0-9-._~]*[A-Za-z0-9]$&quot;`
#
# Invalid job names will fail with an
# INVALID_ARGUMENT error.
&quot;deletionTime&quot;: &quot;A String&quot;, # Output only. The time that the transfer job was deleted.
&quot;notificationConfig&quot;: { # Specification to configure notifications published to Cloud Pub/Sub. # Notification configuration.
# Notifications will be published to the customer-provided topic using the
# following `PubsubMessage.attributes`:
#
# * `&quot;eventType&quot;`: one of the EventType values
# * `&quot;payloadFormat&quot;`: one of the PayloadFormat values
# * `&quot;projectId&quot;`: the project_id of the
# `TransferOperation`
# * `&quot;transferJobName&quot;`: the
# transfer_job_name of the
# `TransferOperation`
# * `&quot;transferOperationName&quot;`: the name of the
# `TransferOperation`
#
# The `PubsubMessage.data` will contain a TransferOperation resource
# formatted according to the specified `PayloadFormat`.
&quot;eventTypes&quot;: [ # Event types for which a notification is desired. If empty, send
# notifications for all event types.
&quot;A String&quot;,
],
&quot;pubsubTopic&quot;: &quot;A String&quot;, # Required. The `Topic.name` of the Cloud Pub/Sub topic to which to publish
# notifications. Must be of the format: `projects/{project}/topics/{topic}`.
# Not matching this format will result in an
# INVALID_ARGUMENT error.
&quot;payloadFormat&quot;: &quot;A String&quot;, # Required. The desired format of the notification message payloads.
},
&quot;schedule&quot;: { # Transfers can be scheduled to recur or to run just once. # Schedule specification.
&quot;scheduleEndDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # The last day a transfer runs. Date boundaries are determined relative to
# UTC time. A job will run once per 24 hours within the following guidelines:
#
# * If `schedule_end_date` and schedule_start_date are the same and in
# the future relative to UTC, the transfer is executed only one time.
# * If `schedule_end_date` is later than `schedule_start_date` and
# `schedule_end_date` is in the future relative to UTC, the job will
# run each day at start_time_of_day through `schedule_end_date`.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
&quot;startTimeOfDay&quot;: { # Represents a time of day. The date and time zone are either not significant # The time in UTC that a transfer job is scheduled to run. Transfers may
# start later than this time.
#
# If `start_time_of_day` is not specified:
#
# * One-time transfers run immediately.
# * Recurring transfers run immediately, and each day at midnight UTC,
# through schedule_end_date.
#
# If `start_time_of_day` is specified:
#
# * One-time transfers run at the specified time.
# * Recurring transfers run at the specified time each day, through
# `schedule_end_date`.
# or are specified elsewhere. An API may choose to allow leap seconds. Related
# types are google.type.Date and `google.protobuf.Timestamp`.
&quot;nanos&quot;: 42, # Fractions of seconds in nanoseconds. Must be from 0 to 999,999,999.
&quot;seconds&quot;: 42, # Seconds of minutes of the time. Must normally be from 0 to 59. An API may
# allow the value 60 if it allows leap-seconds.
&quot;hours&quot;: 42, # Hours of day in 24 hour format. Should be from 0 to 23. An API may choose
# to allow the value &quot;24:00:00&quot; for scenarios like business closing time.
&quot;minutes&quot;: 42, # Minutes of hour of day. Must be from 0 to 59.
},
&quot;scheduleStartDate&quot;: { # Represents a whole or partial calendar date, e.g. a birthday. The time of day # Required. The start date of a transfer. Date boundaries are determined
# relative to UTC time. If `schedule_start_date` and start_time_of_day
# are in the past relative to the job&#x27;s creation time, the transfer starts
# the day after you schedule the transfer request.
#
# **Note:** When starting jobs at or near midnight UTC it is possible that
# a job will start later than expected. For example, if you send an outbound
# request on June 1 one millisecond prior to midnight UTC and the Storage
# Transfer Service server receives the request on June 2, then it will create
# a TransferJob with `schedule_start_date` set to June 2 and a
# `start_time_of_day` set to midnight UTC. The first scheduled
# TransferOperation will take place on June 3 at midnight UTC.
# and time zone are either specified elsewhere or are not significant. The date
# is relative to the Proleptic Gregorian Calendar. This can represent:
#
# * A full date, with non-zero year, month and day values
# * A month and day value, with a zero year, e.g. an anniversary
# * A year on its own, with zero month and day values
# * A year and month value, with a zero day, e.g. a credit card expiration date
#
# Related types are google.type.TimeOfDay and `google.protobuf.Timestamp`.
&quot;year&quot;: 42, # Year of date. Must be from 1 to 9999, or 0 if specifying a date without
# a year.
&quot;day&quot;: 42, # Day of month. Must be from 1 to 31 and valid for the year and month, or 0
# if specifying a year by itself or a year and month where the day is not
# significant.
&quot;month&quot;: 42, # Month of year. Must be from 1 to 12, or 0 if specifying a year without a
# month and day.
},
},
}</pre>
</div>
</body></html>