Skip to content
This repository has been archived by the owner on Oct 31, 2023. It is now read-only.

Commit

Permalink
docs: Minor formatting (#195)
Browse files Browse the repository at this point in the history
* docs: Minor formatting
chore: Update gapic-generator-python to v1.11.5
build: Update rules_python to 0.24.0

PiperOrigin-RevId: 563436317

Source-Link: googleapis/googleapis@42fd37b

Source-Link: https://github.com/googleapis/googleapis-gen/commit/280264ca02fb9316b4237a96d0af1a2343a81a56
Copy-Tag: eyJwIjoiLmdpdGh1Yi8uT3dsQm90LnlhbWwiLCJoIjoiMjgwMjY0Y2EwMmZiOTMxNmI0MjM3YTk2ZDBhZjFhMjM0M2E4MWE1NiJ9

* 🦉 Updates from OwlBot post-processor

See https://github.com/googleapis/repo-automation-bots/blob/main/packages/owl-bot/README.md

---------

Co-authored-by: Owl Bot <gcf-owl-bot[bot]@users.noreply.github.com>
  • Loading branch information
gcf-owl-bot[bot] and gcf-owl-bot[bot] authored Sep 13, 2023
1 parent a0b9a11 commit 94b4f73
Show file tree
Hide file tree
Showing 4 changed files with 34 additions and 32 deletions.
11 changes: 6 additions & 5 deletions google/cloud/dataflow_v1beta3/types/environment.py
Original file line number Diff line number Diff line change
Expand Up @@ -56,12 +56,12 @@ class JobType(proto.Enum):
unknown.
JOB_TYPE_BATCH (1):
A batch job with a well-defined end point:
data is read, data is
processed, data is written, and the job is done.
data is read, data is processed, data is
written, and the job is done.
JOB_TYPE_STREAMING (2):
A continuously streaming job with no end:
data is read,
processed, and written continuously.
data is read, processed, and written
continuously.
"""
JOB_TYPE_UNKNOWN = 0
JOB_TYPE_BATCH = 1
Expand Down Expand Up @@ -385,7 +385,6 @@ class Package(proto.Message):
location (str):
The resource to read the package from. The
supported resource type is:
Google Cloud Storage:
storage.googleapis.com/{bucket}
Expand Down Expand Up @@ -428,6 +427,7 @@ class Disk(proto.Message):
documentation for more information about
determining the set of available disk types for
a particular project and zone.
Google Compute Engine Disk types are local to a
particular project in a particular zone, and so
the resource name will typically look something
Expand Down Expand Up @@ -458,6 +458,7 @@ class WorkerSettings(proto.Message):
Attributes:
base_url (str):
The base URL for accessing Google Cloud APIs.
When workers access Google Cloud APIs, they
logically do so via relative URLs. If this
field is specified, it supplies the base URL to
Expand Down
1 change: 1 addition & 0 deletions google/cloud/dataflow_v1beta3/types/jobs.py
Original file line number Diff line number Diff line change
Expand Up @@ -223,6 +223,7 @@ class Job(proto.Message):
Attributes:
id (str):
The unique ID of this job.
This field is set by the Cloud Dataflow service
when the Job is created, and is immutable for
the life of the job.
Expand Down
50 changes: 23 additions & 27 deletions google/cloud/dataflow_v1beta3/types/messages.py
Original file line number Diff line number Diff line change
Expand Up @@ -43,43 +43,39 @@ class JobMessageImportance(proto.Enum):
unknown.
JOB_MESSAGE_DEBUG (1):
The message is at the 'debug' level:
typically only useful for
software engineers working on the code the job
is running. Typically, Dataflow pipeline runners
do not display log messages at this level by
default.
typically only useful for software engineers
working on the code the job is running.
Typically, Dataflow pipeline runners do not
display log messages at this level by default.
JOB_MESSAGE_DETAILED (2):
The message is at the 'detailed' level:
somewhat verbose, but
potentially useful to users. Typically,
Dataflow pipeline runners do not display log
messages at this level by default. These
messages are displayed by default in the
Dataflow monitoring UI.
somewhat verbose, but potentially useful to
users. Typically, Dataflow pipeline runners do
not display log messages at this level by
default. These messages are displayed by default
in the Dataflow monitoring UI.
JOB_MESSAGE_BASIC (5):
The message is at the 'basic' level: useful
for keeping
track of the execution of a Dataflow pipeline.
Typically, Dataflow pipeline runners display log
messages at this level by default, and these
messages are displayed by default in the
Dataflow monitoring UI.
for keeping track of the execution of a Dataflow
pipeline. Typically, Dataflow pipeline runners
display log messages at this level by default,
and these messages are displayed by default in
the Dataflow monitoring UI.
JOB_MESSAGE_WARNING (3):
The message is at the 'warning' level:
indicating a condition
pertaining to a job which may require human
intervention. Typically, Dataflow pipeline
runners display log messages at this level by
default, and these messages are displayed by
default in the Dataflow monitoring UI.
JOB_MESSAGE_ERROR (4):
The message is at the 'error' level:
indicating a condition
preventing a job from succeeding. Typically,
indicating a condition pertaining to a job which
may require human intervention. Typically,
Dataflow pipeline runners display log messages
at this level by default, and these messages are
displayed by default in the Dataflow monitoring
UI.
JOB_MESSAGE_ERROR (4):
The message is at the 'error' level:
indicating a condition preventing a job from
succeeding. Typically, Dataflow pipeline
runners display log messages at this level by
default, and these messages are displayed by
default in the Dataflow monitoring UI.
"""
JOB_MESSAGE_IMPORTANCE_UNKNOWN = 0
JOB_MESSAGE_DEBUG = 1
Expand Down
4 changes: 4 additions & 0 deletions google/cloud/dataflow_v1beta3/types/metrics.py
Original file line number Diff line number Diff line change
Expand Up @@ -116,6 +116,7 @@ class MetricUpdate(proto.Message):
"Mean", "Set", "And", "Or", and "Distribution".
The specified aggregation kind is
case-insensitive.
If omitted, this is not an aggregated value but
instead a single metric sample value.
cumulative (bool):
Expand Down Expand Up @@ -342,6 +343,7 @@ class ProgressTimeseries(proto.Message):
The current progress of the component, in the range [0,1].
data_points (MutableSequence[google.cloud.dataflow_v1beta3.types.ProgressTimeseries.Point]):
History of progress for the component.
Points are sorted by time.
"""

Expand Down Expand Up @@ -388,6 +390,7 @@ class StageSummary(proto.Message):
Start time of this stage.
end_time (google.protobuf.timestamp_pb2.Timestamp):
End time of this stage.
If the work item is completed, this is the
actual end time of the stage. Otherwise, it is
the predicted end time.
Expand Down Expand Up @@ -537,6 +540,7 @@ class WorkItemDetails(proto.Message):
Start time of this work item attempt.
end_time (google.protobuf.timestamp_pb2.Timestamp):
End time of this work item attempt.
If the work item is completed, this is the
actual end time of the work item. Otherwise, it
is the predicted end time.
Expand Down

0 comments on commit 94b4f73

Please sign in to comment.