DEPS: cloud_pubsub, recipe_engine/buildbucket, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def can_publish_event(self, request, response):
Return whether ‘request’ and ‘response’ can be published.
Based on whether the types are both part of AnalysisServiceEvent. For example, can_publish_event(InstallPackagesRequest(), InstallPackagesResponse())
is true because the AnalysisServiceEvent contains these fields.
can_publish_event(NewRequest(), NewResponse())
would not be true, because those fields are not added to AnalysisServiceEvent.
Args: request (proto in AnalysisServiceEvent ‘request’ oneof): The request to log response (proto in AnalysisServiceEvent ‘response’ oneof): The response to log
Return: bool
— def publish_event(self, request, response, request_time, response_time, step_data, step_output=None):
Publish request and response on Cloud Pub/Sub.
Wraps request and response in a AnalysisServiceEvent. ‘can_publish_event’ must be called before (and return true).
Does not check that request and response are corresponding types, e.g. it is possible to send a InstallPackagesRequest and SysrootCreateResponse; it is up to the caller to not do this.
Args: request (proto in AnalysisServiceEvent ‘request’ oneof): The request to log response (proto in AnalysisServiceEvent ‘response’ oneof): The response to log request_time (google.protobuf.timestamp_pb2.Timestamp): The time the request was sent by the caller. response_time (google.protobuf.timestamp_pb2.Timestamp): The time the response was received by the caller. step_data (recipe_engine.StepData): Data from the step that sent the request. step_output (str): Output for the step.
DEPS: cros_build_api, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def get_latest_build(self, android_package):
Retrieves the latest Android version for the given Android package.
Args: android_package (str): The Android package.
Returns: str: The latest Android version (build ID).
— def uprev(self, chroot, sysroot, android_package, android_version):
Uprev the given Android package to the given version.
Args: chroot (chromiumos.Chroot): Information on the chroot for the build. sysroot (Sysroot): The Sysroot being used. android_package (str): The Android package to uprev (e.g. android-vm-rvc). android_version (str): The Android version to uprev to (e.g. 7123456).
Returns: bool: If the android package has been uprevved.
— def uprev_if_unstable_ebuild_changed(self, chroot, sysroot, patch_sets):
Uprev Android if changes are found in the unstable ebuild.
Args: chroot (chromiumos.Chroot): Information on the chroot for the build. sysroot (Sysroot): The Sysroot being used. patch_sets (list[gerrit.PatchSet]): List of patch sets (with FileInfo).
— def write_lkgb(self, android_package, android_version):
Sets LKGB of given Android package to given version.
Args: android_package (str): The Android package to set LKGB for. android_version (str): The LKGB Android version.
Returns: List[str]: list of modified files.
DEPS: recipe_engine/buildbucket, recipe_engine/led, recipe_engine/properties, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to calculate the cost of running bots.
@property
— def bot_size(self):
@contextlib.contextmanager
— def build_cost_context(self):
Set build cost after running.
Returns: A context that sets build_cost on exit.
@contextlib.contextmanager
— def cq_run_cost_context(self):
Set cq cost after running.
Returns: A context that sets cq_run_cost on exit.
— def initialize(self):
— def set_build_cost(self):
Wrapper function to calculate and set the cost of creating the build.
Calculate the cost of creating the build and set it as a build output property.
— def set_cq_run_cost(self, child_builds=None):
Wrapper function to calculate and set the cost of the cq run.
Calculate the cost of the cq run and set it as a build output property.
Args: child_builds (list[build_pb2.Build]): The child builds for this cq run.
DEPS: cros_history, easy, gce_provider, swarming_cli, recipe_engine/buildbucket, recipe_engine/futures, recipe_engine/json, recipe_engine/random, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module that determines how to scale bot groups.
— def drop_cpu_cores(self, min_cpus_left=4, max_drop_ratio=0.75, test_rand=None):
Gather data on build's per core scaling efficiencies.
Gather data on per build CPU efficiency by dropping cores on the instances. Sets a property ‘enabled_cpu_cores’, with the final count.
Warning!: We've embedded an assumption that we will reboot between tasks so these changes are effectual for a single run only.
Args: min_cpus_left (int): Do not drop below this number of cores. max_drop_ratio (float): The maximum ratio of cpus to drop relative to the overall count. test_rand (flaot): Set the max_drop_ratio to this instead of the uniform random distribution (for testing).
Returns: The number of cpus dropped.
— def get_bot_request(self, demand, scaling_restriction):
Core function that scales bots based on demand.
Args: demand(int): Current demand for bots. scaling_restriction(ScalingRestriction): Scaling restriction defined by the bot policy.
Returns: int, number of bots to request.
— def get_current_gce_config(self, bot_policy_config):
Retrieves the current configuration from GCE Provider service.
Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop.
Returns: list(Config), GCE Provider config definitions.
— def get_gce_bots_configured(self, region_restrictions, config_map):
Sums the total number of configured bots per bot policy.
Args: region_restrictions(list[RegionRestriction]): Regional preferences from config. config_map(dict|Config): Map of GCE Config to prefix
Returns: int, sum of the total number of bots in GCE Provider
— def get_regional_actions(self, bots_requested, region_restrictions):
Determines regional distribution of bot requests.
This function uses a running total and residual to ensure we're accurate in computing totals to equal bots_requested.
Args: bots_requested(int): Total number of bots requested. region_restrictions(list[RegionRestriction]): Regional preferences from config.
Returns: list[RegionalAction], region wise distribution of bots requested.
— def get_robocrop_action(self, bot_policy_config, configs, swarming_stats):
Function to compute all the actions of this RoboCrop.
Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop. configs(Configs): List of GCE Config objects. swarming_stats(SwarmingStats): Named tuple containing current Swarming bot and task counts.
Returns: ScalingAction, comprehensive action to be taken by RoboCrop.
— def get_scaling_action(self, demand, bot_policy, configs):
The function that creates a ScalingAction for a bot group.
Args: demand(int): Current demand for bots. bot_policy(BotPolicy): Config defined Policy for a bot group. configs(Configs): List of GCE Config objects.
Returns: ScalingAction, comprehensive action to be taken by RoboCrop.
— def get_swarming_demand(self, swarming_stats, bot_group):
Return the demand for bots in a bot group.
Args: swarming_stats(SwarmingStats): Named tuple containing bot and task Swarming stats. bot_group (str): Name of bot group
Returns: int, the current demand for bots in the group.
— def get_swarming_stats(self, bot_policy_config):
Determines the current Swarming stats per bot group.
Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop.
Returns: SwarmingStats: bot and task stats named tuple.
— def reduce_bot_policy_config_for_table(self, bot_policy_config):
Reduces bot_policy_config fields prior to sending to bb tables.
Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop.
Returns: str, scaled down config that only includes data needed for plx
— def unpack_policy_dimensions(self, dimensions):
Method to iterate through dimensions and return possible combinations.
Args: dimensions (list[dict]): BotPolicy swarming dimensions.
Returns: list, product of all swarming dimensions for querying.
— def update_bot_policy_limits(self, bot_policy_config, configs):
Sums the min and max bot numbers per bot policy.
Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop. config_map(dict|Config): Map of GCE Config to prefix
Returns: BotPolicy, updated to reflect ScalingRestriction values.
— def update_gce_configs(self, robocrop_actions, configs):
Updates each GCE Provider config that is actionable.
Args: robocrop_actions(list[ScalingAction]): Repeatable ScalingAction configs to update. configs(Configs): List of GCE Config objects.
Returns: list(Config), GCE Provider config definitions.
DEPS: easy, urls, depot_tools/gsutil, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def initialize(self):
— def symbolicate_dump(self, image_archive_path, test_results):
Converts minidumps generated by tests into text files.
For each of test_results
, walks the directory, converts all found minidump files to text, and copies the text files back to test_results
.
Args: image_archive_path (str): A Google Storage path to the image archive. test_results (list[(DownloadedTestResult)]): A list of DownloadedTestResult which has both GS path and local path of the test result to process.
Returns: A list[Path] of symbolicated files written.
DEPS: bot_cost, cros_artifacts, cros_bisect, cros_build_api, cros_infra_config, cros_paygen, cros_prebuilts, cros_relevance, cros_sdk, cros_source, cros_tags, cros_version, easy, failures, metadata, metadata_json, src_state, sysroot_util, test_util, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API providing a menu for build steps
A module with steps used by image builders.
Image builders do not call other recipe modules directly: they always get there via this module, and are a simple sequence of steps.
— def add_child_build_ids_to_output_property(self):
Add child build ids to output property of current build.
@property
— def artifact_build(self):
— def bootstrap_sysroot(self, config=None):
Bootstrap the sysroot by installing the toolchain.
Args: config (BuilderConfig): The Builder Config for the build. If none, will attempt to get the BuilderConfig whose id.name matches the specified Buildbucket builder from HEAD.
— def build_and_test_images(self, config=None, include_version=False):
Build the image and run ebuild tests.
This behavior is adjusted by the run_spec values in config.
Args: config (BuilderConfig): The Builder Config for the build, or None. include_version (bool): Whether or not to pass the workspace version to sysroot_util.build. Returns: (bool): Whether to continue with the build.
@property
— def build_target(self):
@property
— def chroot(self):
@property
— def config(self):
@property
— def config_or_default(self):
@contextlib.contextmanager
— def configure_builder(self, is_staging=None, missing_ok=False, disable_sdk=False, commit=None, targets=()):
Initial setup steps for the builder.
This context manager returns with all of the contexts that an image builder needs to have when it runs, for cleanup to happen properly.
Args: is_staging (bool): Whether this is a staging builder. Use this to override auto-detection. By default, anything in the ‘staging’ bucket is considered a staging builder. missing_ok (bool): Whether it is OK if no config is found. disable_sdk (bool): This builder will not be using the SDK at all. Only for branches with broken or no Build API. commit (GitilesCommit): The GitilesCommit for the build, or None. targets (list[build_target]): List of build_targets for metadata_json to use instead of our build_target.
Returns: BuilderConfig or None, with an active context.
@property
— def container_version(self):
Return the version string for containers.
Run through the format string, and replace any allowed fields with their runtime values. If any unknown fields are encountered, then a RuntimeError is thrown.
— def create_containers(self, builder_config=None):
Call the BuildTestServiceContainers endpoint to build test containers.
The build API itself handles uploading generated container images to the container registry, but we handle collecting the metadata and uploading it with the build artifacts.
Args: builder_config (BuilderConfig): The BuilderConfig for this build, or None
Returns: None
@property
— def dep_graph(self):
@property
— def gerrit_changes(self):
— def get_cl_affected_sysroot_packages(self, packages=None, include_rev_deps=False):
Gets the list of sysroot packages affected by the input CLs.
Calculates the list of packages which were changed by the CLs and their reverse dependencies. The list is cached to avoid recalculating the list during subsequent calls.
Args: packages (list[PackageInfo]): The list of packages for which to get dependencies. If none are specified the standard list of packages is used. include_rev_deps (bool): Whether to also calculate reverse dependencies.
Returns: (List[PackageInfo]): A list of packages affected by the CLs.
— def get_dep_graph(self, packages):
Fetch the dependency graph, and validate the SDK for reuse.
Args: packages (list[PackageInfo]): list of packages. Default is the list for this build_target.
Returns: The dependency graph from cros_relevance.get_dependency_graph.
@property
— def gitiles_commit(self):
— def initialize(self):
— def install_packages(self, config=None, packages=None, timeout_sec=‘DEFAULT’, name=None, force_all_deps=False, include_rev_deps=False, dryrun=False):
Install packages as appropriate.
The config determines whether to call install packages. If installing packages, fetch Chrome source when needed.
Args: config (BuilderConfig): The Builder Config for the build. packages (list[PackageInfo]): List of packages to install. Default: all packages for the build_target. timeout_sec (int): Step timeout, in seconds, or None for default. name (string): Step name for install packages, or None for default. force_all_deps (bool): Whether to force building of all dependencies. include_rev_deps (bool): Whether to also install reverse dependencies. Ignored if config specifies ALL_DEPENDENCIES or force_all_deps is True. dryrun (bool): Dryrun the install packages step.
Returns: (bool): Whether to continue with the build.
— def is_cq_build_relevant(self, packages=None, include_rev_deps=False):
Determine whether the CQ build is relevant.
CQ builds are relevant when the changes affect any of the packages in the depgraph for the build target. They can also be forced relevant via an input property or CL footer.
Args: packages (list[PackageInfo]): The list of packages for which to get dependencies. If none are specified the standard list of packages is used. include_rev_deps (bool): Whether to also calculate reverse dependencies.
@property
— def is_staging(self):
— def run_unittests(self, config=None):
run ebuild tests as specified by config.
Args: config (BuilderConfig): The Builder Config for the build, or None.
— def setup_chroot(self, no_chroot_timeout=False, sdk_version=None):
Setup the chroot for the builder.
Args: no_chroot_timeout (bool): whether to allow unlimited time to create the chroot. sdk_version (string): Optional. Specific SDK version to include in the sdk CreateRequest, e.g. 2022.01.20.073008.
Returns: (bool): Whether the build is relevant.
— def setup_sysroot_and_determine_relevance(self, with_sysroot=True, packages=None):
Setup the sysroot for the builder and determine build relevance.
Args: with_sysroot (bool): Whether to create a sysroot. Default: True. (Some builders do not require a sysroot.) packages (list[PackageInfo]): Used to override the list of packages.
Returns: An object containing: pointless (bool): Whether the build is pointless. packages (list[PackageInfo]): The packages for this build, or an empty list.
@contextlib.contextmanager
— def setup_workspace(self, cherry_pick_changes=True):
Setup the workspace for the builder.
Args: cherry_pick_changes (bool): whether to apply gerrit changes on top of the checkout using cherry-pick. If set to False, will directly checkout the changes using the gerrit fetch refs.
@contextlib.contextmanager
— def setup_workspace_and_chroot(self, no_chroot_timeout=False, cherry_pick_changes=True):
Setup the workspace and chroot for the builder.
This context manager sets up the workspace path.
Args: no_chroot_timeout (bool): whether to allow unlimited time to create the chroot. cherry_pick_changes (bool): whether to apply gerrit changes on top of the checkout using cherry-pick. If set to False, will directly checkout the changes using the gerrit fetch refs.
Returns: (bool): Whether the build is relevant.
@property
— def sysroot(self):
@property
— def target_versions(self):
Get the current GetTargetVersionsResponse.
Only set after setup_sysroot_and_determine_relevance().
Returns: (GetTargetVersionsResponse): A GetTargetVersionsRequest or None.
— def upload_artifacts(self, config=None, private_bundle_func=None, sysroot=None):
Upload artifacts from the build.
Args: config (BuilderConfig): The Builder Config for the build, or None. private_bundle_func (func): If a private bundling method is needed (such as when there is no Build API on the branch), this will be called instead of the internal bundling method. sysroot (Sysroot): Use this sysroot. Defaults to the primary Sysroot for the build.
Returns: (UploadedArtifacts) information about uploaded artifacts.
— def upload_prebuilts(self, config=None):
Upload prebuilts from the build.
Upload prebuilts if the configuration has uploadable prebuilts.
Args: config (BuilderConfig): The Builder Config for the build, or None.
DEPS: cq_looks, cros_history, cros_infra_config, cros_relevance, cros_tags, easy, git_footers, test_util, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to plan the builds to be launched.
— def get_build_plan(self, child_specs, enable_history, gerrit_changes, internal_snapshot, external_snapshot):
Return a three-tuple of builds, completed, existing, and needed.
This will be split into specialized functions for cq, release, others.
Args: child_specs (list[ChildSpec]): List of child specs of the child builders. enable_history (bool): Enables history lookup in the orchestrator. gerrit_changes list(GerritChange): List of patches in the order that they can be cherry-picked. internal_snapshot (GitilesCommit): gitiles_commit of the internal manifest to be supplied to child builds syncing to the internal manifest. external_snapshot (GitilesCommit): gitiles_commit of the public manifest to be supplied to child builds syncing to the external manifest.
Returns: A tuple of three lists: A list of Build objects of successful builds with refreshed criticality. A list of -snapshot builds we don't need to schedule and can join. A list of ScheduleBuildRequests that have to be scheduled.
— def get_completed_builds(self, child_specs, forced_rebuilds):
Get the list of previously passed child builds with criticality refreshed.
Args: api (RecipeApi): See RunSteps documentation. child_specs list(ChildSpec): List of child specs of cq-orchestrator. forced_rebuilds list(str): List of builder names that cannot be reused.
Returns: A list of build_pb2.Build objects corresponding to the latest successful child builds with the same patches as the current cq orchestrator with refreshed critical values.
— def get_forced_rebuilds(self, gerrit_changes):
Gets a list of builders whose builds should not be reused.
Compiles a list of all builders whose builds should not be reused as indicated by the Gerrit changes' commit messages. For multiple changes, the union of these list is returned.
Args: gerrit_changes ([common_pb2.GerritChange]): Gerrit changes applied to this run.
Returns: forced_rebuilds (set(str)): A set of builder names or ‘all’ if no builds can be reused.
— def get_slim_builder_name(self, builder_name):
Returns to the name of the slim variant of the builder.
Args: builder_name (str): The name of the builder for which to get the slim builder variant name.
Returns: A string of the slim builder name.
— def prioritize_builds(self, builds):
Takes a list of builds and dedups, choosing a best build, dropping others.
See build_orderer for the sort order. This is most useful if you have multiple, identical, builds and you want to choose a single one from each builder type to carry forward.
Args: builds ([build_pb2.Build]): Builds to dedupe and sort.
Returns: A list of build_pb2.Build objects, deduped and prioritized.
DEPS: build_menu, cloud_pubsub, cros_signing, cros_tags, recipe_engine/buildbucket, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Contains functions for building and sending build status to a pub/sub topic.
The messages for build reporting are defined in: infra/proto/src/chromiumos/build_report.proto
And are specifically designed to be aggregated as a build progresses to create the current status. This means we can focus on sending out just the status pieces that we need without worrying about maintaining the state of the entire message.
The pub/sub topic to send status to is configurable through the pubsub_project
and pubsub_topic
properties for the module. If not set, these default to chromeos-build-reporting
and chromeos-builds-all
, which is intended to be the unfiltered top-level topic for all builds.
API implemention for build reporting.
@staticmethod
— def add_version_msg(build_config, kind, value):
@property
— def build_type(self):
— def create_build_report(self):
Create BuildReport instance that can be .published().
Return: _MessageDelegate wrapping BuildReport instance
— def create_step_info(self, step_name, start_time=None, end_time=None, status=BuildReport.StepDetails.STATUS_RUNNING, raise_on_failed_publish=False):
Create a StepDetails instance to publish information for a step.
Args: step_name (StepDetails.StepName): The predefined step name. start_time (Datetime): UTC datetime indicating step start time end_time (Datetime): UTC datetime indicating step end time status (StepDetails.Status): Step status (default: STATUS_RUNNING) raise_on_failed_publish (bool): Should this publish fail, fail the whole build.
Return: _MessageDelegate wrapping StepDetails instance
@property
— def merged_build_report(self):
— def publish(self, build_report, raise_on_failed_publish=False):
Send a BuildReport to the pubsub topic.
Also aggregates the published BuildReport which is then available through the merged_build_report property.
Args: build_report (BuildReport): Instance to send to pub/sub. raise_on_failed_publish (bool): Should this publish fail, fail the whole build.
Return: Reference to BuildReport input message.
— def publish_build_artifact(self, artifact_type, gs_uri, sha256, created=None):
Publish and merge information about a created artifact.
Args: artifact_type (BuildArtifact.Type): Type of the artifact. gs_uri (str): GS bucket URI for artifact (eg: gs://foo/bar/baz.tgz). sha256 (str): SHA256 hash for artifact. created (Datetime): Optional creation time (default: now).
Raises: ValueError if gs_uri isn't properly formatted with gs:// prefix.
Return: None
— def publish_build_target_and_model_metadata(self, branch, builder_metadata):
Publish and merge info about the build target and models of a build.
Args: branch (str): The branch name (e.g. release-R97-14324.B). builder_metadata (GetBuilderMetadataResponse): Builder metadata from the build-api.
— def publish_signed_build_metadata(self, signed_build_metadata_list):
Publish metadata about the signed build image(s).
Args: signed_build_metadata_list (list[dict]): List of signed build metadata.
— def publish_status(self, status):
Publish and merge build status.
— def publish_versions(self, gtv_response):
Publish and merge versions, sourced from a GetTargetVersionsRequest.
Args: gtv_response (GetTargetVersionsResponse): Response to a build api request.
Return: Nothing
@property
— def pubsub_project(self):
@property
— def pubsub_topic(self):
— def set_build_type(self, build_type):
Set the type for the build, must be set once and only once.
@staticmethod
— def step_as_str(step_name):
Convert a BuildReport.StepDetails.StepName to a canonical string.
@contextlib.contextmanager
— def step_reporting(self, step_name, raise_on_failed_publish=False):
Create a context manager to automatically send out step status.
When created, initial step status is published with the current time and a status of STATUS_RUNNING.
When the context is exited, the step endtime is set and status is set to STATUS_SUCCESS by default.
A handle is returned from the context manager which can be used to set the return status to STATUS_FAILURE or STATUS_INFRA_FAILURE via the fail() and infra_fail() methods respectively.
If a StepFailure occurs, status is set to STATUS_FAILURE automatically, and similarly, InfraFailure sets status to STATUS_INFRA_FAILURE.
Args: step_name (StepDetails.StepName): The predefined step name. raise_on_failed_publish (bool): Should this publish fail, fail the whole build. Return: Handle which is used to set the step status.
DEPS: recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to get statistics from buildbucket.
— def get_bot_demand(self, status_map):
Return the demand for bots in a bot group.
Args: status_map (str->int): Map of Buildbucket status to count.
Returns: int, the current demand for bots in the group.
— def get_bucket_status(self, bucket):
Return the number of builds in the bucket and their statuses.
Args: bucket (str): Buildbucket bucket.
Returns: Map (str->int) of status to number of builds with that status in the bucket.
— def get_build_count(self, bucket, status):
Return the number of builds in the bucket with a specific status.
Args: bucket (str): Buildbucket Bucket to search on. status (common_pb2.Status): The status of builds to search for.
Returns: The number of builds (int) in the given bucket with given status.
DEPS: build_menu, cros_build_api, cros_sdk, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to get builder metadata.
— def get_models(self):
Finds all model names associated with the active build_target.
Returns: List[str]: The names of all models used by this build target.
— def look_up_builder_metadata(self):
Looks up builder metadata for the provided build_target.
Builder metadata does not change within the lifecycle of a build, so builder metadata is looked up once and cached.
Returns: builder_metadata proto describing build and model for the current target.
DEPS: cros_build_api, cros_infra_config, cros_sdk, easy, portage, workspace_util, depot_tools/depot_tools, depot_tools/gclient, recipe_engine/cas, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def cache_sync(self, cache_path):
Sync Chrome cache using existing cached repositories.
Args: cache_path (Path): Path to mount of cache.
— def diffed_files_requires_rebuild(self, patch_sets=None):
Returns a bool if patch_sets includes files that require rebuilding.
The patch_sets object supplied must have been constructed with the file information populated.
Args: patch_sets (list[gerrit.PatchSet]): List of patch sets (with FileInfo).
Returns: A bool that indicates a rebuild should be triggered.
— def follower_lacks_prebuilt(self, build_target, chroot, packages):
Returns whether we need the chrome source to be synced.
Returns whether or not this run needs chrome source to be synced locally. This is independent of if we need to actually build chrome, as we‘ve allowed ‘follower’ packages to be built out of chrome’s source.
Args: build_target (chromiumos.BuildTarget): Build target of the build. chroot (chromiumos.Chroot): Information on the chroot for the build. packages (list[chromiumos.PackageInfo]): Packages that the builder needs to build. Returns: bool: Whether or not this run needs chrome.
— def has_chrome_prebuilt(self, build_target, chroot, internal=False, ignore_prebuilts=False):
— def initialize(self):
Initialization that follows all module loading.
— def maybe_uprev_local_chrome(self, build_target, chroot, patch_sets):
Checks the patch_sets for chrome 9999 ebuild changes and uprevs if so.
Args: build_target (chromiumos.BuildTarget): Build target of the build. chroot (chromiumos.Chroot): Information on the chroot for the build. patch_sets (list[gerrit.PatchSet]): A list of patch sets to examine.
Returns: bool: If we upreved the local Chrome.
— def needs_chrome(self, build_target, chroot, packages=None):
Returns whether or not this run needs chrome.
Returns whether or not this run needs chrome, that is, will require a prebuilt, or will need to build it from source.
Args: build_target (chromiumos.BuildTarget): Build target of the build. chroot (chromiumos.Chroot): Information on the chroot for the build. packages (list[chromiumos.PackageInfo]): Packages that the builder needs to build, or empty / None for default packages.
Returns: bool: Whether or not this run needs chrome.
— def needs_chrome_source(self, request, dep_graph, presentation, patch_sets=None):
Checks whether chrome source is needed.
Args: request (InstallPackagesRequest): InstallPackagesRequest for the build. dep_graph (DepGraph): From cros_relevance.get_dependency_graph. presentation (StepPresentation): Step to update. patch_sets (list[gerrit.PatchSet]): Applied patchsets. Default: the list from workspace_util.
Returns: bool: Whether Chrome source is needed.
— def sync(self, chrome_root, chroot, build_target, internal):
Sync Chrome source code.
Must be run with cwd inside a chromiumos source root.
Args: chrome_root (Path): Directory to sync the Chrome source code to. chroot (chromiumos.Chroot): Information on the chroot for the build. build_target (chromiumos.BuildTarget): Build target of the build. internal (bool): True for internal checkout.
DEPS: cros_infra_config, gcloud, gitiles, goma, repo, depot_tools/bot_update, depot_tools/gclient, depot_tools/git, depot_tools/gitiles, depot_tools/tryserver, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/legacy_annotation, recipe_engine/path, recipe_engine/properties, recipe_engine/runtime, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def build_packages(self, board, args=None, **kwargs):
Run the build_packages script inside the chroot.
Used by the internal goma recipe.
— def cbuildbot(self, name, config, args=None, **kwargs):
Runs the cbuildbot command defined by the arguments.
Args: name: (str) The name of the command step. config: (str) The name of the ‘cbuildbot’ configuration to invoke. args: (list) If not None, addition arguments to pass to ‘cbuildbot’.
Returns: (Step) The step that was run.
— def check_repository(self, repo_type_key, value):
Scans through registered repositories for a specified value.
Args: repo_type_key (str): The key in the ‘repositories’ config to scan through. value (str): The value to scan for. Returns (bool): True if the value was found.
— def checkout(self, manifest_url=None, repo_url=None, branch=None):
— def checkout_chromite(self):
Checks out the configured Chromite branch.
@property
— def chromite_branch(self):
@property
— def chromite_path(self):
— def configure(self, **KWARGS):
Loads configuration from build properties into this recipe config.
Args: KWARGS: Additional keyword arguments to forward to the configuration.
— def cros_sdk(self, name, cmd, args=None, environ=None, chroot_cmd=None, **kwargs):
Return a step to run a command inside the cros_sdk.
Used by the internal goma recipe.
@property
— def depot_tools_path(self):
@property
— def depot_tools_pin(self):
— def gclient_config(self):
Generate a ‘gclient’ configuration to check out Chromite.
Return: (config) A ‘gclient’ recipe module configuration.
— def get_config_defaults(self):
— def run(self, goma_dir=None):
Runs the configured ‘cbuildbot’ build.
This workflow uses the registered configuration dictionary to make group- and builder-specific changes to the standard workflow.
The specific workflow paths that are taken are also influenced by several build properties.
TODO(dnj): When CrOS migrates away from BuildBot, replace property inferences with command-line parameters.
This workflow:
Args: goma_dir: Goma client path used for simplechrome. Goma client for ChromeOS chroot should be located in sibling directory so that cbuildbot can find it automatically. Returns: (Step) the ‘cbuildbot’ execution step.
— def setup_board(self, board, args=None, **kwargs):
Run the setup_board script inside the chroot.
Used by the internal goma recipe.
— def with_system_python(self):
Prepare a directory with the system python binary available.
This is designed to make it possible to mask “bundled python” out of the standard path without hiding any other binaries.
Returns: (context manager) A context manager that inserts system python into the front of PATH.
DEPS: support, util, recipe_engine/context, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for using Cloud Pub/Sub
A module for Cloud Pub/Sub
@exponential_retry(retries=3, delay=datetime.timedelta(minutes=2))
— def publish_message(self, project_id, topic_id, data, ordering_key=None, endpoint=None, raise_on_failed_publish=True):
Publish a message to Cloud Pub/Sub
When specifying an ordering key to ensure message ordering, an explicit endpoint needs to be specified, and only messages going through the same endpoint are guaranteed to be ordered.
Args:
Raises: InfraFailure: If the publish fails and raise_on_failed_publish.
DEPS: cros_infra_config, cros_source, gerrit, gitiles, depot_tools/gsutil, recipe_engine/archive, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/cq, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
This module contains apis to generate code coverage data.
@property
— def metadata_dir(self):
A temporary directory for the metadata.
Temp dir is created on first access to this property.
— def process_coverage_data(self, tarfile, coverage_type, step_name=‘upload code coverage data’, incremental_settings=None, absolute_cs_settings=None, absolute_chromium_settings=None):
Uploads code coverage data to the requested external sources.
Args: tarfile (Path): path to tarfile. coverage_type (str): type of coverage being uploaded (LCOV, or LLVM). step_name (str): name for the step. incremental_settings (CoverageFileSettings): settings for uploading coverage to gerrit. absolute_cs_settings (CoverageFileSettings): settings for uploading coverage to code search. absolute_chromium_settings (CoverageFileSettings): settings for uploading coverage to chromium.
— def upload_code_coverage_llvm_json(self, tarfile, step_name=‘upload code coverage data (code coverage llvm json)’):
Uploads code coverage llvm json.
Args: tarfile (Path): path to tarfile. step_name (str): name for the step.
— def upload_firmware_lcov(self, tarfile, step_name=‘upload code coverage data (firmware lcov)’):
Uploads firmware lcov code coverage.
Args: tarfile (Path): path to tarfile. step_name (str): name for the step.
DEPS: recipe_engine/buildbucket, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to look for green CQ snapshots.
— def get_unfinished_or_failed_snapshot_ids(self, snapshot_ids):
Returns a set of unfinished or failed snapshot ids.
Args: snapshot_ids (set): The set of snapshot ids to be used in build plan.
Returns: A tuple of two sets: A set of snapshot ids referring to builds that are unfinished. A set of snapshot ids referring to builds that are failed.
DEPS: code_coverage, cros_build_api, cros_infra_config, cros_source, cros_version, disk_usage, easy, metadata, depot_tools/gsutil, recipe_engine/bcid_reporter, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/file, recipe_engine/futures, recipe_engine/led, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for uploading CrOS build artifacts to Google Storage.
A module for bundling and uploading build artifacts.
— def artifacts_gs_path(self, builder_name, target, kind=BuilderConfig.Id.TYPE_UNSPECIFIED, template=None):
Returns the GS path for artifacts of the given kind for the given target.
The resulting path will NOT include the GS bucket.
Args: builder_name (str): The builder name, e.g. octopus-cq. target (BuildTarget): The target whose artifacts will be uploaded. kind (BuilderConfig.Id.Type): The kind of artifacts being uploaded, e.g. POSTSUBMIT. May be used as a descriptor in formatting paths. Required if ‘{label}’ or ‘{kind}’ are present in |template|. template (str): The string to format, or None. If set to None, the default ‘{gs_path}’ will be used.
Returns: The formatted template. Default: The GS path at which artifacts should be uploaded.
— def download_artifact(self, build_payload, artifact, name=None):
Download the given artfiact from the given build payload.
Args: build_payload (BuildPayload): Describes where the artifact is on GS. artifact (ArtifactType): The artifact to download. name (string): step name. Defaults to ‘download |artifact_name|’.
Returns: list[Path]: Paths to the files downloaded from GS.
Raises: ValueError: If the artifact is not found in the build payload.
— def download_artifacts(self, build_payload, artifact_types, name=None):
Download the given artifacts from the given build payload.
Args: build_payload (BuildPayload): Describes where build artifacts are on GS. artifact_types (list[ArtifactTypes]): The artifact types to download. name (str): The step name. Defaults to ‘download artifacts’.
Returns: dict: Maps ArtifactType to list[Path] representing downloaded files.
Raises: ValueError: If any artifact is not found in the build payload.
@property
— def gs_upload_path(self):
Return the gs upload path, if one was set in properties.
— def has_output_artifacts(self, artifacts_info):
Return whether there are output artifacts.
Args: artifacts (ArtifactsByService): The artifacts config to check.
Returns: (bool) whether there are any output artifacts.
— def initialize(self):
— def merge_artifacts_properties(self, properties):
Combine uploaded artifacts to produce a final value.
Args: properties (list[UploadedArtifacts]): the values to merge.
— def prepare_for_build(self, chroot, sysroot, artifacts_info, forced_build_relevance=False, test_data=None, name=None):
Prepare the build for the given artifacts.
This function calls the Build API to have it prepare to build artifacts of the given types.
Args: chroot (Chroot): The chroot to use, or None if not yet created. sysroot (Sysroot): The sysroot to use, or None if not yet created. artifacts_info (ArtifactsByService): artifact information. forced_build_relevance (bool): Whether the builder will be ignoring the response. test_data (str): JSON data to use for ArtifactsService call. name (str): The step name. Defaults to ‘prepare artifacts’.
Returns: PrepareForToolchainBuildResponse.BuildRelevance indicating that the build is NEEDED (regardless of the pointless build check), UNKNOWN (pointless build check applies), or POINTLESS (just exit now.)
— def push_image(self, chroot, gs_image_dir, sysroot, dryrun=False, profile=None, sign_types=None, dest_bucket=None, channels=None):
Call the PushImage build API endpoint.
Args: chroot (Chroot): The chroot to use, or None if not yet created. gs_image_dir (string): The source directory (a gs path) to push from. sysroot (Sysroot): The sysroot (build target) to use. profile (Profile): The profile to use, or None. sign_types (list(ImageType)): The sign types to use, or None. dest_bucket (string): The destination bucket to use, or None. channels (list(Channel)): The channels to use, or empty list.
For more context on this parameters, see chromite/scripts/pushimage.py.
Returns: PushImageResponse
— def upload_artifacts(self, builder_name, kind, gs_bucket, _kwonly=(), artifacts_info=None, chroot=None, sysroot=None, name=‘upload artifacts’, test_data=None, private_bundle_func=None, report_to_spike=False):
Bundle and upload the given artifacts for the given build target.
This function sets the “artifacts” output property to include the GS bucket, the path within that bucket, and a dict mapping artifact to a list of artifact paths (relative to the GS path) for each artifact type that was uploaded.
Args: builder_name (str): The builder name, e.g. octopus-cq. kind (BuilderConfig.Id.Type): The kind of artifacts being uploaded, e.g. POSTSUBMIT. This affects where the artifacts are placed in Google Storage. gs_bucket (str): Google storage bucket to upload artifacts to. artifacts_info (ArtifactsByService): Information about artifacts. chroot (Chroot): chroot to use sysroot (Sysroot): sysroot to use (this contains the build target.) name (str): The step name. Defaults to ‘upload artifacts’. test_data (str): Some data for this step to return when running under simulation. The string “@@DIR@@” is replaced with the output_dir path throughout. private_bundle_func (func): If a private bundling method is needed (such as when there is no Build API on the branch), this will be called instead of the internal bundling method. report_to_spike(bool): If True, will call bcid_reporter to report artifact information and trigger Spike to upload the provenance as [artifact-name].attestation if kind is RELEASE. Right now, this only reports on the base image.tar.xz.
Returns: (UploadedArtifacts) information about uploaded artifacts.
— def upload_metadata(self, name, builder_name, target, gs_bucket, filename, message):
Materialize a protobuffer message as a jsonpb artifact in GCS.
Convert the message to a jsonpb file and upload it to the appropriate location in GCS with the other build artifacts.
Args: name (str): Human readable metadata name for step name. builder_name (str): The builder name, e.g. octopus-cq. target (str|): Build target, eg: octopus-kernelnext. gs_bucket (str): Google storage bucket to upload artifacts to. filename (str): Filename for the metadata. message (Message): Protobuffer message to serialize and upload.
Returns: GS path inside bucket to uploaded file
DEPS: easy, failures, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for interacting with FindIt.
A module for interacting with FindIt.
— def get_packages(self):
Returns packages to build as specified by FindIt or empty list.
Returns the packages to build as specified by a FindIt invocation or an empty list if this run was not invoked as a bisection build.
Returns: list[PackageInfo]: list of packages to build as specified by FindIt
— def get_test_child_builders(self):
Returns the child builders as specified by FindIt or empty list.
Returns the child builders that need to run as specified by a FindIt invocation or an empty list if this run was not invoked as a bisection build.
Returns: list[str]: sorted list of child builders to run.
— def get_test_plan(self, builds):
Returns the test plan as specified by FindIt or None.
Returns the test plan that needs to run as specified by a FindIt invocation or None if this run was not invoked as a bisection build.
Args: builds (list[build_pb2.Build]): list of completed builds. While FindIt has returned the test plan to run the TestUnitCommon.BuildPayloads in that plan need to be updated for this bisection invocation. The information to do that update is retrieved from these builds.
Returns: GenerateTestPlanResponse or None.
— def set_bisect_builder(self, build_target_name):
Sets the BISECT_BUILDER output property for the build target recipe.
Sets the BISECT_BUILDER output property to the name of the builder FindIt should invoke if the build fails and bisection is required.
Args: build_target_name (str): build target name to set the bisect builder for.
— def set_orchestrator_bisect_builder(self):
Sets the BISECT_BUILDER output property for the orchestrator.
Sets the BISECT_BUILDER output property to the name of the builder FindIt should invoke if the postsubmit-orchestrator encounters hardware test failures.
DEPS: cros_infra_config, cros_version, depot_tools/depot_tools, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
API wrapping the cros branch tool.
A module for calling cros branch.
— def __call__(self, cmd, step_name=None, force=False, push=False, **kwargs):
Call cros branch with the given args.
Args: cmd: Command to be run with cros branch step_name (str): Message to use for step. Optional. force (bool): If True, cros branch will be run with --force. push (bool): If True, cros branch will be run with --push. kwargs: Keyword arguments for recipe_engine/step.
Returns: branch_name (string): The name of the created branch, or None.
— def create_from_buildspec(self, source_version, branch, **kwargs):
Call cros branch create
, branching from the appropriate buildspec manifest.
Args: source_version (str): Version to branch from. Must have a valid manifest in manifest-versions/buildspecs or branch_util will fail. branch (chromiumos.Branch): Branch to be created. kwargs: Keyword arguments for recipe_engine/step. Accepts the same keyword arguments as call.
Returns: branch_name (string): The name of the created branch, or None.
— def create_from_file(self, manifest_file, branch, **kwargs):
Call cros branch create
, branching from the file specified in manifest_file.
Args: manifest_file (recipe_engine.config_types.Path): Path to manifest file. This recipe assumes that it is at the top level of a ChromeOS checkout. branch (chromiumos.Branch): Branch to be created. kwargs: Keyword arguments for recipe_engine/step. Accepts the same keyword arguments as call.
Returns: branch_name (string): The name of the created branch, or None.
— def delete(self, branch, **kwargs):
Call cros branch delete
with the appropriate arguments.
Args: branch (chromiumos.Branch): Branch to be deleted. kwargs: Keyword arguments for cros branch/recipe_engine/step. Accepts the same keyword arguments as call.
— def initialize(self):
Initializes the module.
— def rename(self, branch, new_branch_name, **kwargs):
Call cros branch rename
with the appropriate arguments.
Args: branch (chromiumos.Branch): Branch to be renamed. new_branch_name (str): New branch name. kwargs: Keyword arguments for cros branch/recipe_engine/step. Accepts the same keyword arguments as call.
DEPS: analysis_service, src_state, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with the protobuf-based Build API.
This recipe module exposes client stubs for all build API services.
To add a service endpoint, create a class INSIDE THIS MODULE extending Stub. Make sure the class name is the same as the service name.
To call a service endpoint, call the corresponding method on the stub. It will “magicly” know what to do and fail gracefully if it does not. Example:
# Inside recipes/my_recipe.py... my_request_proto = BundleRequest() # Set up your request proto, and then... api.cros_build_api.ArtifactsService.BundleFirmware(my_request_proto)
The stub will perform some validation and then call the build API command.
— def GetVersion(self, test_data=None):
Get the Build API version.
The version is always queried, and the result cached.
Returns: CrosBuildApi.Version, the version of the Build API.
— def __call__(self, endpoint, input_proto, output_type, test_output_data=None, test_teelog_data=None, name=None, infra_step=False, timeout=None, response_lambda=None, pkg_logs_lambda=None, step_text=None):
Call the build API with the given input proto.
This function tries to be as dumb as possible. It does not validate that the endpoint exists, nor that the input_proto has the correct type. While clients may call this function directly, they should ALMOST ALWAYS call the build API through the appropriate stub.
Args: endpoint (str): The full endpoint to call, e.g. chromite.api.MyService/MyMethod input_proto (google.protobuf): The input proto object. output_type (google.protobuf.descriptor): The output proto type. test_output_data (str): JSON to use as a response during testing. test_teelog_data (str): Text to use as tee-log contents during testing. name (str): Name for the step. Generated automatically if not specified. infra_step (bool): Whether this build API call should be treated as an infrastructure step. timeout (int): timeout in seconds to be supplied to the BuildAPI call. response_lambda (fn(output_proto)->str): A function that appends a string to the build api response step. Used to make failure step names unique across differing root causes. pkg_logs_lambda (fn(failed_package_data, fn, chroot_path)->(str, str)): a function which takes information about a failed package and its log and produces the {cp} name of the package and the log's contents. step_text (str): text to put on the step for the call.
Returns: google.protobuf: The parsed response proto.
@staticmethod
— def failed_pkg_data_names(output_proto):
Function to append a list of failed package to the failure step.
To use this, pass response_lambda=api.cros_build_api.failed_pkg_data_names to the build api call.
Args: output_proto (a BuildAPI response): A Response that has a ‘failed_package_data’ attribute.
Returns: A string to append to the response step name.
@staticmethod
— def failed_pkg_logs(input_proto, output_proto, read_raw_fn):
Function to cat log file and retrieve package name.
To use this, pass failed_pkg_lambda=api.cros_build_api.failed_pkg_logs to the build api call.
Args: input_proto (a BuildAPI request): A Request that contains a chromiumos.Chroot attribute called ‘chroot’. output_proto (a BuildAPI response): A Response that has a ‘failed_package_data’ attribute. read_raw_fn (the LUCI file module's read_raw function): Utility function for reading log file inside the chroot.
Returns: A list of tuples containing the package name and corresponding build log.
@staticmethod
— def failed_pkg_names(output_proto):
Function to append a list of failed package to the failure step.
To use this, pass response_lambda=api.cros_build_api.failed_pkg_names to the build api call.
Args: output_proto (a BuildAPI response): A Response that has a ‘failed_packages’ attribute.
Returns: A string to append to the response step name.
— def has_endpoint(self, stub, method):
Verifies that the given endpoint can be called.
Args: stub (Stub): stub instance to check if method
can be called on it. method (str): name of method to check for.
Returns: bool: Whether method
can be called on stub
.
— def initialize(self):
Expose all client stubs defined in this module.
— def is_at_least_version(self, major=1, minor=0, bug=0):
Is the Build API at least |major|.|minor|.|bug|.
Args: major (int): the major version. minor (int): the minor level. bug (int): the bug level.
Returns: bool, whether the version is a least the required value.
@property
— def log_level(self):
Log level used when calling Build API
— def response_step_name(self, output_proto, response_lambda):
@property
— def version(self):
DEPS: easy, depot_tools/gsutil, recipe_engine/file, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with CrOS cache.
A module for CrOS-specific cache steps.
— def create_cache_dir(self, directory):
Creates a working directory outside of recipe structure.
Args: directory (Path): Full path to directory to create.
— def write_and_upload_version(self, gs_bucket, version_file, version):
Write local version file and uploads to Google Storage.
Args: gs_bucket (str): Target Google Storage bucket. version_file (str): Version file name. version (str): Version to write to tracking file.
DEPS: cros_source, easy, gerrit, git, repo, src_state, support, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for interacting with Cq-Depends.
A module for checking that Cq-Depend has been fulfilled.
— def ensure_manifest_cq_depends_fulfilled(self, manifest_diffs):
Checks that Cq-Depend deps between manifests are met.
Checks that all Cq-Depend in all CLs in the given manifest diffs are met.
Args: manifest_diffs (List[ManifestDiff]): An array of ManifestDiff
namedtuples.
— def get_cq_depend(self, gerrit_changes, chunk_size=4):
Get Cq-Depend string for the given list of Gerrit changes.
Args: gerrit_changes (list[GerritChange]): The changes on which to depend. chunk_size (int): The number of CLs per ‘Cq-Depend:’ line.
Return: str: The full Cq-Depend string.
— def get_cq_depend_reference(self, gerrit_change):
Return the Cq-Depend reference string for the given change.
Args: gerrit_change (GerritChange): The change of interest.
Returns: str: The reference string for the change, e.g. chromium:12345
— def get_mutual_cq_depend(self, gerrit_changes):
Mutually Cq-Depend all given Gerrit changes.
Args: gerrit_changes (list[GerritChange]): Changes to mutually CQ-depend.
Return: list[str]: Cq-Depend strings in same order as changes.
DEPS: easy, depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for DupIt script. See the design of this recipe in go/cros-dupit.
A module for the DupIt script.
— def configure(self, rsync_mirror_address, rsync_mirror_rate_limit, gs_distfiles_uri, ignore_missing_args=False, filter_missing_links=False):
Configure the DupIt script module.
Args:
@property
— def gs_distfiles_uri(self):
@property
— def rsync_mirror_address(self):
@property
— def rsync_mirror_rate_limit(self):
— def run(self):
@property
— def tmp_distfiles_path(self):
DEPS: cros_tags, easy, naming, recipe_engine/buildbucket, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to use build history to avoid redundant builds.
— def get_annealing_from_snapshot(self, snapshot_id):
Find the annealing build that created snapshot with given ID.
Args: snapshot_id (str): Manifest snapshot commit ID.
Returns: build_pb2.Build of the annealing build or None.
— def get_matching_builds(self, build, statuses=None, start_build_id=None, limit=None):
Get builds with the matching builder and gerrit_changes.
Args: build (build_pb2.Build): build to match for. statuses ([common_pb2.Status]): query for builds with these statuses. start_build_id (int): exclude builds older than this ID. limit (int): number of results to return. Latest first.
Returns: list[Build] which meet the conditions ordered from latest to oldest.
— def get_passed_builds(self, tags=None):
Retrieve passed builds with the same patches as current build.
Args: tags (list[common_pb2.StringPair]): Get builds with these tags.
Returns: list([build_pb2.Build]): Passed builds with the most recent build per builder.
— def get_passed_tests(self):
Find all tests that have passed with the given patches.
Returns: set[str]: Names of passed tests, if any.
— def get_snapshot_builds(self, snapshot, builder_list=None, statuses=None, patches=None):
Get builds ran at given snapshot and additional optional filtering.
Args: snapshot (GitilesCommit): Snapshot to search on. builder_list (set[str]): List of builder names to filter by. If falsy, no name filtering is performed. statuses ([common_pb2.Status]): The statuses of snapshots to return. If falsy, no status filtering is performed. patches ([GerritChange]): Patches applied to snapshot to search on. If falsy, no patch filtering is performed.
Returns: list[Build] builds with the same snapshot and additional filtering.
— def get_test_failure_builders(self):
Get builders with the given patches that failed tests in the last run.
Returns: set[str]: Names of builders with HW or VM testing failures, if any.
— def get_upreved_pkgs(self, annealing_build):
Retrieve the packages upreved by the annealing build.
Args: annealing_build (build_pb2.Build): Annealing Build.
Returns: list(PackageCPV) of upreved packages.
— def is_retry(self):
Determine if this build is being retried.
Returns: Boolean indicating if it is a retry.
— def set_passed_tests(self, tests):
Record the tests that passed in the current run.
This exposes the tests to history, so future runs may know which tests have passed and which have not.
Args: tests (sequence[str]): (Unique) names of the tests that passed.
@property
— def start_time_in_seconds(self):
Generate start time in seconds.
DEPS: easy, gitiles, src_state, util, depot_tools/gitiles, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/properties, recipe_engine/step, recipe_engine/url
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module for accessing data in the chromeos/infra/config repo
go/robocrop-chrome-browser-proposal: This module is temporarily used to access the Chrome Browser infradata/config repo
— def build_target_dict(self, builds):
Take a list of builds and return a map of build_target names to build.
This function will omit any builds that don't define input build targets.
Args: builds (list[Build]): builds to extract build_target.name set from.
Returns: a dict(str, Build) of build_target names.
@property
— def config(self):
Return the config for this builder.
This convenience property wraps cros_infra_config.get_builder_config, which caches the data.
Returns: BuilderConfig for this builder.
@property
— def config_or_default(self):
Config or default config.
The default config is empty, except for:
— def configure_builder(self, commit=None, changes=None, is_staging=None, name=‘configure builder’, choose_branch=True, config_ref=None):
Configure the builder.
Fetch the builder config. Determine the actual commit and changes to use. Set the bisect_builder and use_flags.
Args: commit (GitilesCommit): The gitiles commit to use. Default: GitilesCommit(.... ref=‘refs/heads/snapshot’). changes (list[GerritChange]): The gerrit changes to apply. Default: the gerrit_changes from buildbucket. is_staging (bool): Whether the builder is staging, or None to have configure_builder determine, based on buildbucket bucket and/or config.general.environment. name (string): Step name. Default: “configure builder”. config_ref (string): Override properties.config_ref (for config CLs).
Returns: BuilderConfig or None
@property
— def current_builder_group(self):
Get the builder group for the currently running builder.
— def determine_if_staging(self, is_staging=None, config=None):
Configure the builder‘s knowledge of whether it’s running in staging.
@exponential_retry(retries=3, condition=(lambda e: getattr(e, ‘had_timeout’, False)))
— def download_binproto(self, filename, step_test_data, timeout=None, application=‘ChromeOS’, message=None):
Helper method to fetch a file from gitiles.
@property
— def experiments(self):
Return the list of experiments active for this build.
@property
— def experiments_for_child_build(self):
Return value for bb schedule_request experiments arg.
— def force_reload(self):
Force a reload of the config map from ToT.
@property
— def fresh_config(self):
Return a freshly loaded config for this builder.
Returns: BuilderConfig for this builder, freshly reloaded.
@property
— def gerrit_changes(self):
— def get_bot_policy_config(self, application=‘ChromeOS’):
Get BotPolicies as defined in infra/config. If application is Chrome, BotPolicies will be fetched from infradata/config.
Returns: BotPolicyCfg as defined in the config repo.
— def get_build_target(self, build=None):
Return the build target from input properties.
Args: build (Build): A buildbucket build, which is expected to have a ‘build_target’ input property, or None for the current build.
Returns: (BuildTarget) The build target, or None.
— def get_build_target_name(self, build=None):
Return the build target name from input properties.
Args: build (Build): A buildbucket build, which is expected to have a ‘build_target’ input property, or None for the current build.
Returns: (str) The name of the build target, or None.
— def get_builder_config(self, builder_name, missing_ok=False):
Gets the BuilderConfig for the specified builder from HEAD.
Finds the BuilderConfig whose id.name matches the specified Buildbucket builder.
This function loads the checked in proto and forms a map from id.name to BuilderConfig on the first call. Subsequent calls just look up in the map, so will be much faster than the first call. This is meant for the case when many lookups are needed, e.g. a parent builder looks up all child configs.
Args:
Returns: A BuilderConfigs proto.
Raises: A LookupError if a BuilderConfig is not found for the specified builder.
— def get_dut_tracking_config(self):
Get TrackingPolicyCfg as defined in infra/config.
Returns: TrackingPolicyCfg as defined in the config repo.
— def get_vm_retry_config(self):
Get SuiteRetryCfg as defined in infra/config for tast vm.
Returns: SuiteRetryCfg as defined in the config repo.
@property
— def gitiles_commit(self):
— def initialize(self):
@property
— def is_configured(self):
@property
— def is_staging(self):
@property
— def package_git_revision(self):
@property
— def parent_builder_group(self):
Get the builder group for the parent builder.
@property
— def props_for_child_build(self):
Return properties dict meant to be passed to child builds.
Preserve $chromeos/cros_infra_config when launching a child build.
— def safe_get_builder_configs(self, builder_names):
Gets the BuilderConfigs for the specified builder names from HEAD.
The returned dict will not contain key/values for builder names that could not be found in config.
Args:
Returns: dict(str, BuilderConfig) of found BuilderConfigs.
— def should_exit(self, run_spec):
— def should_run(self, run_spec):
@property
— def target_builder_group(self):
Get the builder group for the target builder.
This is used by findit, which has a single builder that performs bisection using the configuration of another builder.
DEPS: depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for LvfsMirror script.
A module for the LvfsMirror script.
— def configure(self, mirror_address, gs_uri):
Configure the LvfsMirror script module.
Args:
@property
— def gs_uri(self):
@property
— def local_cache(self):
@property
— def mirror_address(self):
— def run(self):
DEPS: cros_infra_config, cros_release_util, cros_sdk, cros_source, cros_storage, cros_test_plan, cros_version, metadata, naming, skylab, util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with Paygen and its config.
A module for CrOS-specific paygen steps.
— def create_au_test_configs(self, gen_req, configured_payloads, delta_test_override=PaygenOrchestratorProperties.RESPECT_CONFIG, full_test_override=PaygenOrchestratorProperties.RESPECT_CONFIG):
Determine which hardware tests need to be run for the given payload.
Args: gen_req (GenerationRequest): Proto defining the payload-to-be for which to find tests. configured_payloads (list[dict]): Configs for the payloads we are generating and testing. delta_test_override (PayloadTestsOverride): Option to override the configured delta payload testing policy. full_test_override (PayloadTestsOverride): Option to override the configured full payload testing policy.
Returns: list[AutoupdateTestConfig]: Test configs that should be run for the requested payload.
— def create_paygen_build_report(self, paygen_build_results):
Prepare payload information for the release pubsub.
Args: paygen_build_results (list[build_pb2.Build]): The results of the child paygen builders, as returned by api.buildbucket.run.
Returns: A list[BuildReport.Payload] containing payload information for the pubsub.
— def create_paygen_test_config(self, tgt_payload, delta_type, src_version=None, src_channel=None, applicable_models=None):
Create a PaygenTestConfig for a test FullPayload or DeltaPayload.
Args: tgt_payload (Payload): The payload to be tested. delta_type (DeltaType): The type of update we are doing with this payload. src_version (str): The version of the image to test updating from (e.g. ‘13373.0.0’). Required if the payload is a FullPayload, required to be None if it‘s a DeltaPayload. src_channel (str): The channel of the image to test updating from (e.g. ‘canary-channel’). Required if the payload is a FullPayload, required to be None if it’s a DeltaPayload. applicable_models (list(str)): A list of models that a paygen test should run against.
Returns: A PaygenTestConfig or None if no source payload exists or unsupported Payload provided.
@property
— def default_delta_types(self):
— def get_builder_configs(self, builder_name, **kwargs):
Return the configs matching the query or [].
Note that all comparisons are made in lower case!
Args: builder_name (str): The name of the builders to return configuration for. **kwargs: Match keyword to top level dictionary contents. For example passing delta_payload_tests=true will match only if matched.
Returns: A list of dictionaries of the matching configurations. For example:
[ { “board”: { “public_codename”: “cyan”, “is_active”: true, “builder_name”: “cyan” }, “delta_type”: “MILESTONE”, “channel”: “stable”, “chrome_os_version”: “13020.87.0”, “chrome_version”: “83.0.4103.119”, “generate_delta”: true, “delta_payload_tests”: true, “full_payload_tests”: false }, {...}, {...} ]
— def get_delta_requests(self, payload_def, src_artifacts, tgt_artifacts, bucket, verify, dryrun):
Examine def, source, and target and return list(GenerationRequests).
If there isn't a matching source and target available, then return [].
bucket, verify, and dryrun are all used to fill out the GenerationRequest().
Args: payload_def (dict): A singular configuration from pulled config. src_artifacts (list[cros_storage.Image]): Available src images. tgt_artifacts (list[cros_storage.Image]): Available tgt images. bucket (str): The bucket containing the requests (and destination). verify (bool): Should we run payload verification. dryrun (bool): Should we not upload resulting artifacts.
Returns: A completed list[GenerationRequest] or [].
— def get_full_requests(self, tgt_artifacts, bucket, verify, dryrun):
Get the configured full requests for a set of artifacts.
Args: tgt_artifacts (list[cros_storage.Image]): Available tgt images. bucket (str): The bucket containing the requests (and destination). verify (bool): Should we run payload verification. dryrun (bool): Should we not upload resulting artifacts.
Returns: A completed list[GenerationRequest] or [].
— def get_n2n_requests(self, tgt_artifacts, bucket, verify, dryrun):
Generate a N2N testing payloads.
We will examine all the artifacts in tgt artifacts for unsigned test images and generate n2n requests (a request that updates to and from the same version.
Args: tgt_artifacts (list[cros_storage.Image]): Available tgt images. bucket (str): The bucket containing the requests (and destination). verify (bool): Should we run payload verification. dryrun (bool): Should we not upload resulting artifacts.
Returns: A list[GenerationRequest] or [].
@property
— def paygen_children_timeout_sec(self):
Get the currently configured paygen timeout in seconds.
@property
— def paygen_orchestrator_timeout_sec(self):
Get the currently configured paygen orchestrator timeout in seconds.
This contains the duration expected for paygen children.
Returns The int max number of seconds the paygen orchestrator should take.
— def run_paygen_builders(self, paygen_reqs):
Launch paygen builders to generate payloads and run configured tests.
Args: paygen_reqs (list[PaygenRequest]): Protos containing the payloads to generate and the corresponding tests to launch.
Returns: A list of completed builds.
— def schedule_au_tests(self, paygen_test_configs):
Schedule Paygen autoupdate (AU) tests.
Create a cros_test_platform build request to launch AU tests.
Args: paygen_test_configs (list[PaygenTestConfig]): A list of PaygenTestConfigs for which to schedule au_tests.
Returns: The scheduled buildbucket build.
DEPS: cros_build_api, cros_infra_config, cros_source, cros_version, git, git_footers, git_txn, repo, src_state, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/runtime, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for uploading CrOS prebuilts to Google Storage.
A module for uploading package prebuilts.
— def get_package_index_info(self, gs_bucket, snapshot=None, build_target=None, profile=None, count=None, test_data_dict=None, name=None):
Return the PackageIndexInfo for this build.
Args: gs_bucket (str): Google storage bucket where the prebuilts live. snapshot (GitilesCommit): The snapshot for this build, or None. build_target (BuildTarget): BuildTarget for the build, or None. profile (chromiumos.Profile): Profile for the build, or None. count (int): Number of snapshots to check, or None. test_data_dict (dict): Dictionary of test data: test_data_dict[snapshot][target_name][file_name] = PackageIndexInfo name (str): Name for the step, or None.
Returns: (list[PackageIndexInfo]) The metadata for CreateSysrootService.
— def upload_target_prebuilts(self, target, profile, kind, gs_bucket, private=True):
Upload binary prebuilts for the build target to Google Storage.
Determines what to upload, uploads it, and points Portage to the upload URI. This step works entirely within the workspace checkout.
Args: target (BuildTarget): The build target to upload prebuilts for. profile (chromiumos.Profile): The Profile, or None. kind (BuilderConfig.Id.Type): Kind of prebuilts to upload. gs_bucket (str): Google storage bucket to upload prebuilts to. private (bool): Whether or not the target prebuilts are private.
DEPS: infra/cloudkms, infra/provenance, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for adding provenenace to generated artifacts.
Apis for generating a signed provenance for created artifacts.
— def generate_provenance(self, file_paths, recipe):
Generate BCID provenances for a list of artifacts.
Args: file_paths (List[str]): the location of artifacts to generate an attestation for. recipe (str): the name of the recipe that this build is running.
Returns: (List[str]): the location of the attestations on disk.
DEPS: build_menu, build_reporting, builder_metadata, cros_artifacts, cros_paygen, cros_release_util, cros_test_plan, cros_version, gerrit, git, repo, src_state, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
An API for providing release related operations (e.g. paygen, signing).
— def create_releasespec(self, specs_dir=‘buildspecs’, branch=‘release’, step_name=‘create releasespec’, dry_run=False, gs_location=None):
Create a pinned manifest and upload to manifest-versions/releasespecs.
Args: specs_dir (str): Relative path in manifest-versions in which to place the pinned manifest. branch (str): The branch of manifest-versions that will be used, or None to use the default branch. step_name (str): The step name to use. dry_run (bool): Whether the git push is --dry-run. gs_location (string): If set, will also upload the pinned manifest to GS.
Returns: Full URL path to newly-uploaded manifest.
— def get_au_testing_models(self, fsi=False):
Determine which models are configured to run autoupdate tests.
TODO(b/223252953): Filter down to models that are available in the lab.
Args: fsi (bool): If True, then return all models which should run autoupdate tests for FSI images, which require broader testing than non-FSI.
Returns: List[str]: The names of each model that should run paygen tests.
@property
— def manifest_versions_url(self):
Returns the git repo URL for manifest versions.
— def push_and_sign_images(self, config, sysroot):
Call the Push Image Build API endpoint for the build.
This pushes the image files to the appropriate bucket and prepares them for signing. The actual execution of these procedures is handled in the underlying script, chromite/scripts/push_image.py. Must be used in the context of a build.
Args: config (BuilderConfig): The Builder Config for the build. sysroot (Sysroot): sysroot to use.
Return: Tuple of (gs_image_dir, instructions_uris): gs_image_dir is the GS directory the image was pushed from. instructions_uris is a list of URIs to instructions files for the pushed images.
@property
— def releasespec(self):
Return the releasespec as created by this module, or None.
— def schedule_payload_generation(self):
Schedule the generation of release payloads using the context of a build.
This is nonblocking, will launch and return the id for the paygen orchestrator. It assumes its being ran after a local build has been made.
Args: build_target_name (str): The builder target name. target_chromeos_version (str): The target chromeos version (e.g. ‘13337.0.1’). milestone (int): The milestone number.
Returns: The int build id for the launched orchestrator.
— def validate_sign_types(self, sign_types):
Takes an array of IMAGE_TYPE enums and validates them or raises StepFailure.
DEPS: build_menu, cros_artifacts, cros_paygen, cros_source, cros_version, gerrit, git, repo, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
An API for managing release config.
— def update_config(self, release_branch):
Creates CLs updating config file to include new release branch.
While Rubik is being turned-up, this endpoint modifies both the legacy config in chromite as well as the Rubik starlark config in infra/config.
Args: release_branch (str): Release branch, e.g. “release-R89-13729.B”.
DEPS: cros_infra_config, cros_source
PYTHON_VERSION_COMPATIBILITY: PY2+3
An API for providing release related utility functions.
@staticmethod
— def channel_long_string_to_enum(str_channel):
Convert long channel name strings (e.g. ‘beta-channel’) to enum values.
@staticmethod
— def channel_short_string_to_enum(str_channel):
Convert short channel name strings (e.g. ‘beta’) to enum values.
@staticmethod
— def channel_strip_prefix(channel):
Takes a common_pb2.Channel and returns an unprefixed str (e.g. beta).
@staticmethod
— def channel_to_long_string(channel):
Takes a common_pb2.Channel and returns a suffixed str (e.g. dev-channel).
@staticmethod
— def match_channels(channel1, channel2):
Determine if two channels are equal, even if represented differently.
Args: channel1 (str|Channel): A representation of a channel, either as a Channel enum (e.g. Channel.CHANNEL_BETA), a short string (e.g. “beta”), or a long string (e.g. “beta-channel”). channel2 (str|Channel): As above.
Returns: bool: Whether the two args describe the same channel.
— def release_builder_name(self, build_target, branch=None, staging=False):
Determine the Rubik child builder name for the given build_target.
Args: build_target (string): name of the build target, e.g. zork or kevin-kernelnext branch (string): optional, branch we‘re on. staging (string): optional, whether or not we’re in staging.
Return: The Rubik child builder, e.g. zork-release-main.
DEPS: cros_build_api, cros_history, cros_infra_config, cros_source, easy, git_footers, repo, src_state, recipe_engine/cipd, recipe_engine/context, recipe_engine/cq, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module for determining if a build is unnecessary.
— def check_for_toolchain_change(self, gerrit_changes, gitiles_commit, chroot, test_value=None, name=None):
Check for toolchain changes.
Args: gerrit_changes (list[GerritChange]): The gerrit changes for the build. gitiles_commit (GitilesCommit): The gitiles commit for the build. chroot (Chroot): The SDK for the build. test_value (bool): The answer to use for testing, or None. name (str): The step name to display, or None for default.
Returns: (bool): Whether there are toolchain_cls applied.
— def check_force_relevance_footer(self, gerrit_changes, configs):
Check the incoming gerrit changes to determine if we force relevance.
Args: gerrit_changes (list[GerritChange]): The gerrit changes. configs (list[BuilderConfig]: The Builder Configs for the build.
Returns: A list of target names, derived from configs
, to be forced relevant.
— def get_dependency_graph(self, sysroot, chroot, packages=None):
Calculates the dependency graph for the build target & SDK
Args: sysroot (Sysroot): The Sysroot being used. chroot (chromiumos.Chroot): The chroot it is being run in. packages (list[chromiumos.PackageInfo]): The packages for which to generate the dependency graph.
Returns: (chromite.api.DepGraph, chromite.api.DepGraph): A tuple of opaque dependency graph objects, with the first element being the dependency graph for the target and the second element the graph for the SDK/chroot.
— def get_necessary_builders(self, builder_configs, gerrit_changes, gitiles_commit, name=None, test_builder_ids=None):
Determines which builders must be run (and which can be skipped).
This filters on preconfigured RunWhen rules, as well as on rules allowing skipping of image builders. Image builders are those that run the build_target recipe, producing an IMAGE_ZIP Chrome OS artifact.
Args: builder_configs (list[chromiumos.BuilderConfig]): builder configs to consider for skipping. gerrit_changes (bbcommon_pb2.GerritChange): The Gerrit Changes to be applied for the build, if any. gitiles_commit (bbcommon_pb2.GitilesCommit): The manifest-internal snapshot Gitiles commit. name (str): The step name. test_builder_ids (list[BuilderConfig.Id]): test override
Returns: list[str]: the names of the child builders that must be run.
— def get_package_dependencies(self, sysroot, chroot, patch_sets=None, packages=None, include_rev_deps=False):
Calculates the dependencies for the build target.
Args: sysroot (Sysroot): The Sysroot being used. chroot (chromiumos.Chroot): The chroot it is being run in. patch_sets (List[gerrit.PatchSet]): The changes applied to the build. Used to determine the affected paths. If empty / None returns package dependencies for all paths. packages (list[chromiumos.PackageInfo]): The list of packages for which to get dependencies. If none are specified the standard list of packages is used.
Returns: (List[str]): A list of package dependencies for the build target.
— def initialize(self):
Initializes the module.
— def is_build_pointless(self, gerrit_changes, gitiles_commit, dep_graph, config, force_relevant=False, test_value=None):
Determines if build(s) can be terminated early.
If build_target is set, then the chromiumos workspace must have been checked out prior to calling this method. This is a requirement for BuildDependencyGraph checks.
Args: gerrit_changes (bbcommon_pb2.GerritChange): The Gerrit Changes to be applied for the build, if any. gitiles_commit (bbcommon_pb2.GitilesCommit): The manifest-internal snapshot Gitiles commit. dep_graph (chromite.api.DepGraph): The dependency graph to compare the Gerrit changes against to test for build relevancy. config (chromiumos.BuilderConfig): config for the builder. force_relevant (bool): Whether to always declare the build relevant. test_value (bool): The answer to use for testing. Default: build is not pointless.
Returns: bool: Whether the build can be terminated early.
— def is_depgraph_affected(self, gerrit_changes, gitiles_commit, dep_graph, test_value=None, name=None):
Determines if a Gerrit Change affects a given dependency graph.
Args: gerrit_changes (bbcommon_pb2.GerritChange): The Gerrit Changes to be applied for the build, if any. gitiles_commit (bbcommon_pb2.GitilesCommit): The manifest-internal snapshot Gitiles commit. dep_graph (chromite.api.DepGraph): The dependency graph to compare the Gerrit changes against to test for build relevancy. test_value (bool): The answer to use for testing, or None. name (str): The step name to display, or None for default.
Returns: bool: Whether the given Gerrit Change affects the given dependency graph.
— def postsubmit_relevance_check(self, gitiles_commit, dep_graph):
Determines if postsubmit builder is relevant for given snapshot.
Args: gitiles_commit (bbcommon_pb2.GitilesCommit): The manifest-internal snapshot Gitiles commit. dep_graph (chromite.api.DepGraph): The dependency graph to compare the Gerrit changes against to test for build relevancy.
Returns: bool: Whether any packages that target depends on have been upreved in the latest snapshot or the build was forced relevant.
@property
— def toolchain_cls_applied(self):
Whether there are toolchain CLs applied to the source tree.
DEPS: cros_infra_config, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for chromium tests on skylab to upload result to Result DB.
— def apply_exonerations(self, invocation_ids, default_behavior=Request.Params.TestExecutionBehavior.BEHAVIOR_UNSPECIFIED, behavior_overrides_map=None, variant_filter=None):
Exonerate unexpected test failures for the given invocations.
Currently only supports exonerating tests based on criticality. First attempt to exonerate based on test run's default behavior. If the default behavior is not exonerable, try to apply a test case behavior override.
Args: invocation_ids (list(str)): The ids of the invocation whose results we should try to exonerate. default_behavior (TestExecutionBehavior): The default behavior for all tests in the test_runner build. behavior_overrides_map (dict{str: TestExecutionBehavior}): Test-specific behavior overrides that supersede the default behavior. variant_filter (dict): Attributes which must all be present in the test result variant definition in order to exonerate.
@property
— def current_invocation_id(self):
Return the current invocation's id.
— def export_invocation_to_bigquery(self, bigquery_exports=None):
Modifies the current invocation to be exported to BigQuery (along with its children) once it is finalized.
This should only be called on top level invocations, if it is called on a parent and a child, all test results in the child will be exported twice.
Note that this should normally be configured on the builder definition in infra/config rather than in the recipe. Only use this when a builder cannot be determined to always export to Bigquery at configuration time, but needs to determine it at recipe runtime.
Args: bigquery_exports (list(resultdb.BigQueryExport)): The BigQuery export configurations of tables and predicates of what to export.
— def extract_chromium_resultdb_settings(self, test_args):
Extract resultdb settings from test_args for chromium test results.
Extracts resultdb settings from test_args. Also converts base_tags from a list of strings [‘key:value’] into a list of string tuples [(key, value)] as is expected by resultdb.wrap().
Args: test_args (string): Extra autotest arguments, e.g. “key1=val1 key2=val2”. Chromium tests use test_arg to pass runtime parameters to our autotest wrapper. We reuse it to pipe resultDB arguments, because it is easy to access in the test runner recipe. test_args must contain resultdb_settings which is base64 compressed json string, wrapping all resultdb parameters.
Returns: A dictionary wrapping all ResultDB upload parameters.
Raises: ValueError: If resultdb settings are not found in the test_args.
— def get_drone_artifact_directory(self, base_dir, result_format=None, artifact_directory=''):
Get the path to the test results artifact directory on the drone.
Currently only supports Tast and Gtest.
Args: base_dir (Path): The path of the base test results on the drone server. For example, Chromium gtest result can be found at base_dir/autoserv_test/chromium/results. result_format (str): The format of the test results. artifact_directory (Path): rel path relative to autotest result folder. ONLY for gtest, E.g. chromium/debug. For tast test, we rely on it to pass the runtime result path to adapter. So we do not accept user defined artifact fed to this module.
Returns: Path to the test results artifact directory on the drone server.
— def get_drone_result_file(self, base_dir, result_format):
Get the path to the test results file on the drone.
There are hardcoded for tast and gtest in this module.
Args: base_dir (Path): The path of the base test results on the drone server. For example, Chromium gtest result can be found at base_dir/autoserv_test/chromium/results. result_format (str): The format of the test results.
Returns: Path to the test results file on the drone server.
— def report_missing_test_cases(self, test_names, base_variant):
Upload test results for missing test cases to ResultDB.
Args: test_names (str): The names of the tests that should have run but did not. base_variant (dict): Variant key-value pairs to attach to the test results.
— def upload(self, config, stainless_url=None, step_name=‘upload test results to rdb’):
Wrapper for uploading test results to resultDB.
Args: config (dict) A dict wrapping all resultdb parameters. stainless_url (string): Link to the Stainless logs for the test run. step_name (str): The name of the step or None for default.
DEPS: easy, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with CrOS's Schedule.
A module for reading, commiting, and manipulating the release schedule.
— def fetch_chromiumdash_schedule(self, start_mstone=None, fetch_n=10):
Return the json schedule from chromiumdash.
Args: start_mstone (int): start with this milestone. Default: last branched milestone. fetch_n (int): Number of milestones to return. Default: 10.
Returns: (str): JSON string representing the results of the query, or None.
— def get_last_branched_mstone(self):
Gets the last branched milestone.
Returns: A chromiumos.chromiumdash.FetchMilestoneScheduleResponse.
Raises: StepFailure if not able to find mstone.
— def get_last_branched_mstone_n(self):
Gets the last branched milestone number as an int.
— def json_to_proto(self, sched_str_json):
Returns a FetchMilestoneScheduleResponse from JSON repr.
DEPS: cros_build_api, cros_infra_config, cros_relevance, cros_source, easy, git, goma, overlayfs, remoteexec, src_state, workspace_util, depot_tools/depot_tools, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for interacting with cros_sdk, the interface to the CrOS SDK.
A module for interacting with cros_sdk.
— def __call__(self, name, args, **kwargs):
Executes ‘cros_sdk’ with the supplied arguments.
Args:
Returns: See ‘step.call’.
— def build_chmod_chroot(self):
Chroot needs to be tightened to 755 for the build process.
@property
— def chrome_root(self):
@property
— def chroot(self):
Return a chromiumos.common.Chroot.
@contextlib.contextmanager
— def cleanup_context(self, checkout_path=None):
Returns a context that cleans the SDK chroot named cache.
This may be called before cros_source.ensure_synced_cache, since it yields immediately, and only accesses checkout_path during cleanup.
Args: checkout_path (Path): Path to source checkout. Default: cros_source.workspace_path.
— def cleanup_sysroot(self):
— def configure(self, chroot_parent_path):
Configure CrosSdkApi.
Args: chroot_parent_path (Path): Parent for chroot directory.
— def configure_goma(self):
Configure goma for Chrome.
This is a helper function to do the various bits of cros_sdk configuration needed for Chrome to be built with goma.
Must be run with cwd inside a chromiumos source root.
— def configure_remoteexec(self):
Configure remoteexec for Chrome.
— def create_chroot(self, version=None, use_image=True, bootstrap=False, sdk_version=None, timeout_sec=‘DEFAULT’, test_data=None, test_toolchain_cls=None, name=None):
Initialize the chroot and link it into the workspace.
Create a chroot if one does not already exist in the chroot path. If one already exists, but is not reusable by this build (see _ensure_cache_state) delete the existing chroot and create a new one.
Args: version (int): Required SDK cache version, if any. Some recipes do not care what version the SDK is, they just need any SDK. use_image (boolean): Mount the SDK file as an image. Default: True. bootstrap (boolean): Whether to bootstrap the chroot. Default: False sdk_version (string): Optional. Specific SDK version to include in the CreateSdkRequest, e.g. 2022.01.20.073008. timeout_sec (int): Step timeout (in seconds). Default: None if bootstrap is True, otherwise 3 hours. test_data (str): test response (JSON) from the SdkService.Create call, or None to use the default in cros_build_api/test_api.py. test_toolchain_cls (bool): Test answer for detect_toolchain_cls. name (str): Step name. Default: ‘init sdk’.
Returns: chromiumos_pb2.Chroot protobuf for the chroot.
@property
— def cros_sdk_path(self):
Returns a Path to the cros_sdk script.
@property
— def force_off_toolchain_changed(self):
Return whether we are forcing toolchain_cls off for testing.
— def goma_config(self):
— def has_goma_config(self):
— def has_remoteexec_config(self):
— def initialize(self):
Cache the chroot path.
— def link_chroot(self, checkout_path, chroot_path=None):
Link the chroot to a chromiumos checkout.
Args: checkout_path (Path): Path to the checkout root. chroot_path (Path): Path to the chroot, or None for the default.
@long_timeouts.setter
— def long_timeouts(self, value):
Set long_timeouts.
This boolean is sticky.
— def mark_sdk_as_dirty(self):
@property
— def remoteexec_config(self):
— def run(self, name, cmd, env=None, **kwargs):
Runs a command in a cros_sdk chroot.
It is assumed the current working directory is within a chromiumos checkout.
Args:
Returns: See ‘step.call’.
@property
— def sdk_cache_state(self):
Returns default values if not set and cache state file does not exist.
@property
— def sdk_is_dirty(self):
Return whether the SDK is dirty
— def set_chrome_root(self, chrome_root):
Set chrome root with synced sources.
This is a helper function to set up a chrome root.
Args: chrome_root (Path): Directory with the Chrome source.
— def set_goma_config(self, goma_dir, goma_client_json, goma_approach, log_dir, stats_file, counterz_file):
Set the goma config.
Args: goma_dir (Path): Path to the goma install location. goma_client_json (Path): Path to the goma client credentials file. goma_approach (chromiumos.GomaConfig.GomaApproach): Goma Approach. log_dir (Path): Path to the log directory. stats_file (str): Name of the goma stats file, relative to log_dir. counterz_file (str): Name of the goma counterz file, relative to log_dir.
— def set_remoteexec_config(self, reclient_dir, reproxy_cfg_file):
Set the remoteexec config.
— def set_use_flags(self, use_flags):
@contextlib.contextmanager
— def snapshot(self, create_test_data=None, restore_test_data=None):
Returns a context that snapshots and restores the SDK chroot state.
When this context manager is entered, a snapshot is made of the chroot state and a token corresponding to that snapshot is stored. When the context is exited, regardless of the reason, the context manager will attempt to restore the chroot back to that initial snapshot. If the chroot was initially created with ‘nouse-image’, it will be replaced so that it supports the ability to make snapshots.
Args: create_test_data (str): test response (JSON) from the SdkService.CreateSnapshot call, or None to use the default in cros_build_api/test_api.py. restore_test_data (str): test response (JSON) from the SdkService.RestoreSnapshot call, or None to use the default in cros_build_api/test_api.py.
— def swarming_chmod_chroot(self):
Chroot is deployed as root, therfore change permissions to allow for Swarming cache uninstall/install.
— def unlink_chroot(self, checkout_path):
Unlink the chroot from the chromiumos checkout.
Args: checkout_path (Path): Path to the checkout root.
— def unmount_chroot(self, chroot=None):
— def update_chroot(self, commit, changes, build_source=False, toolchain_targets=None, timeout_sec=‘DEFAULT’, test_data=None, test_toolchain_cls=None, name=None):
Update the chroot.
Args: commit (GitilesCommit): Active gitiles_commit, or None. changes (list[GerritChange]): Active gerrit changes, or None. build_source (boolean): Whether to compile from source. Default: False. toolchain_targets (list[BuildTarget]): List of toolchain targets needed, or None. timeout_sec (int): Step timeout (in seconds), or None for no step timeout. Default: 24 hours if building from source or a toolchain change is detected, otherwise 3 hours. test_data (str): test response (JSON) from the SdkService.Update call, or None to use the default in cros_build_api/test_api.py. test_toolchain_cls (bool): Test answer for detect_toolchain_cls. name (string): Step name. Default: “update sdk”.
— def uprev_packages(self, build_targets=None, timeout_sec=(10 * 60), name=‘uprev packages’):
Uprev packages.
Args: build_targets (list[BuildTarget]): List of build_targets whose packages should be uprevved, or None for all build_targets. timeout_sec (int): Step timeout (in seconds). Default: 10 minutes. name (string): Name for step.
Returns: UprevPackagesResponse
DEPS: cros_build_api, depot_tools/gsutil, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to encapsulate communication with the signing fleet.
— def get_signed_build_metadata(self, instructions_metadata):
Get the metadata of the signed build.
Note - this requires that wait_for_signing has been called and is complete.
Args: instructions_metadata (dict): The metadata dict returned from wait_for_signing.
Returns: List of signed build metadata dicts (one per signed build image).
@staticmethod
— def get_status_from_instructions(instructions):
Given an instructions file, pull out the status of the signing operation.
Args: instructions (dict): An instructions metadata file.
Returns: The status of the signing, or None if not available.
@staticmethod
— def signing_succeeded(metadata):
Whether the provided metadata contains a successful signing operation.
Args: metadata (dict): Metadata from the instructions file.
Returns: True/False whether the signing succeeded.
— def verify_signing_success(self, instructions_metadata):
Verifies that the signing operation succeeded.
— def wait_for_signing(self, instructions_list):
Wait for signing to complete for a set of instructions files.
This method polls each instructions file for metadata, and waits for that metadata to become present, then checks to see if a terminal passing or failed state has been achieved. This method returns when either a) signing is complete for all of the provided instructions files, or b) the configured timeout has elapsed.
Args: instructions_list (array): List of GS locations for instructions files.
Returns A dict of instruction file location -> instruction metadata for all complete signing operations.
DEPS: support, recipe_engine/service_account, recipe_engine/step, recipe_engine/time, recipe_engine/url
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module for interacting with the ChromeOS Sheriff-o-Matic.
— def get_annotation(self, step_name):
Return a SomAnnotation
for step_name
.
None if there is no annotation for the step.
— def get_silence_reason(self, annotation):
Return the reason an annotation is silenced, None if there is no silence.
Note that if an annotation is in a group that is silenced, it will also be considered silenced.
Args: annotation (SomAnnotation): The annotation to analyze.
Returns: A str
DEPS: bot_cost, cros_build_api, cros_infra_config, easy, gcloud, gerrit, git, git_footers, gitiles, overlayfs, repo, src_state, test_util, util, depot_tools/gitiles, depot_tools/gsutil, recipe_engine/archive, recipe_engine/buildbucket, recipe_engine/cas, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with CrOS source.
A module for CrOS-specific source steps.
— def apply_gerrit_changes(self, gerrit_changes, include_files=False, include_commit_info=False, ignore_missing_projects=False, test_output_data=None):
Apply GerritChanges to the workspace.
Args: gerrit_changes (list[GerritChange]): list of gerrit changes to apply. include_files (bool): whether to include information about changed files. include_commit_info (bool): whether to include info about the commit. ignore_missing_projects (bool): Whether to ignore projects that are not in the source tree. (For example, the builder uses the external manifest, but the CQ run includes private changes.) test_output_data (dict): Test output for gerrit-fetch-changes.
Returns: List[ProjectCommit]: A list of commits from cherry-picked patch sets.
— def apply_patch_set(self, patch, project_path, is_abs_path=False):
Apply a PatchSet to the git repo in ${CWD}.
Args: patch (PatchSet): The PatchSet to apply. project_path (str): The path in which to apply the change. is_abs_path (bool): Whether the project path is an absolute path. The default is False meaning the project_path is relative to the workspace.
Returns: (ProjectCommit) commit for the applied patch.
@property
— def branch_manifest_file(self):
Returns the Path to the manifest_file for this build.
@property
— def cache_path(self):
The cached checkout path.
This is the cached version of source (the internal manifest checkout), usually updated once at the beginning of a build and then mounted into the workspace path.
— def checkout_branch(self, manifest_url, manifest_branch, projects=None, init_opts=None, sync_opts=None, step_name=None):
Check out a branch of the current manifest.
Note: If there are changes applied when this is called, repo will try to rebase them to the new branch.
Args:
— def checkout_gerrit_change(self, change):
Check out a gerrit change using the gerrit refs/changes/... workflow.
Differs from apply_changes in that the change is directly checked out, not cherry picked (so the patchset parent will be accurate). Used for things like tricium where line number matters.
Args: change (GerritChange): Change to check out. name (string): Step name. Default: “checkout gerrit change”.
— def checkout_manifests(self, commit=None, is_staging=False, checkout_external=False, test_footers=None):
Check out the manifest projects.
Syncs the manifest projects into the workspace, at the appropriate revision. This is intended for builders that only need the manifest projects, not for builders that have other projects checked out as well.
If |commit| is on an unpinned branch, there is no reasonable way to discern which revision of the external manifest is correct. The branch's copy of the external manifest is unbranched. As such, the return will have an empty commit id, and the external manifest source tree may be dirty (mirrored manifest files will be copied from the internal manifest, but not committed.)
Args: commit (GitilesCommit): The commit to use, or None for the default (from cros_infra_config.configure_builder) is_staging (bool): Whether this is staging. checkout_external (bool): Whether to checkout the external manifest. test_footers (str): test Cr-External-Snapshot footer data(values separated by newlines), or None.
Returns: (GitilesCommit) The GitilesCommit to use for the external manifest.
@contextlib.contextmanager
— def checkout_overlays_context(self, mount_cache=True, snapshot_mount=False, disk_type=‘pd-ssd’):
Returns a context where overlays can be mounted.
Args: mount_cache (bool): Whether to mount the chromiumos cache. Default: True. snapshot_mount (bool): Whether to utilize the snapshot mount location, rather than the image preload directory. Default: False disk_type (str): GCE disk type to use. Default: pd-ssd
— def checkout_tip_of_tree(self):
Check out the tip-of-tree in the workspace.
— def configure_builder(self, commit=None, changes=None, is_staging=None, default_main=False, name=‘configure builder’):
Configure the builder.
Fetch the builder config. Determine the actual commit and changes to use. Set the bisect_builder and use_flags.
Args: commit (GitilesCommit): The gitiles commit to use. Default: GitilesCommit(.... ref=‘refs/heads/snapshot’). changes (list[GerritChange]): The gerrit changes to apply. Default: the gerrit_changes from buildbucket. is_staging (bool): Whether the builder is staging, or None to have configure_builder determine, based on buildbucket bucket and/or config.general.environment. default_main (bool): Whether the default branch should be ‘main’. Default: use the appropriate snapshot branch. name (string): Step name. Default: “configure builder”.
Returns: BuilderConfig or None
— def ensure_synced_cache(self, manifest_url=None, init_opts=None, sync_opts=None, cache_path_override=None, is_staging=False, projects=None, gitiles_commit=None, manifest_branch_override=None):
Ensure the configured repo cache exists and is synced.
Args:
— def fetch_snapshot_shas(self, count=((7 * 24) * 2)):
Return snapshot SHAs for the manifest.
Return SHAs for the most recent |count| commits in the manifest. The default is to fetch 7 days worth of snapshots, based on (an assumed) 2 snapshots per hour.
Args:
Returns: (list[str]) The list of snapshot SHAs.
— def find_project_paths(self, project, branch, empty_ok=False):
Find the source paths for a given project in the workspace.
Will only include multiple results if the same project,branch is mapped more than once in the manifest.
Args: project (str): The project name to find a source path for. branch (str): The branch name to find a source path for. empty_ok (bool): If no paths are found, return an empty list rather than raising StepFailure
Returns: list(str), The path values for the found project.
— def initialize(self):
Initialization that follows all module loading.
@property
— def is_source_dirty(self):
Returns whether the source is dirty.
Returns whether the source is dirty. The source is dirty if it was checked out to a custom snapshot from isolate or has had patches applied or has been moved to a branch.
@property
— def manifest_branch(self):
Returns any non-default manifest branch that is checked out.
@property
— def manifest_push(self):
Returns the manifest branch to push changes to.
@property
— def mirrored_manifest_files(self):
Returns the names of files that are mirrored into the public manifest.
The files returned are owned by chromeos/manifest-internal, and are copied into chromiumos/manifest when they are changed.
Annealing does this as part of creating the snapshot, and the various builders do it when applying manifest changes.
Returns: (list[MirroredManifestFile]) with files we mirror.
@property
— def pinned_manifest(self):
Return the pinned manifest for this build.
@property
— def preload_path(self):
The cached image checkout path.
This is the cached version of source that is included in the base image of the bot, used as an initial reference path.
— def push_uprev(self, uprev_response, dry_run, commit_only=False, is_staging=False):
Commit and push any upreved packages to its remote.
Args: uprev_response (list[PushUprevRequest]): Named tuple containing the modified ebuild and associated message subject. dry_run (bool): Dry run git push or not. commit_only (bool): Whether to skip the push step.
Return: all_uprevs_passed (bool): True if all uprevs succeeded, False if ANY failed.
@property
— def snapshot_cas_digest(self):
Returns the snapshot digest in use or None.
— def sync_checkout(self, commit=None, manifest_url=None, **kwargs):
Sync a checkout to the appropriate manifest.
If the module properties contain the sync_to_manifest
field, that will be used. Otherwise the given commit/manifest_url will be used.
Args: commit (GitilesCommit): The gitiles_commit to sync to. Default: commit saved in cros_infra_config.configure_builder(). manifest_url: URL of manifest repo. Default: internal manifest
@exponential_retry(retries=3, condition=retry_timeouts)
— def sync_to_gitiles_commit(self, gitiles_commit, manifest_url=None, **kwargs):
Sync a checkout to the specified gitiles commit.
Will first attempt to sync to snapshot.xml, then default.xml.
Args: gitiles_commit (GitilesCommit): commit to sync to manifest_url: URL of manifest repo. Default: internal manifest kwargs (dict): additional args for repo.sync_manifest.
@exponential_retry(retries=3, condition=retry_timeouts)
— def sync_to_pinned_manifest(self, manifest_url=‘‘, manifest_branch=’’, manifest_path=‘‘, manifest_gs_path=’’, **kwargs):
Sync a checkout to the specified [pinned] manifest.
The manifest will be downloaded directly from the source using gitiles.
Args: manifest_url (string): URL of the project the manifest is in, e.g. https://chrome-internal.googlesource.com/chromeos/manifest-versions manifest_branch (string): Branch of repository to get manifest from, e.g. ‘main’ or ‘releasespecs’. manifest_path (string): Path (relative to repository root) of manifest file, e.g. releasespecs/91/13818.0.0.xml. manifest_gs_path (string): GS Path of manifest, e.g. gs://chromeos-manifest-versions/release/91/13818.0.0.xml. Takes precendence over manifest_url/branch/path.
— def uprev_packages(self, workspace_path, build_targets=None, timeout_sec=(10 * 60), name=‘uprev ebuilds’):
Uprev packages.
Args: workspace_path (Path): Path to the workspace checkout. build_targets (list[BuildTarget]): List of build_targets whose packages. should be uprevved, or None for all build_targets. timeout_sec (int): Step timeout (in seconds). Default: 10 minutes. name (string): Name for step.
Returns: UprevPackagesResponse
@property
— def workspace_path(self):
The “workspace” checkout path.
This is where the build is processed. It will contain the target base checkout and any modifications made by the build.
DEPS: depot_tools/gsutil, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API featuring shared helpers for locating and naming stored artifacts.
Much of the inspiration for this module came from: chromite/lib/paygen/gspaths.py
As long as there are two versions of the the path construction any changes to one of these needs to be reflected in the other.
Apis for dealing with stored images, payloads, and artifacts.
— def discover_gs_artifacts(self, prefix_uri, parse_types=None):
Discover and return all the GS artifacts found in a given ArtifactRoot.
We assume that each uri will match at most a single ParserOption and we greedily take the first one. GS exceptions are represented as an empty return list.
Args: prefix_uri (str): The gs path prefix recursively crawled. parse_types list(parse_uri()): A list of uri parser fn()s to consider.
Returns: list[artifact_type]: list of artifacts found in the prefix.
DEPS: recipe_engine/buildbucket, recipe_engine/cq
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for generating tags.
A module for generating tags.
— def add_tags_to_current_build(self, **tags):
Adds arbitrary tags during the runtime of a build.
Args: tags (dict): Dict mapping keys to values. If the value is a list, multiple tags for the same key will be created.
@property
— def cq_cl_group_key(self):
Return the cq_cl_group_key, if any.
Returns: (str) cq_cl_group_key, or None
— def cq_cl_tag_value(self, cl_tag_key, tags):
Returns the value for the given cq_cl_tag, if it is found.
@property
— def cq_equivalent_cl_group_key(self):
Return the cq_equivalent_cl_group_key, if any.
Returns: (str) cq_equivalent_cl_group_key, or None
— def get_single_value(self, key, tags=None, default=None):
Return a single value from a list of tags.
If the key has more than one value, only the first value will be returned.
Args: key (str): The key to look up values for. tags ([StringPair]): A list of tags in which to look up values. (defaults to tags for current build) default (str): A default value to return if no values found.
Returns: str|None, the first value found for the key among the tags.
— def get_values(self, key, tags=None, default=None):
Return a value from a list of tags.
Since tags are able to have multiple values for the same key, the return value is always a list, even for a single item.
Args: key (str): The key to lookup values for tags ([StringPair]): A list of tags in which to look up values. (defaults to tags for current build) default (str): A default value to return if no values found
Returns: List of tag values, or [default] if none found.
— def has_entry(self, key, value, tags):
Returns whether tags contains a tag with key and value.
— def make_schedule_tags(self, snapshot, inherit_buildsets=True):
Returns the tags typically added to scheduled child builders.
Args: snapshot (GitilesCommit): snapshot the build was synced on inherit_buildsets (bool): whether to include non-gitiles_commit buildsets.
Returns: list[StringPair] to pass as buildbucket tags
— def tags(self, **tags):
Helper for generating a list of StringPair messages.
Args: tags (dict): Dict mapping keys to values. If the value is a list, multiple tags for the same key will be created.
Returns: (list[StringPair]) tags.
DEPS: cros_infra_config, cros_source, easy, git, gitiles, repo, src_state, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module for generating and parsing test plans.
— def generate(self, builds, gerrit_changes, manifest_commit, name=None):
Generate test plan.
Args:
Returns: GenerateTestPlanResponse of test plan.
— def generate_target_test_requirements_config(self, builders=None, paygen=False):
Generate target test requirements config in config-internal using ./board_config/generate_test_config. Assumes config-internal is checked out at src_state.workspace_path/CONFIG_INTERNAL_CHECKOUT
.
Args: builders (list[str]): optional list of builder names to generate config for, e.g. coral-release-main or staging-kevin-release-main. If not specified, either the invoking builder or its children (if the invoking builder name contains ‘orchestrator’) will be used. paygen (bool): If true, generate paygen testing requirements instead of standard per-build-target test requirements.
Returns: JSON structure of target test requirements or None.
— def get_target_test_requirements(self, builders=None):
Fetch target test requirements config.
Args: builders (list[str]): optional list of builder names to generate config for, e.g. coral-release-main or staging-kevin-release-main. If not specified, either the invoking builder or its children (if the invoking builder name contains ‘orchestrator’) will be used. Returns: JSON structure of target test requirements.
— def get_test_plan_summary(self, test_plan):
Return a mapping of display name to criticality.
Args: test_plan (GenerateTestPlanResponse): The test plan to summarize.
Returns: test_to_crit_map (dict{string: bool}): Map of test display name to criticality.
— def initialize(self):
DEPS: cros_build_api, cros_infra_config, cros_test_plan, easy, gerrit, gitiles, infra/docker, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module for generating and parsing test plans for CTP v2.
— def enabled_on_changes(self, gerrit_changes):
Returns true if test planning v2 is enabled on gerrit_changes.
Config controlling what changes are enabled is in the ProjectMigrationConfig of this module's properties.
@property
— def generate_ctpv1_format(self):
— def generate_hw_test_plans(self, starlark_packages, generate_test_plan_request=None):
Runs the testplan Docker image to get HWTestPlans.
Args:
Returns: A list of generated HWTestPlans or GenerateTestPlanResponse if generate_ctpv1_format is true.
— def initialize(self):
— def relevant_plans(self, gerrit_changes):
Call test_plan relevant-plans.
Args:
Returns: A list of relevant SourceTestPlans
DEPS: easy, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing cros_test_platform commands
— def cipd_package_version(self):
Return the CTP CIPD package version (e.g. prod/staging/latest).
— def enumerate(self, request):
Enumerate test cases via enumerate
subcommand.
Args: request: a EnumerationRequest.
Returns: EnumerationResponse.
— def execute_luciexe(self, request):
Execute work via luciexe
binary for cros_test_platform
crbug.com/1112514: This is an alternative binary target for cros_test_platform which will eventually replace all the subcommands of the cros_test_platform binary.
Args: request: a ExecuteRequests.
Returns: ExecuteResponses.
— def skylab_execute(self, request):
Execute work via skylab-execute
subcommand.
Args: request: a ExecuteRequest.
Returns: ExecuteResponse.
PYTHON_VERSION_COMPATIBILITY: PY2+3
Data structures used by the cros_test_postprocess recipe.
— def downloaded_test_result(self, gs_path, local_path):
Create an object of DownloadedTestResult.
The object created is passed to each post process api to consume.
Args: gs_path: A string of GS path of test result to download. local_path: A Path object to save downloaded test result locally.
Returns: A named tuple of (gs_path, local_path).
DEPS: cros_bisect, cros_history, cros_infra_config, cros_tags, cros_test_plan, cros_test_plan_v2, easy, exonerate, failures, gerrit, git, git_footers, gitiles, greenness, naming, skylab, src_state, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/file, recipe_engine/path, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def get_test_failures(self, test_results):
Logs all test failures to the UI and raises on failed tests.
Args: test_results: MetaTestTuple of the tests on the changes. Returns: list[Failure]: All failures discovered in the given run.
— def run_proctor(self, need_tests_builds, snapshot, gerrit_changes, enable_history, run_async=False, container_metadata=None, require_stable_devices=False, use_test_plan_v2=False):
Runs the test platform for a given bunch of builds.
This is the entry point into the Chrome OS infra test platform via recipes.
Args: need_tests_builds (list[build]): builds that are eligible for testing, i.e. ones that didn't suffer build failures. snapshot (common_pb2.GitilesCommit): the manifest snapshot at the time the included builds were created. gerrit_changes (list[common_pb2.GerritChange]): the changes that resulted in the provided builds, or None. enable_history (bool): whether to prune test history for previously successful tests on images with the same build inputs. run_async (bool): whether to stop and collect, if set we return no failures (an empty list). container_metadata (ContainerMetadata): Information on container images used for test execution. require_stable_devices (bool): whether to only run on devices with label-device-stable: True use_test_plan_v2 (bool): whether to use the v2 testplan tool in cros test platform v1 compatibility mode. The v2 testplan tool will return GenerateTestPlanResponse protos, so it is interchangable with the v1 testplan tool. Returns list[failures.Failure]: failures encountered running tests
— def run_proctor_v2(self, gerrit_changes):
Runs the test platform v2 for a set of GerritChanges.
Args: gerrit_changes (list[common_pb2.GerritChange]): changes to test.
— def schedule_tests(self, test_plan, passed_tests, timeout, test_to_build_map=None, snapshot=None, is_retry=False, run_async=False, container_metadata=None, require_stable_devices=False):
Schedule all tests from the test_plan.
Args: test_plan (GenerateTestPlanResponse): A plan for all tests to be scheduled. passed_tests (list[string]): A list of names for the tests that have passed before. timeout (Duration): Timeout in duration_pb2.Duration. test_to_build_map (dict{string->string}): Map of test names to build_targets to be populated. snapshot (common_pb2.GitilesCommit): the manifest snapshot at the time the included builds were created. is_retry (bool): Whether this is a CQ retry. run_async (bool): whether to stop and collect, if set we return no failures (an empty list). container_metadata (ContainerMetadata): Information on container images used for test execution. require_stable_devices (bool): whether to only run on devices with label-device-stable: True
Returns: MetaTestTuple of lists of the tests scheduled.
@test_summary.setter
— def test_summary(self, test_summary):
Set the test_summary for this build.
Args: test_summary (list[map{string: string}]): The test_summary for this build.
DEPS: recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing cros_test_runner commands
— def cipd_package_label(self):
Return the CTP CIPD package version (e.g. prod/staging/latest).
— def ensure_cros_test_runner(self):
Ensure the cros_test_runner CLI is installed.
— def execute_luciexe(self):
Execute work via cros_test_runner luciexe binary.
Returns: None
— def is_enabled(self):
Checks if cros_test_runner is enabled for use.
Returns: bool
DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing CrosToolRunner commands
— def create_file_with_container_metadata(self, container_metadata):
Create a temp file with provided container metadata.
Args: container_metadata: (ContainerMetadata) container metadata.
— def ensure_cros_tool_runner(self):
Ensure the CrosToolRunner CLI is installed.
— def find_tests(self, request):
Find tests via test-finder
subcommand.
Args: request: a CrosToolRunnerTestFinderRequest.
— def provision(self, request):
Run provision via provision
subcommand.
Args: request: a CrosToolRunnerProvisionRequest.
— def read_dut_hostname(self):
"Return the DUT hostname.
— def test(self, request):
Run test(s) via test
subcommand.
Args: request: a CrosToolRunnerTestRequest.
— def upload_to_tko(self, autotest_dir, results_dir):
Upload test results to TKO via tko-parse. This command does not call into CTR. It directly invokes tko-parse in autotest. To have parity with phosphorus package, it makes sense to have the implementation live here so that any recipe consuming this module can take the benefit of this.
Args: autotest_dir (str): path to autotest package. results_dir (str): path to test results to upload.
DEPS: cros_infra_config, cros_source, easy, gerrit, git, git_footers, src_state, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/cq, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with CrOS version numbers.
A module for steps that manipulate Chrome OS versions.
— def bump_version(self, dry_run=True):
Bumps the chromeos version (as represented in chromeos_version.sh) and pushes the change to the chromiumos-overlay repo.
Which component is bumped depends on the branch the invoking recipe is running for (main/tot --> build, release-* --> branch).
Args: dry_run (bool): Whether the git push is --dry-run.
— def initialize(self):
Initializes the module.
— def read_workspace_version(self, name=‘read chromeos version’):
Read the Chrome OS version from the workspace.
Returns: a Version read from the workspace.
Args: name (str): The name to use for the step.
Raises: ValueError: if the version file had unexpected formatting.
@property
— def version(self):
The Version of the workspace checkout.
DEPS: depot_tools/gsutil, recipe_engine/json, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API to archive test results to CTS specific buckets
API to archive test results to CTS specific buckets
— def archive(self, d_dir):
Archive CTS result files to CTS specific GS buckets.
This module determines if any CTS results files should uploaded to the CTS GS buckets and archives them if required.
@param d_dir: The results directory to process.
DEPS: cros_infra_config, easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for working with debug symbols.
Module for working with debug symbols.
— def ensure_cipd_package(self, cipd_package_location, cipd_ref, package_name):
Use the recipe_engine CIPD api to fetch and store the package locally.
Args: cipd_package_location (str): CIPD location where the package is stored. E.g. chromiumos/infra/upload_debug_symbols/${platform} cipd_ref (String): Instance of package to use. Typically, prod or staging. package_name (String): Name of package minus extra location information. E.g. upload_debug_symbols, manifest_doctor, branch_util.
Returns: Path: Path to the locally stored package.
— def upload_debug_symbols(self, gs_path=None):
Upload debug symbols to the crash service.
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to process tast-results/ directory.
— def track(self, step_name=None, depth=0, timeout=(10 * 60), d=None):
Print out the disk usage under the current directory.
Args: depth (int): The depth to traverse within the subdirs. timeout (int): timeout in seconds. d (str): absolute dir path to start from. If empty, use cwd.
@contextlib.contextmanager
— def tracking_context(self):
A context wrapper for track().
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def create(self, api, properties):
Factory constructor for interfaces.
Args:
Returns: DUTInterface
DEPS: cros_tags, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for easy steps.
A module for easy steps.
— def log_parent_step(self, log_if_no_parent=True):
Creates a short step to log the current builder's parent build ID.
Args: log_if_no_parent: If True and there is no parent build, create an empty step stating that there's no parent build. If False and there is no parent build, do nothing.
— def set_properties_step(self, step_name=None, **kwargs):
An empty step to set properties in output.properties.
Args: step_name (str): The name of the step. kwargs: Keyword arguments to set as properties, key is property name and value is property value. Key must be a string, value may be str, int, float, list, or dict. If value is bytes, it will be cast to string.
Returns: See ‘step.call’.
— def stdout_json_step(self, name, cmd, step_test_data=None, test_stdout=None, ignore_exceptions=False, **kwargs):
Runs an easy.step and returns stdout data deserialized from JSON.
Args:
Returns: dict|list: JSON-deserialized stdout data.
— def stdout_jsonpb_step(self, name, cmd, message_type, test_output=None, parse_before_str='', **kwargs):
Runs an easy.step and returns stdout jsonpb-deserialized proto data.
Returns: message_type: JSON-pb deserialized proto message.
— def stdout_step(self, name, cmd, step_test_data=None, test_stdout=None, **kwargs):
Runs an easy.step and returns stdout data.
Args:
Returns: str: Raw stdout data.
— def step(self, name, cmd, stdin=None, stdin_data=None, stdin_json=None, **kwargs):
Convenience features on top of the normal ‘step’ call.
At most one of |stdin|, |stdin_data|, or |stdin_json| may be specified.
Args:
Returns: See ‘step.call’.
DEPS: cros_infra_config, naming, skylab, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def exonerate_hwtests(self, hw_test_results):
Exonerate the list of HW Test failures based on configs.
Args: hw_test_results([Skylab_Result]): list of failures from the proctor.
Returns: [Skylab_Result] with exonerated tests modified and [str] names of tests that should be treated as success.
— def exonerate_vm_testcase(self, test_case, build_target):
Exonerates a single test case based on configs.
Args: test_case(TestCaseResult in dict form): VM test_case to be conditionally exonerated. build_target(str): build_target on which the test was executed.
Returns: test case dictionary changed based on the decision.
— def exonerate_vm_testcases(self, all_test_cases, build_target):
Exonerates VM test cases based on configs.
Args: all_test_cases([Dict with predefined keys]): VM test_cases to be conditionally exonerated. build_target(str): build_target on which the test was executed.
Returns: list of test cases modified based on configs and the new overall status(common_pb2.status).
— def exonerate_vmtests(self, vm_builds):
Exonerate the list of VM Test failures based on configs.
Args: vm_builds([build_pb2.Build]): list of vm results from the proctor.
Returns: [Build] with exonerated tests modified and [str] names of tests that should be treated as success.
DEPS: cros_infra_config, cros_som, naming, skylab, urls, recipe_engine/buildbucket, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for raising failures and presenting them in cute ways.
A module for presenting errors and raising StepFailures.
— def aggregate_failures(self, failures):
Returns a recipe result based on the given failures.
Only fatal failures cause the whole recipe to fail.
Args: failures (list[Failure]): All failures encountered during execution.
Returns: RawResult: The recipe result, including a human-readable failure summary.
— def format_step_failures(self, step_failures):
Helper function to format the collected failures for presentation.
Args: step_failures (list[Failure]): Collected error messages from exceptions. Returns: formatted markdown string for UI presentation.
— def get_build_failures(self, builds, refresh_configs=False):
Verify all builds completed successfully.
Args: builds (list[build_pb2.Build]): List of completed builds. refresh_configs (bool): Whether to update configs and adjust is_critical.
Returns: list[Failure]: All failures discovered in the given runs.
— def get_build_status(self, build):
Retrieve the status of the build.
Args: build (Build): The buildbucket Build in question.
Returns: status (common_pb2.Status) of the build.
— def get_hw_test_failures(self, hw_tests):
Logs hardware test status to UI, and raises on failed tests.
Args: hw_tests (list[SkylabResult]): List of Skylab suite results.
Returns: list[Failure]: All failures discovered in the given runs filtered by baseline failures.
— def get_hwtest_status(self, hw_test):
Get the status of the hw_test.
Args: hw_test (SkylabResult): The hardware test result in question.
Returns: status (common_pb2.STATUS) of the test.
— def get_vm_test_failures(self, vm_tests):
Logs VM test status to UI, and raises on failed tests.
Args: vm_tests (list[Build]): List of VM test buildbucket results.
Returns: list[Failure]: All failures discovered in the given runs filtered by baseline failures.
@contextlib.contextmanager
— def ignore_exceptions(self):
Catches exceptions and logs them instead.
Should only be used temporarily to prevent new features from crashing the entire recipe. Remove once new feature is stable.
— def is_critical_build_failure(self, build):
Determine in the build failed and was critical.
Args: build (Build): The buildbucket build in question.
Returns: bool: True if the build failed and was critical.
— def is_critical_hw_test_failure(self, hw_test):
Determine if the vm test failed and was critical.
Args: hw_test (SkylabResult): The hardware test result in question.
Returns: bool: True if the test failed and was critical.
— def is_critical_test_failure(self, test):
Determine if the test is critical and has failed.
Args: test (Build|SkylabResult): The test in question.
Returns: bool: True if the test is critical and has failed.
— def is_hw_test_critical(self, hw_test):
Determine if the vm test was critical.
Args: hw_test (SkylabResult): The hardware test result in question.
Returns: bool: True if the test was critical.
— def raise_failed_image_tests(self, failed_images):
Display failed image tests and raise a failure.
Displays the images that failed tests and raises a failure if there are failed image tests. If there are no failed image tests, a success message is output.
Args: failed_images: (list[chromite.image.Image]): The images that failed tests.
Raises: StepFailure: If failed_images is not empty.
— def set_failed_packages(self, enclosing_step, packages):
If any failed packages, set presentation and raise failure.
Args: enclosing_step (step): The enclosing step to mutate. packages (list[tuple[chromiumos.common.PackageInfo, str]]): The failed packages.
Raises: StepFailure: If failed_packages is not empty.
— def update_non_critical_build_failures(self, failures, fresh_builder_configs, presentation=None):
If builders are now non-critical or removed, failures are non-fatal.
Args: failures (list[Failure]): All failures encountered during execution. fresh_builder_configs (dict(str, BuilderConfig)): name to builder config for all BuilderConfigs that should have criticality checked. presentation (StepPresentation): Parent step presentation. If None, a StepPresentation will be created.
Returns: updated_failures (list[Failure]): The list of Failures with ‘fatal’ statuses possibly updated.
— def update_non_critical_test_failures(self, failures, test_plan_summary, presentation=None):
If tests are now non-critical or removed, failures are non-fatal.
Args: failures (list[Failure]): All failures encountered during execution. test_plan_summary (dict{string: bool}): Map of test display name to criticality against which to check test failures. presentation (StepPresentation): Parent step presentation. If None, a StepPresentation will be created.
Returns: updated_failures (list[Failure]): The list of Failures with ‘fatal’ statuses possibly updated.
DEPS: easy, recipe_engine/futures
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module that interacts with the GCE Provider config service.
Depends on ‘prpc’ binary available in $PATH: https://godoc.org/go.chromium.org/luci/grpc/cmd/prpc
— def get_current_config(self, ids):
Function to retrieve the current config from GCE Provider.
Args: ids (list): A list of all the config prefixes to retrieve.
Returns: Configs, list of GCE Provide Config objects.
Raises: NoneConfigFailure: If the GCE Provider Get call returns None.
— def update_gce_config(self, bid, config):
Function to update the config in GCE Provider.
Args: bid (str): bot group prefix to update. config(Config): GCE Provider config object
Returns: Config, GCE Provider Config defintion with updated values.
DEPS: cros_infra_config, easy, overlayfs, depot_tools/gsutil, recipe_engine/archive, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to interact with Google Cloud.
— def __init__(self, *args, **kwargs):
Initialize GcloudApi.
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def attach_disk(self, name, instance, disk, zone):
Attach a disk to a GCE instance.
As a disk is attached, the disk is then added to the stack that is used by the context manager to detach as the task ends.
Args: name (str): An alphanumeric name for the mount, used for display. instance(str): GCE instance on which disk will be attached. disk(str): Google Cloud disk name. zone(str): GCE zone to create instance (e.g. us-central1-b).
— def auth_list(self, step_name=None):
Print out the auth creds currently on the bot.
Args: step_name(str): Name of the step.
@property
— def branch(self):
— def check_for_disk_mount(self, mount_path):
Check whether there is a disk mounted on given path.
Args: mount_path (str): System path on which the disk is mounted.
Returns: Bool indicating whether there is a disk mounted on the path.
@contextlib.contextmanager
— def cleanup_gce_disks(self):
Wrap disk cleanup in a context handler to ensure they are handled.
Upon exiting the context manager, each attached disk is then iterated through to unmount, detach, and delete the disk.
@contextlib.contextmanager
— def cleanup_mounted_disks(self):
Wrap disk cleanup in a context handler to ensure they are unmounted.
Upon exiting the context manager, each mounted disk is then iterated through and unmounted.
— def create_disk_from_image(self, disk, zone, image, disk_type=None):
Create a GCE disk from supplied image.
Create a GCE disk from a provided snapshot name.
Args: disk(str): Google Cloud disk name. zone(str): GCE zone to create disk (e.g. us-central1-b). image(str): Image version use to create the disk. disk_type(str): Type of GCE disk to create.
Returns: The stdout of the gcloud command.
— def create_image(self, image_name, source_uri=None, licenses=None):
Creates an image in the GCE project.
Args: image_name(str): Name of the image. source_uri(str, optional): Sets the --source-uri
flag if the image source is a tarball. See gcloud docs for detail. licenses(list[str], optional): List of image licenses to apply.
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def create_image_from_disk(self, disk, image_name, zone):
Create an image from specified disk.
Args: disk(str): Google Cloud disk name. image_name(str): The name to give the image. zone(str): GCE zone to create instance (e.g. us-central1-b).
— def create_instance(self, image, project, machine, zone, network=None, subnet=None):
Create an instance in the GCE project.
Args: image(str): GCE image to use for the instance. project(str): Google Cloud project name. machine(str): GCE machine type zone(str): GCE zone to create instance (e.g. us-central1-b). network(str): Network name to use. subnet(str): Network subnet on which to create instance.
Returns: Tuple[str, str]: (name, ip_addr) of the instance.
— def delete_disk(self, disk, zone):
Delete a GCE disk.
Permanently delete a GCE disk from the project.
Args: disk(str): Google Cloud disk name. zone(str): GCE zone to create instance (e.g. us-central1-b).
— def delete_image(self, image_name):
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def delete_images(self, images):
Delete the list of provided images from GCE.
Args: images(list|str): A list of image names.
— def delete_instance(self, instance, project, zone):
Delete a GCE instance.
Args: instance(str): GCE instance to be deleted. project(str): Google Cloud project name. zone(str): GCE zone to create instance (e.g. us-central1-b).
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def detach_disk(self, instance, disk, zone):
Detach a disk to a GCE instance.
As a disk is detached, the disk is then removed from the stack that is used by the context manager to detach as the task ends.
Args: instance(str): GCE instance disk is attached. disk(str): Google Cloud disk name. zone(str): GCE zone to create instance (e.g. us-central1-b).
— def determine_disks_to_delete(self, disks, instances):
Determines the list of orphaned disks to delete.
Args: disks(dict): List of all GCE disks and zone instances(list|str): List of all GCE instances.
Returns: Dictionary containing disk name and zone to delete.
— def disk_attached(self, disk_name):
Check whether a disk is attached to an instance.
Args: disk_name(str): Disk name to match for device id.
Returns: Bool of whether the disk is attached or not.
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def disk_exists(self, disk, zone):
Check whether a disk exists.
Args: disk(str): Name of the disk to check. zone(str): GCE zone in which the disk exists.
Returns: Bool of whether the disk exists or not.
@property
— def disk_short_name(self):
@property
— def gce_disk(self):
@property
— def gce_disk_blkid(self):
@property
— def gce_name_limit(self):
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def get_expired_images(self, retention_days, prefixes, protected_images=None):
Calculate the list of images that have expired.
Args: retention_days(int): Number of days to retain. prefixes(list|str): List of prefixes to filter. protected_images(list|str): List of images to preserve.
— def get_instance_serial_output(self, instance, project, zone):
@property
— def host_zone(self):
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def image_exists(self, image):
Check whether a image exists.
Args: image(str): Name of the snapshot to check.
Returns: Bool of whether the snapshot exists or not.
@property
— def infra_host(self):
— def initialize(self):
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def list_all_disks(self):
Pulls a list of all disks that exist.
Returns: A dictionary containing disk name and zone.
— def list_all_instances(self):
Pulls a list of all instances that exist.
— def lookup_device_id(self, disk_name):
Look up the device id in /dev/disk/by-id by name.
Args: disk_name (str): The name associated with the attached device. Returns: Returns the device id of the provided disk name, defaulting to None if no disk can be found.
— def mount_disk(self, name, mount_path, recipe_mount=False):
Mount an attached disk to host.
As a disk is mounted, the disk is then added to the stack that is used by the context manager to unmount as the task ends.
Args: name (str): An alphanumeric name for the mount, used for display. mount_path(str): Directory to mount the disk. recipe_mount(bool): Whether mount needs to be in the path to use within a recipe.
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def resize_disk(self, disk, zone, size):
Resize the GCE disk above the default of 200GB.
Args: disk(str): Google Cloud disk name. zone(str): GCE zone which the disk is located. size(str): New size of the disk in GB.
@exponential_retry(retries=3, delay=datetime.timedelta(seconds=30))
— def set_disk_autodelete(self, instance, disk, zone):
Set a disk to autodelete when a GCE instance is deleted.
GCE disks are not default to delete when the instance is deleted, thus to ensure cleanup we can flip the metadata to ensure the disks are deleted when the instance is removed.
Args: instance(str): GCE instance on which disk is attached. disk(str): Google Cloud disk name. zone(str): GCE zone to create instance (e.g. us-central1-b).
— def set_gce_project(self, project):
Set the default project for gcloud command. Args: project(str): Google Cloud project name.
— def setup_cache_disk(self, cache_name, branch=‘main’, disk_type=‘pd-standard’, disk_size=None, recipe_mount=False, disallow_previously_mounted=False, mount_existing=False):
Create disk from snapshot, reuse if still attached.
Check if disk is attached, otherwise grab the matching snapshot, create, attach, and mount the source disk.
Args: cache_name(str): Name of the cache file to use. branch(str): Git branch. disk_type(str): Type of GCE disk to create, defaults to standard persistent disk. disk_size(str): Size of the disk to create in GB, defaults to image size. recipe_mount(bool): Whether mount needs to be in the path to use within a recipe. disallow_previously_mounted(bool): If set, this step will fail if the cache is already mounted. mount_existing(bool): If we find an existing cache, mount it immediately instead of relying on subsequent step.
@property
— def snapshot_builder_mount_path(self):
Returns a Path to the base mount directory for cache builder.
@property
— def snapshot_mount_path(self):
The path to mount the snapshot disks.
This is the path that the disks created from image will be mounted.
@property
— def snapshot_suffix(self):
@property
— def snapshot_version_file(self):
@property
— def snapshot_version_path(self):
The path to the local version file.
This is the path to the local version file that contains the image version that was used to create the local named cache.
— def sync_disk_cache(self, name):
Force a local disk cache sync before snapshotting.
Args: name (str): Disk name to use to lookup the mount location.
— def unmount_disk(self, name, mount_path):
Unmount an attached disk to host.
As a disk is unmounted, the disk is then removed from the stack that is used by the context manager to unmount as the task ends.
Args: name (str): An alphanumeric name for the mount, used for display. mount_path(str): Directory to mount the disk.
— def update_fstab(self, mount_path, name):
Mount an attached disk to host.
As a disk is mounted, the disk is then added to the stack that is used by the context manager to unmount as the task ends.
Args: mount_path(str): Directory to mount the disk. name (str): An alphanumeric name for the mount, used for display.
DEPS: easy, git, git_cl, gitiles, repo, src_state, support, depot_tools/gerrit, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for managing Gerrit changes.
A module for Gerrit helpers.
— def __init__(self, *args, **kwargs):
Initialize GerritApi.
— def abandon_change(self, gerrit_change, message=None):
Abandon the given change.
Args: gerrit_change (GerritChange): The change to abandon. message (str): Optional message to post to change.
— def add_change_comment(self, gerrit_change, comment, project_path=None):
Add a comment to the given Gerrit change.
Args: gerrit_change (GerritChange): The change to post to. comment (str): The comment to post. project_path (Path): If set, will use this as the project path rather than any value inferred from the gerrit_change.
Returns: str: The new message ref (primarily for testing).
— def assert_changes_submittable(self, gerrit_changes, test_output_data=None):
Checks if the provided changes can be merged onto their Git branches.
Args: gerrit_changes (list(common_pb2.GerritChange)): the changes to check
Raises: StepFailure if the changes cannot be merged.
— def create_change(self, project, reviewers=None, ccs=None, topic=None, ref=None, hashtags=None, project_path=None):
Create a Gerrit change for the most recent commits in the given project.
Assumes one or more local commits exists in the project. The commit message is always used as the CL description.
Args: project (str|Path): Any path within the project of interest, or the project name. reviewers (list[str]): List of reviewer emails. If specified, gerrit will email the reviewers. ccs (list[str]): List of cc emails. If specified, gerrit will cc the individuals. topic (str): Topic to set for the CL. ref: --target-branch argument to be passed to git cl upload. Should be a full git ref, e.g. refs/heads/main (NOT just ‘main’). hashtags (list[str]): List of hashtags to set for the CL. project_path (Path): If set, will use this as the project path rather than any value inferred from the gerrit_change.
Returns: GerritChange: The newly created change.
— def fetch_patch_set_from_change(self, change, include_files=False, test_output_data=None):
Fetch and return PatchSet associated with the given GerritChange.
Assumes that change.patchset is set (which is not always the case). The step fails if the specific patch set is not found.
Args: gerrit_changes (GerritChange): Buildbucket GerritChange to fetch. include_files (bool): If True, include information about changed files. test_output_data (dict): Test output for gerrit-fetch-changes.
Returns: PatchSet: The corresponding PatchSet.
— def fetch_patch_sets(self, gerrit_changes, include_files=False, include_commit_info=False, include_messages=False, test_output_data=None):
Fetch and return PatchSets from Gerrit.
The step fails if any patch set is not found.
Args: gerrit_changes (List[GerritChange]): Buildbucket GerritChanges to fetch. include_files (bool): If True, include information about changed files. include_commit_info (bool): If True, include information about the commit. include_messages (bool): If True, include messages attached to the commit. test_output_data (dict): Test output for gerrit-fetch-changes.
Returns: List[PatchSet]: List of PatchSets in requested order.
@property
— def gerrit_patch_sets(self):
The gerrit patches last fetched.
These may or may not include files, but always include commit info.
— def get_change_description(self, gerrit_change, memoize=False):
Get the description of the given Gerrit change.
Args: gerrit_change (GerritChange): The change of interest. memoize (bool): Should we consult a local cache for the change id instead of fetching from gerrit. Returns: str: The change description.
— def parse_gerrit_change(self, gerrit_change_url):
Parse GerritChange proto from a gerrit change URL.
This function expects the URL to be formatted as:
https://-review.googlesource.com/c//+/
Args: gerrit_change_url (str): The change URL.
Returns: GerritChange: The parsed proto.
— def parse_gerrit_change_url(self, gerrit_change):
Transform a GerritChange proto into a Gerrit change URL.
Args: gerrit_change (GerritChange): The change in question.
Returns: str: The Gerrit URL.
— def parse_qualified_gerrit_host(self, gerrit_change):
Transform a GerritChange proto into a fully qualified host.
Args: gerrit_change (GerritChange): The change in question.
Returns: str: The fully qualified Gerrit host.
— def query_changes(self, host, query_params):
Query gerrit for the given changes.
Args: host (str): The Gerrit host to query. query_params (list[(str, str)]): Query parameters as list of (key, value) tuples to form a query as documented here: https://gerrit-review.googlesource.com/Documentation/user-search.html#search-operators
Returns: list[GerritChange]: Changes that match the query.
— def set_change_description(self, gerrit_change, description, amend_local=False, project_path=None):
Set the description of the given Gerrit change.
Args: gerrit_change (GerritChange): The change of interest. description (str): The new description, in full. Be sure this still includes the Change-Id and other essential metadata. amend_local (bool): Should you amend the description of the HEAD local change as well. project_path (Path): If set, will use this as the project path rather than any value inferred from the gerrit_change.
— def set_change_labels(self, gerrit_change, labels, branch=None, ref=None):
(Deprecated) Set the given labels for the given Gerrit change.
This function is deprecated. Use set_change_labels_remote
where possible.
Args: gerrit_change (GerritChange): The change of interest. labels (dict): Mapping from label (Label) to value (int). branch (str): The remote branch to update. ref (str): The remote ref to update.
Returns: str: The ref used to push the labels.
— def set_change_labels_remote(self, gerrit_change, labels):
Set the given labels for the given Gerrit change. set_change_labels only works when the change exists in the local checkout. This function should be used in other cases.
Args: gerrit_change (GerritChange): The change of interest. labels (dict): Mapping from label (Label) to value (int).
Returns: str: The applied labels (primarily for testing).
— def submit_change(self, gerrit_change, retries=0, project_path=None):
Submits the given change.
Args: gerrit_change (GerritChange): The change to submit. retries (int): How many times to retry git cl land
should it fail. project_path (Path): If set, will use this as the project path rather than any value inferred from the gerrit_change.
DEPS: src_state, util, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with git.
A module for interacting with git.
— def add(self, paths):
Add/stage paths.
Stages paths
for commit. Note that this will fail if a file is tracked and not modified, which you can use diff_check
to check for.
Args: paths (list[str|Path]): The file paths to stage.
— def amend_head_message(self, message, **kwargs):
Runs ‘git commit --amend’ with the given description.
Args: message (str): The commit message. kwargs (dict): Passed to recipe_engine/step.
— def author_email(self, commit_id):
Returns the email of the author of the given commit.
Args:
Returns: (str): commit author email.
— def branch_exists(self, branch):
Check if a branch exists.
Args:
Returns: (bool) Whether or not the branch exists.
— def checkout(self, commit, force=False, branch=None):
Runs ‘git checkout’.
Args: commit (str): The commit (technically “tree-like”) to checkout. force (bool): If True, throw away local changes (--force).
— def cherry_pick(self, commit, **kwargs):
Runs ‘git cherry-pick’.
Args: commit (str): The commit to cherry pick. kwargs (dict): Passed to recipe_engine/step.
— def clone(self, repo_url, target_path=None, reference=None, dissociate=False, branch=None, single_branch=False, depth=None, timeout_sec=None, verbose=False, progress=False):
Clones a Git repo into the current directory.
Args: repo_url (str): The URL of the repo to clone. target_path (Path): Path in which to clone the repo, or None to specify current directory. reference (Path): Path to the reference repo. dissociate (bool): Whether to dissociate from reference. branch (string): If set, performs a single branch clone of that branch. single_branch (bool): If set, performs a single branch clone of the default branch. depth (int): If set, creates a shallow clone at the specified depth. timeout_sec (int): Timeout in seconds. verbose (bool): If set, run git clone as verbose. progress (bool): If set, print progress to stdout.
— def commit(self, message, files=None, author=None, **kwargs):
Runs ‘git commit’ with the given files.
Args: message (str): The commit message. files (list[str|Path]): A list of file paths to commit. author (str): The author to use in the commit. Ordinarily not used, added to test permission oddities by forcing forged commit failure. kwargs (dict): Passed to recipe_engine/step.
— def create_branch(self, branch, remote_branch=None):
Create a branch.
Args:
— def create_bundle(self, output_path, from_commit, to_ref):
Creates a git bundle file.
Creates a git bundle (see man git-bundle
) containing the commits from |from_commit| (exclusive) to |to_ref| (inclusive).
Args: output_path (Path): Path to create bundle file at. from_commit (str): Parent commit (exclusive) for bundle. to_ref (str): Reference to put in bundle.
— def current_branch(self):
Returns the currently checked out branch name.
Returns: (str): The branch name pointed to by HEAD. None: If HEAD is detached.
— def diff_check(self, path):
Check if the given file changed from HEAD.
Args: path (str|Path): The file path to check for changes.
Returns: (bool): True if the file changed from HEAD (or doesn't exist), False otherwise.
— def extract_branch(self, refspec, default=None):
Splits the branch from the refspec.
Splits the branch from a refs/heads refspec and returns it. Returns default if the refspec is not of the required format.
Args: refspec (str): refspec to split the branch from. default (str): value to return if refspec not of required format.
Returns: (str): the extracted branch name.
— def fetch(self, remote, refs=None, timeout_sec=None, retries=3):
Runs ‘git fetch’.
Args: remote (str): The remote repository to fetch from. refs (list[str]): The refs to fetch. timeout_sec (int): Timeout in seconds. retry (int): Number of times to retry.
— def fetch_ref(self, remote, ref, timeout_sec=None):
Fetch a ref, and return the commit ID (SHA).
Args: remote (str): The remote repository to fetch from. ref (str): The ref to fetch. timeout_sec (int): Timeout in seconds.
Returns: (str): The commit ID (SHA) of the fetched ref.
— def fetch_refs(self, remote, ref, timeout_sec=None, count=1, test_ids=None):
Fetch a list of remote refs.
Args: remote (str): The remote repository to fetch from. ref (str): The ref to fetch. timeout_sec (int): Timeout in seconds. count (int): The number of commit IDs to return. test_ids (list[str]): List of test commit IDs, or None.
Returns: (list[str]): The commit IDs, starting with the fetched ref.
— def get_branch_ref(self, branch):
Creates the full ref for a branch.
Returns a ref of the form refs/heads/{branch}.
Args: branch (str): branch to split the branch from.
Returns: (str): The ref for the branch.
— def get_diff_files(self, from_rev=None, to_rev=None, test_stdout=None):
Runs ‘git diff’ to find files changed between two revs.
Revs are passed directly to ‘git diff’, which has the following effect: 0 revs - Changes between working directory and index 1 revs - Changes between working directory and given commit 2 revs - Changes between the two commits
Args: from_rev (str): First revision (see ‘man 7 gitrevisions’) to_rev (str): Second revision
Returns: (list[str]): changed files.
— def get_parents(self, commit_id, test_contents=None):
Runs get log
to determine the parents of a git commit.
Args: commit_id (str): The commit hash.
Returns: (list[str]): parent commit hash(es).
— def get_working_dir_diff_files(self):
Finds all changed files (including untracked).
— def gitiles_commit(self, test_remote=‘cros-internal’, test_url=None):
Return a GitilesCommit for HEAD.
Args: test_remote (str): The name of the remote, for tests. test_url (str): Test data: url for the remote, for tests.
Returns: (GitilesCommit): The GitilesCommit corresponding to HEAD.
— def head_commit(self):
Returns the HEAD commit ID.
@contextlib.contextmanager
— def head_context(self):
Returns a context that will revert HEAD when it exits.
— def is_merge_commit(self, commit_id):
Determines if the commit_id is a merge commit.
Args: commit_id (str): The commit sha.
Returns: (bool): whether the commit has more than 1 parent.
— def is_reachable(self, revision, head=‘HEAD’):
Check if the given revision is reachable from HEAD.
Args: revision (str): A git revision to search for. head (str): The starting revision. Default: HEAD.
Returns: (bool): Whether the revision is reachable from (is an ancestor of) |head|.
— def log(self, from_rev, to_rev, limit=None, paths=None):
Returns all the Commit
between from_rev
and to_rev
.
Args: from_rev (str): From revision to_rev (str): To revision limit (int): Maximum number of commits to log. paths (list[str]): pathspecs to use.
Returns: (list[Commit]): A list of commit metas.
— def ls_remote(self, refs, repo_url=None):
Return ls-remote output for a repository.
Args: refs (list[str]): The refs to list. repo_url (str): The url of the remote, or None to use CWD.
Returns: (list[Reference]): A list of Refs.
— def merge(self, ref, message, *args, **kwargs):
Runs git merge
.
Args: ref (str): The ref to merge. message (str): The merge commit message. args (tuple): Additional arguments to git merge. kwargs (dict): Passed to recipe_engine/step.
— def merge_abort(self):
Runs ‘git merge --abort’.
— def merge_base(self, *args, **kwargs):
Return the output from git merge-base
.
Args: args (tuple): Additional arguments to git merge. kwargs (dict): Passed to recipe_engine/step.
Returns: (str) stdout of the command, or None for errors.
— def merge_silent_fail(self, ref, message, **kwargs):
Runs git merge
and returns whether the merge succeeded.
This won't generally throw a StepError.
Args: ref (str): The ref to merge. message (str): The merge commit message. kwargs (dict): Passed to recipe_engine/step.
Returns: (bool): whether the merge succeeded
— def push(self, remote, refspec, dry_run=False, capture_stdout=False, capture_stderr=False, retry=True, force=False, **kwargs):
Runs ‘git push’.
Args: remote (str): The remote repository to push to. refspec (str): The refspec to push. dry_run (bool): If true, set --dry-run on git command. capture_stdout (bool): If True, return stdout in step data. capture_stderr (bool): If True, return stderr in step data. retry (bool): Whether to retry. Default: True force (bool): add force flag for git push kwargs (dict): Passed to api.step.
Returns: (StepData): See ‘step.call’.
— def rebase(self, force=False, branch=None, strategy_option=None):
Run git rebase
with the given arguments.
Args: force (bool): If True, set --force. branch (str): If set, rebase from specific branch. strategy_option (str): If set, sets the --strategy-option flag. See git help rebase
for details.
— def remote(self):
Return the name of the remote.
Returns: (str): name of the remote, e.g. ‘origin’ or ‘cros’.
— def remote_head(self, remote=‘.’, test_stdout=None):
Returns the HEAD ref of the given remote.
Args: remote (str): remote name to query, by default remote of current branch
Returns: (str): ref contained in the remote HEAD (ie the default branch), or None on error.
@exponential_retry(retries=20, delay=timedelta(minutes=1))
— def remote_update(self, step_name, timeout_sec=None):
Runs ‘git remote update’.
Args: step_name (str): Name of the step to display. timeout_sec (int): Timeout in seconds.
— def remote_url(self, remote=‘origin’):
Get the URL for a defined remote.
Args: remote (str): The name of the remote to query
Returns: URL to the remote on success
— def repository_root(self, step_name=None):
Return the git repository root for the current directory.
Args: step_name (str): the step name to use instead of the default.
Returns: (str): The path to the git repository.
— def set_global_config(self, args):
Runs git config --global
to set global config.
Args: args (list[str]): args for git config
.
— def set_upstream(self, remote, branch):
Set the upretrem for the given branch.
Args: remote (str): The remote repository to track. branch (str): The remote branch to push to.
Returns: (StepData): See ‘step.call’.
— def show_file(self, rev, path, test_contents=None):
Returns the contents of the given file path at the given revision.
Args: rev (str): The revision to return the contents from. path (str): The file path to return the contents of.
Returns: (str): The contents of the file, None if the file does not exist in |rev|.
DEPS: depot_tools/depot_tools, depot_tools/git_cl, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with git cl.
A module for interacting with git cl.
— def issues(self):
Run git cl issue
.
Returns: dict: Map between ref and issue number, e.g. {‘refs/heads/main’: ‘3402394’}.
— def status(self, field=None, fast=False, issue=None, **kwargs):
Run git cl status
with given arguments.
Args: field: Set --field to this value. fast: Set --fast. issue: Set --issue to this value. kwargs: Passed to recipe_engine/step. May NOT set stdout.
Returns: str: The command output.
— def upload(self, topic=None, reviewers=None, ccs=None, hashtags=None, send_mail=False, target_branch=None, dry_run=False, **kwargs):
Run git cl upload
.
--force and --bypass-hooks are always set to remove the need to enter confirmations and address nits.
Args: topic (str): Optional --topic to set. reviewers (list[str]): Optional list of --reviewers to set. ccs (list[str]): Optional list of --cc to set. hashtags (list[str]): Optional list of --hashtags to set. send_mail (bool): If true, set --send-mail. target_branch (str): Optional --target-branch to send to. Needs to be a full ref (e.g. refs/heads/branch), not the branch name (e.g. branch). kwargs (dict): Forwarded to recipe_engine/step. May NOT set stdout. dry_run (bool): If true, set --cq-dry-run.
Returns: str: The command output.
DEPS: gerrit, depot_tools/depot_tools, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API wrapping the git_footers script..
A module for calling git_footers.
— def __call__(self, *args, **kwargs):
Call git_footers.py with the given args.
Args: args: Arguments for git_footers.py kwargs: Keyword arguments for python call.
Returns: list[str]: All matching footer values, or None
— def edit_add_change_description(self, change_message, footer, footer_text):
Edit or add the given footer to the change_message.
Args: change_message (str): The gerrit change message. footer (str): The name of the footer, e.g. “Cq-Depends” footer_text (str): The value of the footer. If footer_text starts with "{footer}: ", that prefix will be ignored.
Returns: str: Modified change_message.
— def from_gerrit_change(self, gerrit_change, key=None, memoize=True, **kwargs):
Return the footer value(s) in the commit message for the given key.
Args: gerrit_change (GerritChange): The change of interest. key (str): The footer key to look for. If not set, returns all footers found in the Gerrit change message. Note that if this parameter is set, it is EXCLUDED from the returned footer string(s). If it is not set, the footers are formatted as ‘:’. memoize (bool): Should we memoize the call (default: True).
Returns: list[str]: The footer value(s) found in the commit message.
— def from_message(self, message, key=None, **kwargs):
Return the footer value(s) in the commit message for the given key.
Args: message (str): The git commit message. key (str): The footer key to look for. If not set, returns all footers found in the message. Note that if this parameter is set, it is EXCLUDED from the returned footer string(s). If it is not set, the footers are formatted as ‘:’.
Returns: list[str]: The footer value(s) found in the commit message.
— def from_ref(self, ref, key=None, **kwargs):
Return the footer value(s) in the given ref for the given key.
Args: ref (str): The git ref. key (str): The footer key to look for. See from_message docstring.
Returns: list[str]: The footer value(s) found in the ref's commit message.
— def get_footer_values(self, gerrit_changes, key, **kwargs):
Gets a list of values from a footer.
Fetches the named footer from the gerrit changes, and returns a set of all of the (comma-separated) values found.
Args: gerrit_changes ([common_pb2.GerritChange]): Gerrit changes applied to this run. key (str): The footer name (key) to fetch. kwargs (dict): Other keyword arguements, passed to git_footers.from_gerrit_change.
Returns: values (set(str)): A set of values. May be empty.
— def position_num(self, ref, test_position_num=None, **kwargs):
Return the footer value for Cr-Commit-Position.
Args: ref (str): The git ref. test_position_num (int): The test value. step_test_data, if given, will override this. **kwargs (dict): positional parameters for self.call()
Returns: list[str]: The position number for the ref.
DEPS: gerrit, git, repo, recipe_engine/context, recipe_engine/file, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for updating remote git repositories transactionally.
A module for executing git transactions.
— def update_ref(self, remote, update_callback, step_name=‘update ref’, ref=None, dry_run=False, automerge=False, retries=3):
Transactionally update a remote git repository ref.
|update_callback| will be called and should update the checked out HEAD by e.g. committing a new change. Then this new HEAD will be pushed back to the |remote| |ref|. If this push fails because the remote ref was modified in the meantime, and automerge is off, the new ref is fetched and checked out, and the process will repeat up to |retries| times.
The common case is that there‘s no issue updating the ref, so we don’t do a fetch and checkout before attempting to update. This means that the function assumes that the repo is already checked out to the target ref.
This step expects to be run with cwd
inside a git repo.
Args: remote (str): The remote repository to update. update_callback (callable): The callback function that will update the local repo‘s HEAD. The callback is passed no arguments. If the callback returns False the update will be cancelled but succeed. step_name (str): Step name to be displayed in the logs. ref (str): The remote ref to update. If it does not start with ‘refs/’ it will be treated as a branch name. If not specified, the HEAD ref for the remote of the current repo project will be used. dry_run (bool): If set, pass --dry-run to git push. automerge (bool): Whether to use Gerrit’s “auto-merge” feature. retries (int): Number of update attempts to make before failing.
Returns: bool: True if the transaction succeeded, false if it explicitly aborts.
— def update_ref_write_file(self, remote, message, dest, data, automerge=False, ref=None):
Transactionally update a file in a remote git repository ref.
See ‘self.update_ref’. Instead of running a callback, this will attempt to update the contents of a file.
Args: remote (str): The remote repository to update. message (str): The commit message to use. dest (Path): The path of the file to write. data (str): The data to write. automerge (bool): Whether to use Gerrit's “auto-merge” feature. ref (str): The remote ref to update. If it does not start with ‘refs/’ it will be treated as a branch name. If not specified, the HEAD ref for the remote of the current repo project will be used.
Returns: bool: True if the transaction succeeded, false if the file didn't change.
Raises: TooManyAttempts: if the number of attempts exceeds |retries|.
DEPS: easy, support, recipe_engine/json, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for dealing with Gitiles.
A module for Gitiles helpers.
— def fetch_revision(self, host, project, branch, test_output_data=None):
Call gitiles-fetch-ref support tool.
Args: host (str): Gerrit host, e.g. ‘chrome-internal’. project (str): Gerrit project, e.g. ‘chromiumos/chromite’. branch (str): Gerrit branch, e.g. ‘main’. test_output_data (dict): Test output for gitiles-fetch-ref.
Returns: str: the current revision hash of the specified branch
— def file_url(self, commit, file_path=None):
Return the url for a file in a GitilesCommit.
Args: commit (GitilesCommit): The gitiles commit to use. file_path (str): The file path to append, if any.
Returns: (str) The url for the file.
— def get_file(self, host, project, path, ref=None, public=True, credential_cookie_location=None, test_output_data=None):
Return the contents of a file hosted on Gitiles.
Curl will return a zero exit status on many occasions if the server responded even if the response isn't what you expected. When this succeeds the server returns base64, so not being able to decode this is a good indication something is wrong.
Args: host (str): Gerrit host, e.g. chrome-internal.googlesource.com. project (str): Gerrit project, e.g. chromiumos/chromite. path: (str): The path to the file e.g. api/controller/something.py. ref: (str): The ref you should return the file from, default: HEAD. public: (bool): If False, will look in .git-credential-cache for an authorization cookie and use it in the curl. Default: True. credential_cookie_location: (str): The credential cookie location. Default: ‘~/.git-credential-cache/cookie’. test_output_data (str): Test output for curl.
Returns: (str) The contents of the file as a string or raise StepFailure on unexpected curl return.
— def repo_url(self, commit):
Return the url for the repo in a GitilesCommit.
Args: commit (GitilesCommit): The gitiles commit to use.
Returns: (str) The url for the repo.
DEPS: support, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with goma.
A module for working with goma.
@property
— def default_bqupload_dir(self):
@property
— def goma_approach(self):
@property
— def goma_client_json(self):
@property
— def goma_dir(self):
Lazily fetches the goma client and returns its path.
— def initialize(self, also_bq_upload=False):
— def process_artifacts(self, install_pkg_response, goma_log_dir, build_target_name, is_staging=False):
Process goma artifacts, uploading to gsutil if they exist.
Args: install_pkg_response (chromite.api.InstallPackagesResponse): May contain goma artifacts. goma_log_dir (str): Log directory that contains the goma artifacts. build_target_name (str): Build target string. is_staging (bool): If being run in staging environment instead of prod.
Returns: tuple[GomaResults]: tuple containing the GS bucket and path used to write log files and any BQ errors when updating stats/counterz. None is returned if there were no artifacts to process.
DEPS: cros_infra_config, cros_tags, easy, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API providing a menu for calculating greenness metric.
A module to calculate greenness metric.
— def get_greenness(self, target):
Returns the greenness metric for a specific target.
Args: target (str): Name of the target.
Returns: Metric of the target or None if the target wasn't launched.
@property
— def greenness_dict(self):
— def print_step(self):
Print comprehensive greenness info in a step.
— def publish_step(self):
Publish greenness to output properties.
— def update_build_info(self, builds):
Update Grenness with build information.
Args: builds([Build]): Buildbucket.Build objects of builds that have completed.
— def update_hwtest_info(self, results):
Update Grenness with HW test information.
Args: results([SkylabResult]): Results of the HW test runs.
— def update_vmtest_info(self, results):
Update Grenness with VM test information.
Args: results([build_pb2.Build]): Builds of the VM test runs.
DEPS: urls, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for logging step output to Google Storage.
A module for logging step output to Google Storage.
@contextlib.contextmanager
— def log_step_to_gs(self, gs_prefix):
Returns a context that logs stdout of the final step to GS.
Note that only the final step is logged (i.e. the step.active_result as the context exits).
Args: gs_prefix (step): Prefix for the logged GS objects. Should contain the bucket name, but not the ‘gs://’ prefix. For example, ‘/logging’. If None, nothing is logged.
DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module for inter-process communication.
— def initialize(self):
— def make_subscription(self, topic, sub_name):
Create a subscription within a topic
Args: topic: Pubsub topic name (string) sub_name: Pubsub subscription name (string) Returns: nothing
— def receive(self, topic, sub_name, filter_attributes=None):
Receive one message from the filtered subscription specified.
Args: topic: Pubsub topic name (string) sub_name: Pubsub subscription name (string) filter_attributes: dict of {strings: strings} encoding a ‘subtopic’; messages which do not include the required attributes will be acknowledged but the message body will be ignored Returns: Message body, as a byte string.
— def send(self, topic, message_body, attributes=None):
Send a pubsub message on the given topic.
Args: topic: Pubsub topic name (string) message_body: byte string of message to send attributes: dict of {strings: strings} encoding a ‘subtopic’; subscribers will take no action on messages outside their subtopic. Returns: nothing
PYTHON_VERSION_COMPATIBILITY: PY2+3
Utility functions for working with iterables
— def get_one(self, iterable, predicate, error_msg):
Returns the one item from iterable matching predicate.
Raises: A ValueError with error_msg if iterable doesn't have exactly one item matching predicate.
PYTHON_VERSION_COMPATIBILITY: PY2+3
API to support metadata generation and wrangling.
A module with config and support methods for metadata.
Specifically, this supports the new class of metadata we're generating as part of a build, including, but not necessarily limited to:
— def gspath(self, metadata_info, gs_bucket=None, gs_path=None):
Return full or relative path to a metadata payload depending on if bucket info is provided or not.
Args: metadata_info (MetadataInfo): Metadata config information gs_bucket (str): optional gs bucket gs_path (str): optional gs path
Returns: The relative GCS path for metadata if gs_bucket, gs_path not provided. Otherwise, returns the full GCS path to metadata.
DEPS: cros_artifacts, cros_infra_config, cros_source, src_state, test_util, urls, util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/led, recipe_engine/path, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to write metadata.json into GS for GoldenEye consumption.
— def add_default_entries(self):
These fields are available at the start of the build.
— def add_entries(self, **kwargs):
Add elements to metadata.
Args: kwargs (dict): dictionary of key-values to update.
— def add_stage_results(self):
Add stage results for DebugSymbols and Unittest stages.
— def add_version_entries(self, version_dict):
Update metadata with version info.
Args: version_dict (dict): Map containing version info.
@contextlib.contextmanager
— def context(self, config, targets=()):
Returns a context that upload final metadata.json to GS.
Args: config (BuilderConfig): builder config of this builder. targets (list[BuildTarget]): The build targets of this builder.
— def finalize_build(self, config, targets, success):
Finish the build stats and upload metadata.json.
Args: config (BuilderConfig): builder config of this builder. targets (list[BuildTarget]): The build target of this builder. success (bool): Did this build pass.
— def get_metadata(self):
Get the metadata dict. Should only be used for unittesting.
Returns: dict, metadata info.
— def upload_to_gs(self, config, targets, partial=False):
Upload metadata to GS at its current state.
Args: config (BuilderConfig): builder config of this builder. targets (list[BuildTarget]): The build targets of this builder. The first element should be the build_target for the build, the entire list is used for additional publication locations. partial (bool): whether the metadata is incomplete.
— def write_to_file(self, filename):
Write metadata dict to a tempfile.
Args: filename (str): Filename to write to.
Returns: str, path to the file written.
PYTHON_VERSION_COMPATIBILITY: PY2+3
API featuring shared helpers for naming things.
A module with helpers for naming things.
— def get_build_title(self, build):
Get a string to describe the build.
Args: build (Build): The build to describe.
Returns: str: A string describing the build.
— def get_commit_title(self, commit):
Get a string to describe the commit.
This is typically the first line of the commit message.
Args: commit (Commit): The commit in question. See recipe_modules/git/api.py
Returns: str: The commit title.
@staticmethod
— def get_generation_request_title(req):
Get a presentation name for a single GenerationRequest.
Args: req (dict): Dict representing a GenerationRequest proto, containing a single payload to be created.
Returns: A string providing helpful info about that payload.
— def get_hw_test_title(self, hw_test):
Get a string to describe the HW test.
Args: hw_test (HwTest): The HW test in question.
Returns: str: The HW test title.
— def get_package_title(self, package):
Get a string to describe the package.
Args: package (PackageInfo): The package in question.
Returns: str: The package title.
@staticmethod
— def get_paygen_build_title(build_id, paygen_request_dicts):
Get a presentation name for a build running a batch of PaygenRequests.
Args: build_id (int): The ID of the Paygen build being run. paygen_request_dicts (List[dict]): Dicts representing a batch of PaygenRequests being run by a single Paygen builder.
Returns: A string providing helpful info about the paygens being run.
— def get_skylab_result_title(self, skylab_result):
Get a string to describe the HW test.
Args: skylab_result (SkylabResult): The Skylab result in question.
Returns: str: The HW test title.
— def get_skylab_task_title(self, skylab_task):
Get a string to describe the Skylab task.
Args: skylab_task (SkylabTask): The Skylab task in question.
Returns: str: The Skylab task title.
— def get_test_title(self, test):
Get a string to describe the test.
Args: test (SkylabResult|Build): The test in question.
Returns: A str describing the test.
— def get_vm_test_title(self, vm_test):
Get a string to describe the VM test.
Args: vm_test (Build): The buildbucket build for the VM test.
Returns: str: A string describing the VM test.
DEPS: bot_cost, build_menu, build_plan, cros_artifacts, cros_bisect, cros_history, cros_infra_config, cros_release, cros_resultdb, cros_source, cros_tags, cros_test_plan, cros_test_plan_v2, cros_test_proctor, cros_version, easy, failures, gerrit, git, git_footers, gitiles, greenness, metadata, naming, skylab, src_state, test_util, workspace_util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cq, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2
API providing a menu for orchestrator steps
A module with steps used by orchestrators.
Orchestrators do not call other recipe modules directly: they always get there via this module, and are a simple sequence of steps.
— def aggregate_metadata(self, child_builds):
Aggregate metadata payloads from children.
Pull metadata message of each type from children and merge the messages together. Upload the resulting message as our own metadata.
Args: child_builds ([BuildStatus]): BuildStatus instances for child builds
Returns: (ContainerMetadata): Aggregated container metadata
@property
— def builds_status(self):
— def chrome_module_child_props(self):
@property
— def chromium_src_ref_cl_tag(self):
@property
— def config(self):
— def create_recipe_result(self, include_build_details=False):
Create the correct return value for RunSteps.
Args: include_build_details (bool): If True augment RawResults.summary_markdown with additional details about the build for both successes and failures.
Returns: (recipe_engine.result_pb2.RawResult) The return value for RunSteps.
@property
— def external_gitiles_commit(self):
@property
— def gerrit_changes(self):
@property
— def gitiles_commit(self):
— def initialize(self):
@property
— def is_bisecting_orchestrator(self):
@property
— def is_dry_run(self):
@property
— def is_postsubmit_orchestrator(self):
@property
— def is_release_orchestrator(self):
— def plan_and_run_children(self, run_step_name=None, results_step_name=None, check_critical_step_name=None, extra_child_props=None):
Plan, schedule, and run child builders.
Args: run_step_name (str): Name for “run builds” step, or None. results_step_name (str): Name for “check build results” step, or None. check_critical_step_name (str): Name for “non-critical build check” step, or None. extra_child_props (dict): If set, extra properties to append to the child builder requests. Returns: (BuildsStatus): The current status of the builds.
— def plan_and_run_tests(self, testable_builds=None, container_metadata=None):
Plan, schedule, and run tests.
Run tests on the testable_builds identified by plan_and_run_children.
Args: testable_builds (list[Build]): The list of builds to consider, or None to use the current results. container_metadata (ContainerMetadata): Information on container images used for test execution.
Returns: (BuildsStatus): The current status of the builds.
— def run_follow_on_orchestrator(self):
Run the follow_on_orchestrator, if any. Wait if necessary.
— def schedule_wait_build(self, builder, await_completion=False, properties=None, check_failures=False, step_name=None, timeout_sec=None):
Schedule a builder, and optionally await completion.
Args: builder (str): The name of the builder: one of project/bucket/builder, bucket/builder, or builder. await_completion (bool): Wether to await completion. properties (dict): Dictionary of input properties for the builder. check_failures (bool): Whether or not failures accumulate in builds_status. This is only used if await_completion is True. step_name (str): Name for the step, or None. timeout_sec (int): Timeout for the builder, in seconds.
Returns: (Build): The build that was scheduled, and possibly waited for.
@contextlib.contextmanager
— def setup_orchestrator(self, missing_ok=False, test_footers=None):
Initial setup steps for the orchestrator.
This context manager returns with all of the contexts that the orchestrator needs to have when it runs, for cleanup to happen properly.
If appropriate, any inflight orchestrator has finished before we return.
Args: missing_ok (bool): Whether it is OK if no config is found. This can be used by the caller to have a builder with no config report FAILURE (False), or SUCCESS (True). test_footers (str): test Cr-External-Snapshot footer data(values separated by newlines), or None.
Raises: StepFailure if no config is found and |missing_ok| is False.
Returns: BuilderConfig or None, with an active context.
DEPS: easy, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with OverlayFS mounts (the Linux ‘overlay’ filesystem).
See: https://www.kernel.org/doc/Documentation/filesystems/overlayfs.txt
A module for interacting with OverlayFS mounts.
— def __init__(self, props, *args, **kwargs):
Initialize OverlayfsApi.
@contextlib.contextmanager
— def cleanup_context(self):
Returns a context that cleans up any overlayfs mounts created in it.
Upon exiting the context manager, each mounted overlay is then iterated through and unmounted.
— def cleanup_overlay_directories(self, cache_name):
Remove the upper and workdiretories to reset a named cache mount.
Resets the status of an overlayfs mount by removing both the work and upper directories. This is typically used if the status of the lower directory changes.
Args: cache_name (str): Name of the named cache to cleanup.
— def mount(self, name, lowerdir_path, mount_path, persist=False):
Mount an OverlayFS.
As an overlay is mounted, the overlay is then added to the stack that is used by the context manager to unmount as the task ends.
Args:
— def unmount(self, name, mount_path):
Unmount an OverlayFS.
As an overlay is unmounted, the overlay is then removed from the stack that is used by the context manager to unmount as the task ends.
Args:
DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing Phosphorus commands
— def build_parallels_image_provision(self, image_gs_path, max_duration_sec=((2 * 60) * 60)):
Provisions a DUT with the given Chrome OS image and Parallels DLC.
Args: image_gs_path (str): The Google Storage path (prefix) where images are located. For example, ‘gs://chromeos-image-archive/eve-release/R86-13380.0.0’. max_duration_sec (int): Maximum duration of the provision operation, in seconds. Defaults to two hours.
— def build_parallels_image_save(self, dut_state):
Saves the given DUT state in UFS.
The state is only saved if it is safe to do so (i.e. is currently ready or needs_repair).
Args: dut_state (str): The new DUT state. E.g. “needs_repair” or “ready”.
— def fetch_crashes(self, request):
Fetch crashes via the fetch-crashes
subcommand.
Args: request: a FetchCrashesRequest.
— def load_skylab_local_state(self, test_id):
Load the local DUT state file.
Raises:
— def parse(self, results_dir):
Extract test results from an results directory.
Args: results_dir: a string pointing to a directory containing test results.
Returns: Result.
— def prejob(self, request):
Run a prejob or a provision via prejob
subcommand.
Args: request: a PrejobRequest.
— def read_dut_hostname(self):
"Return the DUT hostname.
— def remove_autotest_results_dir(self):
Remove the autotest results directory.
Raises:
— def run_test(self, request):
Run a test via run-test
subcommand.
Args: request: a RunTestRequest.
— def save_and_seal_skylab_local_state(self, dut_state, dut_name, peer_duts):
Update the local DUT state file and seal the results directory.
Args:
Raises:
— def save_skylab_local_state(self, dut_state, dut_name, peer_duts):
Update the local DUT state file.
Args:
Raises:
— def upload_to_gs(self, request):
Upload selected test results to GS via upload-to-gs
subcommand.
Args: request: an UploadToGSRequest.
— def upload_to_tko(self, request):
Upload test results to TKO via upload-to-tko
subcommand.
Args: request: an UploadToTkoRequest.
DEPS: src_state, util, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for CrOS Portage.
A module for CrOS Portage steps.
— def commit_package_uprevs(self):
Uprevs portage packages for all boards.
Must be run with cwd inside a chromiumos source root.
@exponential_retry(retries=3, delay=datetime.timedelta(minutes=2))
— def push_package_uprevs(self, dryrun=False):
Pushes the changes generated by |uprev_portage_packages| to remote.
Must be run with cwd inside a chromiumos source root.
Args: dryrun (bool): If set, do everything except the actual push.
DEPS: gerrit, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for PUpr.
A module for PUpr steps.
— def identify_retry(self, retry_policy, no_existing_cls_policy, open_cls):
Identify the CL to be retried based on retry_policy.
Precedence order:
Args: retry_policy (RetryClPolicy): The retry policy to follow. Can be NO_RETRY, LATEST_OR_LATEST_PINNED, or LATEST_PINNED. no_existing_cls_policy (RetryClPolicy): The policy this PUpr builder follows when no CL exists. If FULL_RUN, we will look for any successful dry runs, allowing us to retry the latest one as a full run. If no successful dry run is found or if DRY_RUN, we will look for a failed CL. open_cls (List[gerrit.PatchSet]): List of CLs.
Returns: (PatchSet, int, str, bool): (The CL to be retried (or None if no retry), CQ label to be applied, The description of the action, Whether the CL, if any, is currently passed)
— def retries_frozen(self, changes):
Examine open CLs for the HASHTAG_FREEZE_RETRIES hashtag.
Args: changes (List[gerrit.PatchSet]): List of CLs.
Returns: bool: Whether or not a HASHTAG_FREEZE_RETRIES hashtag is present.
DEPS: recipe_engine/json, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for calling ‘recipes.py analyze’
A module for calling ‘recipes.py analyze’
— def is_recipe_affected(self, affected_files, recipe):
Return True iff changes in <affected_files> affect .
Must be called from the root of a recipes repo (i.e. recipes.py is in the cwd).
Args:
Return: Bool
DEPS: support, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with re-client for remote execution.
A module for working with re-client for remote execution.
@property
— def reclient_dir(self):
Fetches the reclient directory and returns its path.
@property
— def reproxy_cfg_file(self):
DEPS: cros_infra_config, easy, git, src_state, test_util, depot_tools/depot_tools, depot_tools/gitiles, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for working with the ‘repo’ VCS tool.
See: https://chromium.googlesource.com/external/repo/
A module for interacting with the repo tool.
— def abandon(self, branch, projects=None):
Abandon the branch in the given projects, or all projects if not set.
Args: branch (str): The branch to abandon. projects (list[str]): The projects for which to abandon the branch.
— def create_tmp_manifest(self, manifest_data):
Write manifest_data to a temporary manifest file inside the repo root.
Returns (string): path of tmp manifest relative.
— def diff_manifests(self, from_manifest_str, to_manifest_str, use_merge_base=False):
Diffs the two manifests and returns an array of differences.
Given the two manifest XML strings, generates an array of ManifestDiff
. This only returns CHANGED projects, it skips over projects that were added or deleted.
Args: from_manifest_str (str): The from manifest XML string to_manifest_str (str):The to manifest XML string. use_merge_base (bool): Whether to adjust the from_ref with git merge-base
.
Returns: list[ManifestDiff]: An array of ManifestDiff
namedtuple for any existing changed project (excludes added/removed projects).
— def diff_manifests_informational(self, old_manifest_path, new_manifest_path):
Informational step that logs a “manifest diff”.
Args: old_manifest_path (Path): Path to old manifest file. new_manifest_path (Path): Path to new manifest file.
— def diff_remote_and_local_manifests(self, from_manifest_url, from_manifest_ref, to_manifest_str, test_from_data=None, use_merge_base=False):
Diffs the remote manifest against the local manifest string.
Diffs the ‘snapshot.xml’ at the given from_manifest_url
at the ref from_manifest_ref
against the local to_manifest_str
.
Args: from_manifest_url (str): The manifest repo url to checkout. from_manifest_ref (str): The manifest ref to checkout. to_manifest_str (str): The string XML for the to manifest. test_from_data (str): Test data: The from_manifest contents, or None for the default. use_merge_base (bool): Whether to adjust the from_ref with git merge-base
.
Returns: list[ManifestDiff]: An array of ManifestDiff
namedtuple for any existing changed project (excludes added/removed projects).
@property
— def disable_source_cache_health(self):
— def ensure_pinned_manifest(self, projects=None, regexes=None, test_data=None, step_name=None):
Ensure that we know the revision info for all projects.
If the manifest is not pinned, a pinned manifest is created and logged.
Args: projects (list[str]): Project names or paths to return info for. Defaults to all projects. regexes (list[str]): list of regexes for matching projects. The matching is the same as in repo forall --regex regexes...
. test_data (str): Test data for the step: the output from repo forall, or None for the default. This is passed to project_infos().
Returns: (str): The manifest XML as a string, or None if the manifest is already pinned.
— def ensure_synced_checkout(self, root_path, manifest_url, init_opts=None, sync_opts=None, projects=None, final_cleanup=False, sanitize=False):
Ensure the given repo checkout exists and is synced.
Args: root_path (Path): Path to the repo root. manifest_url (str): Manifest URL for 'repo.init. init_opts (dict): Extra keyword arguments to pass to 'repo.init'. sync_opts (dict): Extra keyword arguments to pass to 'repo.sync'. projects (list[str]): Projects of concern or None if all projects are of concern. Used to perform optimizations where possible to only operate on the given projects. final_cleanup (bool): Used by cache builder to ensure that all locks and uncommitted files are cleaned up after the sync. sanitize (bool): Should we run
git gc` on all repos.
— def init(self, manifest_url, _kwonly=(), manifest_branch='', reference=None, groups=None, depth=None, repo_url=None, repo_branch=None, local_manifests=None, manifest_name=None, projects=None, verbose=True, clean=True):
Executes ‘repo init’ with the given arguments.
Args: manifest_url (str): URL of the manifest repository to clone. manifest_branch (str): Manifest repository branch to checkout. reference (str): Location of a mirror directory to bootstrap sync. groups (list): Groups to checkout (see repo init --groups
). depth (int): Create a shallow clone of the given depth. repo_url (str): URL of the repo repository. repo_branch (str): Repo binary branch to use. local_manifests (list[LocalManifest]): Local manifests to add. See https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md#local-manifests. manifest_name (Path): The manifest file to use. projects (list[str]): Projects of concern or None if all projects are of concern. Ignored as of go/cros-source-cache-health. verbose (bool): Whether to produce verbose output.
— def initialize(self):
— def manifest(self, manifest_file=None, test_data=None, pinned=False, step_name=None):
Uses repo to create a manifest and returns it as a string.
By default uses the internal .repo manifest, but can optionally take another manifest to use.
Args: manifest_file (Path): If given, path to alternate manifest file to use. pinned (bool): Whether to create a pinned (snapshot) manifest. test_data (str): Test data for the step: the contents of the manifest, or None for the default. step_name (str): The name for the step, or None.
Returns: str: The manifest XML as a string.
@property
— def manifest_gitiles_commit(self):
Return a Gitiles commit for the repo manifest.
— def project_exists(self, project):
Use ‘repo info’ to determine if the project exists in the checkout.
Args: project (str): Project name or path to return info for.
Returns: (bool): whether or not the project exists.
— def project_info(self, project=None):
Use ‘repo forall’ to gather project information for one project.
Args: project (str|Path): Project name or path to return info for. If None, then use the cwd as the path for the project.
Returns: ProjectInfo: The request project info.
— def project_infos(self, projects=None, regexes=None, test_data=None, ignore_missing=False):
Uses ‘repo forall’ to gather project information.
Note that if both projects and regexes are specified the resultant ProjectInfos are the union, without duplicates, of what each would return separately.
Args: projects (list[str]): Project names or paths to return info for. Defaults to all projects. regexes (list[str]): list of regexes for matching projects. The matching is the same as in repo forall --regex regexes...
. test_data (str): Test data for the step: the output from repo forall, or None for the default. ignore_missing (bool): If True, skip missing projects and continue
Returns: list[ProjectInfo]: Requested project infos.
@property
— def repo_path(self):
— def start(self, branch, projects=None):
Start a new branch in the given projects, or all projects if not set.
Args: branch (str): The new branch name. projects (list[str]): The projects for which to start a branch.
— def sync(self, _kwonly=(), force_sync=False, detach=False, current_branch=False, jobs=None, manifest_name=None, no_tags=False, optimized_fetch=False, cache_dir=None, timeout=None, retry_fetches=None, projects=None, verbose=True, no_manifest_update=False, force_remove_dirty=False, prune=None, repo_event_log=True):
Executes ‘repo sync’ with the given arguments.
Args: force_sync (bool): Overwrite existing git directories if needed. detach (bool): Detach projects back to manifest revision. current_branch (bool): Fetch only current branch. jobs (int): Projects to fetch simultaneously. manifest_name (str): Temporary manifest to use for this sync. no_tags (bool): Don‘t fetch tags. optimized_fetch (bool): Only fetch projects if revision doesn’t exist. cache_dir (Path): Use git-cache with this cache directory. retry_fetches (int): The number of times to retry retriable fetches. projects (list[str]): Projects to limit the sync to, or None to sync all projects. verbose (bool): Whether to produce verbose output. no_manifest_update (bool): Whether to disable updating the manifest. force_remove_dirty (bool): Whether to force remove projects with uncommitted modifications if projects no longer exist in the manifest. prune (bool): Delete refs that no longer exist on the remote. repo_event_log (bool): Write the repo event log, do analysis steps.
— def sync_manifest(self, manifest_url, manifest_data, **kwargs):
Sync to the given manifest file data.
Args: manifest_url (str): URL of manifest repo to sync to (for repo init) manifest_data (str): Manifest XML data to use for the sync. kwargs: Keyword arguments to pass to ‘repo.sync’.
— def version(self):
Prints the current version information of repo.
DEPS: easy, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing result flow commands
— def pipe_ctp_data(self, request):
Pipe CTP data to TestPlanRun table in BQ.
Args:
— def pipe_test_runner_data(self, request):
Pipe test runner data to TestRun/TestCase tables in BQ.
Args:
— def publish(self, project_id, topic_id, build_type, should_poll_for_completion=False, parent_uid=''):
Run the result_flow to publish build's own build ID to Pubsub.
Args:
DEPS: recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing ServiceVersion commands
— def validate_service_version_if_exists(self):
Validate the caller's service version if they sent one.
DEPS: cros_infra_config, cros_source, cros_tags, easy, git_footers, greenness, metadata, src_state, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing commands to Skylab
— def schedule_ctp_requests(self, tagged_requests, can_outlive_parent=True, bb_tags=None, **kwargs):
Schedule a cros_test_platform build.
Args: tagged_requests (dict): Dictionary of string to test_platform.Request objects. can_outlive_parent (bool): Whether this build can outlive its parent. The default is True. bb_tags (dict or list[StringPair]): If of the type list[StringPair], will be used directly as a bb_tag list. If a dict, used to map keys to values. If the value is a list, multiple tags for the same key will be created. kwargs: List of extra named parameters to pass to buildbucket.schedule_request. Returns: The scheduled buildbucket build.
— def schedule_suites(self, unit_hw_tests, timeout, name=None, async_suite_run=False, container_metadata=None, require_stable_devices=False):
Schedule HW test suites by invoking the cros_test_platform recipe.
Args:
Returns: list[SkylabTask]: with buildbucket_id of the recipe launched.
— def set_qs_account(self, qs_account):
Override the quota scheduler account at runtime.
— def wait_on_suites(self, tasks, timeout):
Wait for the single Skylab multi-request to finish and return the result
Args: tasks (list[SkylabTask]): The Skylab tasks to wait on. timeout (Duration): Timeout in timestamp_pb2.Duration.
Returns: list[SkylabResult]: The results for suites from provided tasks.
DEPS: recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API providing frequently needed values, that we sometimes override.
If you are using cros_source or cros_infra_config, this module is relevant to your interests.
There are two classes of properties in this module.
Constant(ish) things that need a common home to avoid duplication, such as workspace_path, internal_manifest, and external_manifest.
Information obtained from recipe_engine, which we frequently change:
gitiles_commit: Especially when buildbucket does not give us one, we need to set it to the correct value for the build. That varies based on builder_config, and other things.
gerrit_changes: some builders add changes to the build, and others ignore the changes completely.
Source State related attributes for Chrome OS recipes.
@build_manifest.setter
— def build_manifest(self, build_manifest):
Set the manifest that will be used for the build.
Sets the manifest used by this builder.
Args: (ManifestProject): information about the manifest for this build.
@property
— def default_branch(self):
The default branch for Chrome OS repos
@property
— def default_ref(self):
The default ref for Chrome OS repos
@property
— def external_manifest(self):
Information about external manifest.
Provides immutable information about the Chrome OS external manifest.
Returns: (ManifestProject): information about the external manifest.
@gerrit_changes.setter
— def gerrit_changes(self, gerrit_changes):
Set the gerrit_changes that will be used for the build.
Args: gerrit_changes (list[GerritChanges]): The gerrit_changes.
@gitiles_commit.setter
— def gitiles_commit(self, gitiles_commit):
Set the gitiles_commit that will be used for the build.
Args: gitiles_commit (GitilesCommit): The value to use.
— def gitiles_commit_to_manifest(self, gitiles_commit):
Return the manifest corresponding to the gitiles_commit.
Args: gitiles_commit (GitilesCommit): The gitiles_commit.
Returns: (ManifestProject): Information about the corresponding manifest, or None.
— def initialize(self):
@property
— def internal_manifest(self):
Information about internal manifest.
Provides immutable information about the Chrome OS internal manifest.
Returns: (ManifestProject): information about the internal manifest.
@property
— def manifest_name(self):
Return the name of the manifest.
@property
— def manifest_projects(self):
Return the manifest project names.
@property
— def workspace_path(self):
The “workspace” checkout path.
The cros_source module checks out the Chrome OS source in this directory. It will contain the base checkout and any modifications made by the build, and is discarded after the build.
DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module for issuing stable_version commands
— def fetch_and_commit(self):
Fetch up-to-date stable version and commit them.
Returns: response: raw string as the stdout data.
— def initialize(self):
— def validate_stable_version(self):
Validate the remote stable version config file.
DEPS: easy, recipe_engine/cipd, recipe_engine/json, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
APIs for running recipes/support tools.
A module for support tool steps.
— def call(self, tool, input_data, test_output_data=None, infra_step=True, timeout=None, **kwargs):
Run a tool from the support package.
Args: tool (str): Tool name. input_data: Data to be passed as input to the tool (serialized to JSON). test_output_data (dict|list|Callable): Data to return in tests. infra_step (bool): Whether or not this is an infrastructure step. timeout (int): Timeout of the step in seconds.
Returns: Data passed as output from the tool (deserialized from JSON).
— def ensure_package_installed(self):
Ensure the CIPD support package is installed.
— def initialize(self):
DEPS: easy, depot_tools/git, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module that queries Swarming via the CLI.
— def get_bot_counts(self, swarming_instance, dimensions=None):
Retrieves the count of bots from Swarming based on dimensions.
Args: swarming_instance(str): string containing the name of the Swarming instance to query. dimensions (iterable): strings formatted as “key:value” to query Swarming.
— def get_max_pending_time(self, dimensions, lookback_hours, swarming_instance):
Retrieves the list of tasks from Swarming based on dimensions.
Args: dimensions (iterable): strings formatted as “key:value” to query Swarming. lookback_hours (int): Number of hours to query swarming on. swarming_instance(str): string containing the name of the Swarming instance to query.
Returns: (float) Max pending time in hours.
— def get_task_counts(self, dimensions, state, lookback_hours, swarming_instance):
Retrieves the count of tasks from Swarming based on filters.
Args: dimensions (iterable): strings formatted as ‘key:value’ to query Swarming. state (str): state of the tasks to query lookback_hours (int): Number of hours to query swarming on. swarming_instance(str): string containing the name of the Swarming instance to query.
— def get_task_list(self, dimensions, state, lookback_hours, swarming_instance, limit=None):
Retrieves the list of tasks from Swarming based on dimensions and state.
Args: dimensions (iterable): strings formatted as “key:value” to query Swarming. state (str): state of the tasks to query lookback_hours (int): Number of hours to query swarming on. swarming_instance(str): string containing the name of the Swarming instance to query. limit (int): Number of tasks to return.
DEPS: android, chrome, cros_artifacts, cros_bisect, cros_build_api, cros_infra_config, cros_sdk, failures, goma, workspace_util, recipe_engine/cq, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for various support functions for building.
A module for sysroot setup, manipulation, and use.
— def bootstrap_sysroot(self, compile_source=False, response_lambda=None, timeout_sec=‘DEFAULT’, test_data=None, name=None):
Bootstrap the sysroot by calling InstallToolchain.
Args: compile_source (bool): Whether to compile from source. response_lambda (fn(output_proto)->str): A function that appends a string to the build api response step. Used to make failure step names unique across differing root causes. Default: cros_build_api.failed_pkg_data_names. timeout_sec (int): Step timeout, in seconds, or None for default. test_data (str): test response (JSON) from the SysrootService/InstallToolchain call, or None to use the default in cros_build_api/test_api.py. name (str): Step name to use, or None for the default name.
— def build_images(self, image_types, builder_path, disable_rootfs_verification, disk_layout, version=None, timeout_sec=((2 * 60) * 60), build_test_data=None, test_test_data=None, name=None):
Build and validate images.
Args: image_types (list[ImageType]): Image types to build. builder_path (str): Builder path in GS for artifacts. disable_rootfs_verification (bool): whether to disable rootfs verification. disk_layout (str): disk_layout to set, or empty for default. version (str): version string to pass to build API, or None. timeout_sec (int): Step timeout (in seconds). build_test_data (str): test response (JSON) from the ImageService/Create call, or None. test_test_data (str): test response (JSON) from the ImageService/Test call, or None. name (str): Step name to use, or None for default name.
— def create_sysroot(self, build_target, profile=None, chroot_current=True, replace=True, package_indexes=None, timeout_sec=‘DEFAULT’, test_data=None, name=None):
Create the sysroot.
Args: build_target (BuildTarget): Which build_target to create a sysroot for. profile (chromiumos.Profile): The profile the sysroot is to use, or None. chroot_current (bool): Whether the chroot is current. (If not, it will be updated. replace (bool): Whether to replace an existing sysroot. package_indexes (list[PackageIndexInfo]): Package indexes to use, or None. timeout_sec (int): Step timeout (in seconds). Default: None if a toolchain change is detected, otherwise 10 minutes. test_data (str): test response (JSON) from the SysrootService/Create call, or None to generate a default response based on the input data. name (str): Step name to use, or None for the default name.
Returns: Sysroot
— def initialize(self):
— def install_packages(self, config, dep_graph, packages=None, artifact_build=False, package_indexes=None, timeout_sec=‘DEFAULT’, name=None, dryrun=False):
Install packages (possibly fetching Chrome source).
Args: config (BuilderConfig): The builder config. dep_graph: The dependency graph from cros_relevance.get_dependency_graph. packages (list[PackageInfo]): list of packages to install. Default: all packages for the build_target. artifact_build (bool): Whether to call update_for_artifact_build. package_indexes (list[PackageIndexInfo]): Package indexes to use, or None. timeout_sec (int): Step timeout, in seconds, or None for default. name (str): Step name to use, or None for default name. dryrun (bool): Whether to dryrun the step such that we calculate the packages which would have been built, but do not install them.
@property
— def sysroot(self):
— def update_for_artifact_build(self, chroot, artifacts, force_relevance=False, test_data=None, name=None):
Update ebuilds for artifact build.
Args: chroot (Chroot): Chroot, or None. artifacts (BuilderConfig.Artifacts): Artifact Information force_relevance (bool): Whether to always claim relevant. test_data (str): test response (JSON) from the ArtifactsService/BuildSetup call, or None. name (str): Step name to use, or None for default name.
Returns: (BuildSetupResponse): Whether the build is relevant.
DEPS: easy, gcloud, tast_results, util, depot_tools/gsutil, recipe_engine/archive, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to execute tast commands.
— def create_gce_vm_context(self, image, project, machine, zone, network, subnet, private_key_path):
Creates a context manager which performs setup/teardown of a GCE VM.
Args: image(str): GCE image to use for the instance. project(str): Google Cloud project name. machine(str): GCE machine type zone(str): GCE zone to create instance (e.g. us-central1-b). network(str): Network name to use. subnet(str): Network subnet on which to create instance. private_key_path (Path): Path to private key.
Returns: A context manager that - when entered, prepares a VM to test against, and yields a VmInfo object for connecting to it. - when exited, terminates the VM and performs cleanup.
— def create_qemu_vm_context(self, qcow_image_path, private_key_path, second_image_path=None):
Creates a context manager which performs setup/teardown of a QEMU VM.
Args: qcow_image_path (Path): Path to image in qcow format. private_key_path (Path): Path to private key. second_image_path (Path): Path to a second qcow disk image (optional).
Returns: A context manager that - when entered, prepares a VM to test against, and yields a VmInfo object for connecting to it. - when exited, terminates the VM and performs cleanup.
— def download_tast(self, build_payload, test_artifacts_dir):
Downloads the tast executable from specified build artifacts.
Args: build_payload (BuildPayload): Describes where the artifact is on GS. test_artifacts_dir (str): The directory to which files should be downloaded. The tast executable will be found at tast/tast relative to this directory.
— def download_vm(self, build_payload, vm_dir, modify_image=None):
Downloads the VM image from specified build artifacts.
Args: build_payload (BuildPayload): Describes where the artifact is on GS. vm_dir (Path): The directory to which files should be downloaded. modify_image (func): Function that takes one argument, the VM image path. It will be called prior to converting the raw image to the qcow2 format. (optional).
Returns: qcow_image_path (Path): The location of the qcow image. This will be a location inside image_archive_dir. private_key_path (Path): The location of the SSH key. This will be a location inside image_archive_dir.
— def is_vm_running(self, kvm_pid_file):
Check if the specified PID is still running.
Args: kvm_pid_file (Path): File containing the PID of a QEMU process.
Returns: bool: Whether the VM process is still running.
— def run_direct(self, dut_name, tast_inputs, test_results_dir):
Run tast tests without retries or results processing.
Args: dut_name (str): The identity of the DUT to connect to, for example, my-dut-host-name or localhost:9222 (if testing a VM). tast_inputs (TastInputs): Common inputs for running tast tests. test_results_dir (Path): Path to store tast results.
Returns: list[str]: The list of tests that met the specified expression(s).
— def run_direct_vm(self, vm_context, test_results_dir, tast_inputs):
Run tast tests in a VM without retries or results processing.
Args: expressions (list[str]): Expressions describing tests to run. vm_context (contextlib.contextmanager): The VM context manager, created by create_qemu_vm_context/create_gce_vm_context. tast_inputs (TastInputs): Common inputs for running tast tests. test_results_dir (Path): Path to store tast results.
Returns: list[str]: The list of tests that met the specified expression(s).
— def run_vm(self, suite_name, vm_context, tast_inputs):
Run tast tests in a VM with one retry and upload logs to Google storage.
Args: suite_name (str): Unique name used to record test results. vm_context (contextlib.contextmanager): The VM context manager, created by create_qemu_vm_context/create_gce_vm_context. tast_inputs (TastInputs): Common inputs for running tast tests.
Returns: A tuple of list(Failures) and a bool indicating whether the results were empty.
DEPS: cros_infra_config, cros_resultdb, easy, failures, util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/resultdb, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
A module to process tast-results/ directory.
— def __init__(self, props, *args, **kwargs):
Initialize TastResultsApi.
@exponential_retry(retries=3, condition=(lambda e: getattr(e, ‘had_timeout’, False)))
— def archive_dir(self, dir_path, tag):
Archive dir to Google Storage.
Args: dir_path (Path): Path to dir to be uploaded. tag (str): Tag for this execution. Used to distinguish archive folders.
Returns: str, link to the archive on pantheon.
— def convert_to_testcaseresult(self, test_result):
Convert Tast's result into CTP format.
Args: test_result (TestResult): TestResult to be converted.
Returns: TestCaseResult with the same info.
— def create_missing_test_results(self, missing_test_names):
Create test results for the missing test cases.
Args: missing_test_names list(str): Tests that should have run but didn't.
Returns: list(TestCaseResult) Test results for the missing tests cases.
— def get_failures(self, task_result, exclude_tests=None):
Convert TaskResult into api.failures.Failure objects and dicts.
Args: task_result (TaskResult): TaskResult to be converted. exclude_tests list(str): List of names of tests to be excluded.
Returns: A tuple of list(Failure) and list(dict) representing failed test cases excluding the ones provided.
— def get_results(self, test_results_path, suite_name, tag, tests):
Return the test results decoded from the streamed_results.jsonl.
Args: test_results_path (Path): Path to test_results/. suite_name (str): Name of the whole test suite. tag (str): Tag for this execution. Used to distinguish archive folders. tests list(str): List of tests that should have been executed.
Returns: A consolidated Data Structure summarizing all results from a run. Currently this is a TaskResult. https://crrev.com/ee30a869473a8ee54246e0469ede2aa010fb2e48/src/test_platform/steps/execution.proto#47
— def get_tests_to_retry(self, task_result):
Determine which tests to retry.
Args: task_result(TaskResult): TaskResult of the test suite.
Returns: list(str) names of tests to be retried and a boolean that requires VM restart before retry.
— def print_results(self, failures, empty_result):
Print results for the user.
Args: failures(list(Failure)): Failures of this run. empty_result(bool): Were the results empty?
— def record_logs(self, sys_log_dir):
Print system logs to MILO.
Args: sys_log_dir(str): absolute dir path to copy logs from.
— def upload_to_resultdb(self, test_results_path, suite_name, missing_test_names):
Upload the test results to ResultDB.
Args: test_results_path (Path): Path to test_results/. suite_name (str): Name of the whole test suite.
DEPS: cros_tags, src_state, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
API to simpify testing Chrome OS recipes.
A module providing test methods to simplify testing Chrome OS recipes.
DEPS: recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for creating task URLs out of complex data structures.
A module for creating links to tasks.
— def get_build_link_map(self, build):
Returns the title->URL to the given buildbucket build.
Args: build (Build): The buildbucket build in question.
Returns: str->str: title->URL pointing to the build milo page.
— def get_gs_path_url(self, gs_path):
Returns the Cloud Storage Browser URL to the given GS path.
Args: gs_path (str): A string of the format “gs:///”
Returns: str: URL pointing to the Cloud Storage Browser page for the object.
— def get_skylab_result_link_map(self, skylab_result):
Returns the URL to the given skylab result page.
Args: skylab_task (SkylabResult): The Skylab result in question.
Returns: str->str map: title to URL to the skylab swarming task page if the suite succeeded or entries of just the failed tests.
— def get_skylab_task_url(self, skylab_task):
Returns the URL to the given skylab task.
Args: skylab_task (SkylabTask): The Skylab task in question.
Returns: str: URL pointing to the skylab swarming task page.
— def get_state_suffix(self, task_state):
String suffix to supply info about the task.
Args: tast_state(TaskState): The task state.
Returns: str, denoting more information about the task.
— def get_vm_test_link_map(self, vm_test):
Returns the title->URL results from the given VM test.
Args: vm_test (Build): The vm test in question.
Returns: str->str: title->URL pointing to the vm_test's milo page. For direct-vm tests, the individual failing tests are listed.
PYTHON_VERSION_COMPATIBILITY: PY2+3
Module providing importable utilities.
Includable utilities.
DEPS: cros_infra_config, cros_relevance, cros_source, easy, gerrit, repo, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
API for various support functions for building.
A module workspace setup and manipulation.
— def apply_changes(self, changes=None, name=‘cherry-pick gerrit changes’, ignore_missing_projects=False):
Apply gerrit changes.
Args: changes (list[GerritChanges]): Changes to apply. Default: changelist saved in cros_infra_config.configure_builder(). name (string): Step name. Default: “setup source”. ignore_missing_projects (bool): If true, changes to projects that are not currently checked out (as determined by repo forall) will not be applied. An example of when this is useful: it is possible that changes includes changes to repos this builder is not allowed to read (e.g. because of Cq-Depend grouping); the changes will be discarded instead of failing during application.
— def checkout_change(self, change=None, name=‘checkout gerrit change’):
Check out a gerrit change using the gerrit refs/changes/... workflow.
Differs from apply_changes in that the change is directly checked out, not cherry picked (so the patchset parent will be accurate). Used for things like tricium where line number matters.
Args: change (GerritChange): Change to check out. name (string): Step name. Default: “checkout gerrit change”.
@property
— def commits(self):
— def detect_toolchain_cls(self, chroot, gitiles_commit=None, gerrit_changes=None, test_value=None, name=None):
Check for toolchain changes.
If there are any changes that affect the toolchain, set that workspace attribute.
Args: gitiles_commit (GitilesCommit): The gitiles commit to use, or none to use the value from config. gerrit_changes (list[GerritChange]): The gerrit changes in use, None to use the changes already applied via apply_changes(). chroot (Chroot): The chroot for the build. name (str): The name for the step, or None for default. test_value (bool): The value to use for tests. Default: No toolchain changes detected unless step data is provided elsewhere.
Returns: (bool) whether there are toolchain patches applied.
— def initialize(self):
@property
— def patch_sets(self):
@contextlib.contextmanager
— def setup_workspace(self, default_main=False):
Prepare the source checkout for building.
Args: default_main (bool): Whether to checkout tip-of-tree instead of snapshot when no gitiles_commit was provided.
Returns: A context where source is set up, and the current working directory is the workspace path. Note that api.cros_source.cleanup_context() is generally going to be needed.
@contextlib.contextmanager
— def sync_to_commit(self, commit=None, staging=False, projects=None):
Sync the source tree.
This context manager syncs the workspace path.
Args: commit (GitilesCommit): The gitiles_commit to sync to. Default: commit saved in cros_infra_config.configure_builder(). staging (bool): Whether this is a staging build. Default: False. projects (List[str]): Project names or paths to return info for. Defaults to all projects.
@contextlib.contextmanager
— def sync_to_manifest_groups(self, manifest_groups, local_manifests=None, cache_path_override=None, gitiles_commit=None, manifest_branch=None):
Returns a context with manifest groups checked out to cwd.
The subset of repos in the external manifest + local_manifests matching manifest_groups are synced. For example, say the external manifest contains repos:
and a local manifest contains repos:
and manifest_groups is [“g1”, “g4”]. Repos “a”, “b”, and “e” will be synced.
Note the importance of the cache_path_override
parameter. For cases where the number of repos being synced is much smaller than a full checkout it is more efficient to override the default cache. This is because the time to delete unused repos (which are present because of caching) is much larger than the time to sync the used repos.
Args: manifest_groups (list[str]): List of manifest groups to checkout. local_manifests (list[repo.LocalManifest]): A list of local manifests to add or None if not syncing a local manifest. cache_path_override (Path): Path to sync into. If None, the default caching of cros_source.ensure_synced_cache is used. gitiles_commit (GitilesCommit): The gitiles_commit to sync to. Default: commit saved in cros_infra_config.configure_builder(). manifest_branch (str): Branch to checkout. See the --manifest-branch
option of repo init
for details and defaults.
@property
— def toolchain_cls_applied(self):
Whether there are toolchain CLs applied to the source tree.
@property
— def workspace_path(self):
DEPS: orch_menu, recipe_engine/properties, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe that generates artifacts using HW Test results.
All builders run against the same source tree.
— def DoRunSteps(api, properties):
— def RunSteps(api, properties):
DEPS: build_menu, cros_sdk, sysroot_util, test_util
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for building an AFDO benchmark profile.
— def DoRunSteps(api, config, properties):
— def RunSteps(api, properties):
DEPS: analysis_service, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: android, cros_build_api, gerrit, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: android, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: android, cros_source, easy, gerrit, git, orch_menu, repo, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Orchestrator for Android uprev builders.
The orchestrator determines the latest Android version for the specified Android package, then passes the info down into child builders running the build_android_uprev recipe.
Once all builds and tests passed, it submits a CL to update the Android LKGB file. The change will in turn trigger the PUpr generator to publish an actual Android uprev.
— def DoRunSteps(api, properties):
— def RunSteps(api, properties):
DEPS: cros_build_api, cros_cq_depends, cros_infra_config, cros_source, cros_tags, easy, gerrit, git, git_footers, git_txn, naming, repo, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cq, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for the Chrome OS annealing builders.
The annealing builders run in serial and do the following:
— def RunSteps(api, properties):
DEPS: bot_cost, cros_tags, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: bot_cost, cros_tags, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_cost, cros_tags, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: bot_scaling, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_branch, cros_release_config, cros_source, easy, workspace_util, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for creating a new ChromeOS branch.
— def RunSteps(api, properties):
DEPS: breakpad, cros_test_postprocess, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: breakpad, cros_test_postprocess, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: android, build_menu, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for building a BuildTarget image for Android uprev.
The target Android package/version to uprev is specified via input properties, for example:
“$chromeos/android”: { “android_package”: “android-vm-rvc”, “android_version”: “7444938” }
— def DoRunSteps(api, properties, config):
— def RunSteps(api, properties):
DEPS: build_menu, cros_source, gerrit, git, repo, depot_tools/depot_tools, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for building a Borealis rootfs image.
— def DoRunSteps(api, properties):
— def RunSteps(api, properties):
DEPS: bot_scaling, build_menu, cros_infra_config, cros_tags, easy, test_util, recipe_engine/buildbucket, recipe_engine/random, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for building a BuildTarget image for CQ.
— def DoRunSteps(api, config):
— def RunSteps(api):
DEPS: build_menu, cros_build_api, cros_sdk, easy, src_state, test_util, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe that builds and tests firmware.
This recipe lives on its own because it is agnostic of ChromeOS build targets.
— def RunSteps(api, properties):
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for generating artifacts for Informational builders.
This recipe supports the workflow necessary to support asan, UBsan, and fuzzer builder profiles.
— def RunSteps(api):
DEPS: build_menu, cros_artifacts, cros_build_api, cros_release, cros_sdk, cros_version, easy, failures, git, metadata_json, repo, src_state, test_util, depot_tools/depot_tools, recipe_engine/context, recipe_engine/cq, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe that builds chromeos-firmware on a firmware branch.
— def RunSteps(api, properties):
DEPS: build_menu, chromite, cros_build_api, cros_source, gerrit, repo, src_state, recipe_engine/step, recipe_engine/tricium
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for linting CLs with Cargo Clippy.
— def DoRunSteps(api, config, relevant_patchsets, _properties):
— def RunSteps(api, properties):
DEPS: build_menu, cros_bisect, cros_build_api, cros_history, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def DoRunSteps(api, properties):
— def RunSteps(api, properties):
DEPS: build_menu, cros_build_api, cros_relevance, git_footers, workspace_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, easy, test_util, recipe_engine/assertions, recipe_engine/properties, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: build_menu, test_util, recipe_engine/assertions, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: easy, phosphorus, tast_exec, tast_results, test_util, depot_tools/gsutil, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for building a Parallels image for testing.
This recipe runs on a lab drone and takes control of a physical DUT to build a new Parallels VM image, for later use in automated testing.
This recipe involves booting up Windows in a virtual machine on the DUT. The caller is responsible for ensuring this is only invoked in contexts where the necessary license(s) have been obtained. Contact parallels-cros@ for more details.
This recipe is invoked as part of uprev_parallels_pin.
— def RunSteps(api, properties):
— def build_vm_image(api, properties):
Builds a new VM image for testing.
Returns: image_name(str): The name of the generated image. image_size(int): The size of the generated image, in bytes. image_hash(str): The base64-encoded SHA256 hash of the generated image.
— def invoke_tast(api, test_artifacts_dir, build_payload, dest_path):
Runs tast to build the new VM image.
Args: test_artifacts_dir (Path): The location of test artifacts produced by the build. build_payload (BuildPayload): Describes where the artifact is on GS. dest_path (Path): The location that the produced VM image should be copied to (on the local disk).
DEPS: build_plan, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_plan, cros_history, cros_infra_config, cros_relevance, git_footers, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: build_plan, cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, forced_rebuilds, expected_completed):
DEPS: build_plan, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_plan, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_plan, git_footers, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, expected_builders):
DEPS: build_menu, cros_history, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for building a BuildTarget image for Postsubmit.
— def DoRunSteps(api, config):
— def RunSteps(api):
DEPS: bot_scaling, build_menu, build_reporting, builder_metadata, cros_build_api, cros_infra_config, cros_release, cros_signing, cros_source, cros_tags, debug_symbols, easy, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/properties, recipe_engine/runtime, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for building images for release.
— def DoRunSteps(api, config):
— def RunSteps(api):
DEPS: build_reporting, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_reporting, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_reporting, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_reporting, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, cros_history, cros_infra_config, cros_relevance, cros_tags, test_util, recipe_engine/buildbucket, recipe_engine/runtime, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for building and testing a BuildTarget's packages.
— def DoRunSteps(api, config):
— def RunSteps(api):
DEPS: build_menu, cros_build_api, cros_sdk, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2
Builds and uploads the Chromium OS toolchain.
— def RunSteps(api):
DEPS: buildbucket_stats, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: buildbucket_stats, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: buildbucket_stats, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, builder_metadata, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Tests to verify builder_metadata.get_models.
— def RunSteps(api):
DEPS: build_menu, builder_metadata, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Tests to verify that builder_metadata is properly cached between invocations.
— def RunSteps(api):
DEPS: build_menu, builder_metadata, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Test to verify install_packages is called prior to look_up_builder_metadata.
— def RunSteps(api):
DEPS: cros_source, gerrit, repo, src_state, test_util, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Check that any binary blobs in a commit come from a valid FIT version
— def RunSteps(api, properties):
— def mock_fit_header(version):
Mock the header from the FIT tool with given version
Args: version (str): version string to put in FIT header
Return: version file contents as string
— def mock_version_file(version=‘14.0.40.1206’, hashes=None, delete=None):
Mock version file contents
Args: version (str): optional version string to put in FIT header hashes (dict): file => sha256 values to override/add to file delete ([str]): list of keys to remove from the file (default none)
Return: version file contents as string
— def parse_versions_file(step_name, api, path):
Parse a file containing SHA-256 hashes and binary names into a map.
This file is generated by gen_hash_references.sh in the sys-boot overlays of individual baseboards. It's a list of SHA-256 hashes and associated file names.
Args: path (str): path to versions file
Return: (Fit Version, { filename => SHA-256 hash })
DEPS: cros_source, gerrit, gs_step_logging, repo, src_state, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Checks a project conforms to its program's constraints.
— def RunSteps(api, properties):
DEPS: chrome, recipe_engine/context, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: chrome, cros_build_api, gerrit, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
— def jsonify(**kwargs):
Return the kwargs as a json string.
DEPS: chrome, recipe_engine/assertions, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: chrome, cros_build_api, recipe_engine/assertions, recipe_engine/file
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_cost, chromite, easy, gcloud, depot_tools/gitiles, recipe_engine/legacy_annotation, recipe_engine/properties, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2
— def DoRunSteps(api):
— def MakeSummaryMarkdown(api, failure):
— def RunSteps(api):
DEPS: git, git_cl, gitiles, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for the Chrome uprev builder.
Triggers a passive uprev attempt against current Chrome ToT by generating a CL that touches chromeos-chrome-9999.ebuild, adding the gardeners as reviewers, and triggering a CQ dry-run.
— def RunSteps(api):
DEPS: chromite, depot_tools/gitiles, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: recipe_engine/cipd, recipe_engine/properties, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
— def get_current_instance(api, instruction):
Get the current version of the ref.
Args:
cipd set-ref
. Returns: cipd_uprev.Instance Raises: A StepFailure if the CIPD tool call fails.— def uprev_package(api, instruction, package_tags=None):
Change CIPD ref of a package according to the instructions.
Args:
cipd set-ref
.— def validate(api, instruction):
Validate instructions for uprevving a specific package.
Args:
cipd set-ref
. Raises: A ValueError if validation fails.DEPS: cros_cq_depends, cros_source, easy, gerrit, git, repo, src_state, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
Used to create sweeping changes by creating CLs in many repos.
This recipe is currently focused on the use case of running gen_config in program and project repositories. Invocation is most easily handled via the cl_factory script in the chromiumos/config repo's bin directory:
https://chromium.googlesource.com/chromiumos/config/+/HEAD/bin/cl_factory
That script is a wrapper around the bb add
command which ends up executing something that looks like this:
bb add -cl https://chrome-internal-review.googlesource.com/c/chromeos/program/galaxy/+/3095418 -p ‘repo_regexes=[“src/project/galaxy”]’ -p 'message_template=Hello world
BUG=chromium:1092954 TEST=None' -p ‘reviewers=[“reviewer@google.com”]’ -p ‘hashtags=[“mondo-update”]’ -p ‘replace_strings=[{“file_glob”: “*.star”, “before”: “_CLAMSHELL”, “after”: “_CONVERTIBLE”}]’ chromeos/infra/ClFactory
For more details on the input properties, see cl_factory.proto.
— def RunSteps(api, properties):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cloud_pubsub, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, raise_on_failed_publish):
DEPS: git, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for building the Cloudready shim.
— def RunSteps(api):
DEPS: build_menu, cros_build_api, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, code_coverage, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, code_coverage, recipe_engine/raw_io, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, code_coverage, recipe_engine/cq, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_artifacts, cros_infra_config, cros_source, easy, git, git_txn, repo, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Copy legacy configuration and generate backfilled configuration.
— def RunSteps(api, properties):
— def backfill_project(api, config):
Backfill an individual project.
Expects to be run in the root of the chromeos checkout.
Args: config (ConfigBackfillProperties.ProjectConfig) - configuration for project
Return: BackfillStatus with results of backfill. commit hash if empty if no commit is made.
— def config_merger(api, config, path_cros_repo, step_pres):
Create a closure to merge configs.
Meant to be called from git_txn.update_ref, which requires a single function taking no arguments, so close on what we need.
Args: api: Reference to recipes API config: Merge config to execute path_cros_repo: Path to root of ChromeOS checkout step_pres: Step presentation instance
Return: closure to execute merge operation
— def create_download_payload(build):
Build a download payload.
Args: build: a Build message with output properties
Return: A BuildPayload if the Build message contains all the necessary information, otherwise None
— def create_portage_workaround(api):
Hack around needing a full portage environment for reef/fizz.
Reef/fizz require their baseboard overlay to include common files. We can work around this by using symlinks to simulate the overlay.
— def download_latest_config_yaml(api, builder_name):
Download latest project config.yaml from GS.
Args: api: Reference to recipes API builder_name: the full name of the builder to search for
Return: List containing the path where the downloaded config.yaml file resides, or empty list if no GS path was found for the builder.
— def format_output_markdown(commits, errors, nmissing):
Generate markdown to be shown for the build status.
Args: commits: list of (program, project, hash) values for commits errors: list of string-formattable errors nmissing: number of projects missing from manifest
Return: Formatted markdown string suitable to return via RawResult proto.
— def require(cond, message):
Require a given condition be true or throw a ValueError.
— def split_overlay_project(api, repo):
Take a private overlay URL and parse out project name.
DEPS: cros_source, easy, failures, gerrit, git, git_txn, repo, src_state, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Run miscellaneous actions on project repos.
Runs on a schedule rather than as a triggered/CQ action, so there is some latency between commits landing and this script executing its tasks.
For example, if a src/project repo has filtered public configs, there can be an action to copy these public configs to a public repo.
Each action is a function that takes a list of config repos to operate on and returns a list of repos to make commits to.
— def RunSteps(api, properties):
DEPS: cq_looks, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: build_menu, cros_build_api, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_artifacts, cros_test_plan, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_artifacts, cros_build_api, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_artifacts, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_artifacts, cros_test_plan, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_artifacts, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
— def attempt_download_file(api, attempt):
— def attempt_publish_file(api, attempt):
DEPS: cros_artifacts, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
— def attempt_download_file(api, attempt):
DEPS: cros_artifacts, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_bisect, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_bisect, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_branch, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_branch, recipe_engine/assertions, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_cache, recipe_engine/assertions, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_cq_depends, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_cq_depends, cros_source, gerrit, repo, src_state, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_dupit, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_dupit, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_history, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, expected_builder_names):
DEPS: cros_history, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, is_retry):
DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, src_state, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_paygen, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_paygen, cros_storage, gitiles, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, max_batch_size, paygen_requests, expected_batches):
DEPS: cros_paygen, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, gen_req_ser, expected_test_configs_ser, delta_test_override, full_test_override, fsi):
DEPS: cros_paygen, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, cros_prebuilts, git, src_state, test_util, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, cros_prebuilts, test_util, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_prebuilts, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_provenance, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, build_reporting, cros_release, git, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api):
DEPS: cros_release, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api, fsi, expected_models):
DEPS: cros_release, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api):
DEPS: cros_release_config, repo, recipe_engine/file, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api, properties):
— def construct_legacy_config(*blocks):
— def expected_config(*blocks):
DEPS: cros_release_util, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_release_util, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, cros_relevance, cros_source, src_state, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_relevance, git_footers, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_relevance, gerrit, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, patch_sets):
DEPS: cros_history, cros_relevance, cros_source, src_state, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_history, cros_relevance, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_relevance, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_relevance, git_footers, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, expected_builders):
DEPS: cros_resultdb, cros_tags, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_resultdb, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/properties, recipe_engine/resultdb
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_resultdb, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/json
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_schedule, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_schedule, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_sdk, recipe_engine/assertions, recipe_engine/file, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_sdk, goma, remoteexec, workspace_util, recipe_engine/assertions, recipe_engine/file, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_sdk, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_sdk, git, src_state, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, is_chroot_usable):
DEPS: cros_sdk, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, cros_sdk, src_state, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_signing, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Success workflow tests for the cros_signing recipe module.
— def RunSteps(api):
DEPS: cros_signing, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Verify that instructions files are in the appropriate format.
— def RunSteps(api):
DEPS: cros_signing, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Verify that wait_for_signing is required before retrieving signed build metadata.
— def RunSteps(api):
DEPS: cros_som, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, src_state, recipe_engine/assertions, recipe_engine/context, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_source, repo, src_state, recipe_engine/assertions, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, cros_source, gerrit, src_state, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, cros_source, repo, recipe_engine/context, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, git, test_util, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, gerrit, git, repo, src_state, test_util, depot_tools/gitiles, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_source, repo, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_source, gcloud, src_state, recipe_engine/path, recipe_engine/properties, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, repo, src_state, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, repo, recipe_engine/context, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, src_state, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_storage, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_storage, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_tags, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_tags, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, keyvals, check_key, expected_value):
DEPS: cros_tags, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, keyvals, check_key, expected_values):
DEPS: cros_test_plan, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_plan, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_plan_v2, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_plan_v2, gerrit, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_plan_v2, gerrit, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_plan_v2, recipe_engine/assertions, recipe_engine/file, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_platform, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_postprocess, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_proctor, recipe_engine/assertions, recipe_engine/file
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_bisect, cros_history, cros_relevance, cros_test_proctor, easy, gerrit, skylab, src_state, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/file, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, need_tests_builds_serialized, run_async, use_test_plan_v2):
DEPS: cros_test_plan, cros_test_proctor, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, passed_tests, is_retry, expected_tests_run_count):
DEPS: cros_test_runner, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_tool_runner, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
— def mock_metadata(target=‘test-target’):
DEPS: cros_infra_config, cros_version, src_state, test_util, workspace_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/properties, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_version, git_footers, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_version, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
Tests for api.cros_version.Version.
— def RunSteps(api):
DEPS: cros_version, recipe_engine/assertions, recipe_engine/cq, recipe_engine/file
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cts_results_archive, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: debug_symbols, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: disk_usage, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_dupit, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for syncing remote, distributed tarballs to our local cache.
— def RunSteps(api, properties):
DEPS: cros_dupit, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for syncing Archlinux to our local cache for Borealis VM image.
— def RunSteps(api):
DEPS: dut_interface, phosphorus, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time, recipe_engine/uuid
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, easy, swarming_cli, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for the Star Doctor.
Automatically updates binary config files and updates Goldeneye config json files.
— def RunSteps(api):
DEPS: easy, recipe_engine/assertions, recipe_engine/json, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: easy, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: easy, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_tags, easy, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, skylab, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, skylab, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, skylab, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: exonerate, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, failures, naming, test_util, urls, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: failures, skylab, urls, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, urls, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
— def vm_build(**kwargs):
DEPS: bot_cost, cros_infra_config, cros_source, easy, gerrit, git, orch_menu, src_state, test_util, recipe_engine/buildbucket, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe that schedules child builders and watches for failures.
— def RunSteps(api):
DEPS: gce_provider, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gce_provider, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: failures, gcloud, tast_exec, tast_results, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/random, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
An experimental recipe for running GCE tests.
— def RunSteps(api, properties):
DEPS: gcloud, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gcloud, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gcloud, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, gcloud, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_build_api, cros_cq_depends, cros_sdk, cros_source, easy, gerrit, git, git_footers, gitiles, naming, pupr, repo, src_state, test_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cq, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/scheduler, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for the PUpr generator.
PUpr is a general uprev pipeline that listens for package releases (via LUCI Scheduler gitiles triggers), generates ebuild uprev CLs for those releases, and tags the appropriate reviewers. Think of it as the CrOS autoroller.
See go/pupr and go/pupr-generator for rationale and design decisions.
— def RunSteps(api, properties):
— def response_has_changes(api, response):
Returns whether the given UprevPackagesResponse
contains changes.
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, git_cl, src_state, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, src_state, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, src_state, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, src_state, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: gerrit, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: gerrit, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: gerrit, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gerrit, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git, src_state, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git, recipe_engine/assertions, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: git, src_state, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git, recipe_engine/assertions, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git, src_state, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git_cl, recipe_engine/assertions, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git_cl, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git_cl, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git_footers, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
Test git_footers calls.
— def RunSteps(api, invalid_cr_commit_position):
DEPS: git_txn, src_state, recipe_engine/context, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git_txn, src_state, recipe_engine/assertions, recipe_engine/context, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gitiles, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: test_util, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/scheduler, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe that schedules jobs based on its triggers.
— def RunSteps(api, properties):
DEPS: goma, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: goma, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: goma, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: goma, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: goma, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_tags, greenness, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: greenness, skylab, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: greenness, test_util, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: gs_step_logging, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: iterutils, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: build_menu, cros_build_api, cros_sdk, cros_source, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for testing the kernel splitconfig normalization.
The kernel split config design is documented at https://www.chromium.org/chromium-os/how-tos-and-troubleshooting/kernel-configuration/ and go/mini-splitconfigs.
— def RunSteps(api, properties):
DEPS: stable_version, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for sync stable vesrion for ChromeOS build targets & models.
— def RunSteps(api, properties):
— def fetch_and_commit(api):
Fetch the newest stable version and commit it to config file on git.
Returns: A string gerrit CL link.
— def validate_stable_version(api):
Validate the remote stable version config file.
Returns: JSON response with validation result
DEPS: build_menu, cros_source, git, repo, src_state, recipe_engine/context, recipe_engine/file, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for updating libchrome upstream branch
— def RunSteps(api):
DEPS: cros_source, git, gs_step_logging, repo, src_state, workspace_util, depot_tools/depot_tools, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/json, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Runs the presubmit for a project with checkout per local manifest.
— def RunSteps(api, properties):
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for syncing to our local cache LVFS files (https://fwupd.org/).
— def RunSteps(api):
DEPS: bot_cost, cros_infra_config, cros_source, repo, workspace_util, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for performing various manipulations on ChromeOS manifests.
— def RunSteps(api, properties):
— def ensure_manifest_doctor(api, properties):
DEPS: metadata, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: metadata_json, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: metadata_json, test_util, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: metadata_json, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, metadata_json, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, metadata_json, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: git, naming, skylab, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: naming, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, serialized_paygen_requests, expected_url_title):
DEPS: naming, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, serialized_paygen_request, expected_url_title):
DEPS: bot_cost, cros_source, gerrit, git, depot_tools/depot_tools, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for running presubmit on CLs for projects not in the manifest.
— def RunSteps(api, properties):
— def apply_changes_and_run_presubmits(api, project, patch_sets):
For a given project, clone the repo, apply changes and run presubmits.
Args: api (RecipeApi): See RunSteps documentation. project (str): The name of the Gerrit project. patch_sets (list[PatchSet]): List of changes to apply for the given project.
— def categorize_changes(api, properties, gerrit_changes):
Group changes by Gerrit project.
Groups changes by Gerrit project and discards any changes to Gerrit projects not support by the builder as specified in the “projects” input property.
Args: properties (PresubmitTestProperties): Build input properties. gerrit_changes (list[GerritChange]): The Gerrit changes passed in to the build from CQ.
Returns: project_to_patchest_map (dict{str:list[PatchSet]}): Dict mapping the name of the Gerrit project to a list of relevant PatchSets.
DEPS: orch_menu, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api, properties):
DEPS: cros_source, cros_tags, cros_test_plan, gerrit, git_footers, orch_menu, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/file, recipe_engine/json, recipe_engine/properties, recipe_engine/resultdb, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api, properties):
DEPS: failures, orch_menu, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api):
DEPS: orch_menu, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api, properties):
DEPS: cros_release, cros_tags, orch_menu, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe that schedules child builders and watches for failures.
All builders run against the same source tree.
— def DoRunSteps(api):
— def RunSteps(api):
DEPS: easy, failures, tast_exec, tast_results, recipe_engine/file, recipe_engine/futures, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Test reven (aka ChromeOS Flex) installation.
Reven can be installed by end users from a USB device. This operation can be tested with the OsInstall tast test, but that test can't be run as part of the regular suite of VM tests as it requires some additional setup, which this recipe provides. Specifically, OS installation shuts the machine down at the end. The test then waits for the machine to be powered back up in the installed state to verify the installation succeeded.
Here's how the recipe operates:
An empty target disk is created as the destination to install to.
A BuildPayload is downloaded and prepped. The board being tested is reven-vmtest. That board inherits from the base reven board and is specifically intended for VM tests. The source disk must be tweaked slightly to make it look like an installer image, see make_into_installer
.
A VM is launched with two disks: the source installer disk and the empty target disk.
A future is spawned to run the OsInstall tast test.
The recipe then starts polling, waiting for the VM to shut down.
Once the VM shuts down, a new VM is launched with just the target disk, which should now contain the installed image.
Meanwhile the future with the tast test is still running. Once the VM boots back up the test will reconnect to it and verify if installation succeeded.
— def RunSteps(api, properties):
— def make_into_installer(api, image_path):
Modify the raw disk image at image_path
to make it installable.
The disk layout of the reven-vmtest board is slightly different from the reven board; to allow update testing to work it has a full-size ROOT-B partition. The OS installer distinguishes an installer from an installed image by checking if the ROOT-A and ROOT-B partitions have different sizes, so to make the image being tested look like an installer, shrink the ROOT-B partition down to a single block.
— def run_tast(api, properties, vm, tast_inputs, test_results_dir):
DEPS: overlayfs, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, cros_paygen, cros_sdk, cros_source, cros_storage, easy, git, gitiles, naming, src_state, workspace_util, recipe_engine/futures, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for generating ChromeOS payloads (AU deltas etc).
— def RunSteps(api, properties):
DEPS: cros_paygen, cros_release_util, cros_storage, easy, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for orchestrating ChromeOS payloads (AU deltas etc).
— def RunSteps(api, properties):
— def py2_MessageToJson(obj):
DEPS: phosphorus, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: phosphorus, recipe_engine/assertions, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: portage, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, cros_tags, failures, naming, test_util, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2
Launches presubmit tests for CQ.
— def RunSteps(api, properties):
DEPS: bot_cost, cros_infra_config, cros_source, gerrit, git, gitiles, repo, src_state, test_util, workspace_util, depot_tools/depot_tools, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for running presubmit on multiple CLs.
— def RunSteps(api, properties):
DEPS: bot_cost, cros_infra_config, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for invoking the per project buildspec tool.
— def RunSteps(api, properties):
— def ensure_manifest_doctor(api, properties):
DEPS: gerrit, pupr, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
— def patch_set_from_dict(api, changes):
DEPS: gerrit, pupr, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: recipe_analyze, recipe_engine/assertions, recipe_engine/json
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_build_api, cros_sdk, cros_source, git, repo, util, workspace_util, recipe_engine/context, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for the Chrome OS Build Metadata Cache Regnerator.
— def RunSteps(api):
DEPS: remoteexec, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: easy, repo, recipe_engine/assertions, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: repo, recipe_engine/context, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: repo, recipe_engine/assertions, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: repo, recipe_engine/assertions, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
— def WithArgsTest(api, state_name, state, local_manifest):
— def WithManifestNameTest(api, state_name, state):
— def WithNonePruneTest(api, state_name, state):
— def WithretryTest(api, state_name, state, local_manifest):
DEPS: repo, src_state, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: repo, recipe_engine/assertions, recipe_engine/context, recipe_engine/file, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: repo, recipe_engine/assertions, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: repo, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
— def attempt_retry_repo(api, attempt):
DEPS: repo, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
— def attempt_retry_repo(api, attempt):
DEPS: repo, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, repo, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: repo, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: repo, recipe_engine/context, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: result_flow, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, cros_infra_config, easy, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for scaling bots in Chrome and Chrome OS pools.
— def RunSteps(api, properties):
DEPS: service_version, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, cros_tags, easy, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/random, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for signing ChromeOS images.
— def RunSteps(api, properties):
Run steps.
DEPS: cros_test_plan, git_footers, metadata, skylab, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_plan, skylab, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: skylab, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_test_plan, metadata, skylab, recipe_engine/properties, recipe_engine/raw_io
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: chrome, cros_cache, cros_infra_config, cros_release, easy, failures, gcloud, git, repo, src_state, depot_tools/depot_tools, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for generating ChromeOS source cache snapshots.
— def RunSteps(api, properties):
DEPS: src_state, test_util, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: src_state, test_util, recipe_engine/assertions, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: src_state, test_util, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: src_state, test_util, recipe_engine/assertions, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: src_state, test_util, recipe_engine/assertions, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: src_state, test_util, recipe_engine/assertions, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: src_state, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: src_state, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: stable_version, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_schedule, gerrit, git, git_cl, gitiles, depot_tools/depot_tools, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for the Star Doctor.
Automatically updates binary config files and updates Goldeneye config json files.
— def RunSteps(api, properties):
DEPS: support, recipe_engine/assertions, recipe_engine/json
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_scaling, swarming_cli, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_infra_config, sysroot_util, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_build_api, cros_infra_config, cros_relevance, sysroot_util, test_util, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, cros_sdk, sysroot_util, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_infra_config, cros_sdk, sysroot_util, recipe_engine/assertions, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: tast_exec, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: tast_results, recipe_engine/path, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: tast_results, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: tast_results, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: failures, tast_exec, tast_results, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
An experimental recipe for running Tast VM tests without Chroot and ChromeOS checkout, resulting in much faster tests. The tests will use tast executable from build_artifacts.
— def RunSteps(api, properties):
DEPS: build_menu, cros_build_api, cros_sdk, test_util
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe that tests chromite.
This recipe lives on its own because it is agnostic of ChromeOS build targets.
— def RunSteps(api):
DEPS: cros_branch, cros_source, git, repo, src_state, depot_tools/depot_tools, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cq, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Verifies a repo manifest.
— def RunSteps(api, properties):
DEPS: cros_source, gerrit, git, git_txn, repo, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Updates test plan rules to reflect new risk-based rules
— def RunSteps(api, properties):
DEPS: cros_resultdb, cros_tags, cros_test_platform, result_flow, service_version, skylab, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/properties, recipe_engine/random, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for the ChromeOS Test Frontend.
— def RunSteps(api, properties):
— def add_container_metadata(api, requests, error_in_requests):
Add container metadata to requests when required.
Args:
— def enumerate_tests(api, requests, error_in_requests):
Resolve request into list of tests and their metadata.
Args:
Returns: {tag: EnumerationResponse} dict.
— def execute(api, requests):
Execute request in the correct backend.
Args: requests: ExecutionRequests payload.
— def link_to_parent(api):
Attach the parent buildbucket id(s) to the current buildbucket id if any parent buildbucket(s) exists.
Returns: The parent bucketbucket id list if it exists. Otherwise, returns an empty list.
— def output_ctp_release_timestamp_tag(api):
Get the timestamped release tag of the cros_test_platform CIPD packages in use.
— def postprocess(api, requests, responses):
— def publish_to_result_flow(api, config, should_poll_for_completion=False):
Publish build info to result_flow PubSub
Args:
— def set_output_properties(api, responses):
Set the output properties that are part of the cros_test_platform API.
— def sort_task_results_by_state(task_results):
— def summarize(api, enumerations, responses, error_in_requests):
— def validated_requests(api, properties):
Get and validate requests from input properties.
Returns: Struct containing requests.
DEPS: breakpad, cros_test_postprocess, urls, util, depot_tools/gsutil, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
— def RunSteps(api, properties):
DEPS: cros_test_platform, skylab, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe that triggers cros_test_platform runs.
— def RunSteps(api, properties):
DEPS: phosphorus, service_version, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: ipc, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: ipc, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: result_flow, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
— def run_test_ctp_flow(api, config, deadline):
— def run_test_runner_flow(api, config, deadline):
DEPS: cros_resultdb, cros_tags, cros_test_runner, cros_tool_runner, cts_results_archive, dut_interface, easy, phosphorus, result_flow, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/step, recipe_engine/swarming, recipe_engine/time, recipe_engine/uuid
PYTHON_VERSION_COMPATIBILITY: PY2
Recipe for the ChromeOS Skylab Test Runner.
— def RunSteps(api, properties):
— def archive_all_logs(api, interface, test_metadata, result):
Archive all test logs to Google Storage, updating result in the process.
Args:
Raises:
— def create_skylab_result(api, ctr_result, properties, dut_state):
Create skylab_result from ctr_result.
Args:
Returns: Skylab_result: Skylab_result for current test.
— def execution_steps_with_ctr(api, properties):
Runs all the non-UI-related steps using ctr.
Runs all tests specified in properties, saving relevant data, and returning the overall results.
Args:
Returns: DUTResult: The result for all tests run in this run.
Raises:
— def execution_steps_with_phosphorus(api, properties):
Runs all the non-UI-related steps.
Runs all tests specified in properties, saving relevant data, and returning the overall results.
Args:
Returns: DUTResult: The result for all tests run in this run.
Raises:
— def publish_to_result_flow(api, config, parent_request_uid, should_poll_for_completion=False):
Publish build info to result_flow PubSub.
Args:
— def s_link(step, name, link):
Add a link link
named link_name
to the step
if it exists.
Args:
— def s_log(step, name, log):
Add a log
to a step
's log under name
is it exists.
Args:
— def set_output_properties(api, result):
Set the output properties that are part of the test_runner API.
Args:
— def summarize_results_from_ctr_results(api, result):
Display test cases (and failures) as recipe substeps through the api.
Args:
— def summarize_results_from_phosphorus_results(api, result):
Display test cases (and failures) as recipe substeps through the api.
Args:
— def validate_request(api, test):
Validate the TestRunnerProperties.
Args:
Raises:
DEPS: failures, gerrit, git, naming, recipe_analyze, depot_tools/gclient, depot_tools/tryserver, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cq, recipe_engine/file, recipe_engine/json, recipe_engine/led, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
Tests a recipe CL by running ChromeOS builders.
— def RunSteps(api, properties):
DEPS: build_menu, cros_build_api, cros_sdk, test_util
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe that runs bazel rules_cros unit tests.
This recipe lives on its own because it is agnostic of ChromeOS build targets.
— def RunSteps(api):
DEPS: build_menu, cros_build_api, cros_relevance, cros_sdk, failures, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe that runs SDK package unit tests.
This recipe lives on its own because it is agnostic of ChromeOS build targets.
— def RunSteps(api):
DEPS: gerrit, git, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe to test the UEFI shim for the reven board.
— def RunSteps(api):
DEPS: cros_tags, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: test_util, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: bot_cost, cros_infra_config, cros_source, cros_version, gerrit, git, gitiles, repo, src_state, test_util, workspace_util, depot_tools/depot_tools, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/step, recipe_engine/swarming, recipe_engine/tricium
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for running tricium on CLs.
— def RunSteps(api):
DEPS: cros_source, cros_tags, gerrit, git, repo, src_state, workspace_util, depot_tools/gsutil, recipe_engine/archive, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for Upreving Guest VM version pin files.
This recipe copies a VM image artifact from the chromeos-image-archive to the localmirror and then modifies the Guest VM's version pin to match this version.
— def CopyLegacyReleaseImage(api, board, build, vm_property_map, sanitized_version):
— def CopyPostsubmitImage(api, board, build, vm_property_map, sanitized_version):
— def FindLegacyReleaseBuilds(api, board, version_build_map):
— def FindPostsubmitBuilds(api, board, version_build_map):
— def RunSteps(api, properties):
DEPS: build_menu, cros_artifacts, cros_build_api, cros_sdk, cros_source, gerrit, git, naming, repo, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming, recipe_engine/time
PYTHON_VERSION_COMPATIBILITY: PY2+3
Recipe for generating Parallels uprev CLs.
This recipe generates CLs to uprev Parallels binaries. As part of these CLs, a new Parallels VM image is produced for testing.
This recipe involves booting up Windows in a virtual machine. The caller is responsible for ensuring this is only invoked in contexts where the necessary license(s) have been obtained.
— def RunSteps(api, properties):
— def build_os_with_uprev(api, properties, package, upstream_version):
Builds a version of Chrome OS with given version of the Parallels package.
The build will still contain an old VM image for testing.
Args: package (chromiumos.PackageInfo): the identify of the Parallels package. upstream_version (str): the version of Parallels to include in the build.
Returns: BuildPath: where the build artifacts were uploaded.
@exponential_retry(retries=2)
— def build_vm_image(api, properties, artifacts_path, parallels_version):
Builds a new VM image for testing.
Args: artifacts_path (BuildPath): The location of build output artifacts. parallels_version (str): The Parallels version included in the given build.
Returns: dict: The details of the new test image.
— def commit_pin_uprev(api, properties, package, new_version_pin):
Commits and uploads the uprev of the version-pin file.
Args: package (chromiumos.PackageInfo): the package to include in the commit message. new_version_pin (VersionPin): the new version pin data.
— def get_latest_green_snapshot_commit(api, build_target):
Finds the latest green snapshot build for the given build target and returns the corresponding manifest gitiles (input) commit.
Args: build_target (str): The name of the build target.
— def get_upstream_version(api, properties):
Gets the latest version of Parallels from the upstream bucket.
Returns: string: the latest upstream version of Parallels.
— def get_version_path(api, properties):
Gets the path of the VERSION-PIN file.
— def get_version_pin(api, properties):
Reads and returns the content of the VERSION-PIN file.
Before calling this function, ensure a synced version of the source must have been checked out.
Returns: VersionPin: the pinned version data.
— def is_version_after(version, previous_version):
Returns if version occurs logically after pervious_version.
For example, is_version_after(‘1.0.3.1’, ‘1.0.2.2’) returns true.
Args: version (str): The version to compare. previous_version (str): The previous version to compare with.
— def set_version_pin(api, properties, new_version):
Sets the content of the VERSION-PIN file.
Before calling this function, ensure a synced version of the source must have been checked out.
Args: new_version (VersionPin): the new version pin data.
— def uprev_package(api, properties, package, to_version):
Uprevs the Parallels package to the given version.
The Parallels package will be upreved on the local checkout to the given version.
Args: package (chromiumos.PackageInfo): the package to uprev. to_version (str): the version to uprev to.
DEPS: skylab, urls, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
Basic tests for the urls recipe module.
— def RunSteps(api):
DEPS: urls, recipe_engine/assertions
PYTHON_VERSION_COMPATIBILITY: PY2+3
Basic tests for the urls recipe module.
— def RunSteps(api):
DEPS: util, recipe_engine/assertions, recipe_engine/step
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_source, src_state, test_util, workspace_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/properties, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: cros_cache, cros_source, workspace_util, recipe_engine/buildbucket
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):
DEPS: cros_cache, cros_source, repo, test_util, workspace_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/properties
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api, properties):
DEPS: repo, workspace_util, recipe_engine/assertions, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming
PYTHON_VERSION_COMPATIBILITY: PY2+3
— def RunSteps(api):