Repo documentation for chromeos

Table of Contents

Recipe Modules

Recipes

Recipe Modules

recipe_modules / analysis_service

DEPS: cloud_pubsub, recipe_engine/buildbucket, recipe_engine/step

API for publishing events to the Analysis Service.

class AnalysisServiceApi(RecipeApi):

@staticmethod
def can_publish_event(request: Message, response: Message):

Return whether ‘request’ and ‘response’ can be published.

Based on whether the types are both part of AnalysisServiceEvent. For example, can_publish_event(InstallPackagesRequest(), InstallPackagesResponse()) is true because the AnalysisServiceEvent contains these fields.

can_publish_event(NewRequest(), NewResponse()) would not be true, because those fields are not added to AnalysisServiceEvent.

Args: request (proto in AnalysisServiceEvent ‘request’ oneof): The request to log response (proto in AnalysisServiceEvent ‘response’ oneof): The response to log

Return: Whether an event can be published.

def publish_event(self, request: Message, response: Message, request_time: Timestamp, response_time: Timestamp, step_data: StepData, step_output: Optional[str]=None, max_stdout_stderr_bytes: int=_MAX_STDOUT_STDERR_BYTES):

Publish request and response on Cloud Pub/Sub.

Wraps request and response in a AnalysisServiceEvent. ‘can_publish_event’ must be called before (and return true), and it will return early if disable_publish is specified.

Does not check that request and response are corresponding types, e.g. it is possible to send a InstallPackagesRequest and SysrootCreateResponse; it is up to the caller to not do this.

Args: request (proto in AnalysisServiceEvent ‘request’ oneof): The request to log. response (proto in AnalysisServiceEvent ‘response’ oneof): The response to log. request_time: The time the request was sent by the caller. response_time: The time the response was received by the caller. step_data: Data from the step that sent the request. step_output: Output for the step. max_stdout_stderr_bytes: Truncate stdout and stderr to this many bytes.

recipe_modules / android

DEPS: cros_build_api, recipe_engine/step

class AndroidApi(RecipeApi):

def get_latest_build(self, android_package: str, android_branch: Optional[str]):

Retrieves the latest Android version for the given Android package.

Args: android_package: The Android package. android_branch: The Android branch, or chromite default if set to None.

Returns: The latest Android version (build ID).

def uprev(self, chroot: Chroot, sysroot: Sysroot, android_package: str, android_version: str, android_branch: Optional[str], ignore_data_collector_artifacts: bool=False):

Uprev the given Android package to the given version.

Args: chroot: Information on the chroot for the build. sysroot: The Sysroot being used. android_package: The Android package to uprev (e.g. android-vm-rvc). android_version: The Android version to uprev to (e.g. 7123456). android_branch: The Android branch, or chromite default if set to None.

Returns: If the android package has been uprevved.

def uprev_if_unstable_ebuild_changed(self, chroot: Chroot, sysroot: Sysroot, patch_sets: List[PatchSet]):

Uprev Android if changes are found in the unstable ebuild.

Args: chroot: Information on the chroot for the build. sysroot: The Sysroot being used. patch_sets: List of patch sets (with FileInfo).

def write_lkgb(self, android_package: str, android_version: str, android_branch: Optional[str]):

Sets LKGB of given Android package to given version.

Args: android_package: The Android package to set LKGB for. android_version: The LKGB Android version. android_branch: The LKGB Android branch.

Returns: List of modified files.

recipe_modules / auto_retry_util

DEPS: buildbucket_stats, cros_history, cros_infra_config, cros_tags, deferrals, easy, exonerate, exoneration_util, gerrit, git_footers, looks_for_green, naming, skylab_results, tast_results, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Utility function for CQ auto retry.

class AutoRetryUtilApi(RecipeApi):

A module for util functions associated with the CQ auto retries.

def analyze_build_failures(self, cq_run: build_pb2.Build):

def analyze_test_results(self, cq_run: build_pb2.Build):

Returns a list of test suite names grouped by retryable status.

Args: cq_run: The cq-orchestrator for which to analyze the test results.

Returns: A tuple containing 3 lists of test suite names grouped by whether the suite was successful, failed but is retryable, or failed and is not retryable.

def build_was_dry_run(self, build: build_pb2.Build):

Returns whether the build was a dry run.

Note this assumes the $recipe_engine/cq.runMode input property is set, which may not be true for some builds (e.g. manually triggered builds).

Args: build: The build for which to determine whether it is a dry run.

@property
def builds_comment_limit(self):

def cq_retry_candidates(self):

Returns cq-orchestrator builds which may be elegible for auto retry.

Candidate cq-orchestrator builds must meet the following criteria:

  • The build status is in RETRYABLE_STATUSES.
  • The build is the latest cq attempt for the CLs under test.
  • The build had a supported failure mode.
  • The CLs under test are active.

@property
def experimental_retries(self):

def filter_retry_candidates(self, cq_orchs: List[build_pb2.Build]):

Returns cq-orchestrator builds which meet the retry criteria.

cq-orchestrator builds must meet the following criteria:

  • No CL tested in the build opted-out via footer.

TODO(b/305741866): Define more elaborate retry limits.

  • The build was not last triggered by our service account.
  • All CLs in the build are mergeable (as defined by the Gerrit API's GetMergeable) and ready for submission.

def get_exonerated_suites(self, cq_run: build_pb2.Build, failed_test_stats: List[FailedTestStats]):

Returns the names of the exonerated test suites for the given CQ run.

Args: cq_run: The cq-orchestrator build for which to get the exonerated suites. failed_test_stats: A list of FailedTestStats to use when performing auto exoneration rathen that the FailedTestStats in the output properties of the build. This list of FailedTestStats should be updated using the latest LUCI analysis data.

Returns: The names of the exonerated test suites.

def get_failure_attributed_hw_suites(self, cq_run: build_pb2.Build, already_retryable_suites: Optional[Iterable[str]]=None):

Returns a list of suites that can have their failures attributed.

This method is based on the cq_fault_attributions output property of the CQ orchestrator, it does not do the actual attribution analysis. A suite is considered attributed if all of its failed test cases have the MATCHING_FAILURE_FOUND attribution for at least one build target. Note that this does not need to be the same build target, e.g. if test1 fails on targetA but has a matching failure on targetB, test1 is still considered attributed.

Args: cq_run: The CQ run to check for attributed tests. already_retryable_suites: Suites that have already been determined to be retryable via other methods (e.g. the suite is now exonerated). If set, these suites are removed from the returned list of attributed suites.

Returns: A list of suite names that have all failing tests attributed.

def initialize(self):

def is_experimental_feature_enabled(self, feature_name: str, build: build_pb2.Build):

Returns whether the given feature is enabled on the build.

@property
def lookback_seconds(self):

def no_retry_footer_set(self, build):

Given an orchestrator's associated CLs, have any opted out via footer.

def publish_per_build_stats(self):

Write PerBuildStats to an output property.

PerBuildStats are logged for each build passed to analyze_build_failures. This function should be called after every call to analyze_build_failures is complete, to publish the stats to an output property.

To make SQL analysis easier, this function converts per_build_stats from a map to a list and adds a new field ‘build_id’. This is done because iterating a JSON object is less convinient than a list in most SQL dialects. build_id is a str to avoid integer trunctation.

def retry_builds(self, retryable_runs: List[Tuple[(build_pb2.Build, RetryDetails)]]):

Performs retry on retryable builds.

Args: retryable_runs: List of tuples with failed build and retry details.

Returns: A tuple with a retried (old) builds and a list with exceptions.

@property
def suites_comment_limit(self):

def test_variant_exoneration_analysis(self, cq_run: build_pb2.Build):

Runs auto exoneration analysis and returns categorized FailedTestStats.

Uses the FailedTestStats reported in the output properties of the cq-orchestrator to query LUCI analysis for updated exoneration status of the failed test cases in a build.

Note: A test which was previously exonerated will not be updated such that it is no longer exonerated for the CQ run.

Args: cq_run: The cq-orchestrator for which to retrieve updated FailedTestStats.

Returns: A tuple containing 3 lists of FailedTestStats grouped by whether the test variant is previously exonerated, newly exonerated, or not exonerated.

def unthrottled_retries_left(self):

Returns the number of retries left below the 24 and 2 hour throttles.

recipe_modules / auto_runner_util

DEPS: gerrit, recipe_engine/step, recipe_engine/time

Helper functions for auto runner recipe.

class AutoRunnerUtilApi(RecipeApi):

def add_a_step_with_cl_links(self, step_description: str, change_infos: Set[EnhancedChangeInfo]):

Adds a recipe step with links to the provided Gerrit changes.

Args: step_description: Description for the recipe step. change_infos: A set of EnhancedChangeInfo objects representing the changes.

def auto_dry_run_cls(self, changes: Set[EnhancedChangeInfo]):

Sets the Commit-Queue label to +1 (DRY RUN) for eligible changes.

This method iterates through the provided set of EnhancedChangeInfo objects and sets the Commit-Queue label to +1 (DRY RUN) for each change. It also adds a comment to the change explaining that the change was automatically CQ+1'ed by AutoRunner.

Args: changes: A set of EnhancedChangeInfo objects representing the changes to be CQ+1'ed.

Returns: The number of CLs that were marked for CQ+1.

def get_change_infos_from_gerrit(self, host_projects: List[HostProjects], query_params: Tuple[Tuple[(str, str)]], o_params: Tuple[str], query_limit: int):

Retrieves Gerrit change information from configured hosts and projects.

Args: host_projects: List of HostProjects objects defining target hosts and projects. query_params: Tuple of tuples specifying Gerrit query parameters (key, value). o_params: Tuple of additional Gerrit query options. query_limit: Maximum number of results to fetch (None for no limit).

Returns: Set[EnhancedChangeInfo]: A set of EnhancedChangeInfo objects representing the changes.

def get_eligible_cls(self):

Retrieves Gerrit changes that are considered eligible for auto run.

This function performs the following steps:

  1. Fetches changes from configured Gerrit hosts and projects.
  2. Checks and enforces the daily quota for the auto runner.
  3. Filters out changes that:
    • Have exceeded their individual daily CQ quota.
    • Have already been CQed in their current revision.
    • Contain ‘cq-depends’ in their commit footer.
    • Lack reviewers (if _cl_signal is set to REVIEWER_ADDED).
    • Are part of a relation chain with other changes.
  4. Sorts the remaining changes and limits them to the available quota.

Returns: A set of EnhancedChangeInfos representing the eligible changes.

def get_quota_stats(self):

Calculates the current auto runner quota usage in last 24 hrs.

Returns: The total number of times auto runner CQed CLs in the last 24 hours.

recipe_modules / binhost_lookup_service

DEPS: cloud_pubsub, cros_infra_config, recipe_engine/step

APIs to interact with the binhost lookup service.

class BinhostLookupServiceApi(RecipeApi):

Module for operations related to the binhost lookup service.

def publish_binhost_metadata(self, build_target: common_pb2.BuildTarget, profile: common_pb2.Profile, snapshot_sha: str, gs_uri: str, gs_bucket_name: str, buildbucket_id: int, complete: bool, private: bool, raise_on_failure: bool=False):

Publish binhost metadata to Cloud Pub/Sub.

Publish a Pub/Sub message to the binhost lookup service using the cloud_pubsub recipe module.

Args: build_target: The system CrOS is being built for, also known as board. profile: Name of the profile to use with the build_target. snapshot_sha: Unique sha of the snapshot. gs_uri: Location of the binhost object in google storage. gs_bucket_name: Name of the google storage bucket which contains the binhost. buildbucket_id: Id of the postsubmit builder that created and uploaded the binhost. complete: Bool to indicate if this binhost contains all the binpkgs specified in the packages metadata file. private: Bool to indicate if the binhost is private. raise_on_failure: Whether to raise an exception on failure.

def publish_snapshot_metadata(self, snapshot_sha: str, snapshot_num: int, external: bool, buildbucket_id: int, raise_on_failure: bool=False):

Publish snapshot metadata to Cloud Pub/Sub.

Publish a Pub/Sub message to the binhost lookup service using the cloud_pubsub recipe module.

Args: snapshot_sha: Unique sha of the snapshot. snapshot_num: Snapshot number. external: Bool to denote if the snapshot is external. buildbucket_id: ID of the annealing builder that created the snapshot. raise_on_failure: Whether to raise an exception on failure.

recipe_modules / bot_cost

DEPS: easy, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/led, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Module for calculating bot cost.

class BotCostApi(RecipeApi):

A module to calculate the cost of running bots.

@property
def bot_size(self):

@contextlib.contextmanager
def build_cost_context(self):

Set build cost after running.

Returns: A context that sets build_cost on exit.

def initialize(self):

def set_build_run_cost(self):

Wrapper function to calculate and set the cost of the run.

Calculate the cost of the run and set it as a build output property. Includes cost of any child builds.

def set_upload_size(self, gs_path: str):

Output the size of the given GS directory as an output property.

Args: gs_path: The path to the directory. Should include gs://{bucket}...

def update_upload_sizes(self):

Update the upload_sizes for all dirs that have previously been logged.

recipe_modules / bot_scaling

DEPS: cros_infra_config, easy, gce_provider, swarming_cli, recipe_engine/buildbucket, recipe_engine/futures, recipe_engine/json, recipe_engine/led, recipe_engine/random, recipe_engine/raw_io, recipe_engine/step

A module that determines how to scale bot groups based on demand.

class BotScalingApi(RecipeApi):

A module that determines how to scale bot groups.

def drop_cpu_cores(self, min_cpus_left=4, max_drop_ratio=0.75, test_rand=None):

Gather data on build's per core scaling efficiencies.

Gather data on per build CPU efficiency by dropping cores on the instances. Sets a property ‘enabled_cpu_cores’, with the final count.

Warning!: We've embedded an assumption that we will reboot between tasks so these changes are effectual for a single run only.

Note: This is only enabled on non-led staging builds.

Args: min_cpus_left (int): Do not drop below this number of cores. max_drop_ratio (float): The maximum ratio of cpus to drop relative to the overall count. test_rand (flaot): Set the max_drop_ratio to this instead of the uniform random distribution (for testing).

Returns: The number of cpus dropped.

@staticmethod
def get_bot_request(demand, scaling_restriction):

Core function that scales bots based on demand.

Args: demand(int): Current demand for bots. scaling_restriction(ScalingRestriction): Scaling restriction defined by the bot policy.

Returns: int, number of bots to request.

def get_current_gce_config(self, bot_policy_config):

Retrieves the current configuration from GCE Provider service.

Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop.

Returns: ConfigResponse (named_tuple), GCE Provider config definitions and missing configs.

@staticmethod
def get_gce_bots_configured(region_restrictions: List[BotPolicy.RegionRestriction], config_map: Dict[(str, Config)]):

Sums the total number of configured bots per bot policy.

Args: region_restrictions: Regional preferences from config. config_map: Map of prefix to GCE Config.

Returns: Sum of the total number of bots in GCE Provider

def get_num_cores(self):

Get the number of cores on the host.

@staticmethod
def get_regional_actions(bots_requested: int, region_restrictions: List[BotPolicy.RegionRestriction]):

Determines regional distribution of bot requests.

This function uses a running total and residual to ensure we're accurate in computing totals to equal bots_requested.

Args: bots_requested: Total number of bots requested. region_restrictions: Regional preferences from config.

Returns: Region wise distribution of bots requested.

def get_robocrop_action(self, bot_policy_config, configs, swarming_stats):

Function to compute all the actions of this RoboCrop.

Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop. configs(Configs): List of GCE Config objects. swarming_stats(SwarmingStats): Current Swarming bot and task counts, or None if there were errors fetching them.

Returns: RoboCropAction, comprehensive action to be taken by RoboCrop.

def get_scaling_action(self, demand, bot_policy, configs):

The function that creates a ScalingAction for a bot group.

Args: demand(int): Current demand for bots. bot_policy(BotPolicy): Config defined Policy for a bot group. configs(Configs): List of GCE Config objects.

Returns: ScalingAction, action to be taken on a single bot group by RoboCrop.

@staticmethod
def get_swarming_demand(swarming_stats: SwarmingStats, bot_group: str):

Return the number of bots needed to cover current tasks in a bot group.

In general, we need enough bots to cover all scheduled tasks. If a bot is busy, it can't pick up a scheduled task. This includes not only bots that are running tasks, but also bots that are dead, quarantined, and so on. Thus, the demand for bots is equal to ${the number of busy bots} plus ${the number of pending tasks}.

Note: This count does NOT include the min_idle number of bots, which is specified by the BotPolicy.

Args: swarming_stats: Dataclass containing bot and task stats from Swarming. bot_group: Name of bot group.

Returns: The current demand for bots in the group.

def get_swarming_stats(self, bot_policy_config):

Determines the current Swarming stats per bot group.

Args: bot_policy_config(BotPolicyCfg): Config define Policy for the RoboCrop.

Returns: SwarmingStats: Dataclass containing bot and task stats.

@staticmethod
def reduce_bot_policy_config_for_table(bot_policy_config: BotPolicyCfg):

Reduces bot_policy_config fields prior to sending to bb tables.

Args: bot_policy_config: Config define Policy for the RoboCrop.

Returns: Scaled-down config that only includes data needed for Plx.

@staticmethod
def unpack_policy_dimensions(dimensions):

Method to iterate through dimensions and return possible combinations.

Args: dimensions (list[dict]): BotPolicy swarming dimensions.

Returns: list, product of all swarming dimensions for querying.

@staticmethod
def update_bot_policy_limits(bot_policy_config: BotPolicyCfg, configs: Configs):

Sums the min and max bot numbers per bot policy.

Args: bot_policy_config: Config define Policy for the RoboCrop. configs: GCE Configs.

Returns: The original bot_policy_config, updated to reflect ScalingRestriction values.

def update_gce_configs(self, robocrop_actions, configs):

Updates each GCE Provider config that is actionable.

Args: robocrop_actions(list[ScalingAction]): Repeatable ScalingAction configs to update. configs(Configs): List of GCE Config objects.

Returns: list(Config), GCE Provider config definitions.

recipe_modules / breakpad

DEPS: easy, urls, depot_tools/gsutil, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

API for breakpad.

class BreakpadApi(RecipeApi):

def initialize(self):

def symbolicate_dump(self, image_archive_path, test_results):

Converts minidumps generated by tests into text files.

For each of test_results, walks the directory, converts all found minidump files to text, and copies the text files back to test_results.

Args: image_archive_path (str): A Google Storage path to the image archive. test_results (list[(DownloadedTestResult)]): A list of DownloadedTestResult which has both GS path and local path of the test result to process.

Returns: A list[Path] of symbolicated files written.

recipe_modules / build_menu

DEPS: bot_cost, chrome, cros_artifacts, cros_build_api, cros_infra_config, cros_prebuilts, cros_relevance, cros_sdk, cros_source, cros_tags, cros_version, easy, failures, git_footers, gobin, image_builder_failures, metadata, metadata_json, observability_image_size, src_state, sysroot_archive, sysroot_util, test_util, urls, util, workspace_util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time, recipe_engine/uuid

API providing a menu for build steps

class BuildMenuApi(RecipeApi):

A module with steps used by image builders.

Image builders do not call other recipe modules directly: they always get there via this module, and are a simple sequence of steps.

@property
def artifact_build(self):

def artifacts_build_path(self):

Get the standard artifacts build path for the builder (without bucket).

For example betty-arc-r-release/R114-15436.0.0

This method will only work if the checkout has already been initialized, as we rely on the CrOS version (and thus the version file).

def artifacts_gs_path(self):

Get the standard artifacts GS path for the builder (including bucket).

This method will only work if the checkout has already been initialized, as we rely on the CrOS version (and thus the version file).

def bootstrap_sysroot(self, config=None):

Bootstrap the sysroot by installing the toolchain.

Args: config (BuilderConfig): The Builder Config for the build. If none, will attempt to get the BuilderConfig whose id.name matches the specified Buildbucket builder from HEAD.

def build_and_test_images(self, config=None, include_version=False, is_official: bool=False, build_images_timeout_sec: Optional[int]=None):

Build the image and run ebuild tests.

This behavior is adjusted by the run_spec values in config.

Args: config (BuilderConfig): The Builder Config for the build, or None. include_version (bool): Whether or not to pass the workspace version to sysroot_util.build. is_official: Whether to produce official builds. build_images_timeout_sec (int): Step timeout, None uses default timeout. Returns: (bool): Whether to continue with the build.

def build_images(self, config=None, include_version=False, is_official: bool=False, timeout_sec: Optional[int]=None):

Build the image.

This behavior is adjusted by the run_spec values in config.

Args: config (BuilderConfig): The Builder Config for the build, or None. include_version (bool): Whether or not to pass the workspace version to sysroot_util.build. is_official: Whether to produce official builds. timeout_sec (int): Step timeout (in seconds), None uses default timeout.

@property
def build_target(self):

@property
def chroot(self):

@property
def config(self):

@property
def config_or_default(self):

@contextlib.contextmanager
def configure_builder(self, missing_ok: bool=False, disable_sdk: bool=False, commit: Optional[bb_common_pb2.GitilesCommit]=None, targets: Iterable[common_pb2.BuildTarget]=(), lookup_config_with_bucket=False):

Initial setup steps for the builder.

This context manager returns with all of the contexts that an image builder needs to have when it runs, for cleanup to happen properly.

Args: missing_ok: Whether it is OK if no config is found. disable_sdk: This builder will not be using the SDK at all. Only for branches with broken or no Build API. commit: The GitilesCommit for the build, or None. targets: List of build_targets for metadata_json to use instead of our build_target. lookup_config_with_bucket: If true, include builder.bucket in key when looking up the BuilderConfig. If the bucket is not included in the key and there are builders with the same name (in different buckets), it is undefined which BuilderConfig is returned. The bucket will eventually be included in the key by default, see b/287633203.

Returns: BuilderConfig for the active build (or None if the active build has no corresponding config), with an active context.

@property
def container_version(self):

Return the version string for containers.

Run through the format string, and replace any allowed fields with their runtime values. If any unknown fields are encountered, then a RuntimeError is thrown.

def create_containers(self, builder_config=None):

Call the BuildTestServiceContainers endpoint to build test containers.

The build API itself handles uploading generated container images to the container registry, but we handle collecting the metadata and uploading it with the build artifacts.

Args: builder_config (BuilderConfig): The BuilderConfig for this build, or None

Returns: None

@property
def dep_graph(self):

def determine_relevance(self):

Determines build relevance.

Returns: An object containing: pointless (bool): Whether the build is pointless. packages (list[PackageInfo]): The packages for this build, or an empty list.

@property
def gerrit_changes(self):

def get_cl_affected_sysroot_packages(self, packages=None, include_rev_deps=False):

Gets the list of sysroot packages affected by the input CLs.

Calculates the list of packages which were changed by the CLs and their reverse dependencies. The list is cached to avoid recalculating the list during subsequent calls.

Args: packages (list[PackageInfo]): The list of packages for which to get dependencies. If none are specified the standard list of packages is used. include_rev_deps (bool): Whether to also calculate reverse dependencies.

Returns: (List[PackageInfo]): A list of packages affected by the CLs.

def get_dep_graph_and_validate_sdk_reuse(self):

Fetch the dependency graph, and validate the SDK for reuse.

Note that failure to validate the SDK for reuse is not considered fatal, but the SDK will be marked as dirty out of an abundance of caution.

Returns: The dependency graph from cros_relevance.get_dependency_graph.

@property
def gitiles_commit(self):

def initialize(self):

def install_packages(self, config=None, packages=None, timeout_sec=‘DEFAULT’, name=None, force_all_deps=False, include_rev_deps=False, dryrun=False):

Install packages as appropriate.

The config determines whether to call install packages. If installing packages, fetch Chrome source when needed.

Args: config (BuilderConfig): The Builder Config for the build. packages (list[PackageInfo]): List of packages to install. Default: all packages for the build_target. timeout_sec (int): Step timeout, in seconds, or None for default. name (string): Step name for install packages, or None for default. force_all_deps (bool): Whether to force building of all dependencies. include_rev_deps (bool): Whether to also install reverse dependencies. Ignored if config specifies ALL_DEPENDENCIES or force_all_deps is True. dryrun (bool): Dryrun the install packages step.

Returns: (bool): Whether to continue with the build.

@property
def is_staging(self):

def publish_centralized_suites(self, builder_config: Optional[BuilderConfig]=None):

Call the PublishCentralizedSuites endpoint to build test containers.

Args: builder_config: The BuilderConfig for this build, or None. Raises: recipe_api.StepFailure: If the FetchCentralizedSuites endpoint doesn't exist.

def publish_image_size_data(self, config):

Retrieve, assemble, and publish information about package and image size.

Only expected to produce results after a successful completion of ImageService/Create and PackageService/GetTargetVersions.

Args: config: A BuilderConfig object.

def publish_latest_files(self, gs_bucket, gs_path):

Write LATEST-... files to GS.

Writes version information to the LATEST-{version} and LATEST-{branch} files in the specified GS dir. Will only write LATEST-{branch} if the version is more recent than the existing contents.

Args: gs_bucket (str): GS bucket to write to. gs_path (str): GS path/template to write to (relative to the bucket), e.g. eve-release or {target}-release.

@property
def resultdb_gitiles_commit(self):

Return the GitilesCommit for Sources metadata.

The GitilesCommit is either passed in by the parent via input property when the build was scheduled or can be derived for non-release builders after syncing the source.

def run_unittests(self, config=None):

run ebuild tests as specified by config.

Args: config (BuilderConfig): The Builder Config for the build, or None.

def setup_chroot(self, no_chroot_timeout: bool=False, sdk_version: Optional[str]=None, bootstrap: bool=False, uprev_packages: bool=True, setup_toolchains_if_no_update: bool=True, force_update: bool=False, force_no_chroot_upgrade: Optional[bool]=None, no_delete_out_dir: Optional[bool]=False):

Setup the chroot for the builder.

Args: no_chroot_timeout: Whether to allow unlimited time to create the chroot. sdk_version: Specific SDK version to include in the sdk CreateRequest: for example, 2022.01.20.073008. bootstrap: Whether to bootstrap the chroot. uprev_packages: Whether to uprev packages. setup_toolchains_if_no_update: If True, and the function skips updating the chroot (whether due to the update kwarg or due to the builder config), then it will setup toolchains instead. force_update: Pass force_update to chroot update. force_no_chroot_upgrade: If True, chroot update is skipped, regardless of the builder config. no_delete_out_dir: If True, out directory will be preserved.

Returns: Whether the build is relevant.

def setup_sysroot(self, with_sysroot=True, sysroot_archive=None):

Sets up the sysroot for the build.

Args: with_sysroot (bool): Whether to create a sysroot. Default: True. (Some builders do not require a sysroot.) sysroot_archive (str): The gs path of a sysroot archive, used to replace the whole sysroot folder.

def setup_sysroot_and_determine_relevance(self, with_sysroot=True, sysroot_archive=None):

Setup the sysroot for the build and determine build relevance.

Args: with_sysroot (bool): Whether to create a sysroot. Default: True. (Some builders do not require a sysroot.) sysroot_archive (str): The gs path of a sysroot archive, used to replace the whole sysroot folder.

Returns: An object containing: pointless (bool): Whether the build is pointless. packages (list[PackageInfo]): The packages for this build, or an empty list.

def setup_toolchains(self):

Setup toolchains on the builder.

ToolchainService.SetupToolchains was added in R117. If this function runs on an older branch, then Chromite will not have the endpoint implementation, so the build will fail. At time of writing, this function is not expected to run on any branches older than that. If that changes, consider cherry- picking SetupToolchains into your branch: https://crrev.com/c/4659850.

This is a noop if we have no build targets to setup.

Raises: InfraFailure: If the endpoint is not available.

@contextlib.contextmanager
def setup_workspace(self, cherry_pick_changes=True, ignore_changes=False):

Setup the workspace for the builder.

Args: cherry_pick_changes (bool): Whether to apply gerrit changes on top of the checkout using cherry-pick. If set to False, will directly checkout the changes using the gerrit fetch refs. ignore_changes (bool): Whether to apply gerrit changes. Set to True to completely skip application of gerrit changes.

@contextlib.contextmanager
def setup_workspace_and_chroot(self, no_chroot_timeout: bool=False, cherry_pick_changes: bool=True, bootstrap_chroot: bool=False, force_update: bool=False, force_no_chroot_upgrade: Optional[bool]=None):

Setup the workspace and chroot for the builder.

This context manager sets up the workspace path.

Args: no_chroot_timeout: Whether to allow unlimited time to create the chroot. cherry_pick_changes: Whether to apply gerrit changes on top of the checkout using cherry-pick. If set to False, will directly checkout the changes using the gerrit fetch refs. bootstrap_chroot: Whether to bootstrap the chroot. force_update: Pass force_update flag to chroot upgrade. force_no_chroot_upgrade: Whether to prevent the chroot upgrading at all. Returns: Whether the build is relevant.

@property
def sysroot(self):

@property
def target_versions(self):

Get the current GetTargetVersionsResponse.

Only set after setup_sysroot_and_determine_relevance().

Returns: (GetTargetVersionsResponse): A GetTargetVersionsRequest or None.

def unit_test_images(self, config=None):

Run ebuild tests.

Args: config (BuilderConfig): The Builder Config for the build, or None. Returns: (bool): Whether to continue with the build.

def upload_artifacts(self, config=None, private_bundle_func=None, sysroot=None, report_to_spike=False, name=‘upload artifacts’, previously_uploaded_artifacts=None, ignore_breakpad_symbol_generation_errors=False):

Upload artifacts from the build.

Args: config (BuilderConfig): The Builder Config for the build, or None. private_bundle_func (func): If a private bundling method is needed (such as when there is no Build API on the branch), this will be called instead of the internal bundling method. sysroot (Sysroot): Use this sysroot. Defaults to the primary Sysroot for the build. report_to_spike (bool): If True, will call bcid_reporter to report artifact information and trigger Spike to upload the provenance. name (str): The step name. Defaults to ‘upload artifacts’. previously_uploaded_artifacts(UploadedArtifacts): The UploadedArtifacts from a previous call to upload_artifacts; if set, these artifact types will not be re-uploaded. This used to avoid re-bundling artifacts if upload_artifacts is called multiple times. ignore_breakpad_symbol_generation_errors: If True, the BREAKPAD_DEBUG_SYMBOLS step will ignore any errors during symbol generation.

Returns: (Option[UploadedArtifacts]) information about uploaded artifacts, if any exist.

def upload_chrome_prebuilts(self, config: Optional[BuilderConfig]=None):

Upload Chrome prebuilts from the build.

Args: config: The Builder Config for the build, or None.

def upload_devinstall_prebuilts(self, config=None):

Upload dev_install prebuilts from the build.

Args: config (BuilderConfig): The Builder Config for the build, or None.

def upload_host_prebuilts(self, config: Optional[BuilderConfig]=None):

Upload host prebuilts from the build.

Upload prebuilts if the configuration has uploadable prebuilts.

Args: config: The Builder Config for the build, or None.

def upload_prebuilts(self, config=None):

Upload prebuilts from the build.

Upload prebuilts if the configuration has uploadable prebuilts.

Args: config (BuilderConfig): The Builder Config for the build, or None.

def upload_sources(self, config: BuilderConfig):

Add the Sources file to the build metadata artifact dir.

Note: This should only be called after syncing to the manifest.

Args: config: The builder config of this builder.

Returns: sources: The Sources uploaded.

recipe_modules / build_plan

DEPS: chrome, cros_build_api, cros_history, cros_infra_config, cros_relevance, cros_source, cros_tags, easy, failures, future_utils, gerrit, git, git_footers, looks_for_green, repo, src_state, test_util, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cv, recipe_engine/led, recipe_engine/step, recipe_engine/swarming

Functions related to build planning.

class BuildPlanApi(RecipeApi):

A module to plan the builds to be launched.

@property
def additional_chrome_pupr_builders(self):

def get_build_plan(self, child_specs: List[BuilderConfig.Orchestrator.ChildSpec], enable_history: bool, gerrit_changes: List[GerritChange], internal_snapshot: GitilesCommit, external_snapshot: GitilesCommit):

Return a two-tuple of completed and needed builds.

This will be split into specialized functions for cq, release, others.

Args: child_specs: List of child specs of the child builders. enable_history: Enables history lookup in the orchestrator. gerrit_changes: List of patches applied to the build. internal_snapshot: The GitilesCommit of the internal manifest passed to child builds syncing to the internal manifest. external_snapshot: The GitilesCommit of the public manifest passed to child builds syncing to the external manifest.

Returns: A tuple of two lists: A list of Build objects of successful builds with refreshed criticality. A list of ScheduleBuildRequests that have to be scheduled.

def get_completed_builds(self, child_specs, forced_rebuilds):

Get the list of previously passed child builds with criticality refreshed.

Args: api (RecipeApi): See RunSteps documentation. child_specs list(ChildSpec): List of child specs of cq-orchestrator. forced_rebuilds list(str): List of builder names that cannot be reused.

Returns: A list of build_pb2.Build objects corresponding to the latest successful child builds with the same patches as the current cq orchestrator with refreshed critical values.

def get_forced_rebuilds(self, gerrit_changes: List[bb_common_pb2.GerritChange]):

Gets a list of builders whose builds should not be reused.

Compiles a list of all builders whose builds should not be reused as indicated by the Gerrit changes' commit messages. For multiple changes, the union of these list is returned.

Args: gerrit_changes: Gerrit changes applied to this run.

Returns: forced_rebuilds: A set of builder names or ‘all’ if no builds can be reused.

def get_relevant_builders(self, builders: List[BuilderConfig], gerrit_changes: List[GerritChange]):

Returns builders deemed relevant by the RelevancyService.

@staticmethod
def get_slim_builder_name(builder_name: str):

Returns to the name of the slim variant of the builder.

Args: builder_name: The name of the builder for which to get the slim builder variant name.

Returns: The slim builder name.

recipe_modules / build_reporting

DEPS: build_menu, checkpoint, cloud_pubsub, cros_tags, easy, signing, signing_utils, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Contains functions for building and sending build status to a pub/sub topic.

The messages for build reporting are defined in: infra/proto/src/chromiumos/build_report.proto

And are specifically designed to be aggregated as a build progresses to create the current status. This means we can focus on sending out just the status pieces that we need without worrying about maintaining the state of the entire message.

The pub/sub topic to send status to is configurable through the pubsub_project and pubsub_topic properties for the module. If not set, these default to chromeos-build-reporting and chromeos-builds-all, which is intended to be the unfiltered top-level topic for all builds.

class BuildReportingApi(RecipeApi):

API implemention for build reporting.

@staticmethod
def add_version_msg(build_config, kind, value):

@property
def build_type(self):

def create_build_report(self):

Create BuildReport instance that can be .published().

Return: _MessageDelegate wrapping BuildReport instance

def create_step_info(self, step_name, start_time=None, end_time=None, status=BuildReport.StepDetails.STATUS_RUNNING, raise_on_failed_publish=False):

Create a StepDetails instance to publish information for a step.

Args: step_name (StepDetails.StepName): The predefined step name. start_time (Datetime): UTC datetime indicating step start time end_time (Datetime): UTC datetime indicating step end time status (StepDetails.Status): Step status (default: STATUS_RUNNING) raise_on_failed_publish (bool): Should this publish fail, fail the whole build.

Return: _MessageDelegate wrapping StepDetails instance

@disable_pubsub.setter
def disable_pubsub(self, value):

def init_report_from_previous_build(self):

Initialize the build report from an existing report in GS.

Used for retries. Don‘t publish, we’ll wait until our first legitimate publish.

@property
def merged_build_report(self):

def publish(self, build_report, raise_on_failed_publish=False):

Send a BuildReport to the pubsub topic.

Also aggregates the published BuildReport which is then available through the merged_build_report property.

Args: build_report (BuildReport): Instance to send to pub/sub. raise_on_failed_publish (bool): Should this publish fail, fail the whole build.

Return: Reference to BuildReport input message.

def publish_branch(self, branch: str):

Publish the build's branch.

Args: branch: The branch.

def publish_build_artifacts(self, uploaded_artifacts: UploadedArtifacts, artifact_dir: config_types.Path):

Publish metadata about the specified artifacts(s).

Args: uploaded_artifacts: Information about uploaded artifacts as returned by cros_artifacts.upload_artifacts. artifact_dir: Local dir where artifacts are staged.

def publish_build_target_and_model_metadata(self, branch, builder_metadata):

Publish and merge info about the build target and models of a build.

Args: branch (str): The branch name (e.g. release-R97-14324.B). builder_metadata (GetBuilderMetadataResponse): Builder metadata from the build-api.

def publish_channels(self, channels: List[‘common_pb2.Channel’]):

Publish the build's channels.

Args: channels: The channels.

def publish_dlc_artifacts(self, dlc_artifacts: Dict[(str, Dict[(str, str)])]):

Publish DLC artifacts to pubsub, including URL and hash.

Args: dlc_artifacts: DLC locations in GS and file hashes.

def publish_signed_build_metadata(self, signed_build_metadata_list: List[Union[(dict, BuildReport.SignedBuildMetadata)]]):

Publish metadata about the signed build image(s).

Args: signed_build_metadata_list (list[dict]): List of signed build metadata.

def publish_status(self, status):

Publish and merge build status.

@contextlib.contextmanager
def publish_to_gs(self, gs_path=None):

Create a context manager to automatically publish to gs.

Args: gs_path (str): Path to the directory to upload the build report to. Defaults to build.menu.artifacts_gs_path(). Return: Handle which is used to publish to GS.

def publish_toolchain_info(self, toolchain_info: cros_sdk_api.ToolchainInfo):

Publish metadata about SDK/toolchain usage.

Args: toolchain_info: Information about sdk/toolchain usage.

def publish_versions(self, gtv_response):

Publish and merge versions, sourced from a GetTargetVersionsRequest.

Args: gtv_response (GetTargetVersionsResponse): Response to a build api request.

Return: Nothing

@property
def pubsub_project(self):

@property
def pubsub_topic(self):

def reset_build_report(self, build_target, build_type=None):

Resets build properties.

Sets the build report to a new BuildReport object, _build_target and _build_type to the given args, and clears _build_preamble_sent.

def set_build_type(self, build_type, build_target):

Set the type for the build, must be set once and only once using this method.

@contextlib.contextmanager
def status_reporting(self):

Create a context manager to automatically publish overall status.

Return: Handle which is used to publish overall status.

@staticmethod
def step_as_str(step_name):

Convert a BuildReport.StepDetails.StepName to a canonical string.

@contextlib.contextmanager
def step_reporting(self, step_name, raise_on_failed_publish=False):

Create a context manager to automatically send out step status.

When created, initial step status is published with the current time and a status of STATUS_RUNNING.

When the context is exited, the step endtime is set and status is set to STATUS_SUCCESS by default.

A handle is returned from the context manager which can be used to set the return status to STATUS_FAILURE or STATUS_INFRA_FAILURE via the fail() and infra_fail() methods respectively.

If a StepFailure occurs, status is set to STATUS_FAILURE automatically, and similarly, InfraFailure sets status to STATUS_INFRA_FAILURE.

Args: step_name (StepDetails.StepName): The predefined step name. raise_on_failed_publish (bool): Should this publish fail, fail the whole build. Return: Handle which is used to set the step status.

recipe_modules / buildbucket_stats

DEPS: cros_infra_config, recipe_engine/buildbucket, recipe_engine/step, recipe_engine/time

A collection of functions that poll Buildbucket for stats and output properties.

class BuildbucketStatsApi(RecipeApi):

A module to get statistics from buildbucket.

@staticmethod
def get_bot_demand(status_map: Dict[(str, int)]):

Return the demand for bots in a bot group.

Args: status_map: Map of Buildbucket status to count.

Returns: The current demand for bots in the group.

def get_bucket_status(self, bucket: str):

Return the number of builds in the bucket and their statuses.

Args: bucket (str): Buildbucket bucket.

Returns: Map of status to number of builds with that status in the bucket.

def get_build_count(self, bucket: str, status: common_pb2.Status):

Return the number of builds in the bucket with a specific status.

Args: bucket: Buildbucket Bucket to search on. status: The status of builds to search for.

Returns: The number of builds in the given bucket with given status.

def get_snapshot_greenness(self, commit: str, pres: StepPresentation, bucket: Optional[str]=None, builder: Optional[str]=None, end_bbid: Optional[int]=None, retries: int=9, wait_for_complete: bool=False):

Returns greeneness for the specified commit, if found.

Note this function may poll for up to retries * 30 minutes waiting for the specified greenness to be published.

Args: commit: Commit to search for greenness for. Must be in chrome-internal.googlesource.com/chromeos/manifest-internal. The builder publishing greenness must have a buildset tag with this commit. pres: StepPresentation to write logs, etc. to. bucket: If specified, the bucket to search in. Defaults to self._snapshot_bucket. builder: If specified, the builder to search for. Defaults to self._snapshot_builder. end_bbid: If specified, return all runs that are older than the specified bbid (inclusive). retries: Number of times to retry the query. There is a 30 minute sleep before each retry. This allows polling until the specified orchestrator publishes greenness. wait_for_complete: If true, wait until the build is completed. Otherwise, wait until the build publishes the build greenness output property. Note that the build will publish the build greenness before the test greenness; i.e. this should be set true if test greenness is required.

Returns: An ordered dict mapping builder -> Greenness message as a dict. Dict will be empty if no greenness is found.

def initialize(self):

def reformat_greenness_dict(self, list_value: struct_pb2.ListValue):

Reformat ListValue to a dictionary, using builder as key.

This makes buildbucket properties like builderGreenness easier to work with.

recipe_modules / builder_metadata

DEPS: build_menu, cros_build_api, cros_sdk, recipe_engine/raw_io, recipe_engine/step

class BuilderMetadataApi(RecipeApi):

A module to get builder metadata.

def get_models(self):

Finds all model names associated with the active build_target.

Returns: List[str]: The names of all models used by this build target.

def look_up_builder_metadata(self):

Looks up builder metadata for the provided build_target.

Builder metadata does not change within the lifecycle of a build, so builder metadata is looked up once and cached.

Returns: builder_metadata proto describing build and model for the current target.

recipe_modules / checkpoint

DEPS: easy, recipe_engine/buildbucket, recipe_engine/step

Module for Checkpoints which enables partially retriable (release) builds.

class CheckpointApi(RecipeApi):

A module for managing release build checkpoints.

See go/release-checkpoints-dd for context.

def builder_children(self):

Get the BBIDs of the child builders that are image builders.

def builder_retry_props(self, builder: str):

Return the checkpoint module properties to set for the child builder.

@staticmethod
def cascade(requested_steps: List[‘RetryStep’]):

Process step cascades for the requested steps.

Returns: All the steps that are meant to be run.

def failed_builder_children(self):

Returns the list of child builders that failed.

Returns: Names of child builders that failed, e.g. eve-release-main.

def is_retry(self):

Return whether the build is a retry build.

def is_run_step(self, step: ‘RetryStep’):

Return whether the step will be run in this retry.

@property
def original_build_bbid(self):

Return the BBID of the original build as set in input properties.

def register(self):

Perform initial set up for checkpoint / mark the build as a retry.

@contextmanager
def retry(self, step: ‘RetryStep’):

Context to handle retry logic / status reporting.

def successful_builder_children_bbids(self):

Get the BBIDs of the child builders that were successful.

def update_summary(self, step: ‘RetryStep’, status: str):

Update the retry_summary output property with the given step/status.

recipe_modules / chrome

DEPS: cros_build_api, cros_sdk, cros_version, easy, future_utils, gerrit, git_footers, workspace_util, depot_tools/depot_tools, depot_tools/gclient, recipe_engine/cas, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step, recipe_engine/time

Module for dealing with chrome source.

class ChromeApi(RecipeApi):

Module for managing chrome source code.

Note: in general, ChromeOS builders check out the source code at a specific version. That version is obtained by hitting the BuildAPI to read an overlay, which requires a sysroot. This module includes some code to support speeding up those chrome checkouts by doing the following:

  • checkout the chrome source to the latest commit in the cache asynchronously with all refs and tags (sync_chrome_async).
  • wait for the async main sync (wait_for_sync_chrome_source_async).
  • delete the checkout of main (delete_chrome_checkout) for cases where it is unused.

This supports the general workflow on all builders of:

  1. As early as possible, kick off an async chrome checkout to the latest commit in the cache.
  2. Whenever chrome source is definitely going to be needed, block and wait for that async checkout to complete. 3a. If the chrome source code isn't needed, delete the checkout. 3b. If the chrome source code is needed, check out the specific version needed (much faster with the repo already checked out).

def cache_sync(self, cache_path: Path, sync: bool=True, step_name: str=‘sync chrome’):

Sync Chrome cache using existing cached repositories.

Args: cache_path: Path to mount of cache. sync: whether or not to call sync after setting up the cache. Defaults to true. step_name: the name to use for the surrounding step. Defaults to “sync chrome”.

def delete_chrome_checkout(self):

Delete unnecessary chrome checkout.

Allows to delete the chrome source synced in sync_chrome_async() function when it turns out to be unnecessary for the build.

def diffed_files_requires_rebuild(self, patch_sets: Optional[List[PatchSet]]=None):

Returns a bool if patch_sets includes files that require rebuilding.

The patch_sets object supplied must have been constructed with the file information populated.

Args: patch_sets: List of patch sets (with FileInfo).

Returns: A bool that indicates a rebuild should be triggered.

def follower_lacks_prebuilt(self, build_target: BuildTarget, chroot: Chroot, packages: List[PackageInfo]):

Returns whether we need the chrome source to be synced.

Returns whether or not this run needs chrome source to be synced locally. This is independent of if we need to actually build chrome, as we‘ve allowed ‘follower’ packages to be built out of chrome’s source.

Args: build_target: Build target of the build. chroot: Information on the chroot for the build. packages: Packages that the builder needs to build. Returns: bool: Whether or not this run needs chrome.

@property
def gclient_sync_timeout_seconds(self):

def has_chrome_prebuilt(self, build_target: BuildTarget, chroot: Chroot, internal: bool=False, ignore_prebuilts: bool=False):

def is_chrome_pupr_atomic_uprev(self, gerrit_change: GerritChange):

def maybe_uprev_local_chrome(self, build_target: BuildTarget, chroot: Chroot, patch_sets: List[PatchSet]):

Checks the patch_sets for chrome 9999 ebuild changes and uprevs if so.

Args: build_target: Build target of the build. chroot: Information on the chroot for the build. patch_sets: A list of patch sets to examine.

Returns: bool: If we upreved the local Chrome.

def needs_chrome(self, build_target: BuildTarget, chroot: Chroot, packages: Optional[List[PackageInfo]]=None):

Returns whether or not this run needs chrome.

Returns whether or not this run needs chrome, that is, will require a prebuilt, or will need to build it from source.

Args: build_target: Build target of the build. chroot: Information on the chroot for the build. packages: Packages that the builder needs to build, or empty / None for default packages.

Returns: bool: Whether or not this run needs chrome.

def needs_chrome_source(self, request: InstallPackagesRequest, dep_graph: DepGraph, presentation: StepPresentation, patch_sets: Optional[List[PatchSet]]=None):

Checks whether chrome source is needed.

Args: request: InstallPackagesRequest for the build. dep_graph: From cros_relevance.get_dependency_graph. presentation: Step to update. patch_sets: Applied patchsets. Default: the list from workspace_util.

Returns: bool: Whether Chrome source is needed.

def sync(self, chrome_root: Path, chroot: Chroot, build_target: BuildTarget, internal: bool, cache_dir: str, override_version: Optional[str]=None, omit_version: bool=False):

Sync Chrome source code.

Must be run with cwd inside a chromiumos source root.

Args: chrome_root: Directory to sync the Chrome source code to. chroot: Information on the chroot for the build. build_target: Build target of the build. internal: True for internal checkout. cache_dir: Path of the chrome cache. override_version: Specific git ref/hash to sync to. omit_version: Omit the version from the sync command. Defaults to False.

def sync_chrome_async(self, config: BuilderConfig, build_target: BuildTarget):

Sync chrome source async.

Intentionally checking out the chrome source on main instead of the appropriate version for the build. The purpose is to speed up the subsequent chrome source sync.

Args: config: The Builder Config for the build. build_target: Build target of the build.

def wait_for_sync_chrome_source_async(self):

Wait for async chrome source sync.

recipe_modules / chromite

DEPS: gitiles, goma, repo, depot_tools/bot_update, depot_tools/gclient, depot_tools/tryserver, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/legacy_annotation, recipe_engine/path, recipe_engine/properties, recipe_engine/step

API for running cbuildbot chromite scripts.

class ChromiteApi(RecipeApi):

def build_packages(self, board, args=None, **kwargs):

Run the build_packages script inside the chroot.

Used by the internal goma recipe.

def cbuildbot(self, name, config, args=None, **kwargs):

Runs the cbuildbot command defined by the arguments.

Args: name: (str) The name of the command step. config: (str) The name of the ‘cbuildbot’ configuration to invoke. args: (list) If not None, addition arguments to pass to ‘cbuildbot’.

Returns: (Step) The step that was run.

def check_repository(self, repo_type_key, value):

Scans through registered repositories for a specified value.

Args: repo_type_key (str): The key in the ‘repositories’ config to scan through. value (str): The value to scan for. Returns (bool): True if the value was found.

def checkout(self, manifest_url=None, repo_url=None, branch=None):

def checkout_chromite(self):

Checks out the configured Chromite branch.

@property
def chromite_branch(self):

@property
def chromite_path(self):

def configure(self, **kwargs):

Loads configuration from build properties into this recipe config.

def cros_sdk(self, name, cmd, args=None, environ=None, chroot_cmd=None, **kwargs):

Return a step to run a command inside the cros_sdk.

Used by the internal goma recipe.

@property
def depot_tools_path(self):

@property
def depot_tools_pin(self):

def gclient_config(self):

Generate a ‘gclient’ configuration to check out Chromite.

Return: (config) A ‘gclient’ recipe module configuration.

def get_config_defaults(self):

def run(self, goma_dir=None):

Runs the configured ‘cbuildbot’ build.

This workflow uses the registered configuration dictionary to make group- and builder-specific changes to the standard workflow.

The specific workflow paths that are taken are also influenced by several build properties.

TODO(dnj): When CrOS migrates away from BuildBot, replace property inferences with command-line parameters.

This workflow:

  • Checks out the specified ‘cbuildbot’ repository.
  • Pulls information based on the configured change's repository/revision to pass to ‘cbuildbot’.
  • Executes the ‘cbuildbot’ command.

Args: goma_dir: Goma client path used for simplechrome. Goma client for ChromeOS chroot should be located in sibling directory so that cbuildbot can find it automatically. Returns: (Step) the ‘cbuildbot’ execution step.

def setup_board(self, board, args=None, **kwargs):

Run the setup_board script inside the chroot.

Used by the internal goma recipe.

def with_system_python(self):

Prepare a directory with the system python binary available.

This is designed to make it possible to mask “bundled python” out of the standard path without hiding any other binaries.

Returns: (context manager) A context manager that inserts system python into the front of PATH.

recipe_modules / cloud_pubsub

DEPS: support, recipe_engine/context, recipe_engine/step, recipe_engine/time

APIs for using Cloud Pub/Sub

class CloudPubsubApi(RecipeApi):

A module for Cloud Pub/Sub

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def publish_message(self, project_id, topic_id, data, ordering_key=None, endpoint=None, raise_on_failed_publish=True):

Publish a message to Cloud Pub/Sub

Note that if the request is larger than the Pub/Sub request limit, this method will return before it even tries to send a request, raising an exception if raise_on_failed_publish is true.

When specifying an ordering key to ensure message ordering, an explicit endpoint needs to be specified, and only messages going through the same endpoint are guaranteed to be ordered.

Args:

  • project_id (str): The project name.
  • topic_id (str): The topic name.
  • data (str): The data to put in the message. The input must be encodable with utf8, as it will be sent to the publish-message binary via JSON.
  • ordering_key (str): ordering key to be sent with message
  • endpoint (str): specific pub/sub endpoint to use eg: “us-east1-pubsub.googleapis.com”
  • raise_on_failed_publish (bool): If True, raise exception on failure.

Raises: InfraFailure: If the publish fails and raise_on_failed_publish.

recipe_modules / code_coverage

DEPS: cros_infra_config, cros_source, easy, gerrit, gitiles, depot_tools/gsutil, recipe_engine/archive, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/cv, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Recipe definition for code coverage recipe.

class CodeCoverageApi(RecipeApi):

This module contains apis to generate code coverage data.

@property
def metadata_dir(self):

A temporary directory for the metadata.

Temp dir is created on first access to this property.

def process_coverage_data(self, tarfile, coverage_type, merger_flow_enabled=False, gs_artifact_bucket=None, gs_artifact_path=None, step_name=‘upload code coverage data’, incremental_settings=None, absolute_cs_settings=None, absolute_chromium_settings=None):

Uploads code coverage data to the requested external sources.

Args: tarfile (Path): path to tarfile. coverage_type (str): type of coverage being uploaded (LCOV, LLVM, or GO_COV). merger_flow_enabled (bool): whether merger flow is enabled or not. gs_artifact_bucket (str): artifact bucket (eg. chromeos-image-archive). gs_artifact_path (str): artifact bucket path (eg. builderName/version-builderID). step_name (str): name for the step. incremental_settings (CoverageFileSettings): incremental coverage settings. absolute_cs_settings (CoverageFileSettings): absolute coverage settings. absolute_chromium_settings (CoverageFileSettings): absolute chromium coverage settings.

def update_e2e_metadata(self, gs_artifact_bucket: str, gs_artifact_path: str, board: str, version: str):

Uploads metadata needed for e2e coverage.

Args: gs_artifact_bucket (str): artifact bucket (eg. chromeos-image-archive). gs_artifact_path (str): artifact bucket path (eg. builderName/version-builderID). board: Board used for generating artifacts. version: CROS version used to build artifacts.

def upload_active_version(self, active_date: str):

Whether we need to upload active version.

Args: active_date: The date in ISOformat present in uploaded active_version.

def upload_code_coverage(self, tarfile, coverage_type, gs_artifact_bucket, gs_artifact_path, step_name=‘upload code coverage data’):

Uploads code coverage llvm json and golang.

Args: tarfile (Path): path to tarfile. step_name (str): name for the step. coverage_type (str): type of coverage being uploaded (LCOV, LLVM, or GO_COV). gs_artifact_bucket (str): artifact bucket (eg. chromeos-image-archive). gs_artifact_path (str): artifact bucket path (eg. builderName/version-builderID).

def upload_firmware_lcov(self, tarfile, step_name=‘upload code coverage data (firmware lcov)’):

Uploads firmware lcov code coverage.

Args: tarfile (Path): path to tarfile. step_name (str): name for the step.

recipe_modules / conductor

DEPS: easy, gobin, recipe_engine/file, recipe_engine/path, recipe_engine/step

API wrapping the conductor tool.

class ConductorApi(RecipeApi):

A module for calling conductor.

def __call__(self, cmd: List[str], step_name: str=None, timeout: int=3600, **kwargs):

Call conductor with the given args.

Args: cmd: Command to be run with conductor step_name: Message to use for step. Optional. timeout: Timeout, in seconds. Defaults to one hour. kwargs: Keyword arguments for recipe_engine/step.

def collect(self, collect_name: str, bbids: List[Union[(str, int)]], initial_retry: bool=False, **kwargs):

Calls conductor collect with the given args.

Args: collect_name: Name of this collection (used to find collect config). bbids: List of BBIDs to collect. initial_retry: Whether to pass --initial_retry to conductor for an unconditional retry at the start of the run.

Returns: Final set of BBIDs.

def collect_config(self, collect_name: Union[(str, None)]):

@property
def dryrun(self):

@property
def enabled(self):

def initialize(self):

Initializes the module.

recipe_modules / cq_fault_attribution

DEPS: cros_infra_config, easy, failures, looks_for_green, recipe_engine/buildbucket, recipe_engine/resultdb, recipe_engine/step

A module for attributed failures based on snapshot build comparisons.

class CqFailureAttributionApi(RecipeApi):

A module for ascribing build and test failure attributes based on snapshot build comparisons.

@property
def cq_test_failure_attributes(self):

Returns determined failure attributes

def set_cq_fault_attribute_properties(self, test_results: MetaTestTuple, orch_snapshot: GitilesCommit):

Compares test failures between a snapshot and CQ build, and assigns failure attributes and a flakiness status to each failure if a comparison snapshot is found. Sets and returns failure attributes.

Args: test_results: HW and VM test results. orch_snapshot: The manifest snapshot at the orchestrator level.

recipe_modules / cros_artifacts

DEPS: code_coverage, cros_build_api, cros_infra_config, cros_snapshot, cros_source, cros_version, disk_usage, dlc_utils, easy, failures, metadata, depot_tools/gsutil, recipe_engine/bcid_reporter, recipe_engine/cv, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

API for uploading CrOS build artifacts to Google Storage.

class CrosArtifactsApi(RecipeApi):

A module for bundling and uploading build artifacts.

@property
def artifacts_by_image_type(self):

Return a map from image type to artifact name.

def artifacts_gs_path(self, builder_name, target, kind=BuilderConfig.Id.TYPE_UNSPECIFIED, template=None):

Returns the GS path for artifacts of the given kind for the given target.

The resulting path will NOT include the GS bucket.

Args: builder_name (str): The builder name, e.g. octopus-cq. target (BuildTarget): The target whose artifacts will be uploaded. kind (BuilderConfig.Id.Type): The kind of artifacts being uploaded, e.g. POSTSUBMIT. May be used as a descriptor in formatting paths. Required if ‘{label}’ or ‘{kind}’ are present in |template|. template (str): The string to format, or None. If set to None, the default ‘{gs_path}’ will be used.

Returns: The formatted template. Default: The GS path at which artifacts should be uploaded.

def download_artifact(self, build_payload, artifact, name=None):

Download the given artfiact from the given build payload.

Args: build_payload (BuildPayload): Describes where the artifact is on GS. artifact (ArtifactType): The artifact to download. name (string): step name. Defaults to ‘download |artifact_name|’.

Returns: list[Path]: Paths to the files downloaded from GS.

Raises: ValueError: If the artifact is not found in the build payload.

def download_artifacts(self, build_payload, artifact_types, name=None):

Download the given artifacts from the given build payload.

Args: build_payload (BuildPayload): Describes where build artifacts are on GS. artifact_types (list[ArtifactTypes]): The artifact types to download. name (str): The step name. Defaults to ‘download artifacts’.

Returns: dict: Maps ArtifactType to list[Path] representing downloaded files.

Raises: ValueError: If any artifact is not found in the build payload.

@property
def gs_upload_path(self):

Return the gs upload path, if one was set in properties.

@staticmethod
def has_output_artifacts(artifacts_info: ArtifactsByService):

Return whether there are output artifacts.

Args: artifacts: The artifacts config to check.

Returns: Whether there are any output artifacts.

def merge_artifacts_properties(self, properties: List[UploadedArtifacts]):

Combine uploaded artifacts to produce a final value.

Args: properties (list[UploadedArtifacts]): the values to merge.

def prepare_for_build(self, chroot, sysroot, artifacts_info, forced_build_relevance=False, test_data=None, name=None):

Prepare the build for the given artifacts.

This function calls the Build API to have it prepare to build artifacts of the given types.

Args: chroot (Chroot): The chroot to use, or None if not yet created. sysroot (Sysroot): The sysroot to use, or None if not yet created. artifacts_info (ArtifactsByService): artifact information. forced_build_relevance (bool): Whether the builder will be ignoring the response. test_data (str): JSON data to use for ArtifactsService call. name (str): The step name. Defaults to ‘prepare artifacts’.

Returns: PrepareForToolchainBuildResponse.BuildRelevance indicating that the build is NEEDED (regardless of the cq relevance check), UNKNOWN (pointless build check applies), or POINTLESS (just exit now.)

def publish_latest_files(self, gs_bucket: str, gs_path: str):

Write LATEST-... files to GS.

Writes version information to the following files to locate the location of artifacts:

  • LATEST-{version}
  • LATEST-{branch}
  • LATEST-SNAPSHOT-{snapshot identifier} (only on snapshot builds)

Will only write if the version is more recent than the existing contents.

Args: gs_bucket (str): GS bucket to write to. gs_path (str): GS path to write to (relative to the bucket), e.g. eve-release.

def push_image(self, chroot, gs_image_dir, sysroot, dryrun=False, profile=None, sign_types=None, dest_bucket=None, channels=None):

Call the PushImage build API endpoint.

Args: chroot (Chroot): The chroot to use, or None if not yet created. gs_image_dir (string): The source directory (a gs path) to push from. sysroot (Sysroot): The sysroot (build target) to use. profile (Profile): The profile to use, or None. sign_types (list(ImageType)): The sign types to use, or None. dest_bucket (string): The destination bucket to use, or None. channels (list(Channel)): The channels to use, or empty list.

For more context on this parameters, see chromite/scripts/pushimage.py.

Returns: PushImageResponse

@property
def skip_publish(self):

Return whether to skip publish, if set in properties.

@property
def timestamp_micros(self):

Return the value of {time} in GS templates.

def upload_artifacts(self, builder_name, kind, gs_bucket, *, artifacts_info=None, chroot=None, sysroot=None, name=‘upload artifacts’, test_data=None, private_bundle_func=None, report_to_spike=False, attestation_eligible=False, upload_coverage=True, previously_uploaded_artifacts=None, ignore_breakpad_symbol_generation_errors=False):

Bundle and upload the given artifacts for the given build target.

This function sets the “artifacts” output property to include the GS bucket, the path within that bucket, and a dict mapping artifact to a list of artifact paths (relative to the GS path) for each artifact type that was uploaded.

Args: builder_name (str): The builder name, e.g. octopus-cq. kind (BuilderConfig.Id.Type): The kind of artifacts being uploaded, e.g. POSTSUBMIT. This affects where the artifacts are placed in Google Storage. gs_bucket (str): Google storage bucket to upload artifacts to. artifacts_info (ArtifactsByService): Information about artifacts. chroot (Chroot): chroot to use sysroot (Sysroot): sysroot to use (this contains the build target.) name (str): The step name. Defaults to ‘upload artifacts’. test_data (str): Some data for this step to return when running under simulation. The string “@@DIR@@” is replaced with the output_dir path throughout. private_bundle_func (func): If a private bundling method is needed (such as when there is no Build API on the branch), this will be called instead of the internal bundling method. report_to_spike(bool): If True, will call bcid_reporter to report artifact information and trigger Spike to upload the provenance as [artifact-name].attestation if attestation_eligible is true. attestation_eligible(bool): Will call bcid_reporter to report artifact information if report_to_spike is also true. This is set in BuilderConfig.Artifacts.AttestationEligible. upload_coverage(bool): If True, we will run the upload coverage step and store coverage information. This should be set of False when we dont run unit tests and hence have no coverage information to store. previously_uploaded_artifacts(UploadedArtifacts): If set, the UploadedArtifacts from a previous call to upload_artifacts; these artifact types will not be re-uploaded. This used to avoid re-bundling artifacts if upload_artifacts is called multiple times. ignore_breakpad_symbol_generation_errors: If True, the BREAKPAD_DEBUG_SYMBOLS step will ignore any errors during symbol generation.

Returns: (UploadedArtifacts) information about uploaded artifacts. (Path) path to local dir where artifacts are staged.

def upload_metadata(self, name, builder_name, target, gs_bucket, filename, message, template=None):

Materialize a protobuffer message as a jsonpb artifact in GCS.

Convert the message to a jsonpb file and upload it to the appropriate location in GCS with the other build artifacts.

Args: name (str): Human readable metadata name for step name. builder_name (str): The builder name, e.g. octopus-cq. target (str|): Build target, eg: octopus-kernelnext. gs_bucket (str): Google storage bucket to upload artifacts to. filename (str): Filename for the metadata. message (Message): Protobuffer message to serialize and upload. template (str): The string to format for artifacts_gs_path, or None. If set to None, the default artifacts_gs_path template will be used.

Returns: GS path inside bucket to uploaded file

recipe_modules / cros_branch

DEPS: cros_version, gobin, recipe_engine/raw_io, recipe_engine/step

API wrapping the cros branch tool.

class CrosBranchApi(RecipeApi):

A module for calling cros branch.

def __call__(self, cmd, step_name=None, force=False, push=False, **kwargs):

Call cros branch with the given args.

Args: cmd: Command to be run with cros branch step_name (str): Message to use for step. Optional. force (bool): If True, cros branch will be run with --force. push (bool): If True, cros branch will be run with --push. kwargs: Keyword arguments for recipe_engine/step.

Returns: branch_name (string): The name of the created branch, or None.

def create_from_buildspec(self, source_version, branch, **kwargs):

Call cros branch create, branching from the appropriate buildspec manifest.

Args: source_version (str): Version to branch from. Must have a valid manifest in manifest-versions/buildspecs or branch_util will fail. branch (chromiumos.Branch): Branch to be created. kwargs: Keyword arguments for recipe_engine/step. Accepts the same keyword arguments as call.

Returns: branch_name (string): The name of the created branch, or None.

def create_from_file(self, manifest_file, branch, **kwargs):

Call cros branch create, branching from the file specified in manifest_file.

Args: manifest_file (recipe_engine.config_types.Path): Path to manifest file. This recipe assumes that it is at the top level of a ChromeOS checkout. branch (chromiumos.Branch): Branch to be created. kwargs: Keyword arguments for recipe_engine/step. Accepts the same keyword arguments as call.

Returns: branch_name (string): The name of the created branch, or None.

def delete(self, branch, **kwargs):

Call cros branch delete with the appropriate arguments.

Args: branch (chromiumos.Branch): Branch to be deleted. kwargs: Keyword arguments for cros branch/recipe_engine/step. Accepts the same keyword arguments as call.

def rename(self, branch, new_branch_name, **kwargs):

Call cros branch rename with the appropriate arguments.

Args: branch (chromiumos.Branch): Branch to be renamed. new_branch_name (str): New branch name. kwargs: Keyword arguments for cros branch/recipe_engine/step. Accepts the same keyword arguments as call.

recipe_modules / cros_build_api

DEPS: analysis_service, cros_infra_config, failures, git, portage, src_state, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

API for working with the protobuf-based Build API.

class CrosBuildApiApi(RecipeApi):

This recipe module exposes client stubs for all build API services.

To add a service endpoint, create a class INSIDE THIS MODULE extending Stub. Make sure the class name is the same as the service name.

To call a service endpoint, call the corresponding method on the stub. It will “magically” know what to do and fail gracefully if it does not. Example:

# Inside recipes/my_recipe.py...
my_request_proto = BundleRequest()
# Set up your request proto, and then...
api.cros_build_api.ArtifactsService.BundleFirmware(my_request_proto)

The stub will perform some validation and then call the build API command.

def GetVersion(self, test_data=None):

Get the Build API version.

The version is always queried, and the result cached.

Args: test_data: A string representation of a VersionGetResponse dict.

Returns: The version of the Build API.

def __call__(self, endpoint: str, input_proto: message.Message, output_type: descriptor.Descriptor, test_output_data: Optional[str]=None, test_teelog_data: Optional[str]=None, name: Optional[str]=None, infra_step: bool=False, timeout: Optional[int]=None, response_lambda: Optional[Callable[([message.Message], str)]]=None, pkg_logs_lambda: Optional[Callable[([message.Message, message.Message], Tuple[(str, str)])]]=None, step_text: Optional[str]=None, retcode_fn: Optional[Callable[([int], None)]]=None):

Call the build API with the given input proto.

This function tries to be as dumb as possible. It does not validate that the endpoint exists, nor that the input_proto has the correct type. While clients may call this function directly, they should ALMOST ALWAYS call the build API through the appropriate stub.

Args: endpoint: The full endpoint to call, e.g. chromite.api.MyService/MyMethod. input_proto: The input proto object. output_type: The output proto type. test_output_data: String of JSON to use as a response during testing. test_teelog_data: Text to use as tee-log contents during testing. name: Name for the step. Generated automatically if not specified. infra_step: Whether this build API call should be treated as an infrastructure step. timeout: Timeout in seconds to be supplied to the Build API call. response_lambda: A function that appends a string to the build API response step. Used to make failure step names unique across differing root causes. pkg_logs_lambda: A function to produce log information about failed packages. It should take two arguments: the request message and the response message. It should return a list of tuples (failed_package, logs), where failed_package is the category-package for a package that failed, and logs is the corresponding build logs contents. step_text: text to put on the step for the call. retcode_fn: Called with the return code from Build API. This is useful for when the return code is 2 (RETURN_CODE_UNSUCCESSFUL_RESPONSE_AVAILABLE).

Returns: The parsed response proto.

@staticmethod
def failed_pkg_data_names(output_proto: message.Message):

Function to append a list of failed package to the failure step.

To use this, pass response_lambda=api.cros_build_api.failed_pkg_data_names to the build api call.

Args: output_proto: A Response object that has a ‘failed_package_data’ attribute.

Returns: A string to append to the response step name.

def failed_pkg_logs(self, input_proto: message.Message, output_proto: message.Message):

Function to cat log file and retrieve package name.

To use this, pass pkg_logs_lambda=api.cros_build_api.failed_pkg_logs to the build api call.

Args: input_proto: A Request object that contains a chromiumos.Chroot attribute called ‘chroot’. output_proto: A Response object that has a ‘failed_package_data’ attribute.

Returns: A list of tuples (package_name, build_log).

@staticmethod
def failed_pkg_names(output_proto: message.Message):

Function to append a list of failed package to the failure step.

To use this, pass response_lambda=api.cros_build_api.failed_pkg_names to the build api call.

Args: output_proto: A Response object that has a ‘failed_packages’ attribute.

Returns: A string to append to the response step name.

def has_endpoint(self, stub: ‘Stub’, method: str):

Verifies that the given endpoint can be called.

Args: stub: Stub instance to check whether method can be called on it. method: Name of method to check for.

Returns: Whether method can be called on stub.

def initialize(self):

Expose all client stubs defined in this module.

def is_at_least_version(self, major=1, minor=0, bug=0):

Return whether the Build API version is at least major.minor.bug.

@property
def log_level(self):

Return the log level used when calling Build API.

def new_result_path(self):

Create a ResultPath for the BAPI to extract output files into.

@contextlib.contextmanager
def parallel_operations(self):

Sets up the build API for running operations in parallel.

Since we check out the chromite commit before making calls, parallel calls can clobber each other, so this context does the checkout once.

def reset_checkout(self):

@property
def version(self):

Return the version that this build API uses.

recipe_modules / cros_cache

DEPS: easy, depot_tools/gsutil, recipe_engine/file, recipe_engine/path

API for working with CrOS cache.

class CrosCacheApi(RecipeApi):

A module for CrOS-specific cache steps.

def create_cache_dir(self, directory):

Creates a working directory outside of recipe structure.

Args: directory (Path): Full path to directory to create.

def write_and_upload_version(self, gs_bucket, version_file, version):

Write local version file and uploads to Google Storage.

Args: gs_bucket (str): Target Google Storage bucket. version_file (str): Version file name. version (str): Version to write to tracking file.

recipe_modules / cros_cq_additional_tests

DEPS: cros_infra_config, git, git_footers, recipe_engine/buildbucket, recipe_engine/step

Functions configuring additional CQ testing via footer.

class CrosCqAdditionalTests(RecipeApi):

def append_user_provided_test_suites_to_test_plan(self, builds: List[build_pb2.Build], gerrit_changes: List[common_pb2.GerritChange], test_plan: GenerateTestPlanResponse):

Reads test suites-related Git footers and appends them to test_plan.

The following footers are read:

  • Cros-Add-Test-Suites-list of test suites to run.
  • Cros-Add-TS-Boards-BuildTarget-list of boards and build targets provided. For example: coral, octopus|octopus|octopus-kernelNext. Runs test suites for board:coral and build_target:coral for board - octopus and build target = octopus, octopus-kernelNext.
  • Cros-Add-TS-Pool - pool to run against - defaulted to DUT_POOL_QUOTA.

Args: builds: Builds to test. gerrit_changes: Changes that resulted in the provided builds, or None. test_plan: Test plan generated by CTP.

Raises: CrosCqAdditionalTestsInvalidFooterError: When Cros-Add-Test-Suites footer is present but Cros-Add-TS-Boards-BuildTarget is missing. CrosCqAddnlTestsMissingBuildTargetsError: When there are test suites not run due to failed or not built build targets.

def get_additional_test_builders(self, builds: List[build_pb2.Build], gerrit_changes: List[common_pb2.GerritChange]):

Returns the builders that had additional testing specified via footer.

recipe_modules / cros_cq_depends

DEPS: cros_source, easy, gerrit, git, repo, recipe_engine/context, recipe_engine/step

APIs for interacting with Cq-Depends.

class CrosCqDependsApi(RecipeApi):

A module for checking that Cq-Depend has been fulfilled.

def ensure_manifest_cq_depends_fulfilled(self, manifest_diffs: List[ManifestDiff]):

Checks that Cq-Depend deps between manifests are met.

Checks that all Cq-Depend in all CLs in the given manifest diffs are met.

Args: manifest_diffs: An array of ManifestDiff namedtuples.

Raises: StepFailure: If any dependencies cannot be found on the branch, and the allow_missing_depends input property is False.

def get_cq_depend(self, gerrit_changes: List[GerritChange], chunk_size: int=4):

Get Cq-Depend string for the given list of Gerrit changes.

Args: gerrit_changes: The changes on which to depend. chunk_size: The number of CLs per ‘Cq-Depend:’ line.

Return: The full Cq-Depend string.

@staticmethod
def get_cq_depend_reference(gerrit_change: GerritChange):

Return the Cq-Depend reference string for the given change.

Args: gerrit_change: The change of interest.

Returns: The reference string for the change, e.g. chromium:12345

def get_mutual_cq_depend(self, gerrit_changes: List[GerritChange]):

Mutually Cq-Depend all given Gerrit changes.

Args: gerrit_changes: Changes to mutually CQ-depend.

Return: Cq-Depend strings in same order as changes.

recipe_modules / cros_debug

DEPS: cros_infra_config, easy, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/step, recipe_engine/time

Various methods for debugging recipes builds.

class CrosDebugApi(RecipeApi):

A module to be used for debugging builders.

def patch_chromite_head(self, cl_number: int, patchset: int=1):

Patch the chromite-HEAD checkout with the given CL.

Intended for use in a led job. To use this, call this function at the end of cros_build_api.reset_checkout.

Args: cl_number: Number of the (chromite) CL, e.g. 5095955. patchset: Patchset of the change.

def pause_and_wait_for_signal(self, timeout=(10 * 60), override_led_launch_only_staging=False, test_location_override=None):

Halt the builder and wait for a signal to continue.

This method is meant to be used to debug a builder. The thought here is that we've exhausted all other possibilities and our course of action is to try to run the recipe and then ssh into the bot.

To use this, call the method right where you want the builder to halt. When you run the build, it will halt there and wait. If you look at the builder's step output, it should tell you that it is sleeping for a time span, and specify a sentinel file to create when you are ready to release the bot and continue the execution.

By default, this will only work for a led launch against a staging builder, but if you are determined to do it either against a prod builder or merge through the code and run in staging, you can specify the override flag.

Args: timeout (int): how long to wait for a continue signal, in seconds. Defaults to 10 minutes. override_led_launch_only_staging (bool): override flag to allow outside led and/or outside of staging. test_location_override (Path): by default this method creates its own temp location to look for the resume file, but for tests the location can be passed in with this property for ease of verification.

recipe_modules / cros_dupit

DEPS: easy, depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

API for DupIt script. See the design of this recipe in go/cros-dupit.

class DupItApi(RecipeApi):

A module for the DupIt script.

def configure(self, rsync_mirror_address, rsync_mirror_rate_limit, gs_distfiles_uri, ignore_missing_args=False, filter_missing_links=False, regex_for_archival_sync=None, gs_uri_for_archival_sync=None, path_datetime_for_archival_sync=None, gs_topdir_backfill=False):

Configure the DupIt script module.

Args:

  • rsync_mirror_address: the rsync mirror address that contains Gentoo distfiles.
  • rsync_mirror_rate_limit: the rate limit of syncing from public mirror.
  • gs_distfiles_uri: the Google cloud storage URI which stores all Gentoo distfiles.
  • ignore_missing_args: have rsync ignore files that go missing during synchronization.
  • filter_missing_links: filter out symlinks that are missing (such as directories).
  • regex_for_archival_sync: if this string is non-empty, sync any files from the remote mirror that match the regex.
  • gs_uri_for_archival_sync: the base GS URI used for syncing files matching ‘regex_for_archival_sync’.
  • path_datetime_for_archival_sync: an additional path for archival syncing that is interpreted by datetime strftime (using UTC). Gets added to the end of ‘gs_uri_for_archival_sync’.
  • gs_topdir_backfill: enable a workaround for Gentoo distfiles. See b/302226413 for more information.

@property
def gs_distfiles_uri(self):

@property
def rsync_mirror_address(self):

@property
def rsync_mirror_rate_limit(self):

def run(self):

@property
def tmp_distfiles_path(self):

recipe_modules / cros_history

DEPS: cros_tags, easy, naming, skylab_results, recipe_engine/buildbucket, recipe_engine/step, recipe_engine/time

A module to use build history to avoid redundant builds.

class CrosHistoryApi(RecipeApi):

A module to use build history to avoid redundant builds.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def get_annealing_from_snapshot(self, snapshot_id: str):

Find the annealing build that created snapshot with given ID.

Args: snapshot_id: Manifest snapshot commit ID.

Returns: If an Annealing build is found, then a proto message of that build. Otherwise, None.

def get_matching_builds(self, build: build_pb2.Build, statuses: Optional[List[‘bb_common_pb2.Status’]]=None, start_build_id: Optional[int]=None, limit: Optional[int]=None):

Get builds with the matching builder and gerrit_changes.

Args: build: Build to match for. statuses: Query for builds with these statuses. start_build_id: Exclude builds older than this ID. limit: Number of results to return. Latest first.

Returns: List of builds which meet the conditions ordered from latest to oldest.

def get_passed_builds(self, tags: Optional[List[bb_common_pb2.StringPair]]=None):

Retrieve passed builds with the same patches as current build.

Args: tags: Get builds with these tags.

Returns: Passed builds with the most recent build per builder.

def get_passed_tests(self):

Find all tests that have passed with the given patches.

Returns: Names of passed tests, if any.

def get_previous_test_results(self, test_plan: GenerateTestPlanResponse):

Get the tests from the previous run.

Args: test_plan: The test plan which contains the tests for which to retrieve the results from previous runs.

Returns: A tuple containing the list of the previous VM test builds and the list of the previous HW test results.

def get_previous_test_task_ids(self):

Get the task ids of the latest test invocations.

Returns: A tuple (vm_build_ids, hw_build_ids), where:

  • vm_build_ids is a list of buildbucket IDs for all Tast VM tests for the latest invocation of this builder with the same set of Gerrit changes.
  • hw_build_ids is a list of buildbucket IDs for all Skylab tests for the latest invocation of this builder with the same set of Gerrit changes.

def get_snapshot_builds(self, snapshot: bb_common_pb2.GitilesCommit, builder_list: Optional[Set[str]]=None, statuses: Optional[List[‘bb_common_pb2.Status’]]=None, patches: Optional[List[chromiumos_common_pb2.GerritChange]]=None):

Get builds ran at given snapshot and additional optional filtering.

Args: snapshot: Snapshot to search on. builder_list: List of builder names to filter by. If falsy, no name filtering is performed. statuses: The statuses of snapshots to return. If falsy, no status filtering is performed. patches: Patches applied to snapshot to search on. If falsy, no patch filtering is performed.

Returns: Builds with the same snapshot and additional filtering.

def get_test_failure_builders(self):

Get builders with the given patches that failed tests in the last run.

Returns: Names of builders with HW or VM testing failures, if any.

@staticmethod
def get_upreved_pkgs(annealing_build: build_pb2.Build):

Retrieve the packages upreved by the annealing build.

Args: annealing_build: The Annealing build in question.

Returns: List of upreved packages.

def hours_since_breakage(self, broken_until: str):

Determine how long it has been since the tree was fixed.

Args: broken_until: manifest snapshot SHA from builderconfig.

Returns: hours since broken_until snapshot creation.

def is_build_broken(self, build_snapshot: str, broken_until_snapshot: str):

Determine whether the build to be recycled is broken.

Args: build_snapshot: GitliesCommit id of the build to be recycled. broken_until_snapshot: SHA of the manifest snapshot from broken_until config.

Returns: Whether to recycle the build.

@functools.cached_property
def is_retry(self):

Determine if this build is being retried.

Returns: Boolean indicating if it is a retry.

def set_passed_tests(self, tests: Iterable[str]):

Record the tests that passed in the current run.

This exposes the tests to history, so future runs may know which tests have passed and which have not.

Args: tests: Unique names of the tests that passed.

@property
def start_time_in_seconds(self):

Generate start time in seconds.

recipe_modules / cros_infra_config

DEPS: easy, gitiles, src_state, depot_tools/gitiles, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/led, recipe_engine/step, recipe_engine/time

Module providing builder config.

class CrosInfraConfigApi(RecipeApi):

A module for accessing data in the chromeos/infra/config repo

go/robocrop-chrome-browser-proposal: This module is temporarily used to access the Chrome Browser infradata/config repo

@property
def build_id(self):

Returns the build ID of this build.

def build_target_dict(self, builds: List[Build]):

Take a list of builds and return a map of build_target names to build.

This function will omit any builds that don't define input build targets.

Args: builds: builds to extract build_target.name set from.

Returns: A dict mapping build target names to builds.

@property
def config(self):

Return the config for this builder.

This convenience property wraps cros_infra_config.get_builder_config, which caches the data.

Returns: BuilderConfig for this builder.

@property
def config_or_default(self):

Config or default config.

The default config is empty, except for:

  • id.name = this builder
  • chrome.internal = True
  • build.install_packages.run_spec = RUN
  • build.use_flags = ‘chrome_internal’

def configure_builder(self, commit: Optional[GitilesCommit]=None, changes: Optional[List[GerritChange]]=None, name: str=‘configure builder’, choose_branch: bool=True, config_ref: Optional[str]=None, lookup_config_with_bucket: bool=False):

Configure the builder.

Fetch the builder config. Determine the actual commit and changes to use. Set the bisect_builder and use_flags.

Args: commit: The gitiles commit to use. Default: GitilesCommit(.... ref=‘refs/heads/snapshot’). changes: The gerrit changes to apply. Default: the gerrit_changes from buildbucket. name: Step name. Default: “configure builder”. choose_branch: If true, choose a branch for the gitiles commit if none is given. config_ref: Override properties.config_ref (for config CLs). lookup_config_with_bucket: If true, include builder.bucket in key when looking up the BuilderConfig. If the bucket is not included in the key and there are builders with the same name (in different buckets), it is undefined which BuilderConfig is returned. The bucket will eventually be included in the key by default, see b/287633203.

Returns: The BuilderConfig for this builder, if one was found.

def determine_if_staging(self, config: Optional[BuilderConfig]=None):

Configure the builder‘s knowledge of whether it’s running in staging.

Args: config: This build's BuilderConfig.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=1), condition=(lambda e: getattr(e, ‘had_timeout’, False)))
def download_binproto(self, filename: str, step_test_data: recipe_test_api.StepTestData, timeout: Optional[int]=None, repo: str=CHROME_OS_INFRA_CONFIG_REPO_URL, msg: Optional[message.Message]=None):

Helper method to fetch a file from gitiles.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=1), condition=(lambda e: getattr(e, ‘had_timeout’, False)))
def download_txt(self, filename: str, step_test_data: recipe_test_api.StepTestData, timeout: Optional[int]=None, repo: str=CHROME_OS_INFRA_CONFIG_REPO_URL):

Helper method to fetch a txt file from gitiles.

@property
def experiments(self):

Return the list of experiments active for this build.

@property
def experiments_for_child_build(self):

Return value for bb schedule_request experiments arg.

def force_reload(self):

Force a reload of the config map from ToT.

@property
def fresh_config(self):

Return a freshly loaded config for this builder.

Returns: BuilderConfig for this builder, freshly reloaded.

@property
def gerrit_changes(self):

def get_bot_policy_config(self, application: str=‘ChromeOS’):

Get BotPolicies as defined in infra/config. If application is Chrome, BotPolicies will be fetched from infradata/config.

Returns: BotPolicyCfg as defined in the config repo.

def get_build_target(self, build: Optional[Build]=None):

Return the build target from input properties.

Args: build: A buildbucket build, which is expected to have a ‘build_target’ input property, or None for the current build.

Returns: The build target, or None.

def get_build_target_name(self, build: Optional[Build]=None):

Return the build target name from input properties.

Args: build: A buildbucket build, which is expected to have a ‘build_target’ input property, or None for the current build.

Returns: The name of the build target, or None.

def get_builder_config(self, builder_name: str, *, bucket_name: Optional[str]=None, missing_ok: bool=False):

Gets the BuilderConfig for the specified builder from HEAD.

Finds the BuilderConfig whose id.name matches the specified Buildbucket builder. If bucket_name is specified, looks up by (bucket_name, builder_name). Note that looking up by just builder name is potentially ambiguous as builders in different buckets can have the same name, see b/287633203. Eventually, bucket will be required in the lookup.

This function loads the checked in proto and forms a map from id.name and (id.bucket, id.name) to BuilderConfig on the first call. Subsequent calls just look up in the map, so will be much faster than the first call. This is meant for the case when many lookups are needed, e.g. a parent builder looks up all child configs.

Args: builder_name: The Buildbucket builder to look for, matched against BuilderConfig‘s id.name. bucket_name: The Buildbucket bucket to look in, matched against BuilderConfig’s id.bucket. If not set, only builder_name is used in the lookup. Will eventually be required. missing_ok: Whether to allow a missing config.

Returns: A BuilderConfigs proto.

Raises: A LookupError if a BuilderConfig is not found for the specified builder.

def get_ctp2_pools_config(self):

Download ctp2 pools config and return list of ctp2 pools.

Returns: List[str]: List of allowed ctp2 pools.

def get_dut_tracking_config(self):

Get TrackingPolicyCfg as defined in infra/config.

Returns: TrackingPolicyCfg as defined in the config repo.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=1), condition=(lambda e: getattr(e, ‘had_timeout’, False)))
def get_realms_list(self):

Helper method to fetch the list of chromeos realms from gitiles.

def get_test_filter_config(self):

Download config files and return the extracted config protos.

Returns: TestDisablementCfg object of the config.

def get_vm_retry_config(self):

Get SuiteRetryCfg as defined in infra/config for tast vm.

Returns: SuiteRetryCfg as defined in the config repo.

@property
def gitiles_commit(self):

def initialize(self):

Perform one-time initialization.

This method automatically runs when the build begins.

Set whether the builder is staging, and align properties with experiments. Hold off on other fields until they are used, to avoid unnecessary clutter in the expectation files.

@property
def is_configured(self):

@property
def is_staging(self):

@property
def override_release_channels(self):

@property
def package_git_revision(self):

@property
def props_for_child_build(self):

Return properties dict meant to be passed to child builds.

Preserve $chromeos/cros_infra_config when launching a child build.

def safe_get_builder_configs(self, builder_names: List[str]):

Gets the BuilderConfigs for the specified builder names from HEAD.

The returned dict will not contain key/values for builder names that could not be found in config.

Args: builder_names: Buildbucket builders to look for, matched against BuilderConfig id.name.

Returns: Dict mapping builder names to found BuilderConfigs.

def set_build_criticality(self, critical: Optional[‘Trinary’]=None, override: bool=False):

Set the buildbucket.build.critical value.

Args: critical: The value to set for the build criticality. If None, will read the value from the builder config. override: Whether to override the existing criticality value.

def should_exit(self, run_spec: ‘BuilderConfig.RunSpec’):

@property
def should_override_release_channels(self):

def should_run(self, run_spec: ‘BuilderConfig.RunSpec’, default: bool=False):

Return whether run_spec represents a step that should run.

Args: run_spec: The RunSpec enum value to check. default: The value to return if run_spec is UNSPECIFIED.

recipe_modules / cros_lkgm

DEPS: cros_infra_config, cros_release, cros_schedule, cros_source, cros_version, recipe_engine/buildbucket, recipe_engine/step

Module for ChromeOS LKGM (Last Known Good Manifest).

class CrosLkgmApi(RecipeApi):

A module to handle the LGKM process and other interactions between the Release & Public builders.

def cleanup_cls(self):

Performs the LGKM cleaning-up process.

This does only the cleaning-up process of LKGM CLs, in contrast that do_lkgm does the actual uprev process as well.

def collect_public_build(self):

Collects results from the public build.

Returns: (common_pb2.Build) The scheduled build.

def do_lkgm(self, release_build_results, use_branch=False):

Performs the LGKM process if the build is an LKGM candidate.

This should only be called from a release orchestrator.

Args: release_build_results (list(common_pb2.Build)): list of release build results as returned by api.orch_menu.plan_and_run_children. use_branch (bool): if set, upload the LKGM CL to the Chrome branch (e.g. refs/branch-heads/5204) instead of ToT.

@property
def has_public_build(self):

Check if a public build was scheduled.

def schedule_public_build(self):

Schedules a public build.

Returns: (common_pb2.Build) The scheduled build.

recipe_modules / cros_lvfs_mirror

DEPS: depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

API for LvfsMirror script.

class LvfsMirror(RecipeApi):

A module for the LvfsMirror script.

def configure(self, mirror_address, gs_uri):

Configure the LvfsMirror script module.

Args:

  • mirror_address: The mirror address for the LVFS repository.

@property
def gs_uri(self):

@property
def local_cache(self):

@property
def mirror_address(self):

def run(self):

recipe_modules / cros_prebuilts

DEPS: binhost_lookup_service, cros_build_api, cros_infra_config, cros_source, cros_version, git, git_footers, git_txn, repo, src_state, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step, recipe_engine/swarming, recipe_engine/time

API for uploading CrOS prebuilts to Google Storage.

class CrosPrebuiltsApi(RecipeApi):

A module for uploading package prebuilts.

def set_binhosts(self, binhosts: List[Tuple[(BuildTarget, str)]], private: bool, key: binhost_pb.BinhostKey, overriding_max_uris: Optional[Dict[(str, int)]]=None):

Set the target's Portage binhosts to point to the given URIs.

This function updates a conf file within the target's overlay, commits the change, and pushes it.

Args: binhosts: List of tuples of build targets and their new URIs. private: Whether the target's binhost is private. key: The binhost key, e.g. POSTSUBMIT_BINHOST. overriding_max_uris: Dict to override max_uris for board. Key is the name of build target, Value is the number of max_uris used for the build target. None for using the default value.

@exponential_retry(retries=GIT_PUSH_MAX_RETRY_COUNT, delay=datetime.timedelta(seconds=1))
def set_binhosts_retry(self, binhosts: List[Tuple[(BuildTarget, str)]], private: bool, key: binhost_pb.BinhostKey, target_project: ProjectInfo, branch: str, overriding_max_uris: Optional[Dict[(str, int)]]=None):

Utility method to update the target's Portage binhosts.

This function is intended to be called from set_binhosts.

Args: binhosts: List of tuples of build targets and their new URIs. private: Whether the target's binhost is private. key: The binhost key, e.g. POSTSUBMIT_BINHOST. target_project: Project of the binhosts. branch: branch name to update overriding_max_uris: Dict to override max_uris for board. Key is the name of build target, Value is the number of max_uris used for the build target. None for using the default value.

def upload_chrome_prebuilts(self, target: BuildTarget, sysroot: Sysroot, chroot: Chroot, profile: Optional[Profile], kind: BuilderConfig.Id.Type, gs_bucket: str, private: bool):

Upload Chrome binary prebuilts for the build target to Google Storage.

Args: target: The build target to upload prebuilts for. sysroot: The sysroot whose prebuilts are being uploaded. chroot: Chroot to work with. profile: The Profile, or None. kind: Kind of prebuilts to upload. gs_bucket: Google storage bucket to upload prebuilts to. private: Whether or not the target prebuilts are private.

Raises: ValueError: If a gs bucket was not specified.

def upload_devinstall_prebuilts(self, target, sysroot, chroot, gs_bucket):

Upload binary devinstall prebuilts for build target to Google Storage.

Args: target (BuildTarget): The build target to upload prebuilts for. sysroot (Sysroot): The sysroot whose prebuilts are being uploaded. chroot (chromiumos.common.Chroot): Chroot to work with. kind (BuilderConfig.Id.Type): Kind of prebuilts to upload.

def upload_host_prebuilts(self, target: BuildTarget, chroot: Chroot, kind: BuilderConfig.Id.Type, gs_bucket: str, profile: Optional[Profile]=None):

Upload host binary prebuilts to Google Storage.

Args: target: The build target to upload prebuilts for. chroot: Chroot to work with. kind: Kind of prebuilts to upload. gs_bucket: Google storage bucket to upload prebuilts to. profile: The build target profile, or None.

Raises: StepFailure: If a gs bucket was not specified.

def upload_target_prebuilts(self, target, sysroot, chroot, profile, kind, gs_bucket, private=True):

Upload binary prebuilts for the build target to Google Storage.

Determines what to upload, uploads it, and points Portage to the upload URI. This step works entirely within the workspace checkout.

Args: target (BuildTarget): The build target to upload prebuilts for. sysroot (Sysroot): The sysroot whose prebuilts are being uploaded. chroot (chromiumos.common.Chroot): Chroot to work with. profile (chromiumos.Profile): The Profile, or None. kind (BuilderConfig.Id.Type): Kind of prebuilts to upload. gs_bucket (str): Google storage bucket to upload prebuilts to. private (bool): Whether or not the target prebuilts are private.

recipe_modules / cros_release

DEPS: build_menu, build_reporting, checkpoint, conductor, cros_artifacts, cros_infra_config, cros_release_util, cros_source, cros_version, easy, failures, gerrit, git, git_footers, gobin, paygen_orchestration, repo, signing, skylab, src_state, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

An API for providing release related operations (e.g. paygen, signing).

class CrosReleaseApi(RecipeApi):

@buildspec.setter
def buildspec(self, buildspec: ManifestLocation):

@property
def channels(self):

Return the channels as passed into input properties.

def check_buildspec(self, fatal: bool=False):

Checks that the build was given a buildspec and that there doesn't already exist a build for this buildspec (and this build is not a retry).

Args: fatal: Whether or not to kill the build if the build already ran.

def check_channel_override(self):

def create_buildspec(self, specs_dir=‘buildspecs’, step_name=‘create buildspec’, dry_run=False, gs_location=None):

Create a pinned manifest and upload to manifest-versions and/or GS.

If the buildspec is uploaded to GS, this function also creates a public buildspec using Manifest Doctor.

Args: specs_dir (str): Relative path in manifest-versions in which to place the pinned manifest. branch (str): The branch of manifest-versions that will be used, or None to use the default branch. step_name (str): The step name to use. dry_run (bool): Whether the git push is --dry-run. gs_location (string): If set, will also upload the pinned manifest to GS.

def emit_release_buckets(self, build_target, step):

Emit the release buckets for the configured channels in step logs.

Args: build_target (str): build target to include in the path. step (StepPresentation): step to log into.

def get_image_dir(self, config, sysroot, step):

Determine the image directory unsigned artifacts are uploaded in.

Args: config (BuilderConfig): The Builder Config for the build. sysroot (Sysroot): sysroot to use. step (StepPresentation): the step to log into.

Returns: GS image directory as a gs:// uri.

def push_and_sign_images(self, config, sysroot):

Call the Push Image Build API endpoint for the build.

This pushes the image files to the appropriate bucket and prepares them for signing. The actual execution of these procedures is handled in the underlying script, chromite/scripts/push_image.py. Must be used in the context of a build.

Args: config (BuilderConfig): The Builder Config for the build. sysroot (Sysroot): sysroot to use.

Return: Tuple of (gs_image_dir, instructions_uris): gs_image_dir is the GS directory the image was pushed from. instructions_uris is a list of URIs to instructions files for the pushed images.

@property
def release_bucket(self):

Return the release_bucket as passed into input properties.

@property
def resultdb_gitiles_commit(self):

Return the gitiles commit used for ResultDB as created by this module, or None.

def run_payload_generation(self, use_split_paygen: bool=False):

Run the generation of release payloads using the context of a build.

This is blocking: it will launch the paygen orchestrator, and wait for it to finish. This function assumes that it is run after a new release image has been built.

Args: use_split_paygen: Whether to use the new split paygen flow.

def set_output_properties(self):

Set release-related output properties for the build.

def set_release_qs_account(self):

Fetches the RC schedule and determines which QS account to use.

If the schedule cannot be fetched or is malformatted, reasonable defaults will be used. See go/dynamic-rc-prio for more context.

def set_resultdb_gitiles_commit(self, repo_url: str, repo_host: str, project: str, branch: str, position: int):

Set the gitiles commit used for ResultDB.

Args: repo_url: URL where the repo is hosted. repo_host: The identity of the gitiles host. project: Repository name on the host. branch: Branch where commit was fetched. position: Used to define a total order of commits on the ref.

@property
def sign_types(self):

Return the sign types as passed into input properties.

def uprev_packages(self):

Uprev any packages that contain differences.

Intended to be run by non-ToT release orchestrators.

Return: all_uprevs_passed(bool): True if all uprevs succeed, False if ANY failed.

def validate_sign_types(self):

Checks whether the configured sign types are valid for signing.

Raises: StepFailure: If any of the given image types is not supported for signing.

recipe_modules / cros_release_config

DEPS: cros_schedule, cros_source, gerrit, git, repo, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/step, recipe_engine/time

An API for managing release config.

class CrosReleaseConfigApi(RecipeApi):

def update_config(self, branch: str, auto_submit: bool, dryrun: bool=False):

Creates CLs updating config file to include new release branch.

While Rubik is being turned-up, this endpoint modifies both the legacy config in chromite as well as the Rubik starlark config in infra/config.

Args: branch: Release, stabilize or firmware branch, e.g. “release-R89-13729.B”, “stabilize-15129.B”, or “firmware-R126-12345.B”. auto_submit: Whether to autosubmit the config change. dryrun: If in dryrun mode, we'll abandon the change.

recipe_modules / cros_release_util

DEPS: cros_infra_config, cros_source

An API for providing release related utility functions.

class CrosReleaseUtilApi(RecipeApi):

@staticmethod
def channel_long_string_to_enum(str_channel):

Convert long channel name strings (e.g. ‘beta-channel’) to enum values.

@staticmethod
def channel_short_string_to_enum(str_channel):

Convert short channel name strings (e.g. ‘beta’) to enum values.

@staticmethod
def channel_strip_prefix(channel):

Takes a common_pb2.Channel and returns an unprefixed str (e.g. beta).

@staticmethod
def channel_to_long_string(channel):

Takes a common_pb2.Channel and returns a suffixed str (e.g. dev-channel).

@staticmethod
def channel_to_short_string(channel):

Takes a common_pb2.Channel and returns a unsuffixed str (e.g. dev).

def image_type_to_str(self, image_type: ImageType):

Extracts the image type as a lowercase string.

E.g. IMAGE_TYPE_RECOVERY -> recovery.

@staticmethod
def match_channels(channel1, channel2):

Determine if two channels are equal, even if represented differently.

Args: channel1 (str|Channel): A representation of a channel, either as a Channel enum (e.g. Channel.CHANNEL_BETA), a short string (e.g. “beta”), or a long string (e.g. “beta-channel”). channel2 (str|Channel): As above.

Returns: bool: Whether the two args describe the same channel.

def release_builder_name(self, build_target, branch=None, staging=False):

Determine the Rubik child builder name for the given build_target.

Args: build_target (string): name of the build target, e.g. zork or kevin-kernelnext branch (string): optional, branch we‘re on. staging (string): optional, whether or not we’re in staging.

Return: The Rubik child builder, e.g. zork-release-main.

recipe_modules / cros_relevance

DEPS: cros_build_api, cros_history, cros_infra_config, cros_source, easy, git_footers, gobin, src_state, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/file, recipe_engine/path, recipe_engine/step

Module for determining if a build is unnecessary.

class CrosRelevanceApi(RecipeApi):

A module for determining if a build is unnecessary.

def call_pointless_build_checker(self, check_request, step_presentation, is_pointless_test_value=False):

Returns the result of calling the Pointless Build Checker.

Args: check_request (PointlessBuildCheckRequest): The request to pass into the pointless build checker. step_presentation (StepPresentation): The parent step presentation. This is used for adding logs to the UI. is_pointless_test_value (bool): The return value when testing. The default is False.

Returns: check_result (PointlessBuildCheckResponse): The response from calling the Pointless Build Checker

def check_force_relevance_footer(self, gerrit_changes, configs):

Check the incoming gerrit changes to determine if we force relevance.

Args: gerrit_changes (list[GerritChange]): The gerrit changes. configs (list[BuilderConfig]: The Builder Configs for the build.

Returns: A list of target names, derived from configs, to be forced relevant.

def get_affected_paths(self, patch_sets):

Returns the union of all paths in the list of patchsets.

Args: patch_sets (List[PatchSet]): List of Gerrit Patchsets to be applied to the build, if any.

Returns: List[str]: The union of all paths in the patchsets.

def get_dependency_graph(self, sysroot, chroot, packages=None):

Calculates the dependency graph for the build target & SDK

Args: sysroot (Sysroot): The Sysroot being used. chroot (chromiumos.Chroot): The chroot it is being run in. packages (list[chromiumos.PackageInfo]): The packages for which to generate the dependency graph.

Returns: (chromite.api.DepGraph, chromite.api.DepGraph): A tuple of opaque dependency graph objects, with the first element being the dependency graph for the target and the second element the graph for the SDK/chroot.

def get_package_dependencies(self, sysroot, chroot, patch_sets=None, packages=None, include_rev_deps=False):

Calculates the dependencies for the build target.

Args: sysroot (Sysroot): The Sysroot being used. chroot (chromiumos.Chroot): The chroot it is being run in. patch_sets (List[PatchSet]): The changes applied to the build. Used to determine the affected paths. If empty / None returns package dependencies for all paths. packages (list[chromiumos.PackageInfo]): The list of packages for which to get dependencies. If none are specified the standard list of packages is used.

Returns: (List[str]): A list of package dependencies for the build target.

def is_cq_build_relevant(self, patch_sets: List[PatchSet], dep_graph: DepGraph, force_relevant: bool=False, is_pointless_test_value: bool=False):

Determines if changes are relevant to the CQ run.

If build_target is set, then the chromiumos workspace must have been checked out prior to calling this method. This is a requirement for BuildDependencyGraph checks.

Args: patch_sets: The PatchSets applied to the build. dep_graph: The dependency graph to compare the changes against to test for build relevancy. force_relevant: Whether to always declare the build relevant. is_pointless_test_value: The test return value of the pointless build checker. Default is False, meaning the build is not pointless.

Returns: bool: Whether the changes are relevant to the CQ run.

def is_depgraph_affected(self, gerrit_changes, gitiles_commit, dep_graph, test_value=None, name=None):

Determines if a Gerrit Change affects a given dependency graph.

Args: gerrit_changes (bbcommon_pb2.GerritChange): The Gerrit Changes to be applied for the build, if any. gitiles_commit (bbcommon_pb2.GitilesCommit): The manifest-internal snapshot Gitiles commit. dep_graph (chromite.api.DepGraph): The dependency graph to compare the Gerrit changes against to test for build relevancy. test_value (bool): The answer to use for testing, or None. name (str): The step name to display, or None for default.

Returns: bool: Whether the given Gerrit Change affects the given dependency graph.

def postsubmit_relevance_check(self, gitiles_commit, dep_graph):

Determines if postsubmit builder is relevant for given snapshot.

Args: gitiles_commit (bbcommon_pb2.GitilesCommit): The manifest-internal snapshot Gitiles commit. dep_graph (chromite.api.DepGraph): The dependency graph to compare the Gerrit changes against to test for build relevancy.

Returns: bool: Whether any packages that target depends on have been upreved in the latest snapshot or the build was forced relevant.

def run_build_planner(self, builder_configs, gerrit_changes, gitiles_commit, name=None):

Determines which builders must be run (and which can be skipped).

This filters on preconfigured RunWhen rules, as well as on rules allowing skipping of image builders. Image builders are those that run the build_target recipe, producing an IMAGE_ZIP CrOS artifact.

Args: builder_configs (list[chromiumos.BuilderConfig]): builder configs to consider for skipping. gerrit_changes (bbcommon_pb2.GerritChange): The Gerrit Changes to be applied for the build, if any. gitiles_commit (bbcommon_pb2.GitilesCommit): The manifest-internal snapshot Gitiles commit. name (str): The step name.

Returns: PlannedBuilders: Necessary and skipped builders as a tuple.

@toolchain_cls_applied.setter
def toolchain_cls_applied(self, value: Optional[bool]):

recipe_modules / cros_resultdb

DEPS: cros_infra_config, exonerate, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/step

cros_resultdb is a module to ease interaction with ResultDB for ChromeOS. It extends the functionality in the resultdb module with ChromeOS scpecific utilities.

class ResultDBCommand(RecipeApi):

Module for chromium tests on skylab to upload result to Result DB.

def apply_exonerated_exonerations(self, invocation_ids):

Exonerate already exonerated test failures for the given invocations.

Args: invocation_ids (list(str)): The ids of the invocation whose results we should try to exonerate.

def apply_exonerations(self, invocation_ids, default_behavior=Request.Params.TestExecutionBehavior.BEHAVIOR_UNSPECIFIED, behavior_overrides_map=None, variant_filter=None):

Exonerate unexpected test failures for the given invocations.

Currently only supports exonerating tests based on criticality. First attempt to exonerate based on test run's default behavior. If the default behavior is not exonerable, try to apply a test case behavior override.

Args: invocation_ids (list(str)): The ids of the invocation whose results we should try to exonerate. default_behavior (TestExecutionBehavior): The default behavior for all tests in the test_runner build. behavior_overrides_map (dict{str: TestExecutionBehavior}): Test-specific behavior overrides that supersede the default behavior. variant_filter (dict): Attributes which must all be present in the test result variant definition in order to exonerate.

@property
def current_invocation_id(self):

Return the current invocation's id.

def export_invocation_to_bigquery(self, bigquery_exports=None):

Modifies the current invocation to be exported to BigQuery (along with its children) once it is finalized.

This should only be called on top level invocations, if it is called on a parent and a child, all test results in the child will be exported twice.

Note that this should normally be configured on the builder definition in infra/config rather than in the recipe. Only use this when a builder cannot be determined to always export to Bigquery at configuration time, but needs to determine it at recipe runtime.

Args: bigquery_exports (list(resultdb.BigQueryExport)): The BigQuery export configurations of tables and predicates of what to export.

def extract_chromium_resultdb_settings(self, test_args):

Extract resultdb settings from test_args for chromium test results.

Extracts resultdb settings from test_args. Also converts base_tags from a list of strings [‘key:value’] into a list of string tuples [(key, value)] as is expected by resultdb.wrap().

Args: test_args (string): Extra autotest arguments, e.g. “key1=val1 key2=val2”. Chromium tests use test_arg to pass runtime parameters to our autotest wrapper. We reuse it to pipe resultDB arguments, because it is easy to access in the test runner recipe. test_args must contain resultdb_settings which is base64 compressed json string, wrapping all resultdb parameters.

Returns: A dictionary wrapping all ResultDB upload parameters.

Raises: ValueError: If resultdb settings are not found in the test_args.

def get_drone_artifact_directory(self, base_dir, result_format=None, artifact_directory='', is_cft=False):

Get the path to the test results artifact directory on the drone.

Currently only supports Tast and Gtest.

Args: base_dir (str): The path of the base test results on the drone server. For example, Chromium gtest result can be found at base_dir/autoserv_test/chromium/results. result_format (str): The format of the test results. artifact_directory (str): rel path relative to autotest result folder. ONLY for gtest, E.g. chromium/debug. For tast test, we rely on it to pass the runtime result path to adapter. So we do not accept user defined artifact fed to this module. is_cft (bool): True if it's for CFT test results.

Returns: Path to the test results artifact directory on the drone server.

def get_drone_result_file(self, base_dir, result_format, autotest_name=‘chromium’, is_cft=False):

Get the path to the test results file on the drone.

There are hardcoded for tast and gtest in this module.

Args: base_dir (Path): The path of the base test results on the drone server. For example, Chromium gtest result can be found at base_dir/autoserv_test/chromium/results. result_format (str): The format of the test results. autotest_name: The autotest name for non-tast tests. By default, it is ‘chromium’, the generic wrapper name for browser gtests. is_cft (bool): True if it's for CFT test results.

Returns: Path to the test results file on the drone server.

def report_filtered_test_cases(self, test_names, base_variant, base_tags=None, reason=‘filtered’):

Upload test results for filtered test cases to ResultDB.

These filtered test cases should not run, so their result status is marked as SKIP and the expected field is True.

Args: test_names (list[str]): The names of the tests that should not run. base_variant (dict): Variant key-value pairs to attach to the test results. base_tags (list[tuples]): List of tags to attach to the test results.

def report_missing_test_cases(self, test_names, base_variant, base_tags=None):

Upload test results for missing test cases to ResultDB. These missing test cases should have run but did not unexpectedly, so their result status is marked as SKIP and the expected field is False.

Args: test_names (str[]): The names of the tests that should have run but did not. base_variant (dict): Variant key-value pairs to attach to the test results. base_tags (list[tuples]): List of tags to attach to the test results.

def upload(self, config, testhaus_url=None, step_name=‘upload test results to rdb’):

Wrapper for uploading test results to resultDB.

Args: config (dict) A dict wrapping all resultdb parameters. testhaus_url (string): Link to the Testhaus logs for the test run. step_name (str): The name of the step or None for default.

recipe_modules / cros_schedule

DEPS: easy, recipe_engine/step, recipe_engine/time

API for working with CrOS's Schedule.

class CrosScheduleApi(RecipeApi):

A module for reading, commiting, and manipulating the release schedule.

def fetch_chromiumdash_schedule(self, start_mstone=None, fetch_n=10):

Return the json schedule from chromiumdash.

Args: start_mstone (int): start with this milestone. Default: last branched milestone. fetch_n (int): Number of milestones to return. Default: 10.

Returns: (str): JSON string representing the results of the query, or None.

def get_chrome_branch(self, mstone):

Get the associated chrome branch for a milestone.

Args: mstone (int): Milestone to fetch.

Returns: (str): The chromium branch number or None.

def get_last_branched_mstone(self):

Gets the last branched milestone.

Returns: A chromiumos.chromiumdash.FetchMilestoneScheduleResponse.

Raises: StepFailure if not able to find mstone.

def get_last_branched_mstone_n(self):

Gets the last branched milestone number as an int.

def json_to_proto(self, sched_str_json):

Returns a FetchMilestoneScheduleResponse from JSON repr.

recipe_modules / cros_sdk

DEPS: cros_build_api, cros_infra_config, cros_relevance, cros_source, cros_version, easy, git, goma, image_builder_failures, remoteexec, src_state, workspace_util, depot_tools/depot_tools, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step

API for interacting with cros_sdk, the interface to the CrOS SDK.

class CrosSdkApi(RecipeApi):

A module for interacting with cros_sdk.

def __call__(self, name, args, **kwargs):

Executes ‘cros_sdk’ with the supplied arguments.

Args:

  • name (str): The name of the step.
  • args (list): A list of arguments to supply to ‘cros_sdk’.
  • kwargs: Keyword arguments to pass to the ‘step’ call.

Returns: See ‘step.call’.

def build_chmod_chroot(self):

Chroot needs to be tightened to 755 for the build process.

@property
def chrome_root(self):

@property
def chroot(self):

Return a chromiumos.common.Chroot.

Note that use of the Chroot's component fields (e.g., Chroot.path or Chroot.out_path) on an individual basis (such as os.path.join(Chroot.path, “tmp”)) is usually incorrect. Recipes should not be making assumptions about the chroot path structure, and instead should funnel requests through the Build API, where inputs and outputs are represented in proto messages, and the API layer does any translation or copying of artifacts in and out of the chroot.

@contextlib.contextmanager
def cleanup_context(self, checkout_path=None):

Returns a context that cleans the SDK chroot named cache.

This may be called before cros_source.ensure_synced_cache, since it yields immediately, and only accesses checkout_path during cleanup.

Args: checkout_path (Path): Path to source checkout. Default: cros_source.workspace_path.

def cleanup_sysroot(self):

def configure(self, chroot_parent_path):

Configure CrosSdkApi.

Args: chroot_parent_path (Path): Parent for chroot directory.

def configure_goma(self):

Configure goma for Chrome.

This is a helper function to do the various bits of cros_sdk configuration needed for Chrome to be built with goma.

Must be run with cwd inside a chromiumos source root.

def configure_remoteexec(self):

Configure remoteexec for Chrome.

def create_chroot(self, version=None, bootstrap=False, sdk_version=None, timeout_sec=‘DEFAULT’, test_data=None, test_toolchain_cls=None, name=None, no_delete_out_dir=False):

Initialize the chroot and link it into the workspace.

Create a chroot if one does not already exist in the chroot path. If one already exists, but is not reusable by this build (see _ensure_cache_state) or replace is True, delete the existing chroot and create a new one.

Args: version (int): Required SDK cache version, if any. Some recipes do not care what version the SDK is, they just need any SDK. bootstrap (boolean): Whether to bootstrap the chroot. Default: False sdk_version (string): Optional. Specific SDK version to include in the CreateSdkRequest, e.g. 2022.01.20.073008. timeout_sec (int): Step timeout (in seconds). Default: None if bootstrap is True, otherwise 3 hours. test_data (str): test response (JSON) from the SdkService.Create call, or None to use the default in cros_build_api/test_api.py. test_toolchain_cls (bool): Test answer for detect_toolchain_cls. name (str): Step name. Default: ‘init sdk’. no_delete_out_dir (boolean): If True, out directory will be preserved.

Returns: chromiumos_pb2.Chroot protobuf for the chroot.

@property
def cros_sdk_path(self):

Returns a Path to the cros_sdk script.

@property
def default_sdk_sysroot(self):

Returns the default SDK Sysroot.

@property
def force_off_toolchain_changed(self):

Return whether we are forcing toolchain_cls off for testing.

def get_toolchain_info(self, build_target: str):

Retrieve metadata about SDK/toolchain usage.

Args: build_target: Name of the build target.

Returns: Information about sdk/toolchain usage.

def goma_config(self):

def has_goma_config(self):

def has_remoteexec_config(self):

def initialize(self):

Cache the chroot path.

def link_chroot(self, checkout_path, chroot_path=None):

Link the chroot to a chromiumos checkout.

Args: checkout_path (Path): Path to the checkout root. chroot_path (Path): Path to the chroot, or None for the default.

@long_timeouts.setter
def long_timeouts(self, value):

Set long_timeouts.

This boolean is sticky.

def mark_sdk_as_dirty(self):

@property
def remoteexec_config(self):

def run(self, name, cmd, env=None, **kwargs):

Runs a command in a cros_sdk chroot.

It is assumed the current working directory is within a chromiumos checkout.

Args:

  • name (str): The name of the step.
  • cmd (list): A command and arguments to run.
  • env (dict): A dict of environment variables to pass to the command.
  • kwargs: Keyword arguments to pass to call.

Returns: See ‘step.call’.

@property
def sdk_cache_state(self):

Returns default values if not set and cache state file does not exist.

@property
def sdk_is_dirty(self):

Return whether the SDK is dirty

def set_chrome_root(self, chrome_root):

Set chrome root with synced sources.

This is a helper function to set up a chrome root.

Args: chrome_root (Path): Directory with the Chrome source.

def set_goma_config(self, goma_dir, goma_approach, log_dir, stats_file, counterz_file):

Set the goma config.

Args: goma_dir (Path): Path to the goma install location. goma_approach (chromiumos.GomaConfig.GomaApproach): Goma Approach. log_dir (Path): Path to the log directory. stats_file (str): Name of the goma stats file, relative to log_dir. counterz_file (str): Name of the goma counterz file, relative to log_dir.

def set_remoteexec_config(self, reclient_dir, reproxy_cfg_file):

Set the remoteexec config.

def set_use_flags(self, use_flags):

def swarming_chmod_chroot(self):

Chroot is deployed as root, therfore change permissions to allow for Swarming cache uninstall/install.

def unlink_chroot(self, checkout_path):

Unlink the chroot from the chromiumos checkout.

Args: checkout_path (Path): Path to the checkout root.

def update_chroot(self, build_source=False, toolchain_targets=None, timeout_sec=‘DEFAULT’, test_data=None, test_toolchain_cls=None, name=None, force_update=False):

Update the chroot.

Args: build_source (boolean): Whether to compile from source. Default: False. toolchain_targets (list[BuildTarget]): List of toolchain targets needed, or None. timeout_sec (int): Step timeout (in seconds), or None for no step timeout. Default: 24 hours if building from source or a toolchain change is detected, otherwise 3 hours. test_data (str): test response (JSON) from the SdkService.Update call, or None to use the default in cros_build_api/test_api.py. test_toolchain_cls (bool): Test answer for detect_toolchain_cls. name (string): Step name. Default: “update sdk”. force_update (bool): Pass force_update to the SdkService/Update call, causing update_chroot to be called.

recipe_modules / cros_snapshot

DEPS: cros_infra_config, git_footers, src_state, recipe_engine/context, recipe_engine/step

API for working with CrOS snapshot builds.

class CrosSnapshotApi(RecipeApi):

A module for steps that manipulates CrOS snapshot information.

def is_snapshot_build(self):

Return True if this build is a snapshot build running on a snapshot builder.

def snapshot_identifier(self, test_value: Optional[str]=None):

The snapshot identifier of the workspace checkout.

Args: test_value: test value of the snapshot identifier.

recipe_modules / cros_som

DEPS: recipe_engine/service_account, recipe_engine/time, recipe_engine/url

class CrosSomApi(RecipeApi):

A module for interacting with the ChromeOS Sheriff-o-Matic.

def get_annotation(self, step_name):

Return a SomAnnotation for step_name.

None if there is no annotation for the step.

def get_silence_reason(self, annotation):

Return the reason an annotation is silenced, None if there is no silence.

Note that if an annotation is in a group that is silenced, it will also be considered silenced.

Args: annotation (SomAnnotation): The annotation to analyze.

Returns: A str

recipe_modules / cros_source

DEPS: bot_cost, cros_build_api, cros_infra_config, easy, gcloud, gerrit, git, git_footers, gitiles, overlayfs, repo, src_state, test_util, depot_tools/gitiles, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/cas, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

API for working with CrOS source.

class CrosSourceApi(RecipeApi):

A module for CrOS-specific source steps.

def apply_gerrit_changes(self, gerrit_changes, include_files=False, include_commit_info=False, ignore_missing_projects=False, test_output_data=None):

Apply GerritChanges to the workspace.

Args: gerrit_changes (list[GerritChange]): list of gerrit changes to apply. include_files (bool): whether to include information about changed files. include_commit_info (bool): whether to include info about the commit. ignore_missing_projects (bool): Whether to ignore projects that are not in the source tree. (For example, the builder uses the external manifest, but the CQ run includes private changes.) test_output_data (dict): Test output for gerrit-fetch-changes.

Returns: List[PatchSet]: A list of commits from cherry-picked patch sets.

def apply_patch_set(self, patch, project_path, is_abs_path=False):

Apply a PatchSet to the git repo in ${CWD}.

Args: patch (PatchSet): The PatchSet to apply. project_path (str): The path in which to apply the change. is_abs_path (bool): Whether the project path is an absolute path. The default is False meaning the project_path is relative to the workspace.

@property
def branch_manifest_file(self):

Returns the Path to the manifest_file for this build.

@property
def cache_path(self):

The cached checkout path.

This is the cached version of source (the internal manifest checkout), usually updated once at the beginning of a build and then mounted into the workspace path.

def checkout_branch(self, manifest_url, manifest_branch, projects=None, init_opts=None, sync_opts=None, step_name=None):

Check out a branch of the current manifest.

Note: If there are changes applied when this is called, repo will try to rebase them to the new branch.

Args:

  • manifest_url (str): The manifest url.
  • manifest_branch (str): The branch to check out, such as ‘release-R86-13421.B’
  • projects (List[str]): Projects to limit the sync to, or None to sync all projects.
  • init_opts (dict): Extra keyword arguments to pass to ‘repo.init’.
  • sync_opts (dict): Extra keyword arguments to pass to ‘repo.sync’.
  • step_name (str): Name for the step, or None for default.

def checkout_external_manifest(self, commit_id: str, force: bool=True):

Checkout the external manifest at the given commit.

Args: commit_id: The commit of the external manifest to checkout. force: If true, throw away any local changes.

def checkout_gerrit_change(self, change):

Check out a gerrit change using the gerrit refs/changes/... workflow.

Differs from apply_changes in that the change is directly checked out, not cherry picked (so the patchset parent will be accurate). Used for things like tricium where line number matters.

Args: change (GerritChange): Change to check out. name (string): Step name. Default: “checkout gerrit change”.

def checkout_manifests(self, commit=None, is_staging=False, checkout_internal=True, checkout_external=False):

Check out the manifest projects.

Syncs the manifest projects into the workspace, at the appropriate revision. This is intended for builders that only need the manifest projects, not for builders that have other projects checked out as well.

If |commit| is on an unpinned branch, there is no reasonable way to discern which revision of the external manifest is correct. The branch's copy of the external manifest is unbranched. As such, the return will have an empty commit id, and the external manifest source tree may be dirty (mirrored manifest files will be copied from the internal manifest, but not committed.)

Args: commit (GitilesCommit): The commit to use, or None for the default (from cros_infra_config.configure_builder) is_staging (bool): Whether this is staging. checkout_internal (bool): Whether to checkout the internal manifest. Defaults to true. checkout_external (bool): Whether to checkout the external manifest. Defaults to false.

Returns: (GitilesCommit) The GitilesCommit to use for the external manifest.

@contextlib.contextmanager
def checkout_overlays_context(self, mount_cache=True, disk_type=‘pd-ssd’):

Returns a context where overlays can be mounted.

Args: mount_cache (bool): Whether to mount the chromiumos cache. Default: True. disk_type (str): GCE disk type to use. Default: pd-ssd

def checkout_tip_of_tree(self):

Check out the tip-of-tree in the workspace.

def configure_builder(self, commit: Optional[bb_common_pb2.GitilesCommit]=None, changes: Optional[List[bb_common_pb2.GerritChange]]=None, default_main: bool=False, name: str=‘configure builder’, lookup_config_with_bucket=False):

Configure the builder.

Fetch the builder config. Determine the actual commit and changes to use. Set the bisect_builder and use_flags.

Args: commit: The gitiles commit to use. Default: GitilesCommit(.... ref=‘refs/heads/snapshot’). changes: The gerrit changes to apply. Default: the gerrit_changes from buildbucket. default_main: Whether the default branch should be ‘main’. Default: use the appropriate snapshot branch. name: Step name. lookup_config_with_bucket: If true, include builder.bucket in key when looking up the BuilderConfig. If the bucket is not included in the key and there are builders with the same name (in different buckets), it is undefined which BuilderConfig is returned. The bucket will eventually be included in the key by default, see b/287633203.

Returns: BuilderConfig for the active build, or None if the active build does not have a BuilderConfig.

def ensure_synced_cache(self, manifest_url: Optional[str]=None, init_opts: Optional[Dict[(str, Any)]]=None, sync_opts: Optional[Dict[(str, Any)]]=None, cache_path_override: Optional[Path]=None, is_staging: bool=False, projects: Optional[List[str]]=None, gitiles_commit: Optional[bb_common_pb2.GitilesCommit]=None, manifest_branch_override: Optional[str]=None):

Ensure the configured repo cache exists and is synced.

Args: manifest_url: Manifest URL for 'repo.init`. init_opts: Extra keyword arguments to pass to ‘repo.init’. sync_opts: Extra keyword arguments to pass to ‘repo.sync’. cache_path_override: Path to sync into. If None, the cache_path property is used. is_staging: Flag to indicate canary staging environment. projects : Projects to limit the sync to, or None to sync all projects. gitiles_commit: The gitiles_commit, or None to use the current value. manifest_branch_override: If provided, override the manifest_branch value in init_opts. Otherwise, use the value returned from configure_builder().

def fetch_snapshot_shas(self, count: int=((7 * 24) * 2), snapshot: Optional[bb_common_pb2.GitilesCommit]=None):

Return snapshot SHAs for the manifest.

Return SHAs for the most recent |count| commits in the manifest. The default is to fetch 7 days worth of snapshots, based on (an assumed) 2 snapshots per hour.

Args: count: How many SHAs to return. snapshot: The latest snapshot to fetch, or None.

Returns: The list of snapshot SHAs.

def find_project_paths(self, project, branch, empty_ok=False):

Find the source paths for a given project in the workspace.

Will only include multiple results if the same project,branch is mapped more than once in the manifest.

Args: project (str): The project name to find a source path for. branch (str): The branch name to find a source path for. empty_ok (bool): If no paths are found, return an empty list rather than raising StepFailure

Returns: list(str), The path values for the found project.

def get_external_snapshot_commit(self, internal_manifest_path: Path, snapshot_commit_id: str):

Return the Cr-External-Snapshot for the given internal snapshot commit.

The internal snapshot commit contains a footer ‘Cr-External-Snapshot’ which contains the corresponding snapshot commit in the external manifest.

Note: This function assumes the internal manifest is synced.

Args: internal_manifest_path: The path where the internal manifest is checked out. snapshot_commit_id: The internal snapshot commit id for which to return the external snapshot commit counterpart.

Raises: StepFailure if there is not exactly one Cr-External-Snapshot footer.

Returns: The corresponding exteral manifest snapshot commit.

def initialize(self):

Initialization that follows all module loading.

@property
def is_source_dirty(self):

Returns whether the source is dirty.

Returns whether the source is dirty. The source is dirty if it was checked out to a custom snapshot from isolate or has had patches applied or has been moved to a branch.

@property
def is_tot(self):

Return whether or not the builder is on ToT.

@property
def manifest_branch(self):

Returns any non-default manifest branch that is checked out.

@property
def manifest_push(self):

Returns the manifest branch to push changes to.

@property
def mirrored_manifest_files(self):

Returns the names of files that are mirrored into the public manifest.

The files returned are owned by chromeos/manifest-internal, and are copied into chromiumos/manifest when they are changed.

Annealing does this as part of creating the snapshot, and the various builders do it when applying manifest changes.

Returns: (list[MirroredManifestFile]) with files we mirror.

@property
def pinned_manifest(self):

Return the pinned manifest for this build.

def push_uprev(self, uprev_response, dry_run, commit_only=False, is_staging=False, discard_unpushed_changes=False):

Commit and push any upreved packages to its remote.

Args: uprev_response (list[PushUprevRequest]): Named tuple containing the modified ebuild and associated message subject. dry_run (bool): Dry run git push or not. commit_only (bool): Whether to skip the push step. is_staging (bool): Whether the builder is a staging builder. discard_unpushed_changes (bool): Whether to discard unpushed commits when commit_only is True, necessary for release builders where we need buildspecs to contain valid commits.

Return: all_uprevs_passed (bool): True if all uprevs succeeded, False if ANY failed.

def related_changes_to_apply(self, gerrit_changes: List[bb_common_pb2.GerritChange], all_related_changes: OrderedDict_type[(str, Dict[(str, Any)])]):

Based on what is already included, figure out which related changes are implicitly depended on by gerrit_changes.

Note that only related changes that precede the changes in gerrit_changes will be applied. Related changes that follow a change in gerrit_changes will not be applied unless explicitly included.

Args: gerrit_changes: Changes to apply for the builder. all_related_changes: All related changes for all gerrit_changes. May include duplicates.

Returns: De-duplicated changes that are implicitly depended on by gerrit_changes in a relation chain, but not already included in gerrit_changes.

@property
def snapshot_cas_digest(self):

Returns the snapshot digest in use or None.

def sync_checkout(self, commit=None, manifest_url=None, **kwargs):

Sync a checkout to the appropriate manifest.

If the module properties contain the sync_to_manifest field, that will be used. Otherwise the given commit/manifest_url will be used.

Args: commit (GitilesCommit): The gitiles_commit to sync to. Default: commit saved in cros_infra_config.configure_builder(). manifest_url: URL of manifest repo. Default: internal manifest

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=1), condition=retry_timeouts)
def sync_to_gitiles_commit(self, gitiles_commit, manifest_url=None, **kwargs):

Sync a checkout to the specified gitiles commit.

Will first attempt to sync to snapshot.xml, then default.xml.

Args: gitiles_commit (GitilesCommit): commit to sync to manifest_url: URL of manifest repo. Default: internal manifest kwargs (dict): additional args for repo.sync_manifest.

@property
def sync_to_manifest(self):

Returns the manifest being synced to as specified in properties, or None.

Uses the sync_to_manifest property.

Returns: ManifestLocation, or None.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=1), condition=retry_timeouts)
def sync_to_pinned_manifest(self, manifest_url=‘‘, manifest_branch=’’, manifest_path=‘‘, manifest_gs_path=’’, **kwargs):

Sync a checkout to the specified [pinned] manifest.

The manifest will be downloaded directly from the source using gitiles.

Args: manifest_url (string): URL of the project the manifest is in, e.g. https://chrome-internal.googlesource.com/chromeos/manifest-versions manifest_branch (string): Branch of repository to get manifest from, e.g. ‘main’. manifest_path (string): Path (relative to repository root) of manifest file, e.g. buildspecs/91/13818.0.0.xml. manifest_gs_path (string): GS Path of manifest, e.g. gs://chromeos-manifest-versions/release/91/13818.0.0.xml. Takes precendence over manifest_url/branch/path.

def uprev_packages(self, workspace_path=None, build_targets=None, timeout_sec=(10 * 60), name=‘uprev packages’):

Uprev packages.

Args: workspace_path (Path): Path to the workspace checkout. build_targets (list[BuildTarget]): List of build_targets whose packages. should be uprevved, or None for all build_targets. timeout_sec (int): Step timeout (in seconds). Default: 10 minutes. name (string): Name for step.

Returns: UprevPackagesResponse

@property
def use_external_source_cache(self):

Returns whether the builder is configured to use the external cache.

@property
def workspace_path(self):

The “workspace” checkout path.

This is where the build is processed. It will contain the target base checkout and any modifications made by the build.

recipe_modules / cros_storage

DEPS: depot_tools/gsutil, recipe_engine/raw_io, recipe_engine/step

API featuring shared helpers for locating and naming stored artifacts.

Much of the inspiration for this module came from: chromite/lib/paygen/gspaths.py

As long as there are two versions of the the path construction any changes to one of these needs to be reflected in the other.

class CrosStorageApi(RecipeApi):

Apis for working with stored images, payloads, and artifacts.

def discover_gs_artifacts(self, prefix_uri, parse_types=None):

Discover and return all the GS artifacts found in a given ArtifactRoot.

We assume that each uri will match at most a single ParserOption and we greedily take the first one. GS exceptions are represented as an empty return list.

Args: prefix_uri (str): The gs path prefix recursively crawled. parse_types list(parse_uri()): A list of uri parser fn()s to consider.

Returns: list[artifact_type]: list of artifacts found in the prefix.

recipe_modules / cros_tags

DEPS: recipe_engine/buildbucket, recipe_engine/cv

API for generating tags.

class CrosTagsApi(RecipeApi):

A module for generating tags.

def add_tags_to_current_build(self, **tags):

Adds arbitrary tags during the runtime of a build.

Args: **tags (dict): Dict mapping keys to values. If the value is a list, multiple tags for the same key will be created.

@property
def cq_cl_group_key(self):

Return the cq_cl_group_key, if any.

Returns: (str) cq_cl_group_key, or None

def cq_cl_tag_value(self, cl_tag_key, tags):

Returns the value for the given cq_cl_tag, if it is found.

@property
def cq_equivalent_cl_group_key(self):

Return the cq_equivalent_cl_group_key, if any.

Returns: (str) cq_equivalent_cl_group_key, or None

def get_single_value(self, key, tags=None, default=None):

Return a single value from a list of tags.

If the key has more than one value, only the first value will be returned.

Args: key (str): The key to look up values for. tags ([StringPair]): A list of tags in which to look up values. (defaults to tags for current build) default (str): A default value to return if no values found.

Returns: str|None, the first value found for the key among the tags.

def get_values(self, key, tags=None, default=None):

Return a value from a list of tags.

Since tags are able to have multiple values for the same key, the return value is always a list, even for a single item.

Args: key (str): The key to lookup values for tags ([StringPair]): A list of tags in which to look up values. (defaults to tags for current build) default (str): A default value to return if no values found

Returns: List of tag values, or [default] if none found.

def has_entry(self, key, value, tags):

Returns whether tags contains a tag with key and value.

def make_schedule_tags(self, snapshot, inherit_buildsets=True):

Returns the tags typically added to scheduled child builders.

Args: snapshot (GitilesCommit): snapshot the build was synced on inherit_buildsets (bool): whether to include non-gitiles_commit buildsets.

Returns: list[StringPair] to pass as buildbucket tags

def tags(self, **tags):

Helper for generating a list of StringPair messages.

Args: tags (dict): Dict mapping keys to values. If the value is a list, multiple tags for the same key will be created.

Returns: (list[StringPair]) tags.

recipe_modules / cros_test_plan

DEPS: cros_infra_config, cros_source, easy, gobin, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step

Functions for end-to-end test planning.

class CrosTestPlanApi(RecipeApi):

A module for generating and parsing test plans.

def generate(self, builds, gerrit_changes, manifest_commit, name=None):

Generate test plan.

Args:

  • name (str): The step name.
  • builds (list[build_pb2.Build]): builds to test.
  • gerrit_changes (list[common_pb2.GerritChange]): changes that were inputs for these builds, or empty.
  • manifest_commit (common_pb2.GitilesCommit): manifest commit for build.

Returns: GenerateTestPlanResponse of test plan.

def generate_target_test_requirements_config(self, builders=None):

Generate target test requirements config in config-internal using ./board_config/generate_test_config. Assumes config-internal is checked out at src_state.workspace_path/CONFIG_INTERNAL_CHECKOUT.

Args: builders (list[str]): optional list of builder names to generate config for, e.g. coral-release-main or staging-kevin-release-main. If not specified, either the invoking builder or its children (if the invoking builder name contains ‘orchestrator’) will be used.

Returns: JSON structure of target test requirements or None.

def get_test_plan_summary(self, test_plan):

Return a mapping of display name to criticality.

Args: test_plan (GenerateTestPlanResponse): The test plan to summarize.

Returns: test_to_crit_map (dict{string: bool}): Map of test display name to criticality.

recipe_modules / cros_test_plan_v2

DEPS: cros_infra_config, cros_test_plan, easy, gerrit, gobin, src_state, depot_tools/gitiles, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Functions for end-to-end test planning.

class CrosTestPlanV2Api(RecipeApi):

A module for generating and parsing test plans for CTP v2.

def dirmd_update(self, table: str):

Call test_plan chromeos-dirmd-update.

Args:

  • table: BigQuery table to upload to, in the form ... Required. The table will be created if it doesn‘t already exist, and the schema will be updated if it doesn’t match the DirBQRow schema.

def enabled_on_changes(self, gerrit_changes):

Returns true if test planning v2 is enabled on gerrit_changes.

Config controlling what changes are enabled is in the ProjectMigrationConfig of this module's properties.

@property
def generate_ctpv1_format(self):

def generate_hw_test_plans(self, starlark_packages: List[StarlarkPackage], generate_test_plan_request: Optional[GenerateTestPlanRequest]=None):

Runs the test_plan Go infra binary to get HWTestPlans.

Args:

  • starlark_packages (list[StarlarkPackage]): Paths to Starlark files to evaluate to get HWTestPlans. Note that StarlarkPackages must be used instead of single files because the Starlark files can import each other. If there are duplicate StarlarkPackages (same root and main file) each unique package will only be added once.
  • generate_test_plan_request (GenerateTestPlanRequest): A GenerateTestPlanRequest for calling testplan with CTPV1 compatibility.

Returns: A list of generated HWTestPlans or GenerateTestPlanResponse if generate_ctpv1_format is true.

def get_testable_builders(self, starlark_packages: List[StarlarkPackage], builds: List[Build]):

Runs the test_plan Go infra binary to get a list of testable builders.

Args: starlark_packages: Paths to Starlark files to evaluate to get testable builders. Note that StarlarkPackages must be used instead of single files because the Starlark files can import each other. If there are duplicate StarlarkPackages (same root and main file) each unique package will only be added once. builds: The list of builds considered for this CQ run.

Returns: A list of the names of the testable builders.

def is_bazel_builder(self, builder_name: str):

Returns whether builder_name is a Bazel builder.

Bazel builders are filtered out of testing right now, this is a simple filter that just works on the name. In the long-term, Bazel builders will need to be differentiated from Portage builders in test planning.

def relevant_plans(self, gerrit_changes):

Call test_plan relevant-plans.

Args:

  • gerrit_changes (list[common_pb2.GerritChange]): Changes to test, must be non-empty.

Returns: A list of relevant SourceTestPlans

def validate(self, directory: str):

Call test_plan validate on directory.

Raises a StepFailure if validation fails, otherwise returns None.

Args: directory: Path to a directory to validate. Note that this should be a directory, not a DIR_METADATA file. Any DIR_METADATA files in a subdirectory of directory will also be validated.

recipe_modules / cros_test_platform

DEPS: cros_infra_config, easy, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

Module for interacting with cros_test_platform.

class CrosTestPlatformCommand(RecipeApi):

Module for issuing cros_test_platform commands

def cipd_package_version(self):

Return the CTP CIPD package version (e.g. prod/staging/latest).

def enumerate(self, request):

Enumerate test cases via enumerate subcommand.

Args: request: a EnumerationRequest.

Returns: EnumerationResponse.

def execute_luciexe(self, properties, request):

Execute work via luciexe binary for cros_test_platform

crbug.com/1112514: This is an alternative binary target for cros_test_platform which will eventually replace all the subcommands of the cros_test_platform binary.

Args: request: a ExecuteRequests. properties: CrosTestPlatformProperties

Returns: ExecuteResponses, Dict[string][string].

def skylab_execute(self, request):

Execute work via skylab-execute subcommand.

Args: request: a ExecuteRequest.

Returns: ExecuteResponse.

recipe_modules / cros_test_postprocess

class CrosTestPostProcessApi(RecipeApi):

Data structures used by the cros_test_postprocess recipe.

def downloaded_test_result(self, gs_path, local_path):

Create an object of DownloadedTestResult.

The object created is passed to each post process api to consume.

Args: gs_path: A string of GS path of test result to download. local_path: A Path object to save downloaded test result locally.

Returns: A named tuple of (gs_path, local_path).

recipe_modules / cros_test_proctor

DEPS: cq_fault_attribution, cros_cq_additional_tests, cros_history, cros_infra_config, cros_resultdb, cros_tags, cros_test_plan, cros_test_plan_v2, easy, exonerate, future_utils, git, git_footers, greenness, naming, skylab, skylab_results, src_state, test_failures, urls, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/path, recipe_engine/step, recipe_engine/swarming

Functions for sending requests and processing results from cros test platform.

class CrosTestProctorApi(RecipeApi):

@property
def builders_tested_in_this_run(self):

def get_test_failures(self, test_results):

Logs all test failures to the UI and raises on failed tests.

Args: test_results: MetaTestTuple of the tests on the changes. Returns: list[Failure]: All failures discovered in the given run.

def get_testable_builders(self, gerrit_changes: List[GerritChange], builds: List[Build]):

Returns the names of the builders whose images may be tested in this run.

Uses the builds being considered by this CQ run and the relevant test plans based on the Gerrit Changes applied in order to determine which builders produce images that could be tested in this run.

Args: gerrit_changes: Changes being tested in this CQ run. builds: The list of builds considered for this CQ run.

Returns: The names of the builder whose images may be tested in this CQ run.

def run_proctor(self, need_tests_builds, snapshot, gerrit_changes, enable_history, run_async=False, container_metadata=None, require_stable_devices=False, use_test_plan_v2=False, build_target_critical_allowlist=None):

Runs the test platform for a given bunch of builds.

This is the entry point into the CrOS infra test platform via recipes.

Args: need_tests_builds (list[Build]): builds that are eligible for testing, i.e. ones that didn't suffer build failures. snapshot (common_pb2.GitilesCommit): the manifest snapshot at the time the included builds were created. gerrit_changes (list[common_pb2.GerritChange]): the changes that resulted in the provided builds, or None. enable_history (bool): whether to prune test history for previously successful tests on images with the same build inputs. run_async (bool): whether to stop and collect, if set we return no failures (an empty list). container_metadata (ContainerMetadata): Information on container images used for test execution. require_stable_devices (bool): whether to only run on devices with label-device-stable: True use_test_plan_v2 (bool): whether to use the v2 testplan tool in cros test platform v1 compatibility mode. The v2 testplan tool will return GenerateTestPlanResponse protos, so it is interchangable with the v1 testplan tool. build_target_critical_allowlist: If set (including empty list), only the build targets specified can have tests run as critical. If None, criticality will not be modified for any build targets. Returns list[failures.Failure]: failures encountered running tests

def schedule_tests(self, test_plan, passed_tests, previously_failed_now_exonerable_hw_suites, timeout, is_retry=False, run_async=False, container_metadata=None, require_stable_devices=False, build_target_critical_allowlist=None):

Schedule all tests from the test_plan.

Args: test_plan (GenerateTestPlanResponse): A plan for all tests to be scheduled. passed_tests (list[string]): A list of names for the tests that have passed before. previously_failed_now_exonerable_hw_suites (list[string]): Previously failed tests that are now eligible for exoneration. timeout (Duration): Timeout in duration_pb2.Duration. is_retry (bool): Whether this is a CQ retry. run_async (bool): whether to stop and collect, if set we return no failures (an empty list). container_metadata (ContainerMetadata): Information on container images used for test execution. require_stable_devices (bool): whether to only run on devices with label-device-stable: True build_target_critical_allowlist: If set (including empty list), only the build targets specified can have tests run as critical. If None, criticality will not be modified for any build targets.

Returns: MetaTestTuple of lists of the tests scheduled.

@test_summary.setter
def test_summary(self, test_summary):

Set the test_summary for this build.

Args: test_summary (list[map{string: string}]): The test_summary for this build.

recipe_modules / cros_test_runner

DEPS: recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/step

Module for interacting with cros_test_runner.

class CrosTestRunnerCommand(RecipeApi):

Module for issuing cros_test_runner commands

def cipd_package_label(self):

Return the CTP CIPD package version (e.g. prod/staging/latest).

def ensure_cros_test_runner(self):

Ensure the cros_test_runner CLI is installed.

def execute_luciexe(self):

Execute work via cros_test_runner luciexe binary.

def is_dynamic(self):

Checks if cros_test_runner contains the dynamic TRv2 request.

Returns: bool

def is_enabled(self):

Checks if cros_test_runner is enabled for use.

Returns: bool

recipe_modules / cros_test_sharding

DEPS: recipe_engine/raw_io, recipe_engine/step

cros_test_sharding recipe module

Provides optimization algorithm for distributing tests among shards

class CrosTestShardingAPI(RecipeApi):

@staticmethod
def bucket_by_dependencies(test_cases, suite_name):

Creates a list of buckets grouping test_cases by their dependencies.

Args:

  • test_cases: List[test_case].
  • suite_name: string.

Returns: List[List[test_case]].

def optimized_shard_allocation(self, test_suite, suite_name, board, total_shards):

def optimized_shard_allocation_deps(self, test_buckets, suite_name, board, max_number_of_shards):

recipe_modules / cros_tool_runner

DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

API for cros_tool_runner interface.

class CrosToolRunnerCommand(RecipeApi):

Module for issuing CrosToolRunner commands

def create_file_with_container_metadata(self, container_metadata):

Create a temp file with provided container metadata.

Args: container_metadata: (ContainerMetadata) container metadata.

def ensure_cros_tool_runner(self):

Ensure the CrosToolRunner CLI is installed.

def find_tests(self, request):

Find tests via test-finder subcommand.

Args: request: a CrosToolRunnerTestFinderRequest.

def post_process(self, request):

Run post process via post_process subcommand.

Args: request: a CrosToolRunnerPostTestRequest.

def pre_process(self, request):

Pre process commands via pre-process subcommand.

Args: request: a CrosToolRunnerPreTestRequest.

def provision(self, request):

Run provision via provision subcommand.

Args: request: a CrosToolRunnerProvisionRequest.

def read_dut_hostname(self):

"Return the DUT hostname.

def test(self, request):

Run test(s) via test subcommand.

Args: request: a CrosToolRunnerTestRequest.

def upload_to_tko(self, autotest_dir, results_dir):

Upload test results to TKO via tko-parse. This command does not call into CTR. It directly invokes tko-parse in autotest. To have parity with phosphorus package, it makes sense to have the implementation live here so that any recipe consuming this module can take the benefit of this.

Args: autotest_dir (str): path to autotest package. results_dir (str): path to test results to upload.

recipe_modules / cros_try

DEPS: recipe_engine/buildbucket, recipe_engine/step

API for working with cros try-initiated jobs.

class CrosTryApi(RecipeApi):

A module for checking cros try builds.

def check_try_version(self):

Checks that this specific cros try invocation is supported.

def get_invoker(self):

Get the email of the tryjob invoker, if any.

recipe_modules / cros_version

DEPS: cros_source, easy, gerrit, git, git_footers, gobin, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cv, recipe_engine/file, recipe_engine/step, recipe_engine/time

API for working with CrOS version numbers.

class CrosVersionApi(RecipeApi):

A module for steps that manipulate CrOS versions.

def bump_version(self, dry_run=True, use_local_diff=False):

Bumps the chromeos version (as represented in chromeos_version.sh) and pushes the change to the chromiumos-overlay repo.

Which component is bumped depends on the branch the invoking recipe is running for (main/tot --> build, release-* --> branch).

Args: dry_run (bool): Whether the git push is --dry-run. use_local_diff (bool): If true, use the local diff instead of diff taken against tip-of-branch for the version bump CL.

def read_workspace_version(self, name=‘read chromeos version’):

Read the CrOS version from the workspace.

Returns: a Version read from the workspace.

Args: name (str): The name to use for the step.

Raises: ValueError: if the version file had unexpected formatting.

@property
def version(self):

The Version of the workspace checkout.

recipe_modules / ctpv2

DEPS: recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/step

API to call into the CTPv2 binary

class Ctpv2Command(RecipeApi):

Module for issuing ctpv2 commands

def cipd_package_label(self):

Return the CTPv2 CIPD package version (e.g. prod/staging/latest).

def ensure_ctpv2(self):

Ensure the ctpv2 CLI is installed.

def execute_luciexe(self, use_legacy=False, runningAsync=False):

Execute work via ctpv2 luciexe binary.

def filter_legacy_requests(self, requests, reverse=False):

Filter out the legacy requests based on allowed pools.

Args:

  • requests: Dict of legacy v1 requests.
  • reverse: boolean to flip the filter result.

Returns dict of filtered legacy v1 requests.

def get_val_from_obj_or_dict(self, obj_or_dict, field, key=None):

Retrieves the value from the obj/dict using the field/key.

This is needed because filter legacy requests is called with both a proto object and a proto dict.

def is_enabled(self):

Checks if ctpv2 is enabled for use.

Returns: bool

def set_allowed_pools(self, allowed_pools):

Set the allowed ctp2 pools

recipe_modules / cts_results_archive

DEPS: cros_tags, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/step

API to archive test results to CTS specific buckets

class CTSResultsArchive(RecipeApi):

API to archive test results to CTS specific buckets

def archive(self, d_dir):

Archive CTS result files to CTS specific GS buckets.

This module determines if any CTS results files should uploaded to the CTS GS buckets and archives them if required.

@param d_dir: The results directory to process.

recipe_modules / debug_symbols

DEPS: cros_infra_config, failures, gobin, recipe_engine/raw_io, recipe_engine/step

Module for working with debug symbols.

class DebugSymbols(RecipeApi):

Module for working with debug symbols.

def upload_debug_symbols(self, gs_path=None):

Upload debug symbols to the crash service.

recipe_modules / deferrals

DEPS: recipe_engine/step

API for deferring things (mainly failures).

Usage:

context mode:

with api.deferrals.raise_exceptions_at_end(): with api.deferrals.defer_exceptions(): do_a_thing_that_raises_an_exception() do_another_thing_that_should_happen_regardless()

exception raised at this point

explicit mode:

api.deferrals.defer_exception(StepFailure(‘error’)) do_another_thing_that_should_happen_regardless() api.deferrals.raise_exceptions() # <- this raises the exception

combo mode:

with api.deferrals.defer_exceptions(): do_a_thing_that_raises_an_exception() do_another_thing_that_should_happen_regardless() api.deferrals.raise_exceptions() # <- this raises the exception

specific exception mode:

with api.deferrals.raise_exceptions_at_end(): with api.deferrals.defer_exceptions(exception_types=[StepFailure]): raise StepFailure(‘error’) # <- deferred with api.deferrals.defer_exceptions(exception_types=[StepFailure]): raise InfraFailure(‘error’) # <- not deferred and raises immediately.

class DeferralsApi(RecipeApi):

A module for deferring actions, such as raising Exceptions.

def are_exceptions_pending(self):

Returns whether any exceptions would be raised by raise_exceptions.

def defer_exception(self, exception):

Takes an exception and defers it until raised (see usage above).

Should be used either in conjunction with deferrals.raise_exceptions_at_end or deferrals.raise_exceptions or this will essentially just swallow exceptions (and if that's your intent, we prefer api.failures.ignore_exceptions).

Args: exception (Exception): the exception to raise later.

@contextlib.contextmanager
def defer_exceptions(self, exception_types=None):

Catches exceptions and defers them until raised (see usage above).

Should be used either in conjunction with deferrals.raise_exceptions_at_end or deferrals.raise_exceptions or this will essentially just swallow exceptions (and if that's your intent, we prefer api.failures.ignore_exceptions).

Note: this can only catch a single exception, so if the intent is for exceptions to be swallowed by a long block of code, be aware that every step that can throw should be wrapped.

For more on this see examples/defer_exceptions_block_incorrect.py.

Args: exception_types (Optional[List[Type]]): types of exceptions to defer (allowing all others through).

def raise_exceptions(self, prefer_first_type: bool=False):

Explicitly raise any deferred exceptions.

This is the non-context manager approach to using this module. Simply call this method at the point where you want deferred exceptions to be raised.

If multiple exceptions were deferred, the superclass of the raised exception depends on the value of prefer_first_type and the order in which exceptions that were raised:

  • An InfraFailure will be raised if:
    • prefer_first_type is False and any deferred exception was an InfraFailure, or
    • prefer_first_type is True and the first deferred exception was an InfraFailure.
  • Otherwise, a StepFailure will be raised.

@contextlib.contextmanager
def raise_exceptions_at_end(self, prefer_first_type: bool=False):

Sets up a context manager to raise deferred failures at the end.

Note: while using this context manager, if an exception is thrown that is not caught by the defer_exceptions call above, that exception will take precendence over the deferred one. However, the deferred one will still be logged.

recipe_modules / dirmd

DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step

Functions for using the dirmd tool.

class DirmdApi(RecipeApi):

A module for using the dirmd tool.

def validate_dir(self, directory: str):

Find and validate all DIR_METADATA files in a directory.

Raises a StepFailure if validation fails, otherwise returns None.

Args: directory: Path to a directory to validate. Note that this should be a directory, not a DIR_METADATA file. Any DIR_METADATA files in a subdirectory of directory will also be validated.

recipe_modules / disk_usage

DEPS: recipe_engine/step

class DiskUsageApi(RecipeApi):

A module to process tast-results/ directory.

def track(self, step_name=None, depth=0, timeout=(10 * 60), d=None):

Print out the disk usage under the current directory.

Args: depth (int): The depth to traverse within the subdirs. timeout (int): timeout in seconds. d (str): absolute dir path to start from. If empty, use cwd.

@contextlib.contextmanager
def tracking_context(self):

A context wrapper for track().

recipe_modules / dlc_utils

DEPS: cros_build_api, future_utils, gcloud, depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

class DlcUtilsApi(RecipeApi):

A module handle special operations around DLCs.

@artifacts_local_path.setter
def artifacts_local_path(self, artifacts_local_path: str):

Set the local path where artifacts are placed by BAPI.

def copy_prebuilt_dlcs(self, bucket: str, sysroot: ArtifactsByService.Sysroot, chroot: common_pb2.Chroot, is_staging: bool):

Retrieves the list of prebuilt DLCs and copies them to the bucket.

Args: bucket: GS bucket to copy into. sysroot: The sysroot to use. chroot: The chroot to use. is_staging: Whether this is running in the staging environment.

Returns: Dict mapping DLC locations to file hashes.

def get_dlc_artifacts(self, gs_path: str):

Retrieves DLC artifact locations and corresponding file hashes.

Args: gs_path: GS path used to report uploaded artifacts.

Returns: Dict mapping DLC locations to file hashes.

def get_dlcs_in_path(self, path: str, use_local_path: Optional[bool]=False):

Retrieves a list of DLCs in the provided local or GS path.

Args: path: Location to search for DLCs. use_local_path: Whether the path is local (vs. GS).

Returns: List of fully qualified paths of DLCs within the path.

recipe_modules / dut_interface

class DUTInterface(RecipeApi):

def create(self, api, properties):

Factory constructor for interfaces.

Args:

  • api (RecipeScriptApi): Ubiquitous recipe API
  • properties (TestRunnerProperties): Input recipe properties.

Returns: DUTInterface

recipe_modules / easy

DEPS: cros_tags, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/raw_io, recipe_engine/step

APIs for easy steps.

class EasyApi(RecipeApi):

A module for easy steps.

def log_parent_step(self, log_if_no_parent: bool=True):

Creates a short step to log the current builder's parent build ID.

Args: log_if_no_parent: If True and there is no parent build, create an empty step stating that there's no parent build. If False and there is no parent build, do nothing.

def set_properties_step(self, step_name: Optional[str]=None, **kwargs):

An empty step to set properties in output.properties.

Args: step_name: The name of the step. kwargs: Keyword arguments to set as properties, key is property name and value is property value. Key must be a string, value may be str, int, float, list, or dict. If value is bytes, it will be cast to string.

Returns: See ‘step.call’.

def stdout_json_step(self, name: str, cmd: List[str], step_test_data: Optional[Any]=None, test_stdout: Optional[Union[(str, Any)]]=None, ignore_exceptions: bool=False, add_json_log: bool=True, **kwargs):

Runs an easy.step and returns stdout data deserialized from JSON.

Args:

  • name: The name of the step.
  • cmd: The command to run.
  • step_test_data: Should be ‘callable’, See ‘step.call’.
  • test_stdout: Data to return in tests.a
  • ignore_exceptions: Should we ignore any exceptions.
  • add_json_log: Log the content of the output json.
  • kwargs: Keyword arguments to pass to the ‘step’ call.

Returns: dict|list: JSON-deserialized stdout data.

def stdout_jsonpb_step(self, name: str, cmd: List[str], message_type: type, test_output: Optional[Any]=None, parse_before_str: str='', **kwargs):

Runs an easy.step and returns stdout jsonpb-deserialized proto data.

  • name (str): The name of the step.
  • cmd (list[str]): The command to run.
  • message_type: A type (and also constructor) of proto message, indicating the type of proto to be returned.
  • test_output: Data, of type(message_type), to return in tests.
  • parse_before_str: Parse value only upto this str. Used to bypass random binaries appended with protos.
  • kwargs: Keyword arguments to pass to the ‘step’ call.

Returns: message_type: JSON-pb deserialized proto message.

def stdout_step(self, name: str, cmd: List[str], step_test_data: Optional[Any]=None, test_stdout: Optional[Union[(str, Any)]]=None, **kwargs):

Runs an easy.step and returns stdout data.

Args:

  • name: The name of the step.
  • cmd: The command to run.
  • step_test_data: Should be ‘callable’, See ‘step.call’.
  • test_stdout: Data to return in tests.
  • kwargs: Keyword arguments to pass to the ‘step’ call.

Returns: bytes: Raw stdout data.

def step(self, name: str, cmd: List[str], stdin: Optional[Any]=None, stdin_data: Optional[str]=None, stdin_json: Optional[Any]=None, **kwargs):

Convenience features on top of the normal ‘step’ call.

At most one of |stdin|, |stdin_data|, or |stdin_json| may be specified.

Args:

  • name: The name of the step.
  • cmd: The command to run.
  • stdin: Placeholder to read step stdin from.
  • stdin_data: Bytes to pass to stdin.
  • stdin_json: Object to JSON-serialize to stdin.
  • kwargs: Keyword arguments to pass to the ‘step’ call.

Returns: See ‘step.call’.

recipe_modules / exonerate

DEPS: cros_history, cros_infra_config, easy, exoneration_util, naming, rdb_util, urls, depot_tools/gitiles, recipe_engine/buildbucket, recipe_engine/step

Functions for exonerating test failures.

class ExonerateApi(RecipeApi):

def auto_exoneration_analysis(self, fake_data: bool=False):

Analyze failed tests to see if they can be exonerated.

Args: fake_data: If true, return true immediately. Should only be used for unittesting.

Returns: A boolean indicating if auto exoneration was enabled without dry_run and did not exceed any of the limits.

def auto_exoneration_analysis_v2(self, failed_tests: Set[FailedTest]=None, fake_data=None):

Analyze failed tests to see if they can be exonerated.

Args: failed_tests: Optional override of tests to be analyzed. Default use self._failed_tests. fake_data: Mocked LUCI Analysis response data to be used for tests. Tuple of (List[TestVariantStabilityAnalysis], TestStabilityCriteria).

@property
def auto_exoneration_v2_enabled(self):

def clear_failed_tests(self):

Clear failed_tests entries.

def enable_excludes(self):

enable excludes config's use.

def exonerate_hwtests(self, hw_test_results: List[SkylabResult]):

Exonerate the list of HW Test failures based on configs.

Args: hw_test_results: list of failures from the proctor.

Returns: [SkylabResult] with exonerated tests modified and [str] names of tests that should be treated as success.

def fetch_config(self, mock_data=None, mock_excludes_data=None):

Download config files and return the extracted config protos.

Args: mock_data: step_test_data for the exoneration config download step. mock_excludes_data: step_test_data for the excludes config download step.

Returns: TestDisablementCfg object of the config.

def generate_failed_test_stats(self, failure_rates: TestVariantFailureRateAnalysis):

Returns FailedTestStats for the given LUCI Analysis failure rates.

def get_consistent_failure_count_from_verdicts(self, recent_verdicts: List[TestVariantFailureRateAnalysis.RecentVerdict]):

Get the number of failures in the last 10 independant runs from LUCI Analysis.

Args: recent_verdicts: 10 most recent verdicts from LUCI Analysis.

Returns: Number of failures in the last 10 runs.

def get_flake_percent_from_interval_stats(self, interval_stats: List[TestVariantFailureRateAnalysis.IntervalStats]):

Get the flake count & percent of the test for the last 24 hr period.

Args: interval_stats: Verdict stats of the test over interval ranges.

Returns: Percent of verdict with flaky result in the last 24 hr period rounded to the nearest integer.

def get_prev_failed_now_exonerable_test_results(self, test_plan: GenerateTestPlanResponse, dry_run=False):

Get the tests from the previous failed runs that are now exonerable.

Args: test_plan: The test plan which contains the tests for which to retrieve the results from previous runs.

Returns: A list of exonerable HW test results.

def get_test_variant_dict(self, test_id: str, board: str, build_target: str, model: str):

Create test_variant dict for LUCI Analysis from inputs.

Args: test_id: Name of the test. board: Name of the board. build_target: Name of the build_target. model: Name of the model.

Returns: A dict that contains the test & variant info.

@property
def is_enabled(self):

Returns whether exoneration is enabled.

def is_exonerated(self, test_result):

Whether the test_result was exonerated.

Args: test_result[TestResult]: Test result to check for.

Returns: boolean indicating if test_result was exonerated.

def is_hw_result_exonerable(self, hw_test_result: SkylabResult, exoneration_configs_override: Optional[Dict]=None, exonerate_prejob_failures: Optional[bool]=False, excludes_enabled_override: Optional[bool]=None):

Checks to see if hw result is exonerable.

Args: hw_test_result: The skylab result to check if it is exonerable. exoneration_configs_override: Alternate exoneration configs to use when determining if the result is exonerable. exonerate_prejob_failures: Whether to exonerate prejob failures. These failures are not exonerable by default. excludes_enabled_override: Whether to take into account the excludes configs when exonerating. Overrides the module-level setting.

Returns: True if and only if the result is a failure AND exonerable. Note that it will return False if result is a success.

def load_configs(self, mock_data=None):

Load configs from binary/json files.

@property
def manual_exoneration_configs(self):

Returns configs for the manually exonerated tests.

@property
def overall_autoex_limit(self):

Returns the max number of tests to auto exonerate.

@property
def per_target_autoex_limit(self):

Returns the max number of tests to auto exonerate on a single target.

def print_stats(self, property_name: str):

Write exoneration stats to output properties & reset counts.

Args: property_name: Name of the property to populate.

recipe_modules / exoneration_util

DEPS: recipe_engine/luci_analysis, recipe_engine/resultdb

A module for util functions associated with cq test exoneration.

class ExonerationUtilApi(RecipeApi):

A module for util functions associated with exoneration.

def check_overall_limit(self, test_stats: List[FailedTestStats], overall_limit: int):

Check if automated exoneration exceeded overall limit.

Args: test_stats: List of failed tests stats. overall_limit: Number of auto exonerations should not exceed this limit.

Returns: Boolean indicating if number of exonerations has exceeded overall_limit.

def check_per_target_limit(self, test_stats: List[FailedTestStats], per_target_limit: int):

Check if automated exoneration exceeded per target limit.

Args: test_stats: List of failed tests stats. per_target_limit: Number of auto exonerations per target should not exceed this limit.

Returns: Tuple of boolean indicating if per-target exonerations have exceeded per_target_limit and offending build_target. If multiple targets have exceeded, return any one.

def get_tastless_name(self, test_name):

Return test_name without the tast prefix.

def get_updated_configs(self, test_stats: List[FailedTestStats], manual_configs: dict):

Update exoneration configs dict based on autoex analysis.

Args: test_stats: List of failed tests stats. manual_configs: Configs for manual exoneration in the format of test_name -> [list of build_targets]

Returns: A map of the same format as manual_configs but is updated to include autoex tests.

def match_test_variants_sources(self, test_variant_list: List[dict], fake_query_func: Callable[([Optional[str]], QueryTestVariantsResponse)]=None):

Query resultdb test variants API and populate sources to the given test variants (dict). The return data is readily to be consumed by the query_stability method.

Args: test_variants: list of test variant dicts to be matched. fake_query_func: a fake query function to be used for unit testing.

Returns: a new list of test variant dicts that have sources populated. The list can be compared to the original to identify any missed matches.

def override_calculation(self, test_stats: List[FailedTestStats], overall_limit: int, per_target_limit: int):

Populate and return OverrideInfo based on stats of auto exoneration.

Args: test_stats: List of failed tests stats. overall_limit: Number of auto exonerations should not exceed this limit. per_target_limit: Number of auto exonerations per target should not exceed this limit.

Returns: OverrideInfo based on auto exoneration statistics.

def query_failure_rate(self, test_variant_list: List[dict]):

Query failure rate from luci_analysis.

Args: test_variant_list: A list of dicts with test name and variant def to query on.

Returns: List of TestVariantFailureRateAnalysis for each input.

def query_stability(self, test_variant_position_list: List[dict], fake_data=None):

Query stability from luci_analysis. Batched client.

Args: test_variant_position_list list(TestVariantPosition): List of dicts containing testId, variant and source position fake_data: Fake data to be returned for unit testing.

Returns: List of TestVariantStabilityAnalysis. TestStabilityCriteria configured in Luci analysis.

recipe_modules / factory_util

DEPS: build_menu, cros_artifacts, cros_source, cros_version, depot_tools/gsutil, recipe_engine/context, recipe_engine/path, recipe_engine/step

A module for util functions associated with factory builds.

class FactoryUtilApi(RecipeApi):

def compress_test_image(self, artifact_dir: str, images_path: Path, version_str: str):

Compress artifacts for chromiumos_test_image.tar.xz

def upload_factory(self, config: BuilderConfig, artifact_dir: str):

Compress and upload factory artifacts.

Used for older factory branches which predate ArtifactsService.

Args: config - Which contains builder info used to construct GS path. artifact_dir - Local dir containing build artifacts.

def upload_factory_artifacts(self, config: BuilderConfig, artifact_dir: str):

Upload factory.zip and chromiumos_test_image.tar.xz

def zip_factory_image(self, artifact_dir: str, images_path: Path, bundle_path: Path, version_str: str):

Zip up artifacts for factory.zip

recipe_modules / failures

DEPS: cros_infra_config, cros_tags, easy, failures_util, naming, src_state, urls, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/step

API for raising failures and presenting them in cute ways.

class FailuresApi(RecipeApi):

A module for presenting errors and raising StepFailures.

def aggregate_build_failures(self, build_failures: List[Failure], failure_count: int, total_count: int):

Returns aggregate test failure markdown text for build results.

Args: build_failures: List of all the build failures. failure_count: Number of build failures. total_count: Number of all build results.

Returns: List of summary markdown lines.

def aggregate_failure_group(self, kind: str, failure_group: List[Failure], failure_count: int, total_count: int):

Returns aggregate failure markdown text.

Args: kind: The failure kind. failure_group: List of all the failures for the specific kind. failure_count: Number of failures total_count: Number of all entries

Returns: List of summary markdown lines.

def aggregate_failures(self, results, ignore_build_test_failures=False):

Returns a recipe result based on the given failures.

Only fatal failures cause the whole recipe to fail.

Args: results (Results): An object containg all failures encountered during execution and a dictionary mapping a test kind with the number of successes. Only tests considered as critical are counted. ignore_build_test_failures (bool): If True, we will still produce a summary of failures if present, but we will not set the build status to FAILURE.

Returns: RawResult: The recipe result, including a human-readable failure summary.

def aggregate_hw_test_failures(self, hw_test_failures: List[Failure]):

Returns aggregate test failure markdown text for HW tests, distinguishing between test and shard / suite failures.

Args: hw_test_failures: List of all the hw test failures.

Returns: List of summary markdown lines.

def format_step_failures(self, step_failures):

Helper function to format the collected failures for presentation.

Args: step_failures (list[Failure]): Collected error messages from exceptions. Returns: formatted markdown string for UI presentation.

def format_summary_markdown(self, summary_lines):

Aggregate individual failure summary lines.

There is a 4000 byte limit on the summary_markdown field in buildbucket. This function ensures that we do not go over that limit when summarizing the failures which occurred in the build.

Args: summary_lines (list[str]): Text lines to add to the summary markdown.

Returns: summary_markdown (str): A markdown text that can be used to set the result of the step or build.

def get_build_results(self, builds, relevant_child_builder_names=None):

Verify all builds completed successfully.

Args: builds (list[build_pb2.Build]): List of completed builds. relevant_child_builder_names (list(str)): List of relevant child builder names.

Returns: A Results object containing the list[Failure] of all failures discovered in the given runs and a dict mapping a task kind with the number of successes.

def get_build_status(self, build: build_pb2.Build):

Retrieve the status of the build.

def get_results(self, kind, runs, get_status, is_critical, get_title, get_link_map, get_id, build_detailed_kind=None):

def get_test_failure_main_line(self, failure_group: List[Failure]):

Returns the main line of the summary markdown.

Args: failure_group: List of all the failures for the specified kind.

Returns: Main line of the summary markdown.

def get_test_fault_attribution_text(self, target_identifier: str, test_id: str):

Returns the fault attribution text of a summary markdown line.

Args: target_identifier: The title of the failure, consisting of build_target name, suite name and optionally a model name. test_id: The ID of the test.

Returns: Fault attribution text for a summary line.

@contextlib.contextmanager
def ignore_exceptions(self):

Catches exceptions and logs them instead.

Should only be used temporarily to prevent new features from crashing the entire recipe. Remove once new feature is stable.

def is_critical_build_failure(self, build):

Determine in the build failed and was critical.

Args: build (Build): The buildbucket build in question.

Returns: bool: True if the build failed and was critical.

def set_test_variant_to_fault_attribute(self, test_variant_to_fault_attribute: Dict[(Tuple[(str, str, str)], FaultAttributedBuildTarget)]):

Sets dictionary information for test variant to the corresponding fault attribute.

Args: test_variant_to_fault_attribute: defaultdict of tuples of the format: (test_id, build_target, model), to fault attribution.

recipe_modules / failures_util

DEPS: recipe_engine/step

API for failures util functions.

class FailuresUtilApi(RecipeApi):

A module for util functions associated with failures printing & processing.

def present_run(self, title, link_map, status, critical=True):

recipe_modules / future_utils

DEPS: recipe_engine/futures

class FutureUtilsApi(RecipeApi):

A module to run things in parallel.

This module is intended to be used by any task that should run async. It supports maximum concurrency, and also waits for responses if asked to.

The biggest draw of using this module is the pairing of request object to response object. As such, it differs from the futures interface by requiring request be a single object. This object could easily be a dict for multiple inputs if needed.

Usage:

runner = api.future_utils.create_parallel_runner(2)
# First function.
runner.run_function_async(
  lambda req, _: "response 1",
  "request 1",
)
# Second function (in parallel).
runner.run_function_async(
  lambda req, _: "response 2",
  "request 2",
)
# Third function (will run after one of the first two completes).
runner.run_function_async(
  lambda req, _: "response 3",
  "request 3",
)
# Get responses.
responses = runner.wait_for_and_get_responses()
assert responses == [
  CallResponse('request 1', 'response 1', 1),
  CallResponse('request 2', 'response 2', 1),
  CallResponse('request 3', 'response 3', 1),
]

For more options, read the docs of ParallelRunner below.

@staticmethod
def create_custom_response(req, resp, call_count, errored=False):

Create a custom CallResponse, for convenience in logic.

Args: req: request object to include. resp: response object to include. call_count: number of calls. errored: whether the call errored or not.

Returns: A custom CallResponse for the given properties.

def create_parallel_runner(self, max_concurrent_requests=None):

Create a parallel runner, bounded to the provided number of concurrency.

Args: max_concurrent_requests: max number of requests to run in parallel. Defaults to None, which means unlimited.

Returns: A ParallelRunner that can be used to kick off async jobs, and wait for them to resolve.

recipe_modules / gce_provider

DEPS: deferrals, easy, recipe_engine/futures

A module that interacts with GCE Provider.

class GceProvider(RecipeApi):

A module that interacts with the GCE Provider config service.

Depends on ‘prpc’ binary available in $PATH: https://godoc.org/go.chromium.org/luci/grpc/cmd/prpc

def get_current_config(self, ids):

Function to retrieve the current config from GCE Provider.

Args: ids (list): A list of all the config prefixes to retrieve.

Returns: ConfigResponse containing: configs: Configs, list of GCE Provide Config objects. missing_configs: list[str] of ids for which there is no config.

def update_gce_config(self, bid, config):

Function to update the config in GCE Provider.

Args: bid (str): bot group prefix to update. config(Config): GCE Provider config object

Returns: Config, GCE Provider Config definition with updated values.

recipe_modules / gcloud

DEPS: cros_infra_config, easy, overlayfs, depot_tools/gsutil, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming, recipe_engine/time

API for gcloud commands.

class GcloudApi(RecipeApi):

A module to interact with Google Cloud.

def __init__(self, properties, *args, **kwargs):

Initialize GcloudApi.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def attach_disk(self, name, instance, disk, zone):

Attach a disk to a GCE instance.

As a disk is attached, the disk is then added to the stack that is used by the context manager to detach as the task ends.

Args: name (str): An alphanumeric name for the mount, used for display. instance (str): GCE instance on which disk will be attached. disk (str): Google Cloud disk name. zone (str): GCE zone to create instance (e.g. us-central1-b).

def auth_list(self, step_name=None):

Print out the auth creds currently on the bot.

Args: step_name (str): Name of the step.

@property
def branch(self):

@cache_action.setter
def cache_action(self, val):

def check_for_disk_mount(self, mount_path):

Check whether there is a disk mounted on given path.

Args: mount_path (str): System path on which the disk is mounted.

Returns: Bool indicating whether there is a disk mounted on the path.

@contextlib.contextmanager
def cleanup_gce_disks(self):

Wrap disk cleanup in a context handler to ensure they are handled.

Upon exiting the context manager, each attached disk is then iterated through to unmount, detach, and delete the disk.

@contextlib.contextmanager
def cleanup_mounted_disks(self):

Wrap disk cleanup in a context handler to ensure they are unmounted.

Upon exiting the context manager, each mounted disk is then iterated through and unmounted.

def create_disk(self, disk, zone, image=None, disk_type=None, size=None):

Create a GCE disk.

Create a GCE disk using the provided options.

Args: disk (str): Google Cloud disk name. zone (str): GCE zone to create disk (e.g. us-central1-b). image (str): Image version use to create the disk. If none, will create a blank disk. disk_type (str): Type of GCE disk to create. size (str): Size of disk in gcloud format (e.g. 100GB).

Returns: The stdout of the gcloud command.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def create_gcloud_image(self, step_name: str, image_name: str, props: List[str]):

Create an image.

Args: image_name: The name to give the image. props: Additional props to pass in to the create command. step_name: Step name to use.

def create_image(self, image_name, source_uri=None, licenses=None):

Creates an image in the GCE project.

Args: image_name (str): Name of the image. source_uri (str, optional): Sets the --source-uri flag if the image source is a tarball. See gcloud docs for detail. licenses (list[str], optional): List of image licenses to apply.

def create_image_from_disk(self, disk, image_name, zone):

Create an image from specified disk.

Args: disk (str): Google Cloud disk name. image_name (str): The name to give the image. zone (str): GCE zone to create instance (e.g. us-central1-b).

def create_image_from_image(self, image_name: str, existing_image: str):

Create an image from an existing image.

Args: image_name: The name to give the image. existing_image: The name of the existing image.

def create_instance(self, image, project, machine, zone, network=None, subnet=None, external_ip=False):

Create an instance in the GCE project.

Args: image (str): GCE image to use for the instance. project (str): Google Cloud project name. machine (str): GCE machine type zone (str): GCE zone to create instance (e.g. us-central1-b). network (str): Network name to use. subnet (str): Network subnet on which to create instance. external_ip (bool): Enables external IP address.

Returns: Tuple[str, str, str|None]: (name, ip_addr, ext_ip_addr) of the instance.

def delete_disk(self, disk, zone):

Delete a GCE disk.

Permanently delete a GCE disk from the project.

Args: disk (str): Google Cloud disk name. zone (str): GCE zone to create instance (e.g. us-central1-b).

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def delete_image(self, image_name):

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def delete_images(self, images):

Delete the list of provided images from GCE.

Args: images (list|str): A list of image names.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def delete_instance(self, instance, project, zone):

Delete a GCE instance.

Args: instance (str): GCE instance to be deleted. project (str): Google Cloud project name. zone (str): GCE zone to create instance (e.g. us-central1-b).

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def detach_disk(self, instance, disk, zone):

Detach a disk to a GCE instance.

As a disk is detached, the disk is then removed from the stack that is used by the context manager to detach as the task ends.

Args: instance (str): GCE instance disk is attached. disk (str): Google Cloud disk name. zone (str): GCE zone to create instance (e.g. us-central1-b).

def determine_disks_to_delete(self, disks, instances):

Determines the list of orphaned disks to delete.

Args: disks (dict): List of all GCE disks and zone instances (list|str): List of all GCE instances.

Returns: Dictionary containing disk name and zone to delete.

def disk_attached(self, disk_name):

Check whether a disk is attached to an instance.

Args: disk_name (str): Disk name to match for device id.

Returns: Bool of whether the disk is attached or not.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def disk_exists(self, disk, zone):

Check whether a disk exists.

Args: disk (str): Name of the disk to check. zone (str): GCE zone in which the disk exists.

Returns: Bool of whether the disk exists or not.

@property
def disk_short_name(self):

def download_file(self, source_path: str, dest_path: str):

Download Google Storage object.

Args: source_path: Google Storage path to copy (e.g. ‘gs://my-bucket/file.txt’) dest_path: Local path to save the object (e.g. ‘/tmp’)

@property
def gce_disk(self):

@property
def gce_disk_blkid(self):

@property
def gce_name_limit(self):

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def get_expired_images(self, retention_days, prefixes, protected_images=None):

Calculate the list of images that have expired.

Args: retention_days (int): Number of days to retain. prefixes (list|str): List of prefixes to filter. protected_images (list|str): List of images to preserve.

def get_instance_serial_output(self, instance, project, zone):

@property
def host_zone(self):

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def image_exists(self, image):

Check whether a image exists.

Args: image (str): Name of the snapshot to check.

Returns: Bool of whether the snapshot exists or not.

@property
def infra_host(self):

def initialize(self):

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def list_all_disks(self):

Pulls a list of all disks that exist.

Returns: A dictionary containing disk name and zone.

def list_all_instances(self):

Pulls a list of all instances that exist.

def lookup_device_id(self, disk_name):

Look up the device id in /dev/disk/by-id by name.

Args: disk_name (str): The name associated with the attached device. Returns: Returns the device id of the provided disk name, defaulting to None if no disk can be found.

def mount_disk(self, name, mount_path, recipe_mount=False, chown=False):

Mount an attached disk to host.

As a disk is mounted, the disk is then added to the stack that is used by the context manager to unmount as the task ends.

Args: name (str): An alphanumeric name for the mount, used for display. mount_path (str): Directory to mount the disk. recipe_mount (bool): Whether mount needs to be in the path to use within a recipe. chown (bool): Whether or not to chown the disk after mounting. This is needed for newly created disks.

@property
def mounted_snapshot(self):

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def resize_disk(self, disk, zone, size):

Resize the GCE disk above the default of 200GB.

Args: disk (str): Google Cloud disk name. zone (str): GCE zone which the disk is located. size (str): New size of the disk in GB.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def set_disk_autodelete(self, instance, name, zone):

Set a disk to autodelete when a GCE instance is deleted.

GCE disks are not default to delete when the instance is deleted, thus to ensure cleanup we can flip the metadata to ensure the disks are deleted when the instance is removed.

Args: instance (str): GCE instance on which disk is attached. name (str): Google Cloud disk name. zone (str): GCE zone to create instance (e.g. us-central1-b).

def set_gce_project(self, project):

Set the default project for gcloud command. Args: project (str): Google Cloud project name.

def setup_cache_disk(self, cache_name, branch=‘main’, disk_type=‘pd-standard’, disk_size=None, recipe_mount=False, disallow_previously_mounted=False, mount_existing=False, recovery_snapshot=None):

Create disk from snapshot, reuse if still attached.

Check if disk is attached, otherwise grab the matching snapshot, create, attach, and mount the source disk.

Args: cache_name (str): Name of the cache file to use. branch (str): Git branch. disk_type (str): Type of GCE disk to create, defaults to standard persistent disk. disk_size (str): Size of the disk to create in GB, defaults to image size. recipe_mount (bool): Whether mount needs to be in the path to use within a recipe. disallow_previously_mounted (bool): If set, this step will fail if the cache is already mounted. mount_existing (bool): If we find an existing cache, mount it immediately instead of relying on subsequent step. recovery_snapshot (str): Recovery snapshot to fall back to, defaults to initial-{cache_name}-source-snapshot.

@property
def snapshot_builder_mount_path(self):

Returns a Path to the base mount directory for cache builder.

@property
def snapshot_mount_path(self):

The path to mount the snapshot disks.

This is the path that the disks created from image will be mounted.

@property
def snapshot_suffix(self):

@property
def snapshot_version_file(self):

@property
def snapshot_version_path(self):

The path to the local version file.

This is the path to the local version file that contains the image version that was used to create the local named cache.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=30))
def storage_cp(self, source: str, dest: str, step: str=‘gcloud storage cp’, flags: Optional[List[str]]=None, **kwargs):

Do a gcloud storage cp.

Args: source: source location to copy. dest: destination location to copy. step: step name. flags: additional command line flags. kwargs: additional arguments.

def storage_ls(self, path: str):

Do a gcloud storage ls.

Args: path: the path to ls.

def sync_disk_cache(self, name):

Force a local disk cache sync before snapshotting.

Args: name (str): Disk name to use to lookup the mount location.

def transactionally_update_recovery_image(self, image: str, cache_name: str, recovery_image: Optional[str]):

Transactionally update the recovery image to the provided image.

The image name provided must already exist in GCP. The update does this:

  • quick double check to make sure the image exists.
  • copy the image to the recovery fallback path.
  • delete the old recovery image.
  • copy the newly created fallback image to the recovery image.
  • return the fallback image for later deletion.

Args: image: the image to replace the recovery image with. cache_name: the name of the cache (just in case recovery image isn't provided). recovery_image: the name of the recovery image.

Returns: The temp recovery disk to delete later.

def unmount_disk(self, name, mount_path):

Unmount an attached disk to host.

As a disk is unmounted, the disk is then removed from the stack that is used by the context manager to unmount as the task ends.

Args: name (str): An alphanumeric name for the mount, used for display. mount_path (str): Directory to mount the disk.

def update_fstab(self, mount_path, name):

Mount an attached disk to host.

As a disk is mounted, the disk is then added to the stack that is used by the context manager to unmount as the task ends.

Args: mount_path (str): Directory to mount the disk. name (str): An alphanumeric name for the mount, used for display.

recipe_modules / gerrit

DEPS: easy, git, git_cl, gobin, repo, src_state, support, depot_tools/gerrit, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

APIs for managing Gerrit changes.

class GerritApi(RecipeApi):

A module for Gerrit helpers.

def __init__(self, *args, **kwargs):

Initialize GerritApi.

def abandon_change(self, gerrit_change: GerritChange, message: Optional[str]=None):

Abandon the given change.

Args: gerrit_change: The change to abandon. message: Optional message to post to change.

def add_change_comment(self, gerrit_change: GerritChange, comment: str, project_path: Optional[Path]=None):

Add a comment to the given Gerrit change.

Args: gerrit_change: The change to post to. comment: The comment to post. project_path: If set, will use this as the project path rather than any value inferred from the gerrit_change.

def add_change_comment_remote(self, gerrit_change: GerritChange, comment: str):

Add comment to gerrit_change.

The comment is left as a resolved patchset level comment.

Args: gerrit_change: The change to comment on. comment: The comment to leave.

def changes_submittable(self, gerrit_changes: List[GerritChange], test_output_data: Union[(Callable, Dict, List, None)]=None):

Check whether the given changes can be merged onto their Git branches.

Args: gerrit_changes: The changes to check. test_output_data: Mock response for the git-test-submit support tool.

def create_change(self, project: Union[(str, Path)], reviewers: Optional[List[str]]=None, ccs: Optional[List[str]]=None, topic: Optional[str]=None, ref: Optional[str]=None, hashtags: Optional[List[str]]=None, project_path: Path=None, use_local_diff: bool=False, non_repo_checkout: bool=False):

Create a Gerrit change for the most recent commits in the given project.

Assumes one or more local commits exists in the project. The commit message is always used as the CL description.

Args: project: Any path within the project of interest, or the project name. reviewers: List of emails to mark as reviewers on Gerrit. ccs: List of emails to mark as cc on Gerrit. topic: Topic to set for the CL. ref: --target-branch argument to be passed to git cl upload. Should be a full git ref, such as ‘refs/heads/main’ -- NOT just ‘main’. hashtags: List of hashtags to set for the CL. project_path: If set, will use this as the project path rather than any value inferred from the gerrit_change. use_local_diff: If true, use the local diff instead of diff taken against tip-of-branch for the CL. non_repo_checkout: If true, means that the checkout described by project_path is not within a repo checkout (and thus the method will skip repo calls used to gather optional information).

Returns: The newly created change.

def fetch_patch_set_from_change(self, change: GerritChange, include_files: bool=False, include_commit_info: bool=False, include_detailed_labels: bool=False, test_output_data: Optional[Callable]=None):

Fetch and return PatchSet associated with the given GerritChange.

Assumes that change.patchset is set (which is not always the case).

Args: gerrit_changes: Buildbucket GerritChange to fetch. include_files: If True, include information about changed files. test_output_data: Test output for gerrit-fetch-changes. include_commit_info: If True, include information about the commit. include_detailed_labels: If True, include information about the labels applied to the change.

Raises: StepFailure: If the requested patch set is not found.

def fetch_patch_sets(self, gerrit_changes: List[GerritChange], include_files: bool=False, include_commit_info: bool=False, include_messages: bool=False, include_submittable: bool=False, include_detailed_labels: bool=False, step_name: Optional[str]=None, test_output_data: Optional[Callable]=None):

Fetch and return PatchSets from Gerrit.

Args: gerrit_changes: Buildbucket GerritChanges to fetch. include_files: If True, include information about changed files. include_commit_info: If True, include information about the commit. include_messages: If True, include messages attached to the commit. include_submittable: If True, include information about possible submission. include_detailed_labels: If True, include information about the labels applied to the change. step_name: The name of the step. test_output_data: Test output for gerrit-fetch-changes.

Returns: List of PatchSets in requested order.

Raises: StepFailure: If any of the requested patch sets is not found.

def gerrit_related_changes(self, gerrit_change: GerritChange):

Fetch and return related changes given a Gerrit change.

Uses the gerrit_related_changes CIPD package.

Returns: The JSON for ‘related’ outputted by gerrit_related_changes.

@exponential_retry(retries=4, delay=timedelta(seconds=5))
@functools.lru_cache
def get_account_id(self, email: str, gerrit_host: str):

Get the Gerrit account id for the given email on the given host.

def get_change_description(self, gerrit_change: GerritChange, memoize: bool=False):

Get the description of the given Gerrit change.

Args: gerrit_change: The change of interest. memoize: Whether to consult a local cache for the change ID instead of fetching from gerrit.

Returns: The change description.

@exponential_retry(retries=4, delay=timedelta(seconds=5))
def get_change_mergeable(self, change_num: int, gerrit_host: str, revision: str=‘current’):

Get the mergeable status of the given Gerrit change.

Args: change_num: The number of the change to check. gerrit_host: Base URL to curl against. revision: The revision of the change to check.

Returns: Whether the revision of the change is mergeable.

@exponential_retry(retries=1, delay=timedelta(seconds=5))
def is_merge_commit(self, change_num: int, gerrit_host: str, revision: str=‘current’):

Returns whether the given change list contains a merge commit.

This is determined by looking at the change's merge list. If the list is empty than it is not a merge commit.

Args: change_num: The number of the change to check. gerrit_host: Base URL to curl against. revision: The revision of the change to check.

Returns: Whether the given change contains a merge commit.

def parse_gerrit_change(self, gerrit_change_url: str):

Return a GerritChange proto, parsed from the gerrit change URL.

This function expects the URL to be in one of these formats:

https:///c//+/ https:///

...where is something like ‘chromium-review.googlesource.com’, is something like ‘chromiumos/chromite’, and is something like 12345. The https:// is optional.

def parse_gerrit_change_url(self, gerrit_change: GerritChange):

Return a Gerrit change URL, parsed from a GerritChange proto.

def parse_qualified_gerrit_host(self, gerrit_change: GerritChange):

Return a fully qualified host parsed from a GerritChange proto.

def query_change_infos(self, host: str, query_params: List[Tuple[(str, str)]], label_constraints: Optional[List[LabelConstraint]]=None, o_params: Optional[List[str]]=None, limit: Optional[int]=None):

Query gerrit for change meeting certain constraints, and return them.

Args: host: The Gerrit host to query. query_params: Query parameters as list of (key, value) tuples to form a query, as documented here: https://gerrit-review.googlesource.com/Documentation/user-search.html#search-operators label_constraints: Constraints on the changes' labels, to be used as a filter before returning. o_params: A list of additional output specifiers, as documented here: https://gerrit-review.googlesource.com/Documentation/rest-api-changes.html#list-changes limit: The maximum number of changes to return. Returns: A list of ChangeInfo objects that meet the specified query parameters and label constraints. If no changes meet the criteria, an empty list is returned.

def query_changes(self, host: str, query_params: List[Tuple[(str, str)]], label_constraints: Optional[List[LabelConstraint]]=None):

Query gerrit for change meeting certain constraints, and return them.

Args: host: The Gerrit host to query. query_params: Query parameters as list of (key, value) tuples to form a query, as documented here: https://gerrit-review.googlesource.com/Documentation/user-search.html#search-operators label_constraints: Constraints on the changes' labels, to be used as a filter before returning. Returns: A list of GerritChange objects that meet the specified query parameters and label constraints. If no changes meet the criteria, an empty list is returned.

def set_change_description(self, gerrit_change: GerritChange, description: str, amend_local: bool=False, project_path: Optional[Path]=None):

Set the description of the given Gerrit change.

Args: gerrit_change: The change of interest. description: The new description, in full. Be sure this still includes the Change-Id and other essential metadata. amend_local: Whether to amend the description of the HEAD local change as as well. project_path: If set, use this as the project path rather than any value inferred from the gerrit_change.

def set_change_labels(self, gerrit_change: GerritChange, labels: Dict[(Label, int)], branch: Optional[str]=None, ref: Optional[str]=None):

(Deprecated) Set the given labels for the given Gerrit change.

This function is deprecated. Use set_change_labels_remote where possible.

Args: gerrit_change: The change of interest. labels: Mapping from label (Label) to value (int). branch: The remote branch to update. ref: The remote ref to update.

Returns: The ref used to push the labels.

def set_change_labels_remote(self, gerrit_change: GerritChange, labels: Dict[(Label, int)]):

Set the given labels for the given Gerrit change.

set_change_labels only works when the change exists in the local checkout. This function should be used in other cases.

Args: gerrit_change: The change of interest. labels: Mapping from label to value.

Returns: The applied labels (primarily for testing).

def submit_change(self, gerrit_change: GerritChange, retries: int=0, project_path: Optional[Path]=None):

Submit the given change.

Args: gerrit_change: The change to submit. retries: How many times to retry git cl land should it fail. project_path: If set, use this as the project path rather than any value inferred from the gerrit_change.

recipe_modules / git

DEPS: src_state, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

API for working with git.

class GitApi(RecipeApi):

A module for interacting with git.

def add(self, paths):

Add/stage paths.

Stages paths for commit. Note that this will fail if a file is tracked and not modified, which you can use diff_check to check for.

Args: paths (list[str|Path]): The file paths to stage.

def add_all(self):

Add/stage all changed files.

def amend_head_message(self, message, **kwargs):

Runs ‘git commit --amend’ with the given description.

Args: message (str): The commit message. kwargs (dict): Passed to recipe_engine/step.

def author_email(self, commit_id):

Returns the email of the author of the given commit.

Args:

  • commit_id (str): The commit sha.

Returns: (str): commit author email.

def branch_exists(self, branch):

Check if a branch exists.

Args:

  • branch (str): Name of the branch to check.

Returns: (bool) Whether or not the branch exists.

def checkout(self, commit=None, force=False, branch=None, **kwargs):

Runs ‘git checkout’.

Args: commit (Optional[str]): The commit (technically “tree-like”) to checkout. force (bool): If True, throw away local changes (--force). branch (Optional[str]): The branch to check out a commit from.

def cherry_pick(self, commit, **kwargs):

Runs ‘git cherry-pick’.

Args: commit (str): The commit to cherry pick. kwargs (dict): Passed to recipe_engine/step.

def cherry_pick_abort(self):

Runs ‘git cherry_pick --abort’.

def cherry_pick_silent_fail(self, commit, **kwargs):

Runs ‘git cherry-pick’ and returns whether the cherry-pick succeeded.

Args: commit (str): The commit to cherry pick. kwargs (dict): Passed to recipe_engine/step.

def clone(self, repo_url, target_path=None, reference=None, dissociate=False, branch=None, single_branch=False, depth=None, timeout_sec=None, verbose=False, progress=False):

Clones a Git repo into the current directory.

Args: repo_url (str): The URL of the repo to clone. target_path (Path): Path in which to clone the repo, or None to specify current directory. reference (Path): Path to the reference repo. dissociate (bool): Whether to dissociate from reference. branch (string): If set, performs a single branch clone of that branch. single_branch (bool): If set, performs a single branch clone of the default branch. depth (int): If set, creates a shallow clone at the specified depth. timeout_sec (int): Timeout in seconds. verbose (bool): If set, run git clone as verbose. progress (bool): If set, print progress to stdout.

def commit(self, message, files=None, author=None, **kwargs):

Runs ‘git commit’ with the given files.

Args: message (str): The commit message. files (list[str|Path]): A list of file paths to commit. author (str): The author to use in the commit. Ordinarily not used, added to test permission oddities by forcing forged commit failure. kwargs (dict): Passed to recipe_engine/step.

def create_branch(self, branch, remote_branch=None):

Create a branch.

Args:

  • branch (str): Name of the branch to be created, e.g. mybranch. This branch may not already exist.
  • remote_branch (str): Name of the remote branch to track, e.g. origin/main or cros/mybranch.

def create_bundle(self, output_path, from_commit, to_ref):

Creates a git bundle file.

Creates a git bundle (see man git-bundle) containing the commits from |from_commit| (exclusive) to |to_ref| (inclusive).

Args: output_path (Path): Path to create bundle file at. from_commit (str): Parent commit (exclusive) for bundle. to_ref (str): Reference to put in bundle.

def current_branch(self):

Returns the currently checked out branch name.

Returns: (str): The branch name pointed to by HEAD. None: If HEAD is detached.

def delete_local_branch(self, branch):

Deletes the local branch (if it exists). Args: branch (str): Name of the branch to be deleted.

def diff_check(self, path):

Check if the given file changed from HEAD.

Args: path (str|Path): The file path to check for changes.

Returns: (bool): True if the file changed from HEAD (or doesn't exist), False otherwise.

def extract_branch(self, refspec, default=None):

Splits the branch from the refspec.

Splits the branch from a refs/heads refspec and returns it. Returns default if the refspec is not of the required format.

Args: refspec (str): refspec to split the branch from. default (str): value to return if refspec not of required format.

Returns: (str): the extracted branch name.

def fetch(self, remote=None, refs=None, timeout_sec=None, retries=2):

Runs ‘git fetch’.

Args: remote (str): The remote repository to fetch from. refs (list[str]): The refs to fetch. timeout_sec (int): Timeout in seconds. retries (int): Number of times to retry.

def fetch_ref(self, remote, ref, timeout_sec=None):

Fetch a ref, and return the commit ID (SHA).

Args: remote (str): The remote repository to fetch from. ref (str): The ref to fetch. timeout_sec (int): Timeout in seconds.

Returns: (str): The commit ID (SHA) of the fetched ref.

def fetch_refs(self, remote, ref, timeout_sec=None, count=1, test_ids=None):

Fetch a list of remote refs.

Args: remote (str): The remote repository to fetch from. ref (str): The ref to fetch. timeout_sec (int): Timeout in seconds. count (int): The number of commit IDs to return. test_ids (list[str]): List of test commit IDs, or None.

Returns: (list[str]): The commit IDs, starting with the fetched ref.

def get_branch_ref(self, branch):

Creates the full ref for a branch.

Returns a ref of the form refs/heads/{branch}.

Args: branch (str): branch to split the branch from.

Returns: (str): The ref for the branch.

def get_diff_files(self, from_rev=None, to_rev=None, test_stdout=None):

Runs ‘git diff’ to find files changed between two revs.

Revs are passed directly to ‘git diff’, which has the following effect: 0 revs - Changes between working directory and index 1 revs - Changes between working directory and given commit 2 revs - Changes between the two commits

Note that this does not include new/untracked files.

Args: from_rev (str): First revision (see ‘man 7 gitrevisions’) to_rev (str): Second revision

Returns: (list[str]): changed files.

def get_parents(self, commit_id, test_contents=None):

Runs get log to determine the parents of a git commit.

Args: commit_id (str): The commit hash.

Returns: (list[str]): parent commit hash(es).

def get_working_dir_diff_files(self):

Finds all changed files (including untracked).

def gitiles_commit(self, test_remote=‘cros-internal’, test_url=None):

Return a GitilesCommit for HEAD.

Args: test_remote (str): The name of the remote, for tests. test_url (str): Test data: url for the remote, for tests.

Returns: (GitilesCommit): The GitilesCommit corresponding to HEAD.

def head_commit(self):

Returns the HEAD commit ID.

@contextlib.contextmanager
def head_context(self):

Returns a context that will revert HEAD when it exits.

def is_merge_commit(self, commit_id):

Determines if the commit_id is a merge commit.

Args: commit_id (str): The commit sha.

Returns: (bool): whether the commit has more than 1 parent.

def is_reachable(self, revision, head=‘HEAD’):

Check if the given revision is reachable from HEAD.

Args: revision (str): A git revision to search for. head (str): The starting revision. Default: HEAD.

Returns: (bool): Whether the revision is reachable from (is an ancestor of) |head|.

def log(self, from_rev, to_rev, limit=None, paths=None):

Returns all the Commit between from_rev and to_rev.

Args: from_rev (str): From revision to_rev (str): To revision limit (int): Maximum number of commits to log. paths (list[str]): pathspecs to use.

Returns: (list[Commit]): A list of commit metas.

def ls_remote(self, refs, repo_url=None):

Return ls-remote output for a repository.

Args: refs (list[str]): The refs to list. repo_url (str): The url of the remote, or None to use CWD.

Returns: (list[Reference]): A list of Refs.

def merge(self, ref, message, *args, **kwargs):

Runs git merge.

Args: ref (str): The ref to merge. message (str): The merge commit message. args (tuple): Additional arguments to git merge. kwargs (dict): Passed to recipe_engine/step.

def merge_abort(self):

Runs ‘git merge --abort’.

def merge_base(self, *args, **kwargs):

Return the output from git merge-base.

Args: args (tuple): Additional arguments to git merge. kwargs (dict): Passed to recipe_engine/step.

Returns: (str) stdout of the command, or None for errors.

def push(self, remote, refspec, dry_run=False, capture_stdout=False, capture_stderr=False, retry=True, force=False, **kwargs):

Runs ‘git push’.

Args: remote (str): The remote repository to push to. refspec (str): The refspec to push. dry_run (bool): If true, set --dry-run on git command. capture_stdout (bool): If True, return stdout in step data. capture_stderr (bool): If True, return stderr in step data. retry (bool): Whether to retry. Default: True force (bool): add force flag for git push kwargs (dict): Passed to api.step.

Returns: (StepData): See ‘step.call’.

def rebase(self, force=False, branch=None, strategy_option=None):

Run git rebase with the given arguments.

Args: force (bool): If True, set --force. branch (str): If set, rebase from specific branch. strategy_option (str): If set, sets the --strategy-option flag. See git help rebase for details.

def remote(self):

Return the name of the remote.

Returns: (str): name of the remote, e.g. ‘origin’ or ‘cros’.

def remote_head(self, remote=‘.’, test_stdout=None):

Returns the HEAD ref of the given remote.

Args: remote (str): remote name to query, by default remote of current branch

Returns: (str): ref contained in the remote HEAD (ie the default branch), or None on error.

@exponential_retry(retries=19, delay=timedelta(minutes=1))
def remote_update(self, step_name, timeout_sec=None):

Runs ‘git remote update’.

Args: step_name (str): Name of the step to display. timeout_sec (int): Timeout in seconds.

def remote_url(self, remote=‘origin’):

Get the URL for a defined remote.

Args: remote (str): The name of the remote to query

Returns: URL to the remote on success

def repository_root(self, step_name=None):

Return the git repository root for the current directory.

Args: step_name (str): the step name to use instead of the default.

Returns: (str): The path to the git repository.

def set_global_config(self, args):

Runs git config --global to set global config.

Args: args (list[str]): args for git config.

def set_upstream(self, remote, branch):

Set the upretrem for the given branch.

Args: remote (str): The remote repository to track. branch (str): The remote branch to push to.

Returns: (StepData): See ‘step.call’.

def show_file(self, rev, path, test_contents=None):

Returns the contents of the given file path at the given revision.

Args: rev (str): The revision to return the contents from. path (str): The file path to return the contents of.

Returns: (str): The contents of the file, None if the file does not exist in |rev|.

def stash(self):

Stash changes.

recipe_modules / git_cl

DEPS: depot_tools/depot_tools, depot_tools/git_cl, recipe_engine/raw_io

API for working with git cl.

class GitClApi(RecipeApi):

A module for interacting with git cl.

def issues(self):

Run git cl issue.

Returns: dict: Map between ref and issue number, e.g. {‘refs/heads/main’: ‘3402394’}.

def status(self, field: str=None, fast: bool=False, issue: str=None, **kwargs):

Run git cl status with given arguments.

Args: field: Set --field to this value. fast: Set --fast. issue: Set --issue to this value. kwargs: Passed to recipe_engine/step. May NOT set stdout.

Returns: The command output.

def upload(self, topic: Optional[str]=None, reviewers: Optional[List[str]]=None, ccs: Optional[List[str]]=None, hashtags: Optional[List[str]]=None, send_mail: bool=False, target_branch: Optional[str]=None, dry_run: bool=False, use_local_diff: bool=False, message: Optional[str]=None, **kwargs):

Run git cl upload.

--force and --bypass-hooks are always set to remove the need to enter confirmations and address nits.

Args: topic: --topic to set. reviewers: list of --reviewers to set. ccs: list of --cc to set. hashtags: list of --hashtags to set. send_mail: If true, set --send-mail. target_branch: --target-branch to send to. Needs to be a full ref (e.g. refs/heads/branch), not the branch name (e.g. branch). kwargs: Forwarded to recipe_engine/step. May NOT set stdout. dry_run: If true, set --cq-dry-run. use_local_diff: If true, use git diff args to upload the local diff instead of diff taken against tip-of-branch. message: Message for patchset. (-m)

Returns: The command output.

recipe_modules / git_footers

DEPS: gerrit, depot_tools/depot_tools, recipe_engine/raw_io, recipe_engine/step

API wrapping the git_footers script..

class GitFootersApi(RecipeApi):

A module for calling git_footers.

def __call__(self, *args, **kwargs):

Call git_footers.py with the given args.

Args: args: Arguments for git_footers.py kwargs: Keyword arguments for python call.

Returns: list[str]: All matching footer values, or None

def edit_add_change_description(self, change_message, footer, footer_text):

Edit or add the given footer to the change_message.

Args: change_message (str): The gerrit change message. footer (str): The name of the footer, e.g. “Cq-Depends” footer_text (str): The value of the footer. If footer_text starts with "{footer}: ", that prefix will be ignored.

Returns: str: Modified change_message.

def from_gerrit_change(self, gerrit_change, key=None, memoize=True, **kwargs):

Return the footer value(s) in the commit message for the given key.

Args: gerrit_change (GerritChange): The change of interest. key (str): The footer key to look for. If not set, returns all footers found in the Gerrit change message. Note that if this parameter is set, it is EXCLUDED from the returned footer string(s). If it is not set, the footers are formatted as ‘:’. memoize (bool): Should we memoize the call (default: True).

Returns: list[str]: The footer value(s) found in the commit message.

def from_message(self, message, key=None, **kwargs):

Return the footer value(s) in the commit message for the given key.

Args: message (str): The git commit message. key (str): The footer key to look for. If not set, returns all footers found in the message. Note that if this parameter is set, it is EXCLUDED from the returned footer string(s). If it is not set, the footers are formatted as ‘:’.

Returns: list[str]: The footer value(s) found in the commit message.

def from_ref(self, ref, key=None, **kwargs):

Return the footer value(s) in the given ref for the given key.

Args: ref (str): The git ref. key (str): The footer key to look for. See from_message docstring.

Returns: list[str]: The footer value(s) found in the ref's commit message.

def get_footer_values(self, gerrit_changes: List[common_pb2.GerritChange], key: str, **kwargs):

Gets a list of values from a footer.

Fetches the named footer from the gerrit changes, and returns a set of all of the (comma-separated) values found.

Args: gerrit_changes: Gerrit changes applied to this run. key: The footer name (key) to fetch. kwargs: Other keyword arguments, passed to git_footers.from_gerrit_change.

Returns: values: A set of values found for the given key. May be empty.

def position_num(self, ref, test_position_num=None, **kwargs):

Return the footer value for Cr-Commit-Position.

Args: ref (str): The git ref. test_position_num (int): The test value. step_test_data, if given, will override this. **kwargs (dict): positional parameters for self.call()

Returns: list[str]: The position number for the ref.

def position_ref(self, ref, test_position_ref=None, **kwargs):

Return the footer ref for Cr-Commit-Position.

Args: ref (str): The git ref. test_position_ref (int): The test value. step_test_data, if given, will override this. **kwargs (dict): positional parameters for self.call()

Returns: str: The position ref for the ref.

recipe_modules / git_txn

DEPS: gerrit, git, repo, recipe_engine/file, recipe_engine/step

API for updating remote git repositories transactionally.

class GitTxnApi(RecipeApi):

A module for executing git transactions.

def update_ref(self, remote, update_callback, step_name=‘update ref’, ref=None, dry_run=False, automerge=False, retries=3):

Transactionally update a remote git repository ref.

|update_callback| will be called and should update the checked out HEAD by e.g. committing a new change. Then this new HEAD will be pushed back to the |remote| |ref|. If this push fails because the remote ref was modified in the meantime, and automerge is off, the new ref is fetched and checked out, and the process will repeat up to |retries| times.

The common case is that there‘s no issue updating the ref, so we don’t do a fetch and checkout before attempting to update. This means that the function assumes that the repo is already checked out to the target ref.

This step expects to be run with cwd inside a git repo.

Args: remote (str): The remote repository to update. update_callback (callable): The callback function that will update the local repo‘s HEAD. The callback is passed no arguments. If the callback returns False the update will be cancelled but succeed. step_name (str): Step name to be displayed in the logs. ref (str): The remote ref to update. If it does not start with ‘refs/’ it will be treated as a branch name. If not specified, the HEAD ref for the remote of the current repo project will be used. dry_run (bool): If set, pass --dry-run to git push. automerge (bool): Whether to use Gerrit’s “auto-merge” feature. retries (int): Number of update attempts to make before failing.

Returns: bool: True if the transaction succeeded, false if it explicitly aborts.

def update_ref_write_files(self, remote: str, message: str, changes: List[Tuple[(Path, str)]], automerge: bool=False, ref: Optional[str]=None):

Transactionally update files in a remote git repository ref.

See ‘self.update_ref’. Instead of running a callback, this will attempt to update the contents of files.

Args: remote: The remote repository to update. message: The commit message to use. changes: List of the tuple of the file paths and the contents to write. automerge: Whether to use Gerrit's “auto-merge” feature. ref: The remote ref to update. If it does not start with ‘refs/’ it will be treated as a branch name. If not specified, the HEAD ref for the remote of the current repo project will be used.

Returns: bool: True if one or more files changed and the transaction succeeded, false if no file changed and any transaction did not happen.

Raises: TooManyAttempts: if the number of attempts exceeds |retries|.

recipe_modules / gitiles

DEPS: easy, support, recipe_engine/path, recipe_engine/step

APIs for working with Gitiles.

class GitilesApi(RecipeApi):

A module for Gitiles helpers.

def fetch_revision(self, host, project, branch, test_output_data=None):

Call gitiles-fetch-ref support tool.

Args: host (str): Gerrit host, e.g. ‘chrome-internal’. project (str): Gerrit project, e.g. ‘chromiumos/chromite’. branch (str): Gerrit branch, e.g. ‘main’. test_output_data (dict): Test output for gitiles-fetch-ref.

Returns: str: the current revision hash of the specified branch

def file_url(self, commit, file_path=None):

Return the url for a file in a GitilesCommit.

Args: commit (GitilesCommit): The gitiles commit to use. file_path (str): The file path to append, if any.

Returns: (str) The url for the file.

def get_file(self, host, project, path, ref=None, public=True, credential_cookie_location=None, test_output_data=None):

Return the contents of a file hosted on Gitiles.

Curl will return a zero exit status on many occasions if the server responded even if the response isn't what you expected. When this succeeds the server returns base64, so not being able to decode this is a good indication something is wrong.

Args: host (str): Gerrit host, e.g. chrome-internal.googlesource.com. project (str): Gerrit project, e.g. chromiumos/chromite. path: (str): The path to the file e.g. api/controller/something.py. ref: (str): The ref you should return the file from, default: HEAD. public: (bool): If False, will look in .git-credential-cache for an authorization cookie and use it in the curl. Default: True. credential_cookie_location: (str): The credential cookie location. Default: ‘~/.git-credential-cache/cookie’. test_output_data (str): Test output for curl.

Returns: (str) The contents of the file as a string or raise StepFailure on unexpected curl return.

def repo_url(self, commit):

Return the url for the repo in a GitilesCommit.

Args: commit (GitilesCommit): The gitiles commit to use.

Returns: (str) The url for the repo.

recipe_modules / gobin

DEPS: git, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

API for interacting with Go binaries built from infra/infra.

class GobinAPI(RecipeApi):

Module for interacting with Go binaries built from infra/infra.

def call(self, package: str, cmd: List[str], step_name: str=None, **kwargs):

Call a binary with the given args.

def ensure_package(self, package: str):

Ensure that the specified package is installed.

Looks up the instance associated with the infra/infra commit stored in infrainfra-golang.version.

Args: package: The package to ensure.

Returns: The path to the relevant cipd binary.

def get_latest_pin_value(self, current_pin: str):

Returns the most recent infra/infra SHA that is a viable pin.

Specifically, returns the latest SHA for which there is a CIPD instance for each of SUPPORTED_PACKAGES.

def mirror_prod_refs_to_latest(self):

Mirror the ‘prod’ ref to the ‘latest’ ref for PROD_MIRRORED_PACKAGES.

See the comment on PROD_MIRRORED_PACKAGES for context on why this is needed for some packages.

@property
def supported_packages(self):

Return the golang packages supported by this module.

recipe_modules / golucibin

DEPS: recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step

API to call into Go luci binaries.

class GoLuciBinAPI(RecipeApi):

Module for running Go luci binaries.

def ensure_package(self, package: str, cipdLabel: str):

def execute_luciexe(self, package: str, cipdLabel: str, args: List[str]=None, runningAsync: bool=False):

Execute go luciexe binary.

recipe_modules / goma

DEPS: support, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

API for working with goma.

class GomaApi(RecipeApi):

A module for working with goma.

@property
def default_bqupload_dir(self):

@property
def goma_approach(self):

@property
def goma_dir(self):

Lazily fetches the goma client and returns its path.

def initialize(self, also_bq_upload=False):

def process_artifacts(self, install_pkg_response: sysroot_pb2.InstallPackagesResponse, goma_log_dir: str, build_target_name: str, is_staging: bool=False):

Process goma artifacts, uploading to gsutil if they exist.

Args: install_pkg_response: May contain goma artifacts. goma_log_dir: Log directory that contains the goma artifacts. build_target_name: Build target string. is_staging: If being run in staging environment instead of prod.

Returns: Tuple containing the GS bucket and path used to write log files. None is returned if there were no artifacts to process.

recipe_modules / greenness

DEPS: buildbucket_stats, cros_infra_config, cros_tags, easy, failures, git, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/step

API providing a menu for calculating greenness metric.

class GreennessApi(RecipeApi):

A module to calculate greenness metric.

@property
def builder_greenness_dict(self):

def get_local_build_greenness(self, is_bazel: bool):

Returns a filtered list of greenness tuples for local builds.

It is assumed that builder_greenness_dict is prepopulated (i.e. update_build_info was previously called); otherwise, an empty list will be returned.

def is_green_for_local(self, is_bazel: bool=False):

Returns whether the current snapshot is green for local builds.

If there are irrelevant builders for the current snapshot, the greenness score from the last relevant build is used. It is assumed that builder_greenness_dict is prepopulated (i.e. update_build_info was previously called); otherwise, a false positive will be returned.

def print_step(self):

Print comprehensive greenness info in a step.

def publish_step(self):

Publish greenness to output properties.

def update_build_info(self, builds: List[build_pb2.Build]):

Update greenness with build information.

If there are any irrelevant builders, this method finds the last snapshot and propagates build greenness forward. Note that the last snapshot may not have completed, and thus test greenness may not get propagated.

Args: builds: List of builds that have completed. wait_for_complete: If true, wait until the snapshot orchestrator build is completed. Otherwise, only wait until the snapshot orchestrator build publishes the build greenness output property. Note that the build will publish the build greenness before the test greenness; i.e. this should be set true if test greenness is required.

def update_hwtest_info(self, results: List[SkylabResult]):

Update greenness with HW test information.

Args: results: Results of the HW test runs.

def update_irrelevant_scores(self):

Update scores in the greenness dict for irrelevant builds.

Build scores are propagated forward for irrelevant builds when builds are added in update_build_info. update_build_info does not wait for the previous snapshot orchestrator to complete, and thus test scores may not be propagated forward. This function waits for the previous snapshot orchestrator to complete and propagates test scores forward.

def update_vmtest_info(self, results: List[build_pb2.Build]):

Update greenness with VM test information.

Args: results: Builds of the VM test runs.

recipe_modules / gs_step_logging

DEPS: urls, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/step

APIs for logging step output to Google Storage.

class GSStepLoggingApi(RecipeApi):

A module for logging step output to Google Storage.

@contextlib.contextmanager
def log_step_to_gs(self, gs_prefix):

Returns a context that logs stdout of the final step to GS.

Note that only the final step is logged (i.e. the step.active_result as the context exits).

Args: gs_prefix (step): Prefix for the logged GS objects. Should contain the bucket name, but not the ‘gs://’ prefix. For example, ‘/logging’. If None, nothing is logged.

recipe_modules / image_builder_failures

DEPS: cros_infra_config, cros_tags, failures, naming, src_state, urls, recipe_engine/buildbucket, recipe_engine/step

API for raising image builder failures and presenting them.

class ImageBuilderFailuresApi(RecipeApi):

A module for presenting errors and raising StepFailures.

@property
def package_failures(self):

def raise_failed_image_tests(self, failed_images):

Display failed image tests and raise a failure.

Displays the images that failed tests and raises a failure if there are failed image tests. If there are no failed image tests, a success message is output.

Args: failed_images: (list[chromite.image.Image]): The images that failed tests.

Raises: StepFailure: If failed_images is not empty.

def set_compile_failed_packages(self, enclosing_step: Step, packages: List[Tuple[(common_pb2.PackageInfo, str)]], cl_affected_packages: Optional[List[common_pb2.PackageInfo]]=None):

If any packages failed compilation set presentation and raise failure.

Args: enclosing_step: The enclosing step to mutate. packages: The failed packages. cl_affected_packages: The packages which are affected by the changes under test.

Raises: StepFailure: If failed_packages is not empty.

def set_test_failed_packages(self, enclosing_step: Step, packages: List[Tuple[(common_pb2.PackageInfo, str)]], cl_affected_packages: Optional[List[common_pb2.PackageInfo]]=None):

If any packages failed unit tests set presentation and raise failure.

Args: enclosing_step: The enclosing step to mutate. packages: The failed packages. cl_affected_packages: The packages which are affected by the changes under test.

Raises: StepFailure: If failed_packages is not empty.

recipe_modules / incremental

DEPS: build_menu, cros_sdk, cros_source, git, repo, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

Recipe module to perform a build on an old checkout state.

class IncrementalApi(RecipeApi):

Module for performing builds on old checkout state.

def DoOldBuild(self, api: RecipeApi, config: BuilderConfig, properties: IncrementalProperties):

Rewind the checkout, install packages, and then forward the checkout.

Args: api: The recipe API. config: The BuilderConfig for this incremental builder. properties: Input properties for this build.

recipe_modules / ipc

DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

A module for inter-process communication.

class IPCApi(RecipeApi):

def initialize(self):

def make_subscription(self, topic: str, sub_name: str):

Create a subscription within a topic

Args: topic: Pubsub topic name. sub_name: Pubsub subscription name.

def receive(self, topic: str, sub_name: str, filter_attributes: Dict[(str, str)]=None):

Receive one message from the filtered subscription specified.

Args: topic: Pubsub topic name. sub_name: Pubsub subscription name. filter_attributes: dict encoding a ‘subtopic’; messages which do not include the required attributes will be acknowledged but the message body will be ignored. Returns: The message body.

def send(self, topic: str, message_body: bytes, attributes: Optional[Dict[(str, str)]]=None):

Send a pubsub message on the given topic.

Args: topic: Pubsub topic name. message_body: Message to send. attributes: dict encoding a ‘subtopic’; subscribers, will take no action on messages outside their subtopic.

recipe_modules / iterutils

class IterutilsApi(RecipeApi):

Utility functions for working with iterables

def get_one(self, iterable, predicate, error_msg):

Returns the one item from iterable matching predicate.

Raises: A ValueError with error_msg if iterable doesn't have exactly one item matching predicate.

recipe_modules / key_value_store

DEPS: recipe_engine/step

Module to interact with key-value store files.

Key-value stores are used commonly by Chromite, including some locations in our codebase and in our Google Storage files.

Key-value store files comprise a series of key=“value” pairs, each on distinct lines. Values can span multiple lines; they end when they reach a line that ends with the same quote character (' or ") that started the value. Lines may be blank. Comments are lines beginning with “#”. Any line may also begin with whitespace, and there may be whitespace surrounding the “=”.

Below is a sample valid key-value store.

# Copyright 2023 The ChromiumOS Authors
# Etc etc etc
simple_value_1="hello"
    simple_value_2   =      'hello'
simple_value_3 = "I contain an internal quote (")!"

multiline_1 = "Hello,
world!"

multiline_2 = "Mismatched end quote: '
That didn't end the value because it didn't match the starting quote."

multiline_3 = "Check this one out...
not_really_a_value = 'Did I fool you?'
The above line wasn't parsed because it's part of a value."

Note: If you‘re designing a new data store, please use JSON rather than this format. This library is designed to work with legacy/external files where JSON isn’t an option.

class KeyValueStoreApi(RecipeApi):

def parse_contents(self, contents: str, source: str=''):

Return the contents of a key-value store interpreted as a dict.

Args: contents: The complete contents of the key-value store. source: Optional string describing where the contents came from. If provided, this will be included in the step name.

Returns: A dictionary of {key: value} containing the key-values from contents, in the order that the keys were found.

def update_one_value(self, original_contents: str, key: str, new_value: str, append_if_missing: bool=False):

Update a single value in the contents of a key-value store.

Right now, this function will not work if the existing value spans multiple lines. Implement that if it becomes necessary.

Other lines, such as comments and newlines, will be preserved.

Args: original_contents: The complete contents of a key-value store file. key: The key whose value will be updated. new_value: The new value to set for the key. append_if_missing: If True and the key is not in original_contents, then the key and value will be appended to the file. If False and the key is not in original_contents, then an exception will be raised.

Returns: A new string containing the contents of an updated key-value store, with the key set to the new value.

Raises: StepFailure: If append_if_missing is False and the key is not found. StepFailure: If the key is assigned multiple times in original_contents. InfraFailure: If the key cannot be wrapped in single or double quotes. InfraFailure: If the key's value in original_contents is multiline. If you ever see this failure mode in production, consider implementing multiline support!

recipe_modules / labpack

DEPS: cros_tags, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/step

This is the labpack API.

It downloads the lapback CIPD executable to a shared area and manages access to it.

class LabpackCommand(RecipeApi):

Labpack command is a singleton whose methods invoke the labpack CIPD executable

Labpack has the following public attributes:

  • cipd_label

  • cipd_package

  • downloaded_executable_path: config_types.Path

@staticmethod
def convert_step_data_to_status(step_data, dut_state):

Utility method to convert step data into a status like “ready”.

def ensure_labpack(self):

Ensure labpack ensures that labpack exists.

We create a cipd package area inside the cleanup directory, add labpack to the manifest file, and then ensure the resulting manifest.

Args: No arguments

Returns: Dictionary

def execute_ile_de_france(self, common_config, dut_state, models=None, hostnames=None):

Whether to use Ile-de-France or not.

Args:

  • common_config: the test runner properties
  • dut_state: the incoming dut state
  • models: the models in question
  • hostnames: the hostnames in question

Returns:

  • the outgoing dut_state

def get_augmented_build(self):

return a build augmented with fields

def get_build(self):

get_build gets a copy of the input build

def get_cipd_executable_name(self):

get_cipd_executable_name gets the executable name from the CIPD path

def get_cipd_path(self):

Get the path of the cipd package.

def get_dut_name(self):

get the dut name from the swarming bot dimensions

@staticmethod
def get_ufs_host():

@staticmethod
def get_use_ile_de_france(models, common_config):

Whether to use ile de france or not

Args:

  • models: a list of models
  • common_config: the common config

Returns: bool, whether to use ile de france or not

def has_downloaded_package(self):

def run_labpack(self, labpack_input: LabpackInput, only_run_once=False, **kwargs):

Run labpack command.

The kwargs are sent along without modification to easy.step.__call__. Note that the most important miscellaneous arg is “timeout”.

Args: labpack_input: a LabpackInput instance only_run_once: whether to only run once or not kwargs: a dictionary of the rest of the output to be handed to sub_build.

Returns: see step.call or None

recipe_modules / lfg_util

DEPS: cros_source, easy, failures, gerrit, git_footers, depot_tools/gerrit, recipe_engine/step

Utility functions for looks for green.

class LFGUtilApi(RecipeApi):

A module for util functions associated with LFG.

def get_depended_cls(self, gerrit_changes: List[GerritChange]):

Gets the depended CLs for the given list of changes.

Note that the depended CLs returned are ones that are not included in the CQ run for the given list of changes.

Args: gerrit_changes: Gerrit changes to analyze.

Returns: List of depended Gerrit changes.

def get_parent_changes(self, merge_commit_cls: List[GerritChange], step_test_data=None):

Retrieve the changes corresponding to the parent commits.

Args: merge_commit_cls: merge commit changes to find parents of.

Returns: submitted changes corresponding to the parent commits of input changes.

def json_to_gerritchanges(self, changes_dict: Dict):

Create GerritChange objects from JSON objects return from GerritAPI.

Args: changes_dict: A map of gerrit_change in this run to the changes that they related on the stack.

Returns: List the related changes in the proto format.

def latest_submission_time(self, gerrit_changes: List[GerritChange], step_test_data=None):

Find the last submitted change and return the submit time.

Args: gerrit_changes: Gerrit changes to analyze.

Returns: Submit time of the last submitted change among the inputs, None if one of them hasn't submitted yet.

def not_included_cq_depend_cls(self, gerrit_changes: List[GerritChange]):

Find the CQ-Depended changes that are not included in this run.

Args: gerrit_changes: Gerrit changes included in this CQ run.

Returns: List of changes that input CLs CQ-depend on but are not included in this run.

recipe_modules / looks_for_green

DEPS: buildbucket_stats, cros_history, cros_infra_config, easy, failures, gerrit, git_footers, greenness, lfg_util, naming, depot_tools/gerrit, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/step, recipe_engine/time

Functions implementing looks for green.

class LooksForGreenApi(RecipeApi):

A module to look for green snapshots.

def calc_approx_snap_age_hours(self, orch_start_time: datetime.datetime):

Returns how many hours age the latest scored snap-orch started.

This is used as an approximation of snapshot manifest age since a snapshot-orchestrator run starts within ~30 minutes of snapshot creation.

Returns: Approx age in hours of snapshot used by latest scored snap-orch.

def find_green_snapshot(self, latest_start: Optional[timestamp_pb2.Timestamp]=None, bucket: Optional[str]=None, builder: Optional[str]=None, requested_snapshot_builders: Optional[List[str]]=None, lookback_hours: Optional[float]=None):

Find a green snapshot within the lookback period if one exists.

Optionally specify a latest_start time in UTC for builds.

Args: latest_start: The latest start time of snapshot orchestrators to consider, or the current time if not set. bucket: If specified, the bucket to search in. Defaults to self._greenness_bucket. builder: If specified, the builder to search for. Defaults to self._greenness_builder. requested_snapshot_builders: Optional, a list of builders to consider when aggregating greenness. lookback_hours: Optional, the amount of time to lookback for a green snapshot. Returns: A green snapshot, if one was found.

def found_disallow_lfg_footer(self, gerrit_changes: List[common_pb2.GerritChange]):

Check the incoming gerrit changes for disallow looks for green footer.

Args: gerrit_changes: The gerrit changes.

Returns: Whether the disallow LFG footer is included and not set to false.

def get_latest_snapshot_greenness(self, bucket: Optional[str]=None, builder: Optional[str]=None):

Returns the latest scored Snapshot.

Or None if no latest scored snapshot is found.

If the latest snapshot-orchestrator has just started, we won't have greenness yet, so look back at the most recent snapshot-orchestrator that does have greenness populated.

Args: bucket: If specified, the bucket to search in. Defaults to self._greenness_bucket. builder: If specified, the builder to search for. Defaults to self._greenness_builder.

Returns: Snapshot from the latest scored snapshot-orchestrator, or None if not found.

def get_requested_snapshot_builders(self, necessary_builders: List[str]):

Gets the requested snapshot builders for find_green_snapshot.

Returns: The names of the snapshot equivalents of the builders in necessary_builders if all necessary_builders have a corresponding snapshot builder. Otherwise, return None.

@lookback_hours.setter
def lookback_hours(self, lookback_hours):

@property
def now_utc(self):

Returns the current UTC time.

Initialized once and used throughout for any time calculations. Zero out the microseconds to use seconds as level of precision.

def resize_lfg_lookback(self, gerrit_changes: List[GerritChange], builders_to_be_scheduled: List[str], lookback_hours: Optional[float]=None):

Change LFG lookback based on the given parameters.

Lookback can be resized based on depended changes and/or broken_until.

Args: gerrit_changes: Gerrit changes to analyze. builders_to_be_scheduled: List of builders that need to be scheduled. lookback_hours: If specified, use the specified value when finding the minimum lookback hours. Only used by the auto-retrier.

Returns: The resized lookback hours.

@property
def seconds_utc(self):

Returns the current UTC time in seconds.

Initialized once and used throughout for any time calculations. Cast to int to use seconds as level of precision.

def set_stats(self):

Sets the LFG output property based on latest info.

def should_lfg(self, gerrit_changes: List[GerritChange]):

Returns whether looks for green logic should be run.

@property
def stats(self):

Returns looks for green stats

recipe_modules / mass_deploy

DEPS: build_menu, recipe_engine/buildbucket, recipe_engine/step

An API for triggering the mass deploy builder

class MassDeployApi(RecipeApi):

def run_mass_deploy_generation(self, signing_metadata):

Run the generation of the mass deployment image, but don't wait for it.

This assumes signed builds have already been generated.

recipe_modules / metadata

DEPS: cros_build_api, util, recipe_engine/file, recipe_engine/path

API to support metadata generation and wrangling.

class MetadataApi(RecipeApi):

A module with config and support methods for metadata.

Specifically, this supports the new class of metadata we're generating as part of a build, including, but not necessarily limited to:

  • container metadata
  • software metadata
  • hardware metadata
  • test metadata

def fetch_test_harness_metadata(self, chroot: Chroot, sysroot: Sysroot, mock_metadata_file: bool=True):

Fetch and return test harness metadata.

Args: chroot: Proto representing the chroot. sysroot: Proto representing the sysroot. mock_metadata_file: Whether the returned filepath should be mocked as existing during tests. This should always be True unless testing the case when a metadata file is not found.

Returns: A TestHarnessMetadataList containing the metadata of all harnesses, or None if the ArtifactsService/FetchTestHarnessMetadata endpoint is unavailable or all metadata files are empty.

def fetch_test_metadata(self, chroot: Chroot, sysroot: Sysroot, mock_metadata_file: bool=True):

Fetch and return test case metadata.

Args: chroot: Proto representing the chroot. sysroot: Proto representing the sysroot. mock_metadata_file: Whether the returned filepath should be mocked as existing during tests. This should always be True unless testing the case when a metadata file is not found.

Returns: A TestCaseMetadataList containing the metadata of all tests, or None if the ArtifactsService/FetchMetadata endpoint is unavailable or all metadata files are empty.

def gspath(self, metadata_info, gs_bucket=None, gs_path=None):

Return full or relative path to a metadata payload depending on if bucket info is provided or not.

Args: metadata_info (MetadataInfo): Metadata config information gs_bucket (str): optional gs bucket gs_path (str): optional gs path

Returns: The relative GCS path for metadata if gs_bucket, gs_path not provided. Otherwise, returns the full GCS path to metadata.

recipe_modules / metadata_json

DEPS: cros_artifacts, cros_infra_config, cros_source, test_util, urls, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/led, recipe_engine/path, recipe_engine/step, recipe_engine/time

API to write metadata json file into GS for GoldenEye consumption.

class MetadataJsonApi(RecipeApi):

A module to write metadata.json into GS for GoldenEye consumption.

def add_default_entries(self):

These fields are available at the start of the build.

def add_entries(self, **kwargs):

Add elements to metadata.

Args: kwargs (dict): dictionary of key-values to update.

def add_stage_results(self):

Add stage results for DebugSymbols and Unittest stages.

def add_version_entries(self, version_dict):

Update metadata with version info.

Args: version_dict (dict): Map containing version info.

@contextlib.contextmanager
def context(self, config, targets=()):

Returns a context that upload final metadata.json to GS.

Args: config (BuilderConfig): builder config of this builder. targets (list[BuildTarget]): The build targets of this builder.

def finalize_build(self, config, targets, success):

Finish the build stats and upload metadata.json.

Args: config (BuilderConfig): builder config of this builder. targets (list[BuildTarget]): The build target of this builder. success (bool): Did this build pass.

def get_metadata(self):

Get the metadata dict. Should only be used for unittesting.

Returns: dict, metadata info.

def upload_to_gs(self, config, targets, partial=False):

Upload metadata to GS at its current state.

Args: config (BuilderConfig): builder config of this builder. targets (list[BuildTarget]): The build targets of this builder. The first element should be the build_target for the build, the entire list is used for additional publication locations. partial (bool): whether the metadata is incomplete.

def write_to_file(self, filename):

Write metadata dict to a tempfile.

Args: filename (str): Filename to write to.

Returns: str, path to the file written.

recipe_modules / naming

API featuring shared helpers for naming things.

class NamingApi(RecipeApi):

A module with helpers for naming things.

@staticmethod
def get_build_title(build: build_pb2.Build):

Get a string to describe the build.

Args: build: The build to describe.

Returns: A string describing the build.

@staticmethod
def get_commit_title(commit: Commit):

Get a string to describe the commit.

This is typically the first line of the commit message.

Args: commit: The commit in question. See recipe_modules/git/api.py

Returns: The commit title.

@staticmethod
def get_generation_request_title(req):

Get a presentation name for a single GenerationRequest.

Args: req (dict): Dict representing a GenerationRequest proto, containing a single payload to be created.

Returns: A string providing helpful info about that payload.

@staticmethod
def get_hw_test_title(hw_test: HwTestCfg.HwTest):

Get a string to describe the HW test.

Args: hw_test: The HW test in question.

Returns: The HW test title.

@staticmethod
def get_package_title(package: common_pb2.PackageInfo):

Get a string to describe the package.

Args: package: The package in question.

Returns: The package title.

@staticmethod
def get_paygen_build_title(build_id: int, paygen_request_dicts: List[Dict]):

Get a presentation name for a build running a batch of PaygenRequests.

Args: build_id (int): The ID of the Paygen build being run. paygen_request_dicts (List[dict]): Dicts representing a batch of PaygenRequests being run by a single Paygen builder.

Returns: A string providing helpful info about the paygens being run.

def get_skylab_result_title(self, skylab_result: SkylabResult):

Get a string to describe the HW test.

Args: skylab_result: The Skylab result in question.

Returns: The HW test title.

def get_skylab_task_title(self, skylab_task: SkylabTask):

Get a string to describe the Skylab task.

Args: skylab_task: The Skylab task in question.

Returns: The Skylab task title.

@staticmethod
def get_snapshot_builder_name(cq_builder_name: str):

Converts a cq builder name to matching snapshot builder name.

For example, ‘amd64-generic-cq’ -> ‘amd64-generic-snapshot’ or ‘amd64-generic-slim-cq’ -> ‘amd64-generic-snapshot’.

def get_test_title(self, test: Union[(SkylabResult, build_pb2.Build)]):

Get a string to describe the test.

Args: test: The test in question.

Returns: A str describing the test.

recipe_modules / observability_image_size

DEPS: cloud_pubsub, cros_build_api, cros_version, easy, src_state, recipe_engine/buildbucket, recipe_engine/step

API for retrieving image and package size data.

class ObservabilityImageSizeApi(RecipeApi):

Collect image size data.

def publish(self, config, build_target, target_versions, built_images, chroot):

Collect and publish the image size data.

recipe_modules / orch_menu

DEPS: bot_cost, build_menu, build_plan, checkpoint, conductor, cros_artifacts, cros_cq_additional_tests, cros_history, cros_infra_config, cros_lkgm, cros_release, cros_source, cros_tags, cros_test_plan, cros_test_plan_v2, cros_test_proctor, cros_version, easy, failures, failures_util, gerrit, git, git_footers, gitiles, gobin, greenness, looks_for_green, metadata, naming, skylab, skylab_results, src_state, test_util, workspace_util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cv, recipe_engine/futures, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

API providing a menu for orchestrator steps

class OrchMenuApi(RecipeApi):

A module with steps used by orchestrators.

Orchestrators do not call other recipe modules directly: they always get there via this module, and are a simple sequence of steps.

def add_child_info_to_output_property(self, relevant_child_builder_names: List[str]=None):

Add child information to output property of current build.

Args: relevant_builder_names: List of relevant child builders.

def aggregate_metadata(self, child_builds):

Aggregate metadata payloads from children.

Pull metadata message of each type from children and merge the messages together. Upload the resulting message as our own metadata.

Args: child_builds ([BuildStatus]): BuildStatus instances for child builds

Returns: (ContainerMetadata): Aggregated container metadata

@property
def builds_status(self):

def categorize_builds_by_collect_handling(self, child_specs: List[BuilderConfig.Orchestrator.ChildSpec], builds: List[build_pb2.Build]):

Group builds by CollectHandling value.

Args: child_specs: The list of ChildSpecs used for build planning. builds: The list of builds spawned by this CQ run.

Returns: A dict mapping CollectHandling to the list of builds which fall into that category.

@property
def chrome_module_child_props(self):

@property
def chromium_src_ref_cl_tag(self):

@property
def config(self):

def cq_relevant(self, build: build_pb2.Build):

Whether the CQ child build was critical and relevant.

Args: build: The child build.

def create_cq_orch_recipe_result(self):

Create the correct return value for RunSteps.

def create_recipe_result(self, include_build_details: bool=False, ignore_build_test_failures: bool=False):

Create the correct return value for RunSteps.

Args: include_build_details: If True augment RawResults.summary_markdown with additional details about the build for both successes and failures. ignore_build_test_failures: If True, we will still produce a summary of failures if present, but we will not set the build status to FAILURE.

Returns: (recipe_engine.result_pb2.RawResult) The return value for RunSteps.

@property
def external_gitiles_commit(self):

@property
def gerrit_changes(self):

@property
def gitiles_commit(self):

def initialize(self):

@property
def is_dry_run(self):

@property
def is_factory_orchestrator(self):

@property
def is_public_orchestrator(self):

@property
def is_release_orchestrator(self):

@property
def is_snapshot_orchestrator(self):

def plan_and_run_children(self, run_step_name=None, results_step_name=None, extra_child_props=None):

Plan, schedule, and run child builders.

Args: run_step_name (str): Name for “run builds” step, or None. results_step_name (str): Name for “check build results” step, or None. extra_child_props (dict): If set, extra properties to append to the child builder requests. Returns: (BuildsStatus): The current status of the builds.

def plan_and_run_tests(self, testable_builds: Optional[List[build_pb2.Build]]=None, ignore_gerrit_changes: bool=False):

Plan, schedule, and run tests.

Run tests on the testable_builds identified by plan_and_run_children.

Args: testable_builds: The list of builds to consider or None to use the current results. ignore_gerrit_changes: Whether to drop gerrit changes from the test plan request, primarily used for tryjobs (which are release builds and thus shouldn't test based on any patches applied).

Returns: BuildsStatus updated with any test failures.

def plan_and_wait_for_images(self, run_step_name: Optional[str]=None, extra_child_props: Optional[Dict[(str, Any)]]=None):

Plan and schedule children, and wait until they have produced images.

Args: run_step_name: Name for “run builds” step, or None. extra_child_props: If set, extra properties to append to the child builder requests.

Returns: A list of builds that have produced images and are ready for testing.

@property
def relevant_child_builder_names(self):

def run_follow_on_orchestrator(self, check_failures=False):

Run the follow_on_orchestrator, if any. Wait if necessary.

def schedule_wait_build(self, builder, await_completion=False, properties=None, check_failures=False, step_name=None, timeout_sec=None):

Schedule a builder, and optionally await completion.

Args: builder (str): The name of the builder: one of project/bucket/builder, bucket/builder, or builder. await_completion (bool): Whether to await completion. properties (dict): Dictionary of input properties for the builder. check_failures (bool): Whether or not failures accumulate in builds_status. This is only used if await_completion is True. step_name (str): Name for the step, or None. timeout_sec (int): Timeout for the builder, in seconds.

Returns: (Build): The build that was scheduled, and possibly waited for.

@contextlib.contextmanager
def setup_cq_orchestrator(self):

Initial setup steps for the cq-orchestrator.

This context manager returns with all of the contexts that the orchestrator needs to have when it runs, for cleanup to happen properly.

If appropriate, any inflight orchestrator has finished before we return.

Raises: StepFailure if no config is found.

@contextlib.contextmanager
def setup_orchestrator(self):

Initial setup steps for the orchestrator.

This context manager returns with all of the contexts that the orchestrator needs to have when it runs, for cleanup to happen properly.

If appropriate, any inflight orchestrator has finished before we return.

Raises: StepFailure if no config is found.

Returns: BuilderConfig or None, with an active context.

@property
def skip_paygen(self):

recipe_modules / overlayfs

DEPS: easy, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step

API for working with OverlayFS mounts (the Linux ‘overlay’ filesystem).

See: https://www.kernel.org/doc/Documentation/filesystems/overlayfs.txt

class OverlayfsApi(RecipeApi):

A module for interacting with OverlayFS mounts.

def __init__(self, props, *args, **kwargs):

Initialize OverlayfsApi.

@contextlib.contextmanager
def cleanup_context(self):

Returns a context that cleans up any overlayfs mounts created in it.

Upon exiting the context manager, each mounted overlay is then iterated through and unmounted.

def cleanup_overlay_directories(self, cache_name):

Remove the upper and workdiretories to reset a named cache mount.

Resets the status of an overlayfs mount by removing both the work and upper directories. This is typically used if the status of the lower directory changes.

Args: cache_name (str): Name of the named cache to cleanup.

def mount(self, name, lowerdir_path, mount_path, persist=False):

Mount an OverlayFS.

As an overlay is mounted, the overlay is then added to the stack that is used by the context manager to unmount as the task ends.

Args:

  • name (str): An alphanumeric name for the mount, used for display and implementation details. Underscores are allowed. Should usually be unique within a recipe.
  • lowerdir_path (Path): Path to the OverlayFS “lowerdir”. See mount(8) “Mount options for overlay”.
  • mount_path (Path): Path to mount the OverlayFS at. Will be created if it doesn't exist.
  • persist (bool): Whether to persist the mount beyond one execution.

def unmount(self, name, mount_path):

Unmount an OverlayFS.

As an overlay is unmounted, the overlay is then removed from the stack that is used by the context manager to unmount as the task ends.

Args:

  • name (str): The name used for |mount|.
  • mount_path (Path): Path to unmount the OverlayFS from.

recipe_modules / paygen_orchestration

DEPS: conductor, cros_infra_config, cros_sdk, cros_storage, cros_version, naming, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/led, recipe_engine/random, recipe_engine/raw_io, recipe_engine/step

API for orchestrating payload generation. Used by paygen_orchestrator.

class PaygenOrchestrationApi(RecipeApi):

A module for CrOS-specific paygen orchestration steps.

@property
def default_delta_types(self):

def get_builder_configs(self, builder_name: str, **kwargs):

Return the configs matching the query or [].

Note that all comparisons are made in lower case!

Args: builder_name: The name of the builders to return configuration for. **kwargs: Match keyword to top level dictionary contents. For example passing delta_payload_tests=true will match only if matched.

Returns: A list of dictionaries of the matching configurations. For example:

[ { “board”: { “public_codename”: “cyan”, “is_active”: true, “builder_name”: “cyan” }, “delta_type”: “MILESTONE”, “channel”: “stable”, “chrome_os_version”: “13020.87.0”, “chrome_version”: “83.0.4103.119”, “generate_delta”: true, “delta_payload_tests”: true, “full_payload_tests”: false }, {...}, {...} ]

def get_delta_requests(self, payload_def: PaygenConfig, src_artifacts: List[Image], tgt_artifacts: List[Image], bucket: str, verify: bool, dryrun: bool, minios: bool=True, keyset: str=None):

Examine def, source, and target and return list(GenerationRequests).

If there isn't a matching source and target available, then return [].

bucket, verify, and dryrun are all used to fill out the GenerationRequest().

Args: payload_def: A singular configuration from pulled config. src_artifacts: Available src images. tgt_artifacts: Available tgt images. bucket: The bucket containing the requests (and destination). verify: Should we run payload verification. dryrun: Should we not upload resulting artifacts. minios: Should we generate minios payloads. keyset: The keyset to use for signing, or None.

Returns: A completed list[GenerationRequest] or [].

def get_full_requests(self, tgt_artifacts: List[Image], bucket: str, verify: bool, dryrun: bool, minios: bool=True, keyset: str=None):

Get the configured full requests for a set of artifacts.

Args: tgt_artifacts: Available tgt images. bucket: The bucket containing the requests (and destination). verify: Should we run payload verification. dryrun: Should we not upload resulting artifacts. minios: Should we generate minios payloads. keyset: The keyset to use for signing, or None.

Returns: A completed list[GenerationRequest] or [].

def get_n2n_requests(self, tgt_artifacts: List[Image], bucket: str, verify: bool, dryrun: bool, minios: bool=True, keyset: str=None):

Generate a N2N testing payloads.

We will examine all the artifacts in tgt artifacts for unsigned test images and generate n2n requests (a request that updates to and from the same version.

Args: tgt_artifacts: Available tgt images. bucket: The bucket containing the requests (and destination). verify: Should we run payload verification. dryrun: Should we not upload resulting artifacts. minios: Should we generate minios payloads. keyset: The keyset to use for signing, or None.

Returns: A list[GenerationRequest] or [].

@property
def paygen_children_timeout_sec(self):

Get the currently configured paygen timeout in seconds.

@property
def paygen_orchestrator_timeout_sec(self):

Get the currently configured paygen orchestrator timeout in seconds.

This contains the duration expected for paygen children.

Returns The int max number of seconds the paygen orchestrator should take.

def run_paygen_builders(self, paygen_reqs: List[PaygenProperties.PaygenRequest], paygen_mpa: Optional[bool]=False, use_split_paygen: bool=False, max_bb_elements: int=200):

Launch paygen builders to generate payloads and run configured tests.

Args: paygen_reqs: Protos containing the payloads to generate and the corresponding tests to launch. paygen_mpa: Use the MPA bot pool builder. use_split_paygen: Whether to use the new split paygen flow. max_bb_elements: number of elements per buildbucket call. Default is 200 but for testing we can use a smaller number.

Returns: A list of completed builds.

recipe_modules / paygen_testing

DEPS: cros_release_util, depot_tools/gsutil, recipe_engine/raw_io

API for working with Paygen. Used by paygen.py.

class PaygenTestingApi(RecipeApi):

A module for CrOS-specific paygen testing steps.

def create_paygen_build_report_payload(self, req: PaygenProperties.PaygenRequest, payload_uri: str, recovery_key_version: Optional[int]=None):

Prepare payload information for the release pubsub.

Args: req: The paygen request that was used. payload_uri: The uri of the generated payload. recovery_key_version: version of the recovery key if provided. Default None.

Returns: A Payload containing payload information for the pubsub.

recipe_modules / phosphorus

DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Module for issuing Phosphorus commands

class PhosphorusCommand(RecipeApi):

def build_parallels_image_provision(self, image_gs_path, max_duration_sec=((2 * 60) * 60)):

Provisions a DUT with the given CrOS image and Parallels DLC.

Args: image_gs_path (str): The Google Storage path (prefix) where images are located. For example, ‘gs://chromeos-image-archive/eve-release/R86-13380.0.0’. max_duration_sec (int): Maximum duration of the provision operation, in seconds. Defaults to two hours.

def build_parallels_image_save(self, dut_state):

Saves the given DUT state in UFS.

The state is only saved if it is safe to do so (i.e. is currently ready or needs_repair).

Args: dut_state (str): The new DUT state. E.g. “needs_repair” or “ready”.

def fetch_crashes(self, request):

Fetch crashes via the fetch-crashes subcommand.

Args: request: a FetchCrashesRequest.

def load_skylab_local_state(self, test_id):

Load the local DUT state file.

Raises:

  • InfraFailure

def parse(self, results_dir):

Extract test results from an results directory.

Args: results_dir: a string pointing to a directory containing test results.

Returns: Result.

def prejob(self, request):

Run a prejob or a provision via prejob subcommand.

Args: request: a PrejobRequest.

def read_dut_hostname(self):

"Return the DUT hostname.

def remove_autotest_results_dir(self):

Remove the autotest results directory.

Raises:

  • InfraFailure

def run_test(self, request):

Run a test via run-test subcommand.

Args: request: a RunTestRequest.

def save_and_seal_skylab_local_state(self, dut_state, dut_name, peer_duts, repair_requests=None):

Update the local DUT state file and seal the results directory.

Args:

  • dut_state: DUT state string (e.g. ‘ready’).
  • dut_name: Hostname of the primary DUT.
  • peer_duts: A list of hostnames for peer DUTs.
  • repair_requests (array): Requests to enforce repair actions.

Raises:

  • InfraFailure

def save_skylab_local_state(self, dut_state, dut_name, peer_duts, repair_requests=None):

Update the local DUT state file.

Args:

  • dut_state: DUT state string (e.g. ‘ready’).
  • dut_name: Hostname of the primary DUT.
  • peer_duts: A list of hostnames for peer DUTs.
  • repair_requests (array): Requests to enforce repair actions.

Raises:

  • InfraFailure

def upload_to_gs(self, request):

Upload selected test results to GS via upload-to-gs subcommand.

Args: request: an UploadToGSRequest.

def upload_to_tko(self, request):

Upload test results to TKO via upload-to-tko subcommand.

Args: request: an UploadToTkoRequest.

recipe_modules / portage

DEPS: cros_infra_config, easy, util, recipe_engine/buildbucket, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

APIs for CrOS Portage.

class PortageApi(RecipeApi):

A module for CrOS Portage steps.

def initialize(self):

def publish_emerge_stats(self, step_name: str, step_stdout: str, set_output_prop: bool=False, publish_to_bq: bool=False):

Reads portage stdout and tries to glean facts about emerge performance.

Portage gives us a stdout stream and outputs a formatted representation of number of packages that it will emerge, and the method in which it will emerge them. In lieu of a good system (formatted output, bapi response, etc) we can sniff through this stdout and gather facts about prebuilt use, among other things. We produce an output property keyed on the provided step_name.

Args: step_name: A unique step name (e.g. the bapi's step) and will be used as the output key in the metrics dictionary. step_stdout: The full standard out for the bapi call. set_output_prop: Should we set an output property with the results of the text analysis. Be mindful, it maybe quite large. publish_to_bq: Should we publish to the configured ‘portage_stats’ dataset.

Returns: The current value of the output property.

recipe_modules / pupr

APIs for PUpr.

class PuprApi(RecipeApi):

A module for PUpr steps.

def identify_retry(self, retry_policy, no_existing_cls_policy, open_cls):

Identify the CL to be retried based on retry_policy.

Precedence order:

  • If a CL is already run for CQ+2, that CL. (This means no retry should happen, unless rebase_before_retry is set)
  • If a pinned CL exists, most recent pinned CL if failed, or None if most recent pinned CL is not failed.
  • Most recent CL with a passed dry run, if no_existing_cls_policy == FULL_RUN.
  • Most recent CL with a failed full run.
  • Most recent CL with a failed dry run.

Args: retry_policy (RetryClPolicy): The retry policy to follow. Can be NO_RETRY, LATEST_OR_LATEST_PINNED, or LATEST_PINNED. no_existing_cls_policy (SendToCqPolicy): The policy this PUpr builder follows when no CL exists. If FULL_RUN, we will look for any successful dry runs, allowing us to retry the latest one as a full run. If no successful dry run is found or if DRY_RUN, we will look for a failed CL. open_cls (List[PatchSet]): List of CLs.

Returns: (PatchSet, int, str, bool, bool): (The CL to be retried (or None if no retry), CQ label to be applied, The description of the action, Whether the CL, if any, is currently passed, Whether the CL, if any, is currently running for CQ+1 or +2)

@staticmethod
def retries_frozen(changes):

Examine open CLs for the HASHTAG_FREEZE_RETRIES hashtag.

Args: changes (List[PatchSet]): List of CLs.

Returns: bool: Whether or not a HASHTAG_FREEZE_RETRIES hashtag is present.

recipe_modules / pupr_gerrit_interface

DEPS: cros_cq_depends, cros_infra_config, cros_source, easy, gerrit, git_cl, git_footers, pupr, pupr_local_uprev, repo, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/step

Module to interface with Gerrit for PUpr (Parallel Uprevs).

class PuprGerritInterfaceApi(RecipeApi):

A module to interface between PUpr builders and Gerrit.

def __init__(self, *args, **kwargs):

Initialize the module's attributes.

def apply_retry_policy(self, open_changes: List[GerritChange], most_recent_uprev: Optional[PatchSet], policy: BranchPolicy, topic: str, retry_only_run: bool):

Retry any open uprev CLs based on the retry policy.

Args: open_changes: A list of currently open, relevant PUpr CLs. most_recent_uprev: The most recently merged uprev CL. policy: The policy selected by this PUpr run. topic: A short string with which to tag all generated uprev commits. retry_only_run: this weirdly named property only causes a run in OUTDATED_LEAVE_COMMENT mode to not actually leave a comment if true.

def create_uprev_cls(self, repo_projects: List[ProjectInfo], open_changes: List[GerritChange], existing_cls: bool, policy: BranchPolicy, topic: str):

Create appropriate CLs for the uprevs.

Args: repo_projects: The projects to create uprev CLs for. open_changes: Open uprev CLs. existing_cls: Whether any open CLs remain after abandoning. policy: The branch policy set for the builder. topic: Gerrit topic name added to the Changes managed by this builder.

Returns: Human-readable summary of the operation.

def find_most_recently_merged_uprev(self, projects_by_remote: ProjectsByRemote, topic: str):

Return the most recently merged relevant uprev.

Args: projects_by_remote: A mapping from Git remotes to repo projects on that remote that are relevant to this PUpr.

def find_open_uprev_cls(self, projects_by_remote: ProjectsByRemote, topic: str):

Return any open uprev CLs matching the right projects and topic.

Args: projects_by_remote: A mapping from Git remotes to repo projects on that remote that are relevant to this PUpr. topic: A string to match against CLs' Gerrit topic, which is used to identify relevant PUpr CLs.

def handle_outdated_changes(self, open_changes: List[GerritChange], most_recent_uprev: PatchSet, policy: BranchPolicy, retry_only_run: bool):

Abandon already-open and outdated uprev CLs.

Args: open_changes: A list of currently open, relevant PUpr CLs. most_recent_uprev: The most recently merged uprev CL. policy: The policy selected by this PUpr run. retry_only_run: this weirdly named property only causes a run in OUTDATED_LEAVE_COMMENT mode to not actually leave a comment if true.

Returns: A bool stating whether any open CLs remain after abandoning.

def retry_cl(self, patch_set: PatchSet, cq_label: int):

Retry sending the CL through CQ by setting its Gerrit labels.

def set_generator_attributes(self, rebase_before_retry: bool):

Set attributes whose values are determined in Generator.

Args: rebase_before_retry: Whether to rebase open changes before retrying. See documentation in generator.proto.

TODO(b/259445191): All of these attributes should be moved from generator.proto to pupr_local_uprev.proto.

def sort_projects_by_remote(self, projects: List[ProjectInfo]):

Return a dict which sorts the given projects by their remote.

def upload_new_patch_set(self, gerrit_patch_set: PatchSet, message: Optional[str]=None):

Upload a new revision onto an existing Gerrit PatchSet.

@property
def workspace_path(self):

Return the checkout path where the build is processed.

recipe_modules / pupr_local_uprev

DEPS: cros_build_api, cros_sdk, cros_source, gerrit, git, naming, repo, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/step

Module to create uprevs on the local checkout for PUpr (Parallel Uprevs).

class PuprLocalUprevApi(RecipeApi):

A module to create local uprevs for PUpr.

def __init__(self, properties, *args: Any, **kwargs: Any):

Initialize the module's attributes.

def rebase_cl(self, open_changes: List[bb_common_pb2.GerritChange], topic: str, change_num: int):

Create a new uprev patch (locally) for change_id.

Args: open_changes: List of currently open uprev CLs. change_num: Change number of the CL to rebase, as in crrev.com/c/#####. topic: A short string with which to tag all generated commits.

Raises: StepFailure: If the uprev does not generate any changes, or if the uprev only uprevs some packages and allow_partial_uprev is False, or if the uprev requires a multi-repo commit.

def set_generator_attributes(self, additional_commit_message: str=‘‘, additional_commit_footer: str=’’, allow_partial_uprev: bool=False, packages: Optional[List[common_pb2.PackageInfo]]=None, build_targets: Optional[List[common_pb2.BuildTarget]]=None):

Set attributes whose values are determined in Generator.

Args: additional_commit_message: Additional text to be added in the commit description. additional_commit_footer: Additional footer to be added in the commit description. allow_partial_uprev: Whether to generate CLs when either of the packages had no modified file. packages: The packages that this build should uprev. build_targets: The build targets to uprev. Only relevant if required by endpoint.

TODO(b/262302698): All of these attributes should be moved from generator.proto to pupr_local_uprev.proto.

def uprev_packages(self, versions: List[packages_pb2.UprevVersionedPackageRequest.GitRef], topic: str, change_id: str=''):

Try to uprev the specified packages. If successful, commit the uprev.

Args: versions: The versions to consider for an update. change_id: If given, set Change-Id to the commit message, so that the commit is uploaded as a new patch set of an existing Change. When this is set, the uprev should not span multiple repositories. topic: A short string with which to tag all generated commits.

Returns: If packages are successfully uprevved, return a list of ProjectInfos for all repo projects with modified code. If not all packages are uprevved and allow_partial_uprev==False, return None. This signifies that the PUpr run should terminate immediately.

def uprev_sdk(self, topic: str):

Uprev the SDK on the local filesystem, and commit the uprev.

Args: topic: A short string with which to tag all generated commits.

Returns: A list of repo projects with modified code.

@property
def workspace_path(self):

Return the checkout path where the build is processed.

recipe_modules / rdb_util

class RDBUtilApi(RecipeApi):

A module for util functions associated with ResultDB.

@staticmethod
def get_board_from_variant(variant: Variant):

Get board info from Variant definition.

Args: variant: Variant definition of the test.

Returns: board of the test or ''.

@staticmethod
def get_build_target_from_variant(variant: Variant):

Get build_target info from Variant definition.

Args: variant: Variant definition of the test.

Returns: build_target of the test or ''.

@staticmethod
def get_model_from_variant(variant: Variant):

Get model info from Variant definition.

Args: variant: Variant definition of the test.

Returns: model of the test or ''.

@staticmethod
def get_shardless_test_config(test_config: str):

Get the test config without shard suffix.

Args: test_config: Full test config name with the shard info. eg: ‘betty-cq.tast_vm.tast_vm_default_shard_5_of_5’

Returns: A string of just the test config name registered in RDB without the shard info.

@staticmethod
def get_suite(composite_name: str):

Get the name of the suite from the composite name.

Args: composite_name: Composite name of the builder. eg: ‘betty-cq.tast_vm.tast_vm_default_shard_5_of_5’, ‘zork-cq.hw.bvt-inline’,

Returns: A string of just the suite name registered in RDB.

recipe_modules / recipe_analyze

DEPS: recipe_engine/json, recipe_engine/step

API for calling ‘recipes.py analyze’

class RecipeAnalyzeApi(RecipeApi):

A module for calling ‘recipes.py analyze’

def is_recipe_affected(self, affected_files: List[str], recipe: str):

Return True iff changes in <affected_files> affect .

Must be called from the root of a recipes repo (i.e. recipes.py is in the cwd).

Args:

  • affected_files: A list of changed files. Paths may be absolute or relative (to the root of the recipes repo), and should use forward slashes only.
  • recipe: The name of the recipe to analyze.

recipe_modules / remoteexec

DEPS: depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

API for working with re-client for remote execution.

class RemoteexecApi(RecipeApi):

A module for working with re-client for remote execution.

@property
def enable_logs_upload(self):

def process_artifacts(self, install_pkg_response: InstallPackagesResponse, log_dir: str, build_target_name: str, is_staging: bool=False):

Process remoteexec artifacts, uploading to Google Cloud Storage if they exist.

Args: install_pkg_response: Result from InstallPackage call. May contain remoteexec artifacts. log_dir (str): Log directory that contains the artifacts. build_target_name (str): Build target string. is_staging (bool): If being run in staging environment instead of prod.

@property
def reclient_dir(self):

Fetches the reclient directory and returns its path.

@property
def reproxy_cfg_file(self):

recipe_modules / repo

DEPS: cros_infra_config, easy, git, src_state, depot_tools/depot_tools, depot_tools/gitiles, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

API for working with the ‘repo’ VCS tool.

See: https://chromium.googlesource.com/external/repo/

class RepoApi(RecipeApi):

A module for interacting with the repo tool.

def abandon(self, branch: str, projects: Optional[List[str]]=None):

Abandon the branch in the given projects, or all projects if not set.

Args: branch: The branch to abandon. projects: The projects for which to abandon the branch.

def create_tmp_manifest(self, manifest_data: str):

Write manifest_data to a temporary manifest file inside the repo root.

Args: manifest_data: The contents to write into the new manifest file.

Returns: Path to the new manifest file.

def diff_manifests(self, from_manifest_str: str, to_manifest_str: str, use_merge_base: bool=False):

Diff the two manifests and returns an array of differences.

Given the two manifest XML strings, generates an array of ManifestDiff. This only returns CHANGED projects, it skips over projects that were added or deleted.

Args: from_manifest_str: The manifest XML string to diff from. to_manifest_str: The manifest XML string to diff against. use_merge_base: Whether to adjust the from_ref with git merge-base.

Returns: An array of ManifestDiff namedtuple for any existing changed project (excludes added/removed projects).

def diff_manifests_informational(self, old_manifest_path: Path, new_manifest_path: Path):

Informational step that logs a “manifest diff”.

Args: old_manifest_path: Path to old manifest file. new_manifest_path: Path to new manifest file.

def diff_remote_and_local_manifests(self, from_manifest_url: str, from_manifest_ref: str, to_manifest_str: str, test_from_data: Optional[str]=None, use_merge_base: bool=False):

Diff the remote manifest against the local manifest string.

Diffs the ‘snapshot.xml’ at the given from_manifest_url at the ref from_manifest_ref against the local to_manifest_str.

Args: from_manifest_url: The manifest repo url to checkout. from_manifest_ref: The manifest ref to checkout. to_manifest_str: The string XML for the to manifest. test_from_data: Test data: The from_manifest contents, or None for the default. use_merge_base: Whether to adjust the from_ref with git merge-base.

Returns: An array of ManifestDiff namedtuples for any existing changed project (excludes added/removed projects).

def ensure_pinned_manifest(self, projects: Optional[List[str]]=None, regexes: Optional[List[str]]=None, test_data: Optional[str]=None):

Ensure that we know the revision info for all projects.

If the manifest is not pinned, a pinned manifest is created and logged.

Args: projects: Project names or paths to return info for. Defaults to all projects. regexes: list of regexes for matching projects. The matching is the same as in repo forall --regex regexes.... test_data: Test data for the step: the output from repo forall, or None for the default. This is passed to project_infos().

Returns: The manifest XML as a string, or None if the manifest is already pinned.

def ensure_synced_checkout(self, root_path: Path, manifest_url: str, init_opts: Optional[Dict[(str, Any)]]=None, sync_opts: Optional[Dict[(str, Any)]]=None, projects: Optional[List[str]]=None, final_cleanup: bool=False, sanitize: bool=False):

Ensure the given repo checkout exists and is synced.

Args: root_path: Path to the repo root. manifest_url: Manifest URL for 'repo.init. init_opts: Extra keyword arguments to pass to 'repo.init'. sync_opts: Extra keyword arguments to pass to 'repo.sync'. projects: Projects of concern or None if all projects are of concern. Used to perform optimizations where possible to only operate on the given projects. final_cleanup: Used by cache builder to ensure that all locks and uncommitted files are cleaned up after the sync. sanitize: Whether to run git gc` on all repos.

Returns: Whether the sync succeeded.

def init(self, manifest_url: str, *, manifest_branch: str='', reference: Optional[str]=None, groups: Optional[List[str]]=None, depth: Optional[int]=None, repo_url: Optional[str]=None, repo_branch: Optional[str]=None, local_manifests: Optional[List[LocalManifest]]=None, manifest_name: Optional[Path]=None, projects: Optional[List[str]]=None, verbose: bool=True, clean: bool=True, manifest_depth: Optional[str]=None):

Execute ‘repo init’ with the given arguments.

Args: manifest_url: URL of the manifest repository to clone. manifest_branch: Manifest repository branch to checkout. reference: Location of a mirror directory to bootstrap sync. groups: Groups to checkout (see repo init --groups). depth: Create a shallow clone of the given depth. repo_url: URL of the repo repository. repo_branch: Repo binary branch to use. local_manifests: Local manifests to add. See https://gerrit.googlesource.com/git-repo/+/HEAD/docs/manifest-format.md#local-manifests. manifest_name: The manifest file to use. projects: Projects of concern or None if all projects are of concern. Ignored as of go/cros-source-cache-health. verbose: Whether to produce verbose output. manifest_depth: Value to pass in as manifest-depth to repo.

def initialize(self):

Initialize the module after the recipe engine has been loaded.

def manifest(self, manifest_file: Optional[Path]=None, test_data: Optional[str]=None, pinned: bool=False, step_name=None):

Use repo to create a manifest and returns it as a string.

By default uses the internal .repo manifest, but can optionally take another manifest to use.

Args: manifest_file: If given, path to alternate manifest file to use. test_data: Test data for the step: the contents of the manifest, or None for the default. pinned: Whether to create a pinned (snapshot) manifest. step_name: The name for the step, or None to use a default.

Returns: str: The manifest XML as a string.

@property
def manifest_gitiles_commit(self):

Return a Gitiles commit for the repo manifest.

def project_exists(self, project: str):

Use ‘repo info’ to determine if the project exists in the checkout.

Args: project: Project name or path to check for.

Returns: Whether or not the project exists.

def project_info(self, project: Optional[Union[(str, Path)]]=None, **kwargs: Dict[(str, Any)]):

Use ‘repo forall’ to gather project information for one project.

Args: project: Project name or path to return info for. If None, then use the cwd as the path for the project. kwargs: Additional keyword arguments to pass into self.project_infos.

Returns: ProjectInfo: The request project info.

def project_infos(self, projects: Optional[List[str]]=None, regexes: Optional[List[str]]=None, test_data: Optional[str]=None, ignore_missing: bool=False):

Use ‘repo forall’ to gather project information.

Note that if both projects and regexes are specified the resultant ProjectInfos are the union, without duplicates, of what each would return separately.

Note that this doesn't guarantee that the return value has no duplicates. The caller will need to handle that themselves.

Args: projects: Project names or paths to return info for. Defaults to all projects. regexes: List of regexes for matching projects. The matching is the same as in repo forall --regex regexes.... test_data: Test data for the step: the output from repo forall, or None for the default. ignore_missing: If True, skip missing projects and continue.

Returns: Requested project infos.

@property
def repo_path(self):

def report_manifest_branch_state(self, projects: Optional[List[str]]=None, test_data: str=‘Repo: info’, test_failure: bool=False):

Use ‘repo info’ to output manifest state to stdout. Args: projects: Projects to limit the info call to, or None to get info for all projects. test_data: Optional data for testing stdout. test_failure: Raise StepFailure or not

Returns: Full info on the manifest branch, current branch or unmerged branches.

def start(self, branch: str, projects: Optional[List[str]]=None):

Start a new branch in the given projects, or all projects if not set.

Args: branch: The new branch name. projects: The projects for which to start a branch.

def sync(self, *, force_sync: bool=False, detach: bool=False, current_branch: bool=False, jobs: Optional[int]=None, manifest_name: Optional[Path]=None, no_tags: bool=False, optimized_fetch: bool=False, timeout: Optional[int]=None, retry_fetches: Optional[int]=None, projects: Optional[List[str]]=None, verbose: bool=True, no_manifest_update: bool=False, force_remove_dirty: bool=False, force_checkout: bool=True, prune: bool=None, repo_event_log: bool=True, manifest_branch_state: bool=True, test_manifest_branch_state_failure: bool=False):

Execute ‘repo sync’ with the given arguments.

Args: force_sync: Overwrite existing git directories if needed. detach: Detach projects back to manifest revision. current_branch: Fetch only current branch. jobs: Projects to fetch simultaneously. manifest_name: Temporary manifest to use for this sync. no_tags: Don‘t fetch tags. optimized_fetch: Only fetch projects if revision doesn’t exist. timeout: Number of seconds before the recipe engine should kill the step. retry_fetches: The number of times to retry retryable fetches. projects: Projects to limit the sync to, or None to sync all projects. verbose: Whether to produce verbose output. no_manifest_update: Whether to disable updating the manifest. force_remove_dirty: Whether to force remove projects with uncommitted modifications if projects no longer exist in the manifest. force_checkout: Whether to checkout with the force option. prune: Delete refs that no longer exist on the remote. repo_event_log: Write the repo event log, do analysis steps. manifest_branch_state: Write repo info to stdout. test_manifest_branch_state_failure: Raise StepFailure in repo-info step and confirm it does not fail the entire build.

def sync_manifest(self, manifest_url: str, manifest_data: str, **kwargs):

Sync to the given manifest file data.

Args: manifest_url: URL of manifest repo to sync to (for repo init) manifest_data: Manifest XML data to use for the sync. kwargs: Keyword arguments to pass to ‘repo.sync’.

def version(self):

Get the version info retrieved by running repo version.

Returns: The repo version as a numeric string, such as “2.39”.

def version_at_least(self, version_string: str):

Check to make sure repo version is as least the specified version.

Args: version_string: the minimum version in format #.#(.#).

Returns: True if the minimum version is satisfied, otherwise False.

recipe_modules / result_flow

DEPS: easy, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

Module for issuing result flow commands

class ResultFlowCommand(RecipeApi):

def pipe_ctp_data(self, request):

Pipe CTP data to TestPlanRun table in BQ.

Args:

  • request: a test_platform.result_flow.CTPRequest Returns: JSON proto of test_platform.result_flow.CTPResponse

def pipe_test_runner_data(self, request):

Pipe test runner data to TestRun/TestCase tables in BQ.

Args:

  • request: a test_platform.result_flow.TestRunnerRequest Returns: JSON proto of test_platform.result_flow.TestRunnerResponse

def publish(self, project_id, topic_id, build_type, should_poll_for_completion=False, parent_uid=''):

Run the result_flow to publish build's own build ID to Pubsub.

Args:

  • project_id (str): The project name
  • topic_id (str): The topic name
  • build_type (str): Allowed values are “ctp” and “test_runner”
  • should_poll_for_completion (bool): If true, the consumers should not ACK the message until the build is complete.
  • parent_uid (str): An attribute placed inside the message Returns: JSON proto of test_platform.result_flow.PublishResponse

recipe_modules / satlab

DEPS: cros_version, recipe_engine/file, recipe_engine/service_account, recipe_engine/step

class Satlab(RecipeApi):

Module for Satlab related functionality

def stage_build(self, build, target_bucket):

Stage a build to a partner bucket. Will raise an exception on failure.

Args: build: string, formated as -release/RXXX-YYY.ZZ.A bucket: string, name of gs bucket to stage builds too

recipe_modules / service_version

DEPS: recipe_engine/step

class ServiceVersionCommand(RecipeApi):

Module for issuing ServiceVersion commands

def validate_service_version_if_exists(self):

Validate the caller's service version if they sent one.

recipe_modules / signing

DEPS: bot_cost, build_menu, cros_artifacts, cros_build_api, cros_infra_config, cros_release_util, cros_version, easy, signing_utils, depot_tools/gitiles, depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Module providing signing functionality.

class SigningApi(RecipeApi):

A module to encapsulate signing operations.

def add_kms_logs_as_step_logs(self, presentation: StepPresentation, result_path: Path):

Add the CloudKMS logs to the given step presentation.

Args: presentation: The step presentation to add logs to. result_path: The result_path passed to the signing call.

def artifact_name_by_image_type(self, image_type: common_pb2.ImageType):

Mapping of image type to artifact name.

def download_release_artifacts(self, relevant_signing_configs: List[SigningConfig]):

Download artifacts so we can support retries with conductor.

As opposed to in situ builds with local artifacts already present.

Args: build_target: Name of build target. relevant_signing_configs: Build target configs with supported sign types.

Returns:

  • relevant_signing_configs with local artifact paths populated.
  • dir containing input artifacts

def get_config(self):

Fetch signing config from the appropriate branch of config-internal.

def get_paygen_keyset(self):

Return the keyset for use in paygen.

Can only be called after setup_signing.

Raises: ValueError, if there is no keyset configured for paygen.

def get_signed_build_metadata(self, instructions_metadata: Dict[(str, InstructionsMetadata)]):

Get the metadata of the signed build.

Note - this requires that wait_for_signing has been called and is complete.

Args: instructions_metadata: The metadata dict returned from wait_for_signing.

Returns: List of signed build metadata dicts (one per signed build image).

@property
def get_use_dev_keys(self):

@exponential_retry(retries=GSUTIL_MAX_RETRY_COUNT, delay=datetime.timedelta(seconds=1))
def gs_download_if_present(self, gs_dir: str, local_dir: str, artifact_names: List[str]):

Download from Google Storage if present.

Returns a list of skipped artifacts.

@property
def gs_upload_bucket(self):

def initialize(self):

Initialize method for setup that needs the modules instantiated.

@property
def local_signing(self):

def setup_signing(self, sign_types: List[‘common_pb2.ImageType’], channels: List[‘common_pb2.Channel’]):

Set up the working dir for signing.

Copies all necessary artifacts (based on signing config) from GS into a new temp dir and populates the artifact path field in each individual signing config.

Also drops configs for irrelevant sign types.

Returns:

  • signing configs
  • dir containing input artifacts

def sign_artifacts(self, sign_types: List[‘common_pb2.ImageType’], channels: List[‘common_pb2.Channel’], include_paygen: bool=True):

Implementation for local signing flow.

@property
def signing_docker_image(self):

def stage_paygen_artifacts(self, build_target_config: BuildTargetSigningConfig, channels: List[‘common_pb2.Channel’]):

Copy the artifacts needed for paygen into the appropriate GS locations.

Returns: List of GS dirs that were pushed to.

def upload_signed_artifacts(self, response: SignImageResponse):

Uploads all files in output_dir to GS using gsutil cp.

@exponential_retry(retries=GSUTIL_MAX_RETRY_COUNT, delay=datetime.timedelta(seconds=1))
def upload_unsigned_artifacts(self, archive_dir: Path, build_target_config: BuildTargetSigningConfig, channels: List[‘common_pb2.Channel’]):

Uploads files from archive_dir to GS based on signing config.

Returns: List of GS dirs that were pushed to.

def verify_signing_success(self, instructions_metadata: Dict[(str, InstructionsMetadata)], pres: StepPresentation):

Verifies that the signing operation succeeded.

def wait_for_signing(self, instructions_list: List[str]):

Wait for signing to complete for a set of instructions files.

This method polls each instructions file for metadata, and waits for that metadata to become present, then checks to see if a terminal passing or failed state has been achieved. This method returns when either a) signing is complete for all of the provided instructions files, or b) the configured timeout has elapsed.

Args: instructions_list: List of GS URIs for instructions files.

Returns A dict of instruction file location -> instruction metadata for all complete signing operations.

recipe_modules / signing_utils

DEPS: cros_version, recipe_engine/file, recipe_engine/path, recipe_engine/step

Module providing helpers for signing functionality.

class SigningUtilsApi(RecipeApi):

A module to encapsulate helpers for signing operations.

@staticmethod
def any_empty(instructions_meta: Dict[(str, InstructionsMetadata)]):

Checks to see if any values in the provided dict are None.

Args: instructions_meta: Dict of instructions url -> metadata.

Returns: True if any are None, otherwise false.

@staticmethod
def get_failure(metadata: Dict[(str, InstructionsMetadata)]):

Given an instructions file, pull out the failure of signing.

Args: metadata: An instructions metadata file.

Returns: The failure of the signing, or None if not available.

def get_keyset_version_list(self, archive: signing_pb2.ArchiveArtifacts):

def get_milestone_version(self):

def get_platform_version(self):

@staticmethod
def get_status_from_instructions(metadata: Dict[(str, InstructionsMetadata)]):

Given an instructions file, pull out the status of the signing operation.

Args: metadata: An instructions metadata file.

Returns: The status of the signing, or None if not available.

@staticmethod
def is_terminal_status(status: str):

@staticmethod
def signing_failed(metadata: Dict[(str, InstructionsMetadata)]):

Whether the provided metadata contains a failed signing operation.

Args: metadata: Metadata from the instructions file.

Returns: True/False whether the signing failed.

def signing_response_to_metadata(self, sign_image_response: SignImageResponse):

Translate signing response to metadata of the signed build.

Used for BuildReport in pubsub.

Args: sign_image_response: Response from SignImage.

Returns: List of signed builds (one per signed build image).

@staticmethod
def signing_succeeded(metadata: Dict[(str, InstructionsMetadata)]):

Whether the provided metadata contains a successful signing operation.

Args: metadata: Metadata from the instructions file.

Returns: True/False whether the signing succeeded.

recipe_modules / skylab

DEPS: cros_history, cros_infra_config, cros_source, cros_tags, git_footers, metadata, skylab_results, src_state, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/step, recipe_engine/swarming

Module for issuing commands to Skylab

class SkylabApi(RecipeApi):

def apply_qs_account_overrides(self, gerrit_changes: List[GerritChange]):

Apply any QS account overrides the build is elegible for.

Currently supports overriding the account if any of the Gerrit Changes have the ‘Testing-Override’ footer or if the current build is testing Chrome PUpr CL.

Args: gerrit_changes: The gerrit changes applied to the build.

@property
def last_run_tast_first_class_tests(self):

Returns the hw tests which ran as Tast first class in the last run.

@property
def qs_account(self):

Get the quota scheduler account the module is configured to use.

def schedule_ctp_requests(self, tagged_requests, can_outlive_parent=True, bb_tags=None, **kwargs):

Schedule a cros_test_platform build.

Args: tagged_requests (dict): Dictionary of string to test_platform.Request objects. can_outlive_parent (bool): Whether this build can outlive its parent. The default is True. bb_tags (dict or list[StringPair]): If of the type list[StringPair], will be used directly as a bb_tag list. If a dict, used to map keys to values. If the value is a list, multiple tags for the same key will be created. kwargs: List of extra named parameters to pass to buildbucket.schedule_request. Returns: The scheduled buildbucket build.

def schedule_suites(self, unit_hw_tests: List[UnitHwTest], timeout: Duration, name: str=None, async_suite_run: bool=False, container_metadata: ContainerMetadata=None, require_stable_devices: bool=False, previous_results: Dict[(str, ExecuteResponse)]=None, build_target_critical_allowlist: List[str]=None):

Schedule HW test suites by invoking the cros_test_platform recipe.

Args: unit_hw_tests: Hardware test suites to execute timeout: Timeout in timestamp_pb2.Duration. name: The step name. Defaults to ‘schedule skylab tests v2’ async_suite_run: If set, indicates that caller does not intend to wait for the scheduled suites to complete, and the child build can outlive the parent build. container_metadata: Information on container images used for test execution. require_stable_devices (bool): If set, only run on devices with ‘label-device-stable: True’ previous_results: The results of the previous invocation. The results are a dict mapping the unit_hw_test's display name to an ExecuteResponse. build_target_critical_allowlist: If set (including empty list), only the build targets specified can have tests run as critical. If None, criticality will not be modified for any build targets.

Returns: A list of SkylabTasks with buildbucket_id of the recipe launched.

def set_qs_account(self, qs_account):

Override the quota scheduler account at runtime.

def wait_on_suites(self, tasks, timeout):

Wait for the single Skylab multi-request to finish and return the result

Args: tasks (list[SkylabTask]): The Skylab tasks to wait on. timeout (Duration): Timeout in timestamp_pb2.Duration.

Returns: list[SkylabResult]: The results for suites from provided tasks.

recipe_modules / skylab_results

DEPS: recipe_engine/buildbucket, recipe_engine/step

Util functions for parsing HW test results.

class SkylabResultsApi(RecipeApi):

Module for working with Skylab Structs and Hw Test Results.

def extract_failed_test_case_names(self, hw_test_results: typing.List[ExecuteResponse.TaskResult], ensure_complete: bool=True):

Returns the names of the test cases which failed all attempts.

Args: hw_test_results: The results from which to extract the failed tests cases. ensure_complete: Only return failed test case names if at least one attempt for each shard had terminated uniterrupted. This is to ensure the list of failed test cases returned is complete.

Returns: The names of the failed test cases.

def extract_failed_test_shard_names(self, hw_test_results: typing.List[ExecuteResponse.TaskResult]):

Returns the names of the tests shards which failed all attempts.

Args: hw_test_results: The results from which to extract the failed shard names.

Returns: The names of the failed hw_test_results.

def get_per_board_prejob_stats(self, skylab_results: typing.List[SkylabResult]):

Returns PrejobStats for each board tested in skylab_results.

def get_previous_results(self, task_ids: typing.List[str], unit_hw_tests: typing.List[UnitHwTest]):

Get the results from the previous tasks with the specified task_ids.

Args: task_ids: The list of Skylab task IDs for which to retrieve results. unit_hw_tests: The list of unit_hw_tests for which to retrieve results.

Returns: The list of Skylab results for the specified unit_hw_tests that ran in the tasks with the specified task_ids.

def get_tagged_execute_responses_from_build(self, build):

@staticmethod
def request_tag(hw_test):

def translate_result(self, result, task):

Translates result to a Skylab result.

recipe_modules / snapshot_orch_menu

DEPS: bot_cost, build_menu, build_plan, checkpoint, conductor, cros_artifacts, cros_cq_additional_tests, cros_history, cros_infra_config, cros_lkgm, cros_release, cros_resultdb, cros_source, cros_tags, cros_test_plan, cros_test_plan_v2, cros_test_proctor, cros_version, easy, failures, failures_util, gerrit, git, git_footers, gitiles, gobin, greenness, looks_for_green, metadata, naming, orch_menu, skylab, skylab_results, src_state, test_util, workspace_util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cv, recipe_engine/futures, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

API providing a menu for snapshot orchestrator steps

class SnapshotOrchMenuApi(RecipeApi):

A module with steps used by orchestrators.

Orchestrators do not call other recipe modules directly: they always get there via this module, and are a simple sequence of steps.

@property
def builds_status(self):

@property
def config(self):

def create_recipe_result(self):

Create the correct return value for RunSteps.

Returns: (recipe_engine.result_pb2.RawResult) The return value for RunSteps.

@property
def external_gitiles_commit(self):

@property
def gerrit_changes(self):

@property
def gitiles_commit(self):

def initialize(self):

def output_local_greenness(self, should_update: bool, should_update_bazel: bool):

Outputs info about local greenness.

def plan_and_run_children(self, run_step_name=None, results_step_name=None, extra_child_props=None):

Plan, schedule, and run child builders.

Args: run_step_name (str): Name for “run builds” step, or None. results_step_name (str): Name for “check build results” step, or None.

extra_child_props (dict): If set, extra properties to append to the child builder requests. Returns: (BuildsStatus): The current status of the builds.

def plan_and_run_tests(self, testable_builds: Optional[List[build_pb2.Build]]=None, ignore_gerrit_changes: bool=False):

Plan, schedule, and run tests.

Run tests on the testable_builds identified by plan_and_run_children.

Args: testable_builds: The list of builds to consider or None to use the current results. ignore_gerrit_changes: Whether to drop gerrit changes from the test plan request, primarily used for tryjobs (which are release builds and thus shouldn't test based on any patches applied).

Returns: BuildsStatus updated with any test failures.

def ps_relevant(self, build: build_pb2.Build):

Whether the postsubmit child build was critical and relevant.

Args: build: The child build.

@property
def relevant_child_builder_names(self):

@contextlib.contextmanager
def setup_orchestrator(self):

Initial setup steps for the orchestrator.

This context manager returns with all of the contexts that the orchestrator needs to have when it runs, for cleanup to happen properly.

If appropriate, any inflight orchestrator has finished before we return.

Raises: StepFailure if no config is found.

Returns: BuilderConfig or None, with an active context.

recipe_modules / src_state

DEPS: recipe_engine/buildbucket, recipe_engine/path, recipe_engine/step

API providing frequently needed values, that we sometimes override.

If you are using cros_source or cros_infra_config, this module is relevant to your interests.

There are two classes of properties in this module.

  1. Constant(ish) things that need a common home to avoid duplication, such as workspace_path, internal_manifest, and external_manifest.

  2. Information obtained from recipe_engine, which we frequently change:

  • gitiles_commit: Especially when buildbucket does not give us one, we need to set it to the correct value for the build. That varies based on builder_config, and other things.

  • gerrit_changes: some builders add changes to the build, and others ignore the changes completely.

class SrcStateApi(RecipeApi):

Source State related attributes for CrOS recipes.

@build_manifest.setter
def build_manifest(self, build_manifest):

Set the manifest that will be used for the build.

Sets the manifest used by this builder.

Args: (ManifestProject): information about the manifest for this build.

@property
def default_branch(self):

The default branch for CrOS repos

@property
def default_ref(self):

The default ref for CrOS repos

@property
def external_manifest(self):

Information about external manifest.

Provides immutable information about the CrOS external manifest.

Returns: (ManifestProject): information about the external manifest.

@gerrit_changes.setter
def gerrit_changes(self, gerrit_changes):

Set the gerrit_changes that will be used for the build.

Args: gerrit_changes (list[GerritChanges]): The gerrit_changes.

@gitiles_commit.setter
def gitiles_commit(self, gitiles_commit):

Set the gitiles_commit that will be used for the build.

Args: gitiles_commit (GitilesCommit): The value to use.

def gitiles_commit_to_manifest(self, gitiles_commit):

Return the manifest corresponding to the gitiles_commit.

Args: gitiles_commit (GitilesCommit): The gitiles_commit.

Returns: (ManifestProject): Information about the corresponding manifest, or None.

def initialize(self):

@property
def internal_manifest(self):

Information about internal manifest.

Provides immutable information about the CrOS internal manifest.

Returns: (ManifestProject): information about the internal manifest.

@property
def manifest_name(self):

Return the name of the manifest.

@property
def manifest_projects(self):

Return the manifest project names.

@property
def workspace_path(self):

The “workspace” checkout path.

The cros_source module checks out the CrOS source in this directory. It will contain the base checkout and any modifications made by the build, and is discarded after the build.

recipe_modules / support

DEPS: easy, gobin

APIs for running recipes/support tools.

class SupportApi(RecipeApi):

A module for support tool steps.

def call(self, tool, input_data, test_output_data=None, infra_step=True, timeout=None, add_json_log=True, name=None, **kwargs):

Run a tool from the support package.

Args: tool (str): Tool name. input_data: Data to be passed as input to the tool (serialized to JSON). test_output_data (dict|list|Callable): Data to return in tests. infra_step (bool): Whether or not this is an infrastructure step. timeout (int): Timeout of the step in seconds. add_json_log (bool): Log the content of the output json. name (str): The step name to display, or None for default.

  • kwargs: Keyword arguments to pass to the ‘step’ call.

Returns: Data passed as output from the tool (deserialized from JSON).

recipe_modules / swarming_cli

DEPS: easy, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/step, recipe_engine/time

Wrapper functions for calling the swarming CLI.

class SwarmingCli(RecipeApi):

A module that queries Swarming via the CLI.

def get_bot_counts(self, swarming_instance: str, dimensions: Optional[Iterable[str]]=None, bot_group: Optional[str]=None):

Retrieves the count of bots from Swarming based on dimensions.

Args: swarming_instance: The name of the Swarming instance to query. dimensions: Iterable of strings formatted as “key:value” to query Swarming. bot_group: The name of the bot group for which to get the count.

def get_max_pending_time(self, dimensions, lookback_hours, swarming_instance):

Retrieves the list of tasks from Swarming based on dimensions.

Args: dimensions (iterable): strings formatted as “key:value” to query Swarming. lookback_hours (int): Number of hours to query swarming on. swarming_instance(str): string containing the name of the Swarming instance to query.

Returns: (float) Max pending time in hours.

def get_task_counts(self, dimensions: Iterable[str], state: str, lookback_hours: int, swarming_instance: str, bot_group: Optional[str]=None):

Retrieves the count of tasks from Swarming based on filters.

Args: dimensions: Iterable of strings formatted as ‘key:value’ to query Swarming. state: The state of the tasks to query lookback_hours: Number of hours to query swarming on. swarming_instance: The name of the Swarming instance to query. bot_group: The name of the bot group for which to get the count.

def get_task_list(self, dimensions, state, lookback_hours, swarming_instance, limit=None):

Retrieves the list of tasks from Swarming based on dimensions and state.

Args: dimensions (iterable): strings formatted as “key:value” to query Swarming. state (str): state of the tasks to query lookback_hours (int): Number of hours to query swarming on. swarming_instance(str): string containing the name of the Swarming instance to query. limit (int): Number of tasks to return.

def initialize(self):

Perform one-time module setup.

This method is automatically called by the recipe engine once at the start of every recipe that depends on this module.

recipe_modules / sysroot_archive

DEPS: cros_build_api, cros_infra_config, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/led, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

Sysroot archive functions.

class SysrootArchiveApi(RecipeApi):

A module for interacting with sysroot archive.

def archive_sysroot_build(self, chroot: common_pb2.Chroot, sysroot: Sysroot, build_target: common_pb2.BuildTarget):

Archives sysroot into gs bucket.

The gs path format of the archive should be: gs://bucket/board/chromeos_version~cl_diff_count-build_id/archive_name.

Args: chroot: The chroot to use. sysroot: The sysroot to use. build_target: The build target of the sysroot archive.

def extract_best_archive(self, chroot: common_pb2.Chroot, build_target: common_pb2.BuildTarget):

Finds and extracts the sysroot archive closest to the given version.

Args: chroot: The chroot to use. build_target: build target.

def extract_sysroot_build(self, chroot: common_pb2.Chroot, build_target: common_pb2.BuildTarget, sysroot_archive_gs_path: str=''):

Downloads the provided archive from GS and places it in the sysroot.

Args: chroot: The chroot to use. build_target: The build target of the sysroot archive. sysroot_archive_gs_path: Path of archive to be unarchived.

Raises: StepFailure: The archive does not exist.

def find_best_archive(self, build_target: common_pb2.BuildTarget):

Returns the sysroot archive path closest to the given version.

The function calculates the cl diff count between given CrOS version and sysroot archives. And returns an arbitrary sysroot archive with smallest cl diff count.

Args: build_target: The build target of the sysroot archive.

Returns: A string indicates the best gs archive path or None if not found.

def parse_archive_path(self, gs_path: str):

Parses the ChromeOS version and cl count of GS archive path.

Args: gs_path: Path of the sysroot archive.

Returns: Parsed ChromeOS version and cl diff count.

recipe_modules / sysroot_util

DEPS: android, chrome, cros_artifacts, cros_build_api, cros_infra_config, cros_relevance, cros_sdk, cros_source, easy, goma, image_builder_failures, remoteexec, src_state, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cv, recipe_engine/file, recipe_engine/path, recipe_engine/step, recipe_engine/time

API for various support functions for building.

class SysrootUtilApi(RecipeApi):

A module for sysroot setup, manipulation, and use.

def bootstrap_sysroot(self, compile_source=False, response_lambda=None, timeout_sec=‘DEFAULT’, test_data=None, name=None):

Bootstrap the sysroot by calling InstallToolchain.

Args: compile_source (bool): Whether to compile from source. response_lambda (fn(output_proto)->str): A function that appends a string to the build api response step. Used to make failure step names unique across differing root causes. Default: cros_build_api.failed_pkg_data_names. timeout_sec (int): Step timeout, in seconds, or None for default. test_data (str): test response (JSON) from the SysrootService/InstallToolchain call, or None to use the default in cros_build_api/test_api.py. name (str): Step name to use, or None for the default name.

def build_images(self, image_types: List[‘common_pb2.ImageType’], builder_path: str, disable_rootfs_verification: bool, disk_layout: str, base_is_recovery: bool=False, version: Optional[str]=None, timeout_sec: Optional[int]=None, build_test_data: Optional[str]=None, test_test_data: Optional[str]=None, name: Optional[str]=None, skip_image_tests: bool=False, verify_image_size_delta: bool=False, bazel: bool=False, is_official: bool=False):

Build and validate images.

Args: image_types: Image types to build. builder_path: Builder path in GS for artifacts. disable_rootfs_verification: whether to disable rootfs verification. disk_layout: disk_layout to set, or empty for default. base_is_recovery: copy the base image to recovery_image.bin. version: version string to pass to build API, or None. timeout_sec: Step timeout (in seconds), None uses default timeout. build_test_data: test response (JSON) from the ImageService/Create call, call, or None. test_test_data: test response (JSON) from the ImageService/Test call, or None. name: Step name to use, or None for default name. skip_image_tests: Whether to skip tests of the built image via ImageService/Test. verify_image_size_delta: Whether to verify the image size delta. bazel: Whether to use Bazel to build the images. is_official: Whether to produce official builds.

Returns: The images built during the stage.

def create_netboot_image(self):

Create a netboot image for the factory build.

def create_sysroot(self, build_target, profile=None, chroot_current=True, replace=True, timeout_sec=‘DEFAULT’, use_cq_prebuilts: bool=False, test_data=None, name=None):

Create the sysroot.

Args: build_target (BuildTarget): Which build_target to create a sysroot for. profile (chromiumos.Profile): The profile the sysroot is to use, or None. chroot_current (bool): Whether the chroot is current. (If not, it will be updated. replace (bool): Whether to replace an existing sysroot. timeout_sec (int): Step timeout (in seconds). Default: None if a toolchain change is detected, otherwise 10 minutes. use_cq_prebuilts (bool): Whether to use CQ prebuilts. test_data (str): test response (JSON) from the SysrootService/Create call, or None to generate a default response based on the input data. name (str): Step name to use, or None for the default name.

Returns: Sysroot

def initialize(self):

def install_packages(self, config, dep_graph, packages=None, artifact_build=False, timeout_sec=‘DEFAULT’, name=None, dryrun=False):

Install packages (possibly fetching Chrome source).

Args: config (BuilderConfig): The builder config. dep_graph: The dependency graph from cros_relevance.get_dependency_graph. packages (list[PackageInfo]): list of packages to install. Default: all packages for the build_target. artifact_build (bool): Whether to call update_for_artifact_build. timeout_sec (int): Step timeout, in seconds, or None for default. name (str): Step name to use, or None for default name. dryrun (bool): Whether to dryrun the step such that we calculate the packages which would have been built, but do not install them.

@property
def sysroot(self):

def update_for_artifact_build(self, chroot, artifacts, force_relevance=False, test_data=None, name=None):

Update ebuilds for artifact build.

Args: chroot (Chroot): Chroot, or None. artifacts (BuilderConfig.Artifacts): Artifact Information force_relevance (bool): Whether to always claim relevant. test_data (str): test response (JSON) from the ArtifactsService/BuildSetup call, or None. name (str): Step name to use, or None for default name.

Returns: (BuildSetupResponse): Whether the build is relevant.

recipe_modules / tast_exec

DEPS: easy, failures, gcloud, git, tast_results, depot_tools/gsutil, recipe_engine/archive, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/step, recipe_engine/time

class TastExecApi(RecipeApi):

A module to execute tast commands.

def add_ssh_key(self, path):

Registers an SSH key for use during test execution.

Args: path (Path): Path to the SSH key.

def create_gce_vm_context(self, image, project, machine, zone, network, subnet):

Creates a context manager which performs setup/teardown of a GCE VM.

Args: image(str): GCE image to use for the instance. project(str): Google Cloud project name. machine(str): GCE machine type zone(str): GCE zone to create instance (e.g. us-central1-b). network(str): Network name to use. subnet(str): Network subnet on which to create instance.

Returns: A context manager that - when entered, prepares a VM to test against, and yields a VmInfo object for connecting to it. - when exited, terminates the VM and performs cleanup.

def create_qemu_vm_context(self, qcow_image_path, second_image_path=None):

Creates a context manager which performs setup/teardown of a QEMU VM.

Args: qcow_image_path (Path): Path to image in qcow format. second_image_path (Path): Path to a second qcow disk image (optional).

Returns: A context manager that - when entered, prepares a VM to test against, and yields a VmInfo object for connecting to it. - when exited, terminates the VM and performs cleanup.

def download_tast(self, build_payload, test_artifacts_dir):

Downloads the tast executable from specified build artifacts.

Args: build_payload (BuildPayload): Describes where the artifact is on GS. test_artifacts_dir (str): The directory to which files should be downloaded. The tast executable will be found at tast/tast relative to this directory.

def download_vm(self, build_payload, vm_dir, modify_image=None):

Downloads the VM image from specified build artifacts.

Args: build_payload (BuildPayload): Describes where the artifact is on GS. vm_dir (Path): The directory to which files should be downloaded. modify_image (func): Function that takes one argument, the VM image path. It will be called prior to converting the raw image to the qcow2 format. (optional).

Returns: The location of the qcow image. This will be a location inside image_archive_dir.

def fetch_partner_key(self):

Fetch partner key from private ChromeOS Tree

def is_vm_running(self, kvm_pid_file):

Check if the specified PID is still running.

Args: kvm_pid_file (Path): File containing the PID of a QEMU process.

Returns: bool: Whether the VM process is still running.

def run_direct(self, dut_name, tast_inputs, test_results_dir):

Run tast tests without retries or results processing.

Args: dut_name (str): The identity of the DUT to connect to, for example, my-dut-host-name or localhost:9222 (if testing a VM). tast_inputs (TastInputs): Common inputs for running tast tests. test_results_dir (Path): Path to store tast results.

Returns: list[str]: The list of tests that met the specified expression(s).

def run_direct_vm(self, vm_context, test_results_dir, tast_inputs):

Run tast tests in a VM without retries or results processing.

Args: vm_context (contextlib.contextmanager): The VM context manager, created by create_qemu_vm_context/create_gce_vm_context. test_results_dir (Path): Path to store tast results. tast_inputs (TastInputs): Common inputs for running tast tests. new_invocation (bool): Whether the test results should be uploaded in a new invocation.

Returns: list[str]: The list of tests that met the specified expression(s).

def run_vm(self, suite_name, vm_context, tast_inputs):

Run tast tests in a VM with one retry and upload logs to Google storage.

Args: suite_name (str): Unique name used to record test results. vm_context (contextlib.contextmanager): The VM context manager, created by create_qemu_vm_context/create_gce_vm_context. tast_inputs (TastInputs): Common inputs for running tast tests.

Returns: A tuple of list(Failures), a bool indicating whether the results were empty and a dict mapping a task kind with the number of successes.

recipe_modules / tast_results

DEPS: cros_infra_config, cros_resultdb, cros_tags, failures_util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/resultdb, recipe_engine/step, recipe_engine/swarming, recipe_engine/time

Functions for reporting and parsing Tast VM test results.

class TastResultsApi(RecipeApi):

A module to process tast-results/ directory.

def __init__(self, props, *args, **kwargs):

Initialize TastResultsApi.

@exponential_retry(retries=2, delay=datetime.timedelta(seconds=1), condition=(lambda e: getattr(e, ‘had_timeout’, False)))
def archive_dir(self, dir_path, tag):

Archive dir to Google Storage.

Args: dir_path (Path): Path to dir to be uploaded. tag (str): Tag for this execution. Used to distinguish archive folders.

Returns: str, link to the archive on pantheon.

def convert_results(self, task_result, exclude_tests=None):

Convert TaskResult into api.failures.Results object and dicts.

Args: task_result (TaskResult): TaskResult to be converted. exclude_tests list(str): List of names of tests to be excluded.

Returns: A tuple of api.failures.Results object and list(dict) representing failed test cases excluding the ones provided.

def convert_to_testcaseresult(self, test_result):

Convert Tast's result into CTP format.

Args: test_result (TestResult): TestResult to be converted.

Returns: TestCaseResult with the same info.

def create_missing_test_results(self, missing_test_names):

Create test results for the missing test cases.

Args: missing_test_names list(str): Tests that should have run but didn't.

Returns: list(TestCaseResult) Test results for the missing tests cases.

def extract_failed_test_names(self, vm_test_build: Build):

Returns the failed test names from the output properties of the build.

def get_results(self, test_results_path, suite_name, tag, tests, build_artifacts_url=None, new_invocation=False):

Return the test results decoded from the streamed_results.jsonl.

Args: test_results_path (Path): Path to test_results/. suite_name (str): Name of the whole test suite. tag (str): Tag for this execution. Used to distinguish archive folders. tests list(str): List of tests that should have been executed. build_artifacts_url (str): GS Link to the artifacts of the build. Ex: ‘gs://chromeos-image-archive/betty-cq/R111-2349872/’ new_invocation (bool): Whether the test results should be uploaded in a new invocation.

Returns: A consolidated Data Structure summarizing all results from a run. Currently this is a TaskResult. https://crrev.com/ee30a869473a8ee54246e0469ede2aa010fb2e48/src/test_platform/steps/execution.proto#47

def get_tests_to_retry(self, task_result):

Determine which tests to retry.

Args: task_result(TaskResult): TaskResult of the test suite.

Returns: list(str) names of tests to be retried and a boolean that requires VM restart before retry.

def had_no_unexpected_skips(self, vm_test_build: Build):

Returns whether all test cases were attempted.

def print_results(self, failures, empty_result):

Print results for the user.

Args: failures(list(Failure)): Failures of this run. empty_result(bool): Were the results empty?

def record_logs(self, sys_log_dir):

Print system logs to MILO.

Args: sys_log_dir(str): absolute dir path to copy logs from.

def upload_to_resultdb(self, test_results_path, suite_name, missing_test_names, tag, build_artifacts_url=None, new_invocation=False):

Upload the test results to ResultDB.

Args: test_results_path (Path): Path to test_results. suite_name (str): Name of the whole test suite. missing_test_names: Test results for the missing tests cases. tag (str): Tag for this execution. Used to distinguish archive folders. build_artifacts_url (str): GS Link to the artifacts of the build. Ex: ‘gs://chromeos-image-archive/betty-cq/R111-2349872/

recipe_modules / test_failures

DEPS: cros_infra_config, cros_tags, failures, failures_util, naming, src_state, urls, recipe_engine/buildbucket, recipe_engine/step

API for raising e2e test failures and presenting them.

class TestFailuresApi(RecipeApi):

A module for presenting test failures.

def get_additional_hw_test_not_run_failures(self, not_runnable_addtnl_tests):

def get_hw_test_results(self, hw_tests):

Logs hardware test status to UI, and raises on failed tests.

Args: hw_tests (list[SkylabResult]): List of Skylab suite results.

Returns: A Results object containing the list[Failure] of all failures discovered in the given runs and a dict mapping a task kind with the number of successes.

def get_hwtest_status(self, hw_test: SkylabResult):

Get the status of the hw_test.

def is_critical_test_failure(self, test):

Determine if the test is critical and has failed.

Args: test (SkylabResult): The test in question.

Returns: bool: True if the test is critical and has failed.

def is_hw_test_critical(self, hw_test):

Determine if the hw test was critical.

Args: hw_test (SkylabResult): The hardware test result in question.

Returns: bool: True if the test was critical.

recipe_modules / test_util

DEPS: cros_tags, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/properties

API to simpify testing CrOS recipes.

class TestUtilApi(RecipeApi):

A module providing test methods to simplify testing CrOS recipes.

recipe_modules / urls

DEPS: recipe_engine/buildbucket

API for creating task URLs out of complex data structures.

class UrlsApi(RecipeApi):

A module for creating links to tasks.

def get_build_link_map(self, build: build_pb2.Build):

Returns a {title->URL} for the given buildbucket build.

Args: build: The buildbucket build in question.

Returns: Dict of {title: URL} pointing to the build's MILO page.

@staticmethod
def get_gs_bucket_url(gs_bucket: str, gs_path: str):

Returns the Cloud Storage Browser URL given a bucket and path.

Args: gs_bucket: A string of the gs bucket name to use gs_path: A string matching a path within that bucket

Returns: URL pointing to the Cloud Storage Browser page matching the input.

@staticmethod
def get_gs_path_url(gs_uri):

Returns the Cloud Storage Browser URL to the given GS path.

Args: gs_uri: A string of the format “gs:///”

Returns: URL pointing to the Cloud Storage Browser page for the object.

def get_logdog_url(self, step: step_data.StepData, log_name: str, use_top_level_step: bool=True):

Returns the LogDog URL for a step's log.

buildbucket.build.infra.logdog is used to find the LogDog hostname, project, and prefix.

Args: step: The step containing the log. Note that this should be the StepData for the step that actually ran, even if the log is attached to a higher-level nested step, see use_top_level_step. log_name: The name of the log added to a step. This can be a log that is automatically added to the step (e.g. “stdout”) or a log added to StepPresentation.logs by the recipe. use_top_level_step: If true, point the URL to the highest-level step, otherwise point the URL to the step that actually ran (which may be nested). For example, the step is “outer step|run cmd” and use_top_level_step is true, the URL will be “.../outer_step/<log_name>”, otherwise it will be “.../outer_step/run_cmd/<log_name>”.

Returns: The LogDog URL.

def get_skylab_result_link_map(self, skylab_result: SkylabResult):

Returns the URL to the given skylab result page.

Args: skylab_task: The Skylab result in question.

Returns: Dict of {title: URL} for the Skylab swarming task parge if the suite succeeded, or entries of just the failed tests.

@staticmethod
def get_skylab_task_url(skylab_task: SkylabTask):

Returns the URL to the given skylab task.

Args: skylab_task: The Skylab task in question.

Returns: URL pointing to the Swarming task page for the Skylab task.

@staticmethod
def get_state_suffix(task_state: TaskState):

Returns a string suffix to supply info about the task.

Args: tast_state: The task state.

Returns: A string denoting more information about the task.

recipe_modules / util

DEPS: recipe_engine/path

Module providing importable utilities.

class UtilApi(RecipeApi):

Includable utilities.

def proto_path_to_recipes_path(self, proto_path: common_pb2.Path):

Return a config_types.Path equivalent to the common_pb2.Path.

Args: proto_path: A Path proto message, as might be returned by the build API. Must be absolute, and location must be specified as either INSIDE or OUTSIDE.

Raises: ValueError: If proto_path.location is OUTSIDE and proto_path.path is not relative to any recipe anchor point. See the path API for more info info about those anchor points. This exception is raised during self.m.path.abs_to_path(). ValueError: If proto_path.location is INSIDE. Chroot paths are migrated to different outside locations according to Chromite logic that is subject to change. We deliberately do not replicate that logic here. Instead, if a recipe needs to reference INSIDE paths returned by a build API endpoint, the endpoint should be return OUTSIDE paths. If you need to refactor in this way, consider using the build API's ResultPath functionality for this. Add a ResultPath field to the request message, and ensure that the endpoint runs inside the chroot via either service_chroot_assert or method_chroot_assert. The API router will automatically extract any Path (or repeated Path) fields to the given ResultPath. ValueError: If proto_path.location is not specified (i.e. NO_LOCATION).

recipe_modules / vmlab

DEPS: recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

A module to interact with CrOS VMLab.

class VmlabApi(RecipeApi):

def __init__(self, properties, *args, **kwargs):

Initialize GcloudApi.

def clean_images(self, name, dry_run, rate=1):

Clean up VM images in the GCP project.

Remove expired images, return error if there is any unknown image.

Args: name: name of the step. dry_run: don't really delete images if true. rate: maximum number of requests per second.

Returns: Object containing the result of the clean up. Includes total number of images, deleted images, failed to import images, unknown images.

def cleanup_vm(self, name, config, swarming_bot_name, dry_run=False, rate=1, allow_failure=False):

Cleanup orphan VM instances.

Args: name: name of the step. config: config name preconfigured in vmlab CLI. swarming_bot_name: only cleanup instances created by given swarming bot. dry_run: dry run mode. only list, but not delete any instance. rate: rate limit for deleting instance requests. allow_failure: if set to True, step will not raise if CLI returns non-zero result.

def delete_vm(self, name, config, instance_name):

Deletes a given VM instance.

Args: name: name of the step. config: config name presentation in vmlab CLI. instance_name: name of the instnace returned by lease_vm.

def import_image(self, name, build_path, wait, assert_ready=False):

Import a VM image from GCS to GCE.

Args: name: name of the step. build_path: build path of the image in GCS without bucket, for example betty-arc-r-cq/R108-15164.0.0-71927-8801111609984657185 wait: whether to wait for the image import to complete. assert_ready: raise StepFailure if image is not in READY state.

Returns: Object containing project, name, status, source of the imported image.

def lease_vm(self, name, config, image_name, image_project=DEFAULT_IMAGE_PROJECT, swarming_bot_name=None):

Lease a VM.

Args: name: name of the step. config: config name preconfigured in vmlab CLI. image_name: name of the image to use. image_project: GCP project where the image is stored. swarming_bot_name: name of the sarming bot. cleanup_vm may not work well if empty swarming_bot_name is provided at some backend.

recipe_modules / workspace_util

DEPS: cros_build_api, cros_infra_config, cros_relevance, cros_source, easy, src_state, recipe_engine/context, recipe_engine/step

API for various support functions for building.

class WorkspaceUtilApi(RecipeApi):

A module workspace setup and manipulation.

def apply_changes(self, changes: Optional[List[GerritChange]]=None, name: str=‘cherry-pick gerrit changes’, ignore_missing_projects: bool=False):

Apply gerrit changes.

Args: changes: Changes to apply. Default: changelist saved in cros_infra_config.configure_builder(). name: Step name. ignore_missing_projects: If true, changes to projects that are not currently checked out (as determined by repo forall) will not be applied. An example of when this is useful: it is possible that changes includes changes to repos this builder is not allowed to read (e.g. because of Cq-Depend grouping); the changes will be discarded instead of failing during application.

def checkout_change(self, change: Optional[GerritChange]=None, name: str=‘checkout gerrit change’):

Check out a gerrit change using the gerrit refs/changes/... workflow.

Differs from apply_changes in that the change is directly checked out, not cherry picked, so the patchset parent will be accurate. Used for things like tricium where line number matters.

Args: change: Change to check out. name: Step name.

def detect_toolchain_cls(self, chroot: Chroot, test_value: Optional[bool]=None, name: Optional[str]=None):

Check for toolchain changes.

If there are any changes that affect the toolchain, set that workspace attribute.

Args: chroot: The chroot for the build. test_value: The value to use for tests, or None to detect toolchain changes unless step data is provided elsewhere. name: The name for the step, or None for default.

Returns: Whether there are toolchain patches applied.

@property
def patch_sets(self):

The patch sets (with commit and file info) applied to the build.

@contextlib.contextmanager
def setup_workspace(self, default_main: bool=False):

Prepare the source checkout for building.

Args: default_main: Whether to checkout tip-of-tree instead of snapshot when no gitiles_commit was provided.

Yields: A context where source is set up, and the current working directory is the workspace path. Note that api.cros_sdk.cleanup_context() is generally going to be needed.

@contextlib.contextmanager
def sync_to_commit(self, commit: Optional[GitilesCommit]=None, staging: bool=False, projects: Optional[List[str]]=None):

Sync the source tree.

Args: commit: The gitiles_commit to sync to. Default: commit saved in cros_infra_config.configure_builder(). staging: Whether this is a staging build. projects: Project names or paths to return info for. Defaults to all projects.

Yields: A context manager which syncs the workspace path.

@contextlib.contextmanager
def sync_to_manifest_groups(self, manifest_groups: List[str], local_manifests: Optional[List[LocalManifest]]=None, cache_path_override: Optional[Path]=None, gitiles_commit: Optional[GitilesCommit]=None, manifest_branch: Optional[str]=None):

Return a context with manifest groups checked out to cwd.

The subset of repos in the external manifest + local_manifests matching manifest_groups are synced. For example, say the external manifest contains repos:

and a local manifest contains repos:

and manifest_groups is [“g1”, “g4”]. Repos “a”, “b”, and “e” will be synced.

Note the importance of the cache_path_override parameter. For cases where the number of repos being synced is much smaller than a full checkout it is more efficient to override the default cache. This is because the time to delete unused repos (which are present because of caching) is much larger than the time to sync the used repos.

Args: manifest_groups: List of manifest groups to checkout. local_manifests: A list of local manifests to add, or None if not syncing a local manifest. cache_path_override: Path to sync into. If None, the default caching of cros_source.ensure_synced_cache is used. gitiles_commit: The gitiles_commit to sync to. Default: commit saved in cros_infra_config.configure_builder(). manifest_branch: Branch to checkout. See the --manifest-branch option of repo init for details and defaults.

@property
def toolchain_cls_applied(self):

Whether there are toolchain CLs applied to the source tree.

@property
def workspace_path(self):

Recipes

recipes / afdo_orchestrator

DEPS: orch_menu, test_util, recipe_engine/properties, recipe_engine/swarming

Recipe that generates artifacts using HW Test results.

All builders run against the same source tree.

def DoRunSteps(api: RecipeApi, properties: AfdoOrchestratorProperties):

def RunSteps(api: RecipeApi, properties: AfdoOrchestratorProperties):

recipes / afdo_process

DEPS: build_menu, cros_sdk, sysroot_util, test_util

Recipe for building an AFDO benchmark profile.

def DoRunSteps(api: RecipeApi, config: BuilderConfig, properties: AfdoProcessProperties):

def RunSteps(api: RecipeApi, properties: AfdoProcessProperties):

recipes / analysis_service:examples/full

DEPS: analysis_service, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api: RecipeApi, properties: FullProperties):

recipes / analysis_service:tests/publish_events_filter_payload

DEPS: analysis_service, cros_build_api, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api):

recipes / android:examples/full

DEPS: android, cros_build_api, gerrit, recipe_engine/properties

def RunSteps(api: RecipeApi, properties: TestProperties):

recipes / android:examples/misc

DEPS: android, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / android:examples/uprev

DEPS: android

def RunSteps(api: RecipeApi):

recipes / android_uprev_orchestrator

DEPS: android, cros_infra_config, cros_source, easy, gerrit, git, orch_menu, repo, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step

Orchestrator for Android uprev builders.

The orchestrator determines the latest Android version for the specified Android package, then passes the info down into child builders running the build_android_uprev recipe.

Once all builds and tests passed, it submits a CL to update the Android LKGB file. The change will in turn trigger the PUpr generator to publish an actual Android uprev.

def DoRunSteps(api: RecipeApi, properties: AndroidUprevProperties):

def RunSteps(api: RecipeApi, properties: AndroidUprevProperties):

recipes / annealing

DEPS: binhost_lookup_service, cros_build_api, cros_cq_depends, cros_infra_config, cros_source, cros_tags, easy, gerrit, git, git_footers, git_txn, naming, repo, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/cv, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Recipe for the CrOS annealing builders.

The annealing builders run in serial and do the following:

  1. Checkout ToT
  2. Rewind (i.e. checkout an ancestor) projects with missing dependencies; this prevents a bad tree state due to e.g. Gerrit replication latency.
  3. Uprev portage packages (for each board)
  4. Make a manifest snapshot (aka “pinned manifest”), and push it
  5. Perform post-submit tasks like:
  • push metadata for e.g. Goldeneye, findit

def RunSteps(api, properties):

recipes / auto_retry_util:tests/analyze_build_failures

DEPS: auto_retry_util, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/analyze_test_results

DEPS: auto_retry_util, cros_history, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/cq_retry_candidates

DEPS: auto_retry_util, gerrit, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/properties

Tests for the cq_retry_candidates function.

def RunSteps(api):

recipes / auto_retry_util:tests/filter_candidates

DEPS: auto_retry_util, gerrit, git_footers, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Tests for the filter_candidates function.

def RunSteps(api):

recipes / auto_retry_util:tests/footers

DEPS: auto_retry_util, git_footers, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/get_exonerated_suites

DEPS: auto_retry_util, cros_history, skylab_results, test_util, depot_tools/gitiles, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/get_failure_attributed_suites

DEPS: auto_retry_util, cros_infra_config, skylab_results, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Tests for the get_failure_attributed_hw_suites method.

def RunSteps(api):

recipes / auto_retry_util:tests/is_experimental_feature_enabled

DEPS: auto_retry_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/multi_retry

DEPS: auto_retry_util, gerrit, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/time

Tests of multi-retry filtering.

def RunSteps(api):

recipes / auto_retry_util:tests/not_at_fault_package_failures

DEPS: auto_retry_util, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Tests retrying package failures unrelated to the changes under test.

def RunSteps(api):

recipes / auto_retry_util:tests/retry_build

DEPS: auto_retry_util, gerrit, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/sdk_failures

DEPS: auto_retry_util, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Tests involving SDK failure retries.

def RunSteps(api):

recipes / auto_retry_util:tests/snapshot_greenness_cache

DEPS: auto_retry_util, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Tests caching greenness for the latest snapshot.

def RunSteps(api):

recipes / auto_retry_util:tests/submission_blocking_builders

DEPS: auto_retry_util, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/test_variant_exoneration_analysis

DEPS: auto_retry_util, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/luci_analysis, recipe_engine/properties

def RunSteps(api):

recipes / auto_retry_util:tests/throttle

DEPS: auto_retry_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / auto_runner_util:tests/current_revision_number

DEPS: auto_runner_util, recipe_engine/assertions, recipe_engine/properties

Tests for current_revision_number prop.

def RunSteps(api):

recipes / auto_runner_util:tests/dry_run

DEPS: auto_runner_util, gerrit, recipe_engine/properties

Tests for executing dry runs.

def RunSteps(api):

recipes / auto_runner_util:tests/filter_dependent_changes

DEPS: auto_runner_util, recipe_engine/assertions, recipe_engine/properties

Tests for the filtering cq-depent changes.

def RunSteps(api):

recipes / auto_runner_util:tests/filter_no_reviewer_changes

DEPS: auto_runner_util, recipe_engine/assertions, recipe_engine/properties

Tests for the cls with reviewers functionality.

def RunSteps(api):

recipes / auto_runner_util:tests/get_change_infos_from_gerrit

DEPS: auto_runner_util, recipe_engine/assertions, recipe_engine/properties

Tests for get_change_infos_from_gerrit function.

def RunSteps(api):

recipes / auto_runner_util:tests/get_eligible_cls

DEPS: auto_runner_util, gerrit, recipe_engine/assertions, recipe_engine/properties

Tests for get_eligible_cls function.

def RunSteps(api):

recipes / auto_runner_util:tests/quota_limit

DEPS: auto_runner_util, gerrit, recipe_engine/assertions, recipe_engine/properties, recipe_engine/time

Tests for quota limits.

def RunSteps(api):

recipes / binhost_lookup_service:examples/publish_binhost_data

DEPS: binhost_lookup_service, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step

Test the publish binhost metadata functionality of the module.

def RunSteps(api: RecipeApi):

recipes / binhost_lookup_service:examples/publish_snapshot_data

DEPS: binhost_lookup_service, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step

Test the publish snapshot metadata functionality of the module.

def RunSteps(api: RecipeApi):

recipes / bot_cost:examples/calculate_build_cost

DEPS: bot_cost, cros_tags, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Tests for bot_cost.

def RunSteps(api: RecipeApi):

recipes / bot_cost:tests/bot_size

DEPS: bot_cost, cros_tags, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api: RecipeApi):

recipes / bot_scaling:examples/drop_cpus

DEPS: bot_scaling, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

Unit tests for the drop_cpu_cores function.

def RunSteps(api, properties):

recipes / bot_scaling:examples/get_bot_request

DEPS: bot_scaling, recipe_engine/assertions

def RunSteps(api):

recipes / bot_scaling:examples/get_gce_config

DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions

def RunSteps(api):

recipes / bot_scaling:examples/get_quota_usage

DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / bot_scaling:examples/get_robocrop_action

DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / bot_scaling:examples/get_scaling_action

DEPS: bot_scaling, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / bot_scaling:examples/get_swarming_demand

DEPS: bot_scaling, recipe_engine/assertions

Test case for api.bot_scaling.get_swarming_demand().

def RunSteps(api: recipe_api.RecipeApi):

def make_irrelevant_bot_stats(bot_group: str):

Return a BotStats object whose numbers don't matter.

This will help us ensure that we only calculate demand for the bot group we care about, and ignore all others. The state numbers are so large that if we accidentally use them, it should be immediately obvious that something is wrong.

def make_irrelevant_task_stats(bot_group: str, task_state: str):

Return a TaskStats object whose numbers don't matter.

This will help us ensure that we only calculate demand for the bot group we care about, and ignore all others. The count number is so large that if we accidentally use it, it should be immediately obvious that something is wrong.

recipes / bot_scaling:examples/get_swarming_stats

DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions

def RunSteps(api):

recipes / bot_scaling:examples/update_bot_policy_config

DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions

def RunSteps(api):

recipes / bot_scaling:examples/update_gce_configs

DEPS: bot_scaling, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / brancher

DEPS: cros_branch, cros_release_config, cros_source, easy, workspace_util, depot_tools/gsutil, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Recipe for creating a new ChromeOS branch.

def RunSteps(api: RecipeApi, properties: BrancherProperties):

def is_108_or_greater(source_version: str):

def is_unsupported_rubik_build(api: RecipeApi, source_version: str):

recipes / breakpad:examples/full

DEPS: breakpad, cros_test_postprocess, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io

def RunSteps(api):

recipes / breakpad:examples/no_symbols

DEPS: breakpad, cros_test_postprocess, recipe_engine/assertions, recipe_engine/path, recipe_engine/raw_io

def RunSteps(api):

recipes / build_android_uprev

DEPS: android, build_menu, recipe_engine/buildbucket, recipe_engine/properties

Recipe for building a BuildTarget image for Android uprev.

The target Android package/version to uprev is specified via input properties, for example:

“$chromeos/android”: { “android_package”: “android-vm-rvc”, “android_version”: “7444938” }

def DoRunSteps(api: RecipeApi, properties: AndroidUprevProperties, config: BuilderConfig):

def RunSteps(api: RecipeApi, properties: AndroidUprevProperties):

recipes / build_bisector

DEPS: bot_scaling, build_menu, cros_history, cros_infra_config, future_utils, sysroot_archive, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/properties, recipe_engine/raw_io

Recipe for building a BuildTarget image for Bisector.

def DoRunSteps(api: RecipeApi, config: BuilderConfig):

def RunSteps(api: RecipeApi, properties: BuildBisectorProperties):

recipes / build_borealis_rootfs

DEPS: build_menu, cros_sdk, cros_source, easy, gerrit, git, repo, src_state, depot_tools/depot_tools, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Recipe for building a Borealis rootfs image.

def DoRunSteps(api: RecipeApi, properties: BuildBorealisRootfsProperties):

def RunSteps(api: RecipeApi, properties: BuildBorealisRootfsProperties):

recipes / build_chromiumos

DEPS: build_menu, build_reporting, builder_metadata, cros_sdk, cros_source, cros_tags, debug_symbols, easy, recipe_engine/properties, recipe_engine/step

Recipe for building public ChromiumOS images.

def DoRunSteps(api: RecipeApi, config: BuilderConfig, properties: BuildChromiumosProperties):

def RunSteps(api: RecipeApi, properties: BuildChromiumosProperties):

recipes / build_compilation_database

DEPS: cros_sdk, recipe_engine/step

def RunSteps(api):

recipes / build_cq

DEPS: bot_scaling, build_menu, chrome, cros_infra_config, deferrals, easy, future_utils, gerrit, src_state, test_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/led, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Recipe for building a BuildTarget image for CQ.

def DoRunSteps(api: RecipeApi, config: BuilderConfig, upload_state: ArtifactUploadState):

def RunSteps(api: RecipeApi):

recipes / build_factory

DEPS: build_menu, build_reporting, cros_build_api, cros_infra_config, cros_release, cros_version, factory_util, signing, src_state, test_util, recipe_engine/file, recipe_engine/properties, recipe_engine/step

Recipe for generating artifacts for Factory builders.

This recipe supports the workflow necessary to support factory builders.

def RunSteps(api, properties: BuildFactoryProperties):

recipes / build_firmware

DEPS: build_menu, cros_artifacts, cros_build_api, cros_infra_config, cros_release, cros_sdk, cros_source, cros_version, easy, failures, src_state, test_util, depot_tools/gsutil, recipe_engine/bcid_reporter, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/step

Recipe that builds and tests firmware.

This recipe lives on its own because it is agnostic of ChromeOS build targets. This recipe should only be used for ToT firmware builds and build_legacy_fw (which is not deprecated) should be used for branch firmware builds. It is also acceptable to use for short-lived EC branches. There is DANGER that there could be unexpected interactions between unbranched recipes and branched cros_build_api calls. You are on your own if you attempt to use this recipe on a branch, and that branch should be as short-lived as possible.

def CreateContainers(api, config):

def CreateTi50TastArtifacts(api, location, config):

Create directories and files of artifacts needed by Ti50 Tast tests.

def RunSteps(api, properties):

def UploadTestResults(api, location, builder_name):

recipes / build_firmware_historical_db

DEPS: build_menu, cros_artifacts, cros_build_api, cros_infra_config, cros_sdk, cros_source, easy, failures, gcloud, src_state, test_util, depot_tools/gsutil, recipe_engine/bcid_reporter, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/step, recipe_engine/time

Recipe that manages Zephyr EC firmware's historical token database.

This recipe builds firmware and merges its unified database with the historical database in GCS. This database maintenance runs on a 24 hour cadence.

def CopyVersionedDatabase(api: RecipeApi, source: str):

def RunSteps(api, properties):

def UpdateHistoricalTokenDatabase(api: RecipeApi, location: common_pb2.FwLocation, uploaded_artifacts: UploadedArtifacts):

Updates Historical Token Database in GCS

Updates the historical database in GCS using preconditions to avoid any race conditions between other builders.

Args: api: RecipesAPI object for dependencies. location: The firmware location. builder_name: Name of builder. uploaded_artifacts: Artifacts uploaded.

recipes / build_incremental

DEPS: build_menu, build_plan, cros_build_api, cros_infra_config, cros_prebuilts, cros_sdk, cros_source, cros_tags, easy, git, incremental, repo, src_state, sysroot_util, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Recipe for building a BuildTarget incrementally.

def DoRunSteps(api: RecipeApi, config: BuilderConfig, properties: IncrementalProperties):

Tests reliability of incremental build by performing two builds.

First, revert the checkout back in time, build_packages for that old state, generating local artifacts, and move the checkout back to ToT. Then, attempt to build from the current state with old state intact.

Args: api: The recipe API. config: The BuilderConfig for this incremental builder. properties: Input properties for this build.

Returns: A list of relevant packages built.

def RunSteps(api: RecipeApi, properties: IncrementalProperties):

recipes / build_informational

DEPS: build_menu, test_util

Recipe for generating artifacts for Informational builders.

This recipe supports the workflow necessary to support asan, UBsan, and fuzzer builder profiles.

def RunSteps(api: RecipeApi):

recipes / build_kabuto_shadercache

DEPS: build_menu, easy, failures, git, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Recipe for building Borealis shadercache using Kabuto.

def DoRunSteps(api: RecipeTestApi, properties: BuildKabutoShadercacheProperties):

def RunSteps(api: RecipeApi, properties: BuildKabutoShadercacheProperties):

recipes / build_legacy_factory

DEPS: build_menu, src_state, recipe_engine/context, recipe_engine/properties

Recipe that builds factory images/artifacts on a factory branch.

def RunSteps(api, properties):

recipes / build_legacy_fw

DEPS: build_menu, build_reporting, cros_artifacts, cros_build_api, cros_infra_config, cros_release, cros_sdk, cros_version, easy, failures, git, metadata_json, repo, src_state, test_util, depot_tools/depot_tools, recipe_engine/bcid_reporter, recipe_engine/context, recipe_engine/cv, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Recipe that builds chromeos-firmware on a firmware branch.

This recipe is not deprecated. All branched firmware builds should use this recipe.

def RunSteps(api, properties):

recipes / build_linters

DEPS: build_menu, chrome, chromite, cros_build_api, cros_sdk, cros_source, gerrit, repo, src_state, recipe_engine/path, recipe_engine/step, recipe_engine/tricium

Recipe for linting CLs.

def DoRunSteps(api: RecipeApi, config: BuilderConfig, relevant_patchsets_by_linter: Dict[(str, Dict[(PatchSet, List[str])])]):

def RunSteps(api: RecipeApi, properties: BuildLintersProperties):

recipes / build_mass_deploy

DEPS: cros_infra_config, easy, failures, depot_tools/gsutil, recipe_engine/archive, recipe_engine/bcid_reporter, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step

Recipe for modifying images for mass deployment. Intended for use with ChromeOS Flex.

def RunSteps(api, properties):

recipes / build_menu:examples/full

DEPS: build_menu, cros_build_api, cros_history, cros_relevance, cros_version, gerrit, repo, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming

def DoRunSteps(api, config, properties):

def RunSteps(api, properties):

def step_data_complete_cached_container_gcs(api):

def step_data_incomplete_cached_container_gcs(api):

def step_data_no_cached_container_gcs(api):

recipes / build_menu:tests/is_staging

DEPS: build_menu, easy, recipe_engine/buildbucket

def RunSteps(api: recipe_api.RecipeApi):

Run main test logic.

recipes / build_menu:tests/no_dep_graph

DEPS: build_menu, test_util, recipe_engine/assertions, recipe_engine/swarming

def RunSteps(api):

recipes / build_menu:tests/publish_centralized_suites

DEPS: build_menu, cros_build_api, recipe_engine/buildbucket

Tests for build_menu.publish_centralized_suites

def RunSteps(api):

recipes / build_menu:tests/setup_chroot

DEPS: build_menu, cros_build_api, cros_infra_config, recipe_engine/assertions, recipe_engine/properties

Test coverage for BuildMenuApi.setup_chroot().

def RunSteps(api: RecipeApi, properties: SetupChrootProperties):

Setup the chroot, like a builder might do.

recipes / build_menu:tests/upload_sources

DEPS: build_menu, cros_infra_config, cros_source, src_state, test_util, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api, properties):

recipes / build_parallels_image

DEPS: easy, phosphorus, tast_exec, tast_results, test_util, depot_tools/gsutil, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Recipe for building a Parallels image for testing.

This recipe runs on a lab drone and takes control of a physical DUT to build a new Parallels VM image, for later use in automated testing.

This recipe involves booting up Windows in a virtual machine on the DUT. The caller is responsible for ensuring this is only invoked in contexts where the necessary license(s) have been obtained. Contact parallels-cros@ for more details.

This recipe is invoked as part of uprev_parallels_pin.

def RunSteps(api: RecipeApi, properties: BuildParallelsImageProperties):

def build_vm_image(api: RecipeApi, properties: BuildParallelsImageProperties):

Builds a new VM image for testing.

Returns: image_name(str): The name of the generated image. image_size(int): The size of the generated image, in bytes. image_hash(str): The base64-encoded SHA256 hash of the generated image.

def invoke_tast(api: RecipeApi, test_artifacts_dir: Path, build_payload: BuildPayload, dest_path: Path):

Runs tast to build the new VM image.

Args: test_artifacts_dir (Path): The location of test artifacts produced by the build. build_payload (BuildPayload): Describes where the artifact is on GS. dest_path (Path): The location that the produced VM image should be copied to (on the local disk).

recipes / build_plan:examples/cq_build_plan

DEPS: build_plan, cros_build_api, cros_history, cros_infra_config, cros_relevance, gerrit, git_footers, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, properties):

recipes / build_plan:examples/get_completed_builds

DEPS: build_plan, cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Unit tests for the get_completed_builds function.

def RunSteps(api, properties):

recipes / build_plan:examples/snapshot_build_plan

DEPS: build_plan, cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / build_plan:tests/cq_looks

DEPS: build_plan, cros_infra_config, gerrit, git_footers, orch_menu, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/properties, recipe_engine/step

def RunSteps(api, properties):

recipes / build_plan:tests/cq_looks_with_request_builders

DEPS: build_plan, cros_infra_config, git_footers, orch_menu, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cq, recipe_engine/properties, recipe_engine/time

Unit tests for choosing snapshots with requested builders.

def RunSteps(api):

recipes / build_plan:tests/get_forced_rebuilds

DEPS: build_plan, git_footers, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api):

recipes / build_plan:tests/get_relevant_builders

DEPS: build_plan, cros_build_api, cros_infra_config, recipe_engine/assertions, recipe_engine/properties

Unit tests for the get_relevant_builders function.

def RunSteps(api):

recipes / build_release

DEPS: bot_cost, bot_scaling, build_menu, build_reporting, builder_metadata, checkpoint, cros_build_api, cros_infra_config, cros_prebuilts, cros_release, cros_release_util, cros_sdk, cros_source, cros_tags, cros_test_plan, cros_try, cros_version, debug_symbols, dlc_utils, easy, failures, mass_deploy, signing, src_state, vmlab, recipe_engine/bcid_reporter, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/led, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step

Recipe for building images for release.

def DoRunSteps(api, config, properties):

def RunSteps(api, properties):

recipes / build_reporting:examples/contexts_1

DEPS: build_reporting, recipe_engine/properties

def RunSteps(api):

recipes / build_reporting:examples/contexts_2

DEPS: build_reporting, recipe_engine/properties

def RunSteps(api):

recipes / build_reporting:examples/full

DEPS: build_reporting, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/time

Exercise all the functionality of the build_reporting module.

def RunSteps(api):

recipes / build_reporting:tests/full

DEPS: build_reporting, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/time

def RunSteps(api):

recipes / build_reporting:tests/init_report_from_previous_build

DEPS: build_reporting, checkpoint, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/time

Test init_report_from_previous_build.

def RunSteps(api: RecipeApi):

recipes / build_reporting:tests/publish_dlcs

DEPS: build_reporting

def RunSteps(api):

recipes / build_reporting:tests/publish_signing

DEPS: build_reporting, recipe_engine/properties

Tests for publish_signed_build_metadata.

def RunSteps(api):

recipes / build_reporting:tests/publish_to_gs

DEPS: build_reporting, recipe_engine/assertions, recipe_engine/step

def RunSteps(api):

recipes / build_reporting:tests/publish_to_pubsub

DEPS: build_reporting, recipe_engine/assertions, recipe_engine/step

Tests for disabling/enabling pubsub update

def RunSteps(api):

recipes / build_reporting:tests/status_reporting

DEPS: build_reporting, recipe_engine/properties

def RunSteps(api):

recipes / build_sdk

DEPS: build_menu, cros_build_api, cros_sdk, deferrals, easy, git_footers, key_value_store, src_state, util, workspace_util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/scheduler, recipe_engine/step, recipe_engine/time

Recipe that builds a ChromiumOS SDK and cross-compilers.

def RunSteps(api: recipe_api.RecipeApi, properties: build_sdk_pb2.BuildSDKProperties):

recipes / build_sdk_subtools

DEPS: build_menu, cros_build_api, cros_sdk, cros_source, failures, image_builder_failures, src_state, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step

Recipe that runs the Subtools Builder.

The Subtools builder starts with an SDK, builds some additional host packages, then uploads build artifacts to external locations, such as CIPD.

def RunSteps(api: recipe_api.RecipeApi, properties: BuildSdkSubtoolsProperties):

def create_package_info(atom: str):

Parse category/package-name into a PackageInfo.

recipes / build_slim_cq

DEPS: bot_scaling, build_menu, cros_history, cros_infra_config, cros_relevance, cros_tags, easy, test_util, recipe_engine/buildbucket, recipe_engine/runtime, recipe_engine/step

Recipe for building and testing a BuildTarget's packages.

def DoRunSteps(api: RecipeApi, config: BuilderConfig):

def RunSteps(api: RecipeApi):

recipes / build_snapshot

DEPS: bot_scaling, build_menu, build_reporting, cros_history, cros_infra_config, cros_sdk, future_utils, test_util, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/properties, recipe_engine/step

Recipe for building a BuildTarget image for Snapshot.

def DoRunSteps(api: RecipeApi, config: BuilderConfig):

def RunSteps(api: RecipeApi):

recipes / build_toolchain

DEPS: build_menu, cros_build_api, cros_sdk, cros_version, gerrit, repo, test_util, workspace_util, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Builds and uploads the CrOS toolchain.

def RunSteps(api: RecipeApi, properties: BuildToolchainProperties):

recipes / buildbucket_stats:examples/get_bot_demand

DEPS: buildbucket_stats, recipe_engine/assertions

def RunSteps(api):

recipes / buildbucket_stats:examples/get_bucket_status

DEPS: buildbucket_stats, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / buildbucket_stats:examples/get_build_count

DEPS: buildbucket_stats, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / buildbucket_stats:tests/get_snapshot_greenness

DEPS: buildbucket_stats, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / builder_metadata:tests/get_models

DEPS: build_menu, builder_metadata, recipe_engine/assertions, recipe_engine/step

Tests to verify builder_metadata.get_models.

def RunSteps(api):

recipes / builder_metadata:tests/lookup_is_cached

DEPS: build_menu, builder_metadata, recipe_engine/assertions, recipe_engine/step

Tests to verify that builder_metadata is properly cached between invocations.

def RunSteps(api):

recipes / builder_metadata:tests/no_install_packages

DEPS: build_menu, builder_metadata, recipe_engine/assertions, recipe_engine/step

Test to verify install_packages is called prior to look_up_builder_metadata.

def RunSteps(api):

recipes / check_fit_image

DEPS: cros_source, gerrit, repo, src_state, test_util, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Check that any binary blobs in a commit come from a valid FIT version

def RunSteps(api, properties):

def mock_fit_header(version):

Mock the header from the FIT tool with given version

Args: version (str): version string to put in FIT header

Return: version file contents as string

def mock_version_file(version=‘14.0.40.1206’, hashes=None, delete=None):

Mock version file contents

Args: version (str): optional version string to put in FIT header hashes (dict): file => sha256 values to override/add to file delete ([str]): list of keys to remove from the file (default none)

Return: version file contents as string

def parse_versions_file(step_name, api, path):

Parse a file containing SHA-256 hashes and binary names into a map.

This file is generated by gen_hash_references.sh in the sys-boot overlays of individual baseboards. It's a list of SHA-256 hashes and associated file names.

Args: path (str): path to versions file

Return: (Fit Version, { filename => SHA-256 hash })

recipes / check_fpp_build

DEPS: recipe_engine/path, recipe_engine/step

Recipe to check fpp builds.

def RunSteps(api):

recipes / check_project_config

DEPS: cros_source, gerrit, gs_step_logging, repo, src_state, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Checks a project conforms to its program's constraints.

def RunSteps(api, properties):

recipes / checkpoint:examples/retry

DEPS: checkpoint, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/step

def RunSteps(api: RecipeApi):

recipes / checkpoint:tests/build_target_retry_props

DEPS: checkpoint, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / checkpoint:tests/builder_children

DEPS: checkpoint, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / checkpoint:tests/cascade

DEPS: checkpoint, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api: RecipeApi, properties: TestProperties):

recipes / checkpoint:tests/update_summary

DEPS: checkpoint, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / chrome:examples/cache_sync

DEPS: chrome, recipe_engine/context, recipe_engine/path, recipe_engine/step

def RunSteps(api):

recipes / chrome:examples/chrome_sync

DEPS: chrome, cros_infra_config, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step

def RunSteps(api):

recipes / chrome:examples/full

DEPS: chrome, cros_build_api, gerrit, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step

def RunSteps(api, properties):

def jsonify(**kwargs):

Return the kwargs as a json string.

recipes / chrome:tests/follower_needs_chrome_no_has_prebuilt

DEPS: chrome, cros_build_api, recipe_engine/assertions, recipe_engine/file

def RunSteps(api):

recipes / chrome:tests/gclient_retry

DEPS: chrome, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / chrome:tests/is_chrome_pupr_atomic_uprev

DEPS: chrome, gerrit, repo, recipe_engine/assertions, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api, properties):

recipes / chromeos_cbuildbot

DEPS: bot_cost, chromite, easy, gcloud, depot_tools/gitiles, recipe_engine/legacy_annotation, recipe_engine/properties, recipe_engine/step, recipe_engine/swarming

def DoRunSteps(api: RecipeApi, properties: ChromeosCbuildbotProperties):

def MakeSummaryMarkdown(api: RecipeTestApi, failure: StepFailure):

def RunSteps(api: RecipeApi, properties: ChromeosCbuildbotProperties):

recipes / chromite:examples/full

DEPS: chromite, depot_tools/gitiles, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/swarming

def RunSteps(api):

recipes / chromium_ide_pre_release

DEPS: build_menu, cros_source, gerrit, git, repo, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/url

Recipe for creating pre-release CL for ChromiumIDE.

def RunSteps(api: RecipeApi, properties: ChromiumIDEPreReleaseProperties):

recipes / chromiumos_codesearch

DEPS: build_menu, cros_build_api, cros_infra_config, cros_sdk, cros_source, easy, depot_tools/bot_update, depot_tools/gclient, infra/codesearch, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Upload a kzip so Kythe can provide language support services for ChromeOS.

This recipe checks out a Chrom(e|ium)OS manifest, syncs it to the latest snapshot, and builds packages with FEATURES=noclean so that build artifacts are preserved. Then the package_index_cros script creates a compilation database for all supported packages, and the package_index script bundles it into a kzip file. That file gets uploaded to Kythe (go/kythe), which will index it to serve cross-references to both Code Search and Cider G.

Note: the “ChromiumOS” in the name is outdated. Originally this recipe was written with the assumption that it used the public ChromiumOS manifest. Now the internal manifest can be specified via input properties.

def RunSteps(api, properties):

def generate_compilation_database(api: recipe_api.RecipeApi, build_dir: config_types.Path, build_target: str, packages: Iterable[str]):

Generate a compilation database for all the given packages.

Args: api: The recipe API. build_dir: The directory that should contain files from the build process. build_target: The build target to build packages for. packages: A list of package names to build.

recipes / chromiumos_codesearch_initiator

DEPS: depot_tools/git, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/scheduler

Builder that launches the chromiumos_codesearch builders.

Each chromiumos_codesearch builder generates and uploads a kzip for a different build target. This builder finds the latest snapshot manifest to check out, and launches all the build targets' builders to ensure that they use the same manifest.

Note: the “ChromiumOS” in the name is outdated. Originally this recipe was written with the assumption that it used the public ChromiumOS manifest. Now the internal manifest can be specified via input properties.

def RunSteps(api: recipe_api.RecipeApi, properties: ChromiumosCodesearchInitiatorProperties):

def latest_ref_info(api: recipe_api.RecipeApi, clone_dir: str, repo: str, branch: str):

Return the hash and timestamp of the latest commit on a branch.

recipes / cipd_uprev

DEPS: deferrals, golucibin, recipe_engine/cipd, recipe_engine/json, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

Recipe for upreving cipd packages.

def RunSteps(api: RecipeApi, properties: cipd_uprev.Properties):

def get_current_instance(api: RecipeApi, instruction: cipd_uprev.Instruction):

Get the current version of the ref.

Args:

  • instruction (cipd_uprev.Instruction): A complete set of args for cipd set-ref. Returns: cipd_uprev.PackageInstance Raises: A StepFailure if the CIPD tool call fails.

def uprev_package(api: RecipeApi, instruction: cipd_uprev.Instruction, package_tags=None):

Change CIPD ref of a package according to the instructions.

Args:

  • instruction (cipd_uprev.Instruction): A complete set of args for cipd set-ref.
  • package_tags: Tags to add to the package. Returns: cipd_uprev.PackageInstance Raises: A StepFailure if the CIPD tool call fails.

def validate(api: RecipeApi, instruction: cipd_uprev.Instruction):

Validate instructions for uprevving a specific package.

Args:

  • instruction (cipd_uprev.Instruction): A complete set of args for cipd set-ref. Raises: A ValueError if validation fails.

recipes / cl_factory

DEPS: cros_cq_depends, cros_source, easy, gerrit, git, repo, src_state, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/swarming

Used to create sweeping changes by creating CLs in many repos.

This recipe is currently focused on the use case of running gen_config in program and project repositories. Invocation is most easily handled via the cl_factory script in the chromiumos/config repo's bin directory:

https://chromium.googlesource.com/chromiumos/config/+/HEAD/bin/cl_factory

That script is a wrapper around the bb add command which ends up executing something that looks like this:

bb add -cl https://chrome-internal-review.googlesource.com/c/chromeos/program/galaxy/+/3095418 -p ‘repo_regexes=[“src/project/galaxy”]’ -p 'message_template=Hello world

BUG=chromium:1092954 TEST=None' -p ‘reviewers=[“reviewer@google.com”]’ -p ‘hashtags=[“mondo-update”]’ -p ‘replace_strings=[{“file_glob”: “*.star”, “before”: “_CLAMSHELL”, “after”: “_CONVERTIBLE”}]’ chromeos/infra/ClFactory

For more details on the input properties, see cl_factory.proto.

def RunSteps(api: RecipeApi, properties: ClFactoryProperties):

recipes / clean_up_lkgm_uprev_cls

DEPS: bot_scaling, cros_infra_config, cros_lkgm, cros_source, easy, failures, test_util, recipe_engine/properties, recipe_engine/step

Recipe that tests chromite.

This recipe lives on its own because it is agnostic of ChromeOS build targets.

def RunSteps(api: RecipeApi):

recipes / clean_vm_images

DEPS: build_menu, vmlab

Recipe for cleaning up stale GCP VM images.

def RunSteps(api: RecipeApi):

recipes / cloud_pubsub:examples/full

DEPS: cloud_pubsub

def RunSteps(api):

recipes / cloud_pubsub:tests/raises_on_failed_publish

DEPS: cloud_pubsub, recipe_engine/properties

def RunSteps(api):

recipes / cloud_pubsub:tests/request_size_too_large

DEPS: cloud_pubsub, recipe_engine/properties

Tests the case where a request is larger than the Pub/Sub limit.

def RunSteps(api):

recipes / code_coverage:examples/firmware_lcov

DEPS: build_menu, cros_build_api, recipe_engine/swarming

def RunSteps(api):

recipes / code_coverage:examples/full

DEPS: build_menu, code_coverage, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/swarming

def RunSteps(api):

recipes / code_coverage:examples/upload_active_version

DEPS: build_menu, code_coverage, recipe_engine/assertions, recipe_engine/time

Tests to test e2e coverage uploads active version.

def RunSteps(api):

recipes / code_coverage:examples/upload_code_coverage_llvm_json

DEPS: build_menu, code_coverage, recipe_engine/raw_io, recipe_engine/swarming

def RunSteps(api):

recipes / code_coverage:examples/upload_e2e_coverage

DEPS: build_menu, code_coverage, recipe_engine/file, recipe_engine/raw_io, recipe_engine/swarming

Tests to test e2e coverage uploads.

def RunSteps(api):

recipes / code_coverage:examples/upload_firmware_lcov

DEPS: build_menu, code_coverage, recipe_engine/cv, recipe_engine/swarming

def RunSteps(api):

recipes / collect_preuprev_test_results

DEPS: cros_source, cros_tags, gerrit, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Recipe that retrieves the result of tests executed before Chrome Uprev to CrOS and warns on failure.

def DoRunSteps(api: RecipeApi, current_build_id: int, uprev_cl_number_overridden_for_testing: Optional[int]):

def GetPreUprevTestBuilders(api: RecipeApi, release_task_id: int):

def GetPuprGeneratorBuilder(api: RecipeApi, pupr_cordinator_task_id: int):

def GetUprevClNumber(pupr_generator_task: build_pb2.Build):

def RunSteps(api: RecipeApi, properties: CollectPreuprevTestResults):

def ToBuilderIds(builders: List[build_pb2.Build]):

recipes / conductor:examples/full

DEPS: conductor, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / conductor:tests/conductor_failures

DEPS: conductor, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / conductor:tests/no_bbids

DEPS: conductor, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / conductor:tests/no_config

DEPS: conductor, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api: RecipeApi):

recipes / conductor:tests/not_enabled

DEPS: conductor, recipe_engine/assertions

def RunSteps(api: RecipeApi):

recipes / config_backfill

DEPS: cros_artifacts, cros_infra_config, cros_source, easy, git, git_txn, repo, src_state, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Copy legacy configuration and generate backfilled configuration.

def RunSteps(api, properties):

def backfill_project(api, config):

Backfill an individual project.

Expects to be run in the root of the chromeos checkout.

Args: config (ConfigBackfillProperties.ProjectConfig) - configuration for project

Return: BackfillStatus with results of backfill. commit hash if empty if no commit is made.

def config_merger(api, config, path_cros_repo, step_pres):

Create a closure to merge configs.

Meant to be called from git_txn.update_ref, which requires a single function taking no arguments, so close on what we need.

Args: api: Reference to recipes API config: Merge config to execute path_cros_repo: Path to root of ChromeOS checkout step_pres: Step presentation instance

Return: closure to execute merge operation

def create_download_payload(build):

Build a download payload.

Args: build: a Build message with output properties

Return: A BuildPayload if the Build message contains all the necessary information, otherwise None

def create_portage_workaround(api):

Hack around needing a full portage environment for reef/fizz.

Reef/fizz require their baseboard overlay to include common files. We can work around this by using symlinks to simulate the overlay.

def download_latest_config_yaml(api, builder_name):

Download latest project config.yaml from GS.

Args: api: Reference to recipes API builder_name: the full name of the builder to search for

Return: List containing the path where the downloaded config.yaml file resides, or empty list if no GS path was found for the builder.

def format_output_markdown(commits, errors, nmissing):

Generate markdown to be shown for the build status.

Args: commits: list of (program, project, hash) values for commits errors: list of string-formattable errors nmissing: number of projects missing from manifest

Return: Formatted markdown string suitable to return via RawResult proto.

def require(cond, message):

Require a given condition be true or throw a ValueError.

def split_overlay_project(api, repo):

Take a private overlay URL and parse out project name.

recipes / config_postsubmit

DEPS: bot_scaling, cros_source, easy, failures, future_utils, gerrit, git, git_txn, repo, src_state, workspace_util, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Run miscellaneous actions on project repos.

Runs on a schedule rather than as a triggered/CQ action, so there is some latency between commits landing and this script executing its tasks.

For example, if a src/project repo has filtered public configs, there can be an action to copy these public configs to a public repo.

Each action is a function that takes a list of config repos to operate on and returns a list of repos to make commits to.

def RunSteps(api, properties):

recipes / cop

DEPS: cros_infra_config, easy, gerrit, src_state, support, test_util, depot_tools/gerrit, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/tricium

Recipe for CoP: A CL validator based on Google Cloud Build. go/cros-cop

def RunSteps(api: RecipeApi, properties: CopProperties):

recipes / copybot

DEPS: build_menu, cros_build_api, easy, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/runtime, recipe_engine/step, recipe_engine/time

Recipe for Copybot.

This recipe calls the RunCopybot endpoint from the Build API CopybotService.

def RunSteps(api: RecipeApi, properties: CopybotProperties):

def run_copybot(api: RecipeApi, properties: CopybotProperties):

Call the RunCopybot endpoint.

recipes / cq_auto_retrier

DEPS: auto_retry_util, deferrals, easy, future_utils, gerrit, test_util, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Recipe for analyzing and retrying failed CQ runs.

def RunSteps(api: RecipeApi):

recipes / cq_auto_runner

DEPS: auto_runner_util, easy

Recipe that opportunistically tries CQ runs on qualified CLs.

def RunSteps(api: RecipeApi):

recipes / cq_fault_attribution:tests/comparison_snapshots_retrieval

DEPS: cq_fault_attribution, looks_for_green, skylab_results, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / cq_fault_attribution:tests/disabled_fault_attribution

DEPS: cq_fault_attribution, skylab_results, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/resultdb, recipe_engine/step

def RunSteps(api, properties):

recipes / cq_fault_attribution:tests/ignores_failures

DEPS: failures, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / cq_fault_attribution:tests/no_comparison_snapshots_found

DEPS: cq_fault_attribution, looks_for_green, skylab_results, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / cq_fault_attribution:tests/set_test_failure_fault_attributes

DEPS: cq_fault_attribution, looks_for_green, skylab_results, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/resultdb, recipe_engine/step

def RunSteps(api):

def create_expected_fault_attribute_properties(test_name, snapshot_comparison_fault_attribution, likely_flaky, expected_snapshot_comparison_properties):

def create_expected_snapshot(source_build_id, source_completed_unix_timestamp):

def get_build_id_from_invocation(invocation_id: str):

def get_build_target_index(items: List[FaultAttributedBuildTarget], build_target: str, model: Union[(str, None)]):

def get_rdb_test_result_name(invocation_id: str, test_name: str):

recipes / cq_fault_attribution:tests/too_many_or_no_failed_tests

DEPS: cq_fault_attribution, skylab_results, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/resultdb, recipe_engine/step

def RunSteps(api, properties):

recipes / cq_orchestrator

DEPS: build_menu, cros_tags, easy, orch_menu, recipe_engine/buildbucket

Recipe that schedules CQ verifiers.

def RunSteps(api: RecipeApi):

recipes / cros_artifacts:examples/code_coverage

DEPS: build_menu, cros_build_api, recipe_engine/raw_io, recipe_engine/swarming

def RunSteps(api):

recipes / cros_artifacts:examples/download_artifacts

DEPS: cros_artifacts, cros_test_plan, recipe_engine/assertions

def RunSteps(api):

recipes / cros_artifacts:examples/full

DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_artifacts:examples/prepare_for_build

DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties

def RunSteps(api, properties):

recipes / cros_artifacts:examples/publish_latest_files

DEPS: cros_artifacts, cros_infra_config, git_footers, test_util, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_artifacts:tests/artifacts_gs_path

DEPS: cros_artifacts, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api, properties):

recipes / cros_artifacts:tests/download_artifacts

DEPS: cros_artifacts, cros_test_plan, recipe_engine/assertions

def RunSteps(api):

recipes / cros_artifacts:tests/failed_artifacts

DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions, recipe_engine/cv, recipe_engine/properties

def RunSteps(api):

recipes / cros_artifacts:tests/gsutil_retry_fail

DEPS: cros_artifacts, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

def attempt_download_file(api, attempt):

def attempt_publish_file(api, attempt):

recipes / cros_artifacts:tests/gsutil_retry_success

DEPS: cros_artifacts, recipe_engine/buildbucket

def RunSteps(api):

def attempt_download_file(api, attempt):

recipes / cros_artifacts:tests/has_artifacts

DEPS: cros_artifacts, recipe_engine/assertions

def RunSteps(api):

recipes / cros_artifacts:tests/previously_uploaded_artifacts

DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions

def RunSteps(api):

recipes / cros_artifacts:tests/upload_artifacts

DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions

def RunSteps(api):

recipes / cros_artifacts:tests/upload_attestations

DEPS: cros_artifacts, cros_build_api, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties

def RunSteps(api, properties):

recipes / cros_branch:examples/full

DEPS: cros_branch, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api):

recipes / cros_branch:tests/errors

DEPS: cros_branch, recipe_engine/assertions, recipe_engine/path, recipe_engine/step

def RunSteps(api):

recipes / cros_build_api:examples/full

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_build_api:examples/has_endpoint

DEPS: cros_build_api, recipe_engine/assertions

def RunSteps(api):

recipes / cros_build_api:examples/ok_retcodes

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/step

def RunSteps(api):

recipes / cros_build_api:examples/parallel_operations

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_build_api:examples/publish_events

DEPS: cros_build_api, recipe_engine/assertions

def RunSteps(api):

recipes / cros_build_api:examples/set_api_return

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step

def RunSteps(api, properties):

recipes / cros_build_api:examples/set_upreved_ebuilds

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step

def RunSteps(api, properties):

recipes / cros_build_api:tests/bad_retcodes

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / cros_build_api:tests/failed_pkg_data_names

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api):

recipes / cros_build_api:tests/failed_pkg_log_retrieval

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/file, recipe_engine/properties

def RunSteps(api):

recipes / cros_build_api:tests/failed_pkg_names

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api):

recipes / cros_build_api:tests/misc

DEPS: cros_build_api, recipe_engine/assertions, recipe_engine/properties

def RunSteps(api):

recipes / cros_build_api:tests/publish_events_throws

DEPS: cros_build_api, recipe_engine/assertions

def RunSteps(api):

recipes / cros_build_api:tests/remove_endpoints

DEPS: cros_build_api, recipe_engine/assertions

def RunSteps(api):

recipes / cros_build_api:tests/version

DEPS: cros_build_api, recipe_engine/assertions

def RunSteps(api):

recipes / cros_cache:examples/full

DEPS: cros_cache, recipe_engine/assertions, recipe_engine/path

def RunSteps(api):

recipes / cros_cq_additional_tests:examples/existing_in_test_response_exact_match

DEPS: cros_cq_additional_tests, git_footers, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_cq_additional_tests:examples/existing_in_test_response_not_exact_match

DEPS: cros_cq_additional_tests, git_footers, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_cq_additional_tests:examples/incorrect_footer

DEPS: cros_cq_additional_tests, git_footers, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_cq_additional_tests:examples/no_additional_ts

DEPS: cros_cq_additional_tests, git_footers, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_cq_additional_tests:examples/ts_for_bt_in_test_response

DEPS: cros_cq_additional_tests, git_footers, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_cq_additional_tests:examples/ts_for_bt_not_in_test_response

DEPS: cros_cq_additional_tests, git_footers, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_cq_additional_tests:examples/unmatched_build_targets

DEPS: cros_cq_additional_tests, git_footers, gitiles, repo, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_cq_additional_tests:tests/get_additional_test_builders

DEPS: cros_cq_additional_tests, git_footers, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / cros_cq_depends:examples/cq_depend_strings

DEPS: cros_cq_depends, recipe_engine/assertions

def RunSteps(api):

recipes / cros_cq_depends:examples/ensure_manifest_cq_depends_fulfilled

DEPS: cros_cq_depends, cros_source, gerrit, repo, src_state, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_debug:examples/patch_chromite_head

DEPS: cros_debug, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Success workflow tests for the signing recipe module.

def RunSteps(api):

recipes / cros_debug:tests/pause_and_wait_for_signal

DEPS: cros_debug, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Success workflow tests for the signing recipe module.

def RunSteps(api, properties):

recipes / cros_debug:tests/pause_and_wait_for_signal_timeout

DEPS: cros_debug, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Success workflow tests for the signing recipe module.

def RunSteps(api):

recipes / cros_dupit:examples/arch

DEPS: cros_dupit, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_dupit:examples/full

DEPS: cros_dupit, recipe_engine/raw_io

def RunSteps(api):

recipes / cros_history:examples/get_annealing_from_snapshot

DEPS: cros_history, recipe_engine/buildbucket, recipe_engine/step

def RunSteps(api: recipe_api.RecipeApi):

recipes / cros_history:examples/get_matching_builds

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cv

def RunSteps(api):

recipes / cros_history:examples/get_passed_builds

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/cv, recipe_engine/properties

def RunSteps(api, properties):

recipes / cros_history:examples/get_passed_tests

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / cros_history:examples/get_previous_test_results

DEPS: cros_history, cros_test_plan, skylab_results, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / cros_history:examples/get_snapshot_builds

DEPS: cros_history, cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket

def RunSteps(api):

recipes / cros_history:examples/get_test_failure_builders

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / cros_history:examples/get_upreved_pkgs

DEPS: cros_history, recipe_engine/assertions

def RunSteps(api):

recipes / cros_history:examples/hours_since_breakage

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Unittests for hours_since_breakage() function.

def RunSteps(api, properties):

recipes / cros_history:examples/is_build_broken

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

Unittests for is_build_broken() function.

def RunSteps(api):

recipes / cros_history:examples/is_retry

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api):

recipes / cros_history:examples/set_passed_tests

DEPS: cros_history, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/step

def RunSteps(api):

recipes / cros_infra_config:examples/builder

DEPS: cros_infra_config, src_state, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api, properties):

recipes / cros_infra_config:examples/config_ref

DEPS: cros_infra_config, recipe_engine/assertions, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / cros_infra_config:examples/full

DEPS: cros_infra_config, test_util, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/properties

def RunSteps(api, properties):

recipes / cros_infra_config:examples/get_bot_policy_config

DEPS: cros_infra_config, recipe_engine/assertions

Test getting bot policy configs for ChromeOS and ChromeOSMPA projects.

def RunSteps(api):

recipes / cros_infra_config:examples/get_bot_policy_config_chrome

DEPS: cros_infra_config, recipe_engine/assertions

def RunSteps(api):

recipes / cros_infra_config:examples/get_build_id

DEPS: cros_infra_config, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/led, recipe_engine/properties

Test build_id property in cros_infra_config module.

def RunSteps(api, properties):

recipes / cros_infra_config:examples/get_dut_tracking_config

DEPS: cros_infra_config, recipe_engine/assertions

def RunSteps(api):

recipes / cros_infra_config:examples/get_vm_retry_config

DEPS: cros_infra_config, recipe_engine/assertions

def RunSteps(api):

recipes / cros_infra_config:examples/no_builder_config

DEPS: cros_infra_config, test_util, recipe_engine/assertions,