Package documentation for depot_tools

Table of Contents

Recipe Modules


Recipe Modules

recipe_modules / bot_update

DEPS: depot_tools, gclient, gerrit, tryserver, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/source_manifest, recipe_engine/step

Recipe module to ensure a checkout is consistent on a bot.

class BotUpdateApi(RecipeApi):

def __call__(self, name, cmd, **kwargs):

Wrapper for easy calling of bot_update.

def apply_gerrit_ref(self, root, gerrit_no_reset=False, gerrit_no_rebase_patch_ref=False, gerrit_repo=None, gerrit_ref=None, step_name=‘apply_gerrit’, **kwargs):

def deapply_patch(self, bot_update_step):

Deapplies a patch, taking care of DEPS and solution revisions properly.

def ensure_checkout(self, gclient_config=None, suffix=None, patch=True, update_presentation=True, patch_root=None, with_branch_heads=False, with_tags=False, refs=None, patch_oauth2=None, oauth2_json=None, use_site_config_creds=None, clobber=False, root_solution_revision=None, rietveld=None, issue=None, patchset=None, gerrit_no_reset=False, gerrit_no_rebase_patch_ref=False, disable_syntax_validation=False, manifest_name=None, **kwargs):

Args: gclient_config: The gclient configuration to use when running bot_update. If omitted, the current gclient configuration is used. disable_syntax_validation: (legacy) Disables syntax validation for DEPS. Needed as migration paths for recipes dealing with older revisions, such as bisect. manifest_name: The name of the manifest to upload to LogDog. This must be unique for the whole build.

def get_project_revision_properties(self, project_name, gclient_config=None):

Returns all property names used for storing the checked-out revision of a given project.

Args: project_name (str): The name of a checked-out project as deps path, e.g. src or src/v8. gclient_config: The gclient configuration to use. If omitted, the current gclient configuration is used.

Returns (list of str): All properties that'll hold the checked-out revision of the given project. An empty list if no such properties exist.

def initialize(self):

def last_returned_properties(self):

recipe_modules / cipd

DEPS: infra_paths, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step

class CIPDApi(RecipeApi):

CIPDApi provides basic support for CIPD.

This assumes that cipd (or cipd.exe or cipd.bat on windows) has been installed somewhere in $PATH. This will be true if you use depot_tools, or if your recipe is running inside of chrome-infrastructure's systems (buildbot, swarming).

def build(self, input_dir, output_package, package_name, install_mode=None):

Builds, but does not upload, a cipd package from a directory.

Args: input_dir (Path) - the directory to build the package from. output_package (Path) - the file to write the package to. package_name (str) - the name of the cipd package as it would appear when uploaded to the cipd package server. install_mode (None|‘copy’|‘symlink’) - the mechanism that the cipd client should use when installing this package. If None, defaults to the platform default (‘copy’ on windows, ‘symlink’ on everything else).

def create_from_pkg(self, pkg_def, refs=None, tags=None):

Builds and uploads a package based on a PackageDefinition object.

This builds and uploads the package in one step.

Args: pkg_def (PackageDefinition) - The description of the package we want to create. refs (list(str)) - A list of ref names to set for the package instance. tags (dict(str, str)) - A map of tag name -> value to set for the package instance.

Returns the JSON ‘result’ section, e.g.: { “package”: “infra/tools/cipd/android-amd64”, “instance_id”: “433bfdf86c0bb82d1eee2d1a0473d3709c25d2c4” }

def create_from_yaml(self, pkg_def, refs=None, tags=None):

Builds and uploads a package based on on-disk YAML package definition file.

This builds and uploads the package in one step.

Args: pkg_def (Path) - The path to the yaml file. refs (list(str)) - A list of ref names to set for the package instance. tags (dict(str, str)) - A map of tag name -> value to set for the package instance.

Returns the JSON ‘result’ section, e.g.: { “package”: “infra/tools/cipd/android-amd64”, “instance_id”: “433bfdf86c0bb82d1eee2d1a0473d3709c25d2c4” }

def default_bot_service_account_credentials(self):

def describe(self, package_name, version, test_data_refs=None, test_data_tags=None):

def ensure(self, root, packages):

Ensures that packages are installed in a given root dir.

packages must be a mapping from package name to its version, where

  • name must be for right platform (see also platform_suffix),
  • version could be either instance_id, or ref, or unique tag.

If installing a package requires credentials, call set_service_account_credentials before calling this function.

def executable(self):

def initialize(self):

def platform_suffix(self, name=None, arch=None, bits=None):

Use to get full package name that is platform indepdent.


‘my/package/%s’ % api.cipd.platform_suffix() ‘my/package/linux-amd64’

Optional platform bits and architecture may be supplied to generate CIPD suffixes for other platforms. If any are omitted, the current platform parameters will be used.

def register(self, package_name, package_path, refs=None, tags=None):

def search(self, package_name, tag):

def set_ref(self, package_name, version, refs):

def set_service_account_credentials(self, path):

def set_tag(self, package_name, version, tags):

recipe_modules / depot_tools

DEPS: recipe_engine/context, recipe_engine/platform, recipe_engine/runtime

The depot_tools module provides safe functions to access paths within the depot_tools repo.

class DepotToolsApi(RecipeApi):

def cros_path(self):

def download_from_google_storage_path(self):

def gn_py_path(self):

def gsutil_py_path(self):

def ninja_path(self):

def on_path(self):

Use this context manager to put depot_tools on $PATH.


with api.depot_tools.on_path(): # run some steps

def presubmit_support_py_path(self):

def root(self):

Returns (Path): The “depot_tools” root directory.

def upload_to_google_storage_path(self):

recipe_modules / gclient

DEPS: infra_paths, tryserver, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/step

class GclientApi(RecipeApi):

def __call__(self, name, cmd, infra_step=True, **kwargs):

Wrapper for easy calling of gclient steps.

def break_locks(self):

Remove all index.lock files. If a previous run of git crashed, bot was reset, etc... we might end up with leftover index.lock files.

def calculate_patch_root(self, patch_project, gclient_config=None, patch_repo=None):

Returns path where a patch should be applied to based patch_project.

Maps the patch‘s repo to a path of directories relative to checkout’s root, which describe where to place the patch. If no mapping is found for the repo url, falls back to trying to find a mapping for the old-style “patch_project”.

For now, considers only first solution ([0]), but in theory can be extended to all of them.

See patch_projects and repo_path_map solution config property.

Returns: Relative path, including solution‘s root. If patch_project is not given or not recognized, it’ll be just first solution root.

def checkout(self, gclient_config=None, revert=RevertOnTryserver, inject_parent_got_revision=True, extra_sync_flags=None, **kwargs):

Return a step generator function for gclient checkouts.

def config_to_pythonish(cfg):

def get_config_defaults(self):

def got_revision_reverse_mapping(cfg):

Returns the merged got_revision_reverse_mapping.

Returns (dict): A mapping from property name -> project name. It merges the values of the deprecated got_revision_mapping and the new got_revision_reverse_mapping.

def inject_parent_got_revision(self, gclient_config=None, override=False):

Match gclient config to build revisions obtained from build_properties.

Args: gclient_config (gclient config object) - The config to manipulate. A value of None manipulates the module's built-in config (self.c). override (bool) - If True, will forcibly set revision and custom_vars even if the config already contains values for them.

def is_blink_mode(self):

Indicates wether the caller is to use the Blink config rather than the Chromium config. This may happen for one of two reasons:

  1. The builder is configured to always use TOT Blink. (factory property top_of_tree_blink=True)
  2. A try job comes in that applies to the Blink tree. (patch_project is blink)

def resolve_revision(self, revision):

def runhooks(self, args=None, name=‘runhooks’, **kwargs):

def set_patch_project_revision(self, patch_project, gclient_config=None):

Updates config revision corresponding to patch_project.

Useful for bot_update only, as this is the only consumer of gclient‘s config revision map. This doesn’t overwrite the revision if it was already set.

def spec_alias(self):

def sync(self, cfg, extra_sync_flags=None, **kwargs):

def use_mirror(self, val):

recipe_modules / gerrit

DEPS: recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step

class GerritApi(RecipeApi):

Module for interact with gerrit endpoints

def __call__(self, name, cmd, infra_step=True, **kwargs):

Wrapper for easy calling of gerrit_utils steps.

def create_gerrit_branch(self, host, project, branch, commit, **kwargs):

Create a new branch from given project and commit

Returns: the ref of the branch created

def get_change_description(self, host, change, patchset):

Get the description for a given CL and patchset.

Args: host: Gerrit host to query. change: The change number. patchset: The patchset number.

Returns: The description corresponding to given CL and patchset.

def get_change_destination_branch(self, host, change, name=None, step_test_data=None):

Get the upstream branch for a given CL.

Result is cached.

Args: host: Gerrit host to query. change: The change number.

Returns: the name of the branch

def get_changes(self, host, query_params, start=None, limit=None, o_params=None, step_test_data=None, **kwargs):

Query changes for the given host.

Args: host: Gerrit host to query. query_params: Query parameters as list of (key, value) tuples to form a query as documented here: start: How many changes to skip (starting with the most recent). limit: Maximum number of results to return. o_params: A list of additional output specifiers, as documented here: step_test_data: Optional mock test data for the underlying gerrit client. Returns: A list of change dicts as documented here:

def get_gerrit_branch(self, host, project, branch, **kwargs):

Get a branch from given project and commit

Returns: the revision of the branch

def get_revision_info(self, host, change, patchset):

Returns the info for a given patchset of a given change.

Args: host: Gerrit host to query. change: The change number. patchset: The patchset number.

Returns: A dict for the target revision as documented here:

recipe_modules / git

DEPS: infra_paths, recipe_engine/context, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step

class GitApi(RecipeApi):

def __call__(self, *args, **kwargs):

Return a git command step.

def bundle_create(self, bundle_path, rev_list_args=None, **kwargs):

Run ‘git bundle create’ on a Git repository.

Args: bundle_path (Path): The path of the output bundle. refs (list): The list of refs to include in the bundle. If None, all refs in the Git checkout will be bundled. kwargs: Forwarded to ‘call’.

def cat_file_at_commit(self, file_path, commit_hash, remote_name=None, **kwargs):

Outputs the contents of a file at a given revision.

def checkout(self, url, ref=None, dir_path=None, recursive=False, submodules=True, submodule_update_force=False, keep_paths=None, step_suffix=None, curl_trace_file=None, can_fail_build=True, set_got_revision=False, remote_name=None, display_fetch_size=None, file_name=None, submodule_update_recursive=True, use_git_cache=False, progress=True):

Performs a full git checkout and returns sha1 of checked out revision.

Args: url (str): url of remote repo to use as upstream ref (str): ref to fetch and check out dir_path (Path): optional directory to clone into recursive (bool): whether to recursively fetch submodules or not submodules (bool): whether to sync and update submodules or not submodule_update_force (bool): whether to update submodules with --force keep_paths (iterable of strings): paths to ignore during git-clean; paths are gitignore-style patterns relative to checkout_path. step_suffix (str): suffix to add to a each step name curl_trace_file (Path): if not None, dump GIT_CURL_VERBOSE=1 trace to that file. Useful for debugging git issue reproducible only on bots. It has a side effect of all stderr output of ‘git fetch’ going to that file. can_fail_build (bool): if False, ignore errors during fetch or checkout. set_got_revision (bool): if True, resolves HEAD and sets got_revision property. remote_name (str): name of the git remote to use display_fetch_size (bool): if True, run git count-objects before and after fetch and display delta. Adds two more steps. Defaults to False. file_name (str): optional path to a single file to checkout. submodule_update_recursive (bool): if True, updates submodules recursively. use_git_cache (bool): if True, git cache will be used for this checkout. WARNING, this is EXPERIMENTAL!!! This wasn't tested with: * submodules * since origin url is modified to a local path, may cause problem with scripts that do “git fetch origin” or “git push origin”. * arbitrary refs such refs/whatever/not-fetched-by-default-to-cache progress (bool): wether to show progress for fetch or not

Returns: If the checkout was successful, this returns the commit hash of the checked-out-repo. Otherwise this returns None.

def config_get(self, prop_name, **kwargs):

Returns: (str) The Git config output, or None if no output was generated.

Args: prop_name: (str) The name of the config property to query. kwargs: Forwarded to ‘call’.

def count_objects(self, previous_result=None, can_fail_build=False, **kwargs):

Returns git count-objects result as a dict.

Args: previous_result (dict): the result of previous count_objects call. If passed, delta is reported in the log and step text. can_fail_build (bool): if True, may fail the build and/or raise an exception. Defaults to False.

Returns: A dict of count-object values, or None if count-object run failed.

def fetch_tags(self, remote_name=None, **kwargs):

Fetches all tags from the remote.

def get_remote_url(self, remote_name=None, **kwargs):

Returns: (str) The URL of the remote Git repository, or None.

Args: remote_name: (str) The name of the remote to query, defaults to ‘origin’. kwargs: Forwarded to ‘call’.

def get_timestamp(self, commit=‘HEAD’, test_data=None, **kwargs):

Find and return the timestamp of the given commit.

def new_branch(self, branch, name=None, upstream=None, **kwargs):

Runs git new-branch on a Git repository, to be used before git cl upload.

Args: branch (str): new branch name, which must not yet exist. name (str): step name. upstream (str): to origin/master. kwargs: Forwarded to ‘call’.

def rebase(self, name_prefix, branch, dir_path, remote_name=None, **kwargs):

Run rebase HEAD onto branch Args: name_prefix (str): a prefix used for the step names branch (str): a branch name or a hash to rebase onto dir_path (Path): directory to clone into remote_name (str): the remote name to rebase from if not origin

recipe_modules / git_cl

DEPS: recipe_engine/context, recipe_engine/raw_io, recipe_engine/step

class GitClApi(RecipeApi):

def get_description(self, patch_url=None, codereview=None, **kwargs):

DEPRECATED. Consider using gerrit.get_change_description instead.

def issue(self, **kwargs):

def set_description(self, description, patch_url=None, codereview=None, **kwargs):

def upload(self, message, upload_args=None, **kwargs):

recipe_modules / gitiles

DEPS: recipe_engine/json, recipe_engine/path, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step, recipe_engine/url

class Gitiles(RecipeApi):

Module for polling a git repository using the Gitiles web interface.

def commit_log(self, url, commit, step_name=None, attempts=None):

Returns: (dict) the Gitiles commit log structure for a given commit.

Args: url (str): The base repository URL. commit (str): The commit hash. step_name (str): If not None, override the step name. attempts (int): Number of times to try the request before failing.

def download_archive(self, repository_url, destination, revision=‘refs/heads/master’):

Downloads an archive of the repo and extracts it to destination.

If the gitiles server attempts to provide a tarball with paths which escape destination, this function will extract all valid files and then raise StepFailure with an attribute StepFailure.gitiles_skipped_files containing the names of the files that were skipped.

Args: repository_url (str): Full URL to the repository destination (Path): Local path to extract the archive to. Must not exist prior to this call. revision (str): The ref or revision in the repo to download. Defaults to ‘refs/heads/master’.

def download_file(self, repository_url, file_path, branch=‘master’, step_name=None, attempts=None, **kwargs):

Downloads raw file content from a Gitiles repository.

Args: repository_url (str): Full URL to the repository. branch (str): Branch of the repository. file_path (str): Relative path to the file from the repository root. step_name (str): Custom name for this step (optional). attempts (int): Number of times to try the request before failing.

Returns: Raw file content.

def log(self, url, ref, limit=0, cursor=None, step_name=None, attempts=None, **kwargs):

Returns the most recent commits under the given ref with properties.

Args: url (str): URL of the remote repository. ref (str): Name of the desired ref (see Gitiles.refs). limit (int): Number of commits to limit the fetching to. Gitiles does not return all commits in one call; instead paging is used. 0 implies to return whatever first gerrit responds with. Otherwise, paging will be used to fetch at least this many commits, but all fetched commits will be returned. cursor (str or None): The paging cursor used to fetch the next page. step_name (str): Custom name for this step (optional).

Returns: A tuple of (commits, cursor). Commits are a list of commits (as Gitiles dict structure) in reverse chronological order. The number of commits may be higher than limit argument. Cursor can be used for subsequent calls to log for paging. If None, signals that there are no more commits to fetch.

def refs(self, url, step_name=‘refs’, attempts=None):

Returns a list of refs in the remote repository.

recipe_modules / gsutil

DEPS: recipe_engine/path, recipe_engine/python

class GSUtilApi(RecipeApi):

def __call__(self, cmd, name=None, use_retry_wrapper=True, version=None, parallel_upload=False, multithreaded=False, **kwargs):

A step to run arbitrary gsutil commands.

Note that this assumes that gsutil authentication environment variables (AWS_CREDENTIAL_FILE and BOTO_CONFIG) are already set, though if you want to set them to something else you can always do so using the env={} kwarg.

Note also that gsutil does its own wildcard processing, so wildcards are valid in file-like portions of the cmd. See ‘gsutil help wildcards’.

Arguments: cmd: list of (string) arguments to pass to gsutil. Include gsutil-level options first (see ‘gsutil help options’). name: the (string) name of the step to use. Defaults to the first non-flag token in the cmd.

def cat(self, url, args=None, **kwargs):

def copy(self, source_bucket, source, dest_bucket, dest, args=None, link_name=‘gsutil.copy’, metadata=None, unauthenticated_url=False, **kwargs):

def download(self, bucket, source, dest, args=None, **kwargs):

def download_url(self, url, dest, args=None, **kwargs):

def gsutil_py_path(self):

def list(self, url, args=None, **kwargs):

def remove_url(self, url, args=None, **kwargs):

def signurl(self, private_key_file, bucket, dest, args=None, **kwargs):

def upload(self, source, bucket, dest, args=None, link_name=‘gsutil.upload’, metadata=None, unauthenticated_url=False, **kwargs):

recipe_modules / infra_paths

DEPS: recipe_engine/path, recipe_engine/properties

class InfraPathsApi(RecipeApi):

infra_paths module is glue for design mistakes. It will be removed.

def default_git_cache_dir(self):

Returns the location of the default git cache directory.

This property should be used instead of using path[‘git_cache’] directly.

It returns git_cache path if it is defined (Buildbot world), otherwise uses the more generic [CACHE]/git path (LUCI world).

def initialize(self):

recipe_modules / presubmit

DEPS: recipe_engine/context, recipe_engine/path, recipe_engine/python, recipe_engine/step

class PresubmitApi(RecipeApi):

def __call__(self, *args, **kwargs):

Return a presubmit step.

def presubmit_support_path(self):

recipe_modules / tryserver

DEPS: gerrit, git, git_cl, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step

class TryserverApi(RecipeApi):

def add_failure_reason(self, reason):

Records a more detailed reason why build is failing.

The reason can be any JSON-serializable object.

def get_files_affected_by_patch(self, patch_root, **kwargs):

Returns list of paths to files affected by the patch.

Argument: patch_root: path relative to api.path[‘root’], usually obtained from api.gclient.calculate_patch_root(patch_project)

Returned paths will be relative to to patch_root.

def get_footer(self, tag, patch_text=None):

Gets a specific tag from a CL description

def get_footers(self, patch_text=None):

Retrieves footers from the patch description.

footers are machine readable tags embedded in commit messages. See git-footers documentation for more information.

def is_gerrit_issue(self):

Returns true iff the properties exist to match a Gerrit issue.

def is_patch_in_git(self):

def is_tryserver(self):

Returns true iff we have a change to check out.

def normalize_footer_name(self, footer):

def set_compile_failure_tryjob_result(self):

Mark the tryjob result as a compile failure.

def set_failure_hash(self):

Context manager that sets a failure_hash build property on StepFailure.

This can be used to easily compare whether two builds have failed for the same reason. For example, if a patch is bad (breaks something), we'd expect it to always break in the same way. Different failures for the same patch are usually a sign of flakiness.

def set_invalid_test_results_tryjob_result(self):

Mark the tryjob result as having invalid test results.

This means we run some tests, but the results were not valid (e.g. no list of specific test cases that failed, or too many tests failing, etc).

def set_patch_failure_tryjob_result(self):

Mark the tryjob result as failure to apply the patch.

def set_subproject_tag(self, subproject_tag):

Adds a subproject tag to the build.

This can be used to distinguish between builds that execute different steps depending on what was patched, e.g. blink vs. pure chromium patches.

def set_test_failure_tryjob_result(self):

Mark the tryjob result as a test failure.

This means we started running actual tests (not prerequisite steps like checkout or compile), and some of these tests have failed.


recipes / bot_update:examples/buildbucket

DEPS: bot_update, gclient, recipe_engine/buildbucket

def RunSteps(api):

recipes / bot_update:examples/full

DEPS: bot_update, gclient, gerrit, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/runtime

def RunSteps(api):

recipes / cipd:examples/full

DEPS: cipd, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunSteps(api, use_pkg, pkg_files, pkg_dirs, ver_files, install_mode):

recipes / cipd:examples/platform_suffix

DEPS: cipd, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunSteps(api, arch_override, bits_override, expect_error):

recipes / depot_tools:examples/full

DEPS: depot_tools, recipe_engine/path, recipe_engine/platform, recipe_engine/runtime, recipe_engine/step

def RunSteps(api):

recipes / fetch_end_to_end_test

DEPS: bot_update, gclient, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/python, recipe_engine/step

def RunSteps(api):

recipes / gclient:examples/full

DEPS: gclient, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / gclient:tests/patch_project

DEPS: gclient, recipe_engine/properties

def RunSteps(api, patch_project, patch_repository_url):

recipes / gerrit:examples/full

DEPS: gerrit, recipe_engine/step

def RunSteps(api):

recipes / git:examples/full

DEPS: git, recipe_engine/context, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api):

recipes / git_cl:examples/full

DEPS: git_cl, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api):

recipes / gitiles:examples/full

DEPS: gitiles, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / gsutil:examples/full

DEPS: gsutil, recipe_engine/path

def RunSteps(api):

Move things around in a loop!

recipes / infra_paths:examples/full

DEPS: infra_paths, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / presubmit:examples/full

DEPS: presubmit

def RunSteps(api):

recipes / tryserver:examples/full

DEPS: tryserver, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api):