Repo documentation for infra

Table of Contents

Recipe Modules

Recipes

Recipe Modules

recipe_modules / buildenv

DEPS: recipe_engine/file, recipe_engine/golang, recipe_engine/nodejs, recipe_engine/path

A helper for bootstrapping Go and Node environments.

class BuildEnvApi(RecipeApi):

API for bootstrapping Go and Node environments.

@contextlib.contextmanager
def __call__(self, root, go_version_file: Optional[str]=None, nodejs_version_file: Optional[str]=None):

A context manager that activates the build environment.

Used to build code in a standalone git repositories that don't have Go or Node.js available via some other mechanism (like via gclient DEPS).

It reads the Golang version from <root>/<go_version_file> and Node.js version from <root>/<nodejs_version_file>, bootstraps the corresponding versions in cache directories, adjusts PATH and other environment variables and yields to the user code.

Let's assume the requested Go version is ‘1.16.10’ and Node.js version is ‘16.13.0’, then this call will use following cache directories (notice that ‘.’ is replace with ‘_’ since ‘.’ is not allowed in cache names):

  • go1_16_10: to install Go under.
  • gocache: for Go cache directories.
  • nodejs16_13_0: to install Node.js under.
  • npmcache: for NPM cache directories.

For best performance the builder must map these directories as named caches using e.g.

luci.builder(
    ...
    caches = [
        swarming.cache("go1_16_10"),
        swarming.cache("gocache"),
        swarming.cache("nodejs16_13_0"),
        swarming.cache("npmcache"),
    ],
)

Args: root (Path) - path to the checkout root. go_version_file (str) - path within the checkout to a text file with Go version to bootstrap or None to skip bootstrapping Go. nodejs_version_file (str) - path within the checkout to a text file with Node.js version to bootstrap or None to skip bootstrapping Node.js.

recipe_modules / cloudbuildhelper

DEPS: depot_tools/depot_tools, depot_tools/git, depot_tools/git_cl, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/commit_position, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

API for calling ‘cloudbuildhelper’ tool.

See https://chromium.googlesource.com/infra/infra/+/main/build/images/.

class CloudBuildHelperApi(RecipeApi):

API for calling ‘cloudbuildhelper’ tool.

def build(self, manifest, canonical_tag=None, build_id=None, infra=None, restrictions=None, labels=None, tags=None, checkout_metadata=None, step_test_image=None, cost=None):

Calls cloudbuildhelper build <manifest> interpreting the result.

Args:

  • manifest (Path) - path to YAML file with definition of what to build.
  • canonical_tag (str) - tag to push the image to if we built a new image.
  • build_id (str) - identifier of the CI build to put into metadata.
  • infra (str) - what section to pick from ‘infra’ field in the YAML.
  • restrictions (Restrictions) - restrictions to apply to manifests.
  • labels ({str: str}) - labels to attach to the docker image.
  • tags ([str]) - tags to unconditionally push the image to.
  • checkout_metadata (CheckoutMetadata) - to get revisions.
  • step_test_image (Image) - image to produce in training mode.
  • cost (ResourceCost) - an estimated resource cost of this call.

Returns: Image instance or NotUploadedImage if the YAML doesn't specify a registry.

Raises: StepFailure on failures.

@command.setter
def command(self, val):

Can be used to tell the module to use an existing binary.

def discover_manifests(self, root, entries, test_data=None):

Returns a list with paths to all manifests we need to build.

Each entry is either a directory to scan recursively, or a reference to a concrete manifest (if ends with “.yaml”).

Args:

  • root (Path) - gclient solution root.
  • entries ([str]) - paths relative to the solution root to scan.
  • test_data ([str]) - paths to put into each dirs in training mode.

Returns: [Path].

def do_roll(self, repo_url, root, callback, ref=‘main’):

Checks out a repo, calls the callback to modify it, uploads the result.

Args:

  • repo_url (str) - repo to checkout.
  • root (Path) - where to check it out too (can be a cache).
  • callback (func(Path)) - will be called as callback(root) with cwd also set to root. It can modify files there and either return None to skip the roll or RollCL to attempt the roll. If no files are modified, the roll will be skipped regardless of the return value.
  • ref (str) - a ref to update (e.g. “main”).

Returns:

  • (None, None) if didn't create a CL (because nothing has changed).
  • (Issue number, Issue URL) if created a CL.

def get_version_label(self, path, revision, ref=None, commit_position=None, template=None):

Computes a version string identifying a commit.

To calculate a commit position it either uses Cr-Commit-Position footer, if available, or falls back to git number <rev>.

Uses the given template string to format the label. It is a Python format string, with following placeholders available:

{rev}: a string with the short git revision hash. {ref}: a string with the last component of the commit ref (e.g. ‘main’). {cp}: an integer with the commit position number. {date}: “YYYY.MM.DD” UTC date when the build started. {build}: an integer build number or 0 if not available.

Defaults to {cp}-{rev}.

This label is used as a version name for artifacts produced based on this checked out commit.

Args:

  • path (Path) - path to the git checkout root.
  • revision (str) - checked out revision.
  • ref (str) - checked out git ref if known.
  • commit_position (str) - Cr-Commit-Position footer value if available.
  • template (str) - a Python format string to use to format the version.

Returns: A version string.

def report_version(self):

Reports the version of cloudbuildhelper tool via the step text.

Returns: None.

def update_pins(self, path):

Calls cloudbuildhelper pins-update <path>.

Updates the file at path in place if some docker tags mentioned there have moved since the last pins update.

Args:

  • path (Path) - path to a pins.yaml file to update.

Returns: List of strings with updated “:” pairs, if any.

def upload(self, manifest, canonical_tag, build_id=None, infra=None, restrictions=None, checkout_metadata=None, step_test_tarball=None, cost=None):

Calls cloudbuildhelper upload <manifest> interpreting the result.

Args:

  • manifest (Path) - path to YAML file with definition of what to build.
  • canonical_tag (str) - tag to apply to a tarball if we built a new one.
  • build_id (str) - identifier of the CI build to put into metadata.
  • infra (str) - what section to pick from ‘infra’ field in the YAML.
  • restrictions (Restrictions) - restrictions to apply to manifests.
  • checkout_metadata (CheckoutMetadata) - to get revisions.
  • step_test_tarball (Tarball) - tarball to produce in training mode.
  • cost (ResourceCost) - an estimated resource cost of this call.

Returns: Tarball instance.

Raises: StepFailure on failures.

recipe_modules / cloudkms

DEPS: recipe_engine/cipd, recipe_engine/path, recipe_engine/step

class CloudKMSApi(RecipeApi):

API for interacting with CloudKMS using the LUCI cloudkms tool.

@property
def cloudkms_path(self):

Returns the path to LUCI cloudkms binary.

When the property is accessed the first time, cloudkms will be installed using cipd.

def decrypt(self, kms_crypto_key, input_file, output_file):

Decrypt a ciphertext file with a CloudKMS key.

Args:

  • kms_crypto_key (str) - The name of the encryption key, e.g. projects/chops-kms/locations/global/keyRings/[KEYRING]/cryptoKeys/[KEY]
  • input_file (Path) - The path to the input (ciphertext) file.
  • output_file (Path) - The path to the output (plaintext) file. It is recommended that this is inside api.path.cleanup_dir to ensure the plaintext file will be cleaned up by recipe.

def sign(self, kms_crypto_key, input_file, output_file, service_account_creds_file=None):

Processes a plaintext and uploads the digest for signing by Cloud KMS.

Args:

  • kms_crypto_key (str) - The name of the cryptographic key, e.g. projects/[PROJECT]/locations/[LOC]/keyRings/[KEYRING]/cryptoKeys/[KEY]
  • input_file (Path) - Path to file with data to operate on. Data for sign and verify cannot be larger than 64KiB.
  • output_file (Path) - Path to write output signature to a json file.
  • service_account_creds_file (str) - Path to JSON file with service account credentials to use.

def verify(self, kms_crypto_key, input_file, signature_file, output_file=‘-’, service_account_creds_file=None):

Verifies a signature that was previously created with a key stored in CloudKMS.

Args:

  • kms_crypto_key (str)- The name of the cryptographic public key, e.g. projects/[PROJECT]/locations/[LOC]/keyRings/[KEYRING]/cryptoKeys/[KEY]
  • input_file (Path) - Path to file with data to operate on. Data for sign and verify cannot be larger than 64KiB.
  • signature_file (Path) - Path to read signature from.
  • output_file (Path) - Path to write operation results (successful verification or signature mismatch)to (use ‘-’ for stdout).
  • service_account_creds_file (str) - Path to JSON file with service account credentials to use.

recipe_modules / codesearch

DEPS: depot_tools/depot_tools, depot_tools/git, depot_tools/gsutil, depot_tools/tryserver, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/commit_position, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step

class CodesearchApi(RecipeApi):

def add_kythe_metadata(self):

Adds inline Kythe metadata to Mojom generated files.

This metadata is used to connect things in the generated file to the thing in the Mojom file which generated it. This is made possible by annotations added to the generated file by the Mojo compiler.

def checkout_generated_files_repo_and_sync(self, copy, revision, kzip_path=None, ignore=None):

Check out the generated files repo and sync the generated files into this checkout.

Args: copy: A dict that describes how generated files should be synced. Keys are paths to local directories and values are where they are copied to in the generated files repo.

  {
      '/path/to/foo': 'foo',
      '/path/to/bar': 'baz/bar',
  }

The above copy config would result in a generated files repo like:

  repo/
  repo/foo/
  repo/baz/bar/

kzip_path: Path to kzip that will be used to prune uploaded files. ignore: List of paths that shouldn't be synced. revision: A commit hash to be used in the commit message.

def cleanup_old_generated(self, age_days: int=7, checkout_dir: Optional[config_types.Path]=None):

Clean up generated files older than the specified number of days.

Args: age_days: Minimum age in days for files to delete (integer). checkout_dir: The directory where code is checked out. If not specified, use checkout_dir initialized in path module by default

def clone_clang_tools(self, clone_dir):

Clone chromium/src clang tools.

def create_and_upload_kythe_index_pack(self, commit_hash: str, commit_timestamp: int, commit_position: Optional[str]=None, clang_target_arch: Optional[str]=None, checkout_dir: Optional[config_types.Path]=None):

Create the kythe index pack and upload it to google storage.

Args: commit_hash: Hash of the commit at which we‘re creating the index pack, if None use got_revision. commit_timestamp: Timestamp of the commit at which we’re creating the index pack, in integer seconds since the UNIX epoch. clang_target_arch: Target architecture to cross-compile for. checkout_dir: The directory where code is checked out. If not specified, use checkout_dir initialized in path module by default.

Returns: Path to the generated index pack.

def get_config_defaults(self):

def run_clang_tool(self, clang_dir: Optional[config_types.Path]=None, run_dirs: Optional[Iterable[config_types.Path]]=None, target_architecture: Optional[str]=None):

Download and run the clang tool.

Args: clang_dir: Path to clone clang into. run_dirs: Dirs in which to run the clang tool. target_architecture: If given, the architecture to transpile for.

recipe_modules / docker

DEPS: recipe_engine/path, recipe_engine/raw_io, recipe_engine/service_account, recipe_engine/step

class DockerApi(RecipeApi):

Provides steps to connect and run Docker images.

def __call__(self, *args, **kwargs):

Executes specified docker command.

Please make sure to use api.docker.login method before if specified command requires authentication.

Args: args: arguments passed to the ‘docker’ command including subcommand name, e.g. api.docker(‘push’, ‘my_image:latest’). kwargs: arguments passed down to api.step module.

def ensure_installed(self, **kwargs):

Checks that the docker binary is in the PATH.

Raises StepFailure if binary is not found.

def get_version(self):

Returns Docker version installed or None if failed to detect.

def login(self, server=‘gcr.io’, project=‘chromium-container-registry’, service_account=None, step_name=None, **kwargs):

Connect to a Docker registry.

This step must be executed before any other step in this module that requires authentication.

Args: server: GCP container registry to pull images from. Defaults to ‘gcr.io’. project: Name of the Cloud project where Docker images are hosted. service_account: service_account.api.ServiceAccount used for authenticating with the container registry. Defaults to the task's associated service account. step_name: Override step name. Default is ‘docker login’.

def pull(self, image, step_name=None):

Pull a docker image from a remote repository.

Args: image: Name of the image to pull. step_name: Override step name. Default is ‘docker pull’.

def python(self, name, script, args, **kwargs):

def run(self, image, step_name=None, cmd_args=None, dir_mapping=None, env=None, inherit_luci_context=False, **kwargs):

Run a command in a Docker image as the current user:group.

Args: image: Name of the image to run. step_name: Override step name. Default is ‘docker run’. cmd_args: Used to specify command to run in an image as a list of arguments. If not specified, then the default command embedded into the image is executed. dir_mapping: List of tuples (host_dir, docker_dir) mapping host directories to directories in a Docker container. Directories are mapped as read-write. env: dict of env variables. inherit_luci_context: Inherit current LUCI Context (including auth). CAUTION: removes network isolation between the container and the docker host. Read more https://docs.docker.com/network/host/.

recipe_modules / infra_checkout

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/gerrit, depot_tools/git, depot_tools/presubmit, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/raw_io, recipe_engine/step, recipe_engine/tricium

class InfraCheckoutApi(RecipeApi):

Stateless API for using public infra gclient checkout.

def apply_golangci_lint(self, co, go_module_root=None):

Apply golangci-lint to existing diffs and emit lint warnings via tricium.

go_module_root is an absolute path to the root of a Go module to lint. It should be under patch_root_path. If None, patch_root_path itself will be used.

def checkout(self, gclient_config_name, patch_root=None, path=None, internal=False, generate_py2_env=False, go_version_variant=None, **kwargs):

Fetches infra gclient checkout into a given path OR named_cache.

Arguments:

  • gclient_config_name (string) - name of gclient config.
  • patch_root (path or string) - path inside infra checkout to git repo in which to apply the patch. For example, ‘infra/luci’ for luci-py repo. If None (default), no patches will be applied.
  • path (path or string) - path to where to create/update infra checkout. If None (default) - path is cache with customizable name (see below).
  • internal (bool) - by default, False, meaning infra gclient checkout layout is assumed, else infra_internal. This has an effect on named_cache default and inside which repo's go corner the ./go/env.py command is run.
  • generate_py2_env uses the “infra/3pp/tools/cpython” package to create the infra/ENV python 2.7 virtual environment. This is only needed in specific situations such as running tests for python 2.7 GAE apps.
  • go_version_variant can be set go “legacy” or “bleeding_edge” to force the builder to use a non-default Go version. What exact Go versions correspond to “legacy” and “bleeding_edge” and default is defined in bootstrap.py in infra.git.
  • kwargs - passed as is to bot_update.ensure_checkout.

Returns: a Checkout object with commands for common actions on infra checkout.

def get_footer_infra_deps_overrides(self, gerrit_change, step_test_data=None):

Returns revision overrides for infra repos parsed from the gerrit footer.

Checks the commit message for lines like: Try-<deps_name>: . e.g. ‘Try-infra: 123abc456def’

Allowed values for <deps_name> are: ‘infra’ for infra/infra, ‘infra_internal’ for infra/infra_internal, ‘.’ for infra/infra_superproject

These deps names are based what's found in infra/infra_superproject/DEPS

recipe_modules / infra_cipd

DEPS: recipe_engine/buildbucket, recipe_engine/context, recipe_engine/json, recipe_engine/runtime, recipe_engine/step

class InfraCIPDApi(RecipeApi):

API for building packages defined in infra's public and intenral repos.

Essentially a shim around scripts in https://chromium.googlesource.com/infra/infra.git/+/main/build/ and its internal counterpart.

def build(self, sign_id=None):

Builds packages.

@contextlib.contextmanager
def context(self, path_to_repo, goos=None, goarch=None):

Sets context building CIPD packages.

Arguments: path_to_repo (path): path infra or infra_internal repo root dir. Expects to find build/build.py inside provided dir. goos, goarch (str): allows for setting GOOS and GOARCH for cross-compiling Go code.

Doesn't support nesting.

def tags(self, git_repo_url, revision):

Returns tags to be attached to uploaded CIPD packages.

def test(self):

Tests previously built packages integrity.

def upload(self, tags, step_test_data=None):

Uploads previously built packages.

recipe_modules / omahaproxy

DEPS: recipe_engine/raw_io, recipe_engine/url

class OmahaproxyApi(RecipeApi):

APIs for interacting with omahaproxy.

def history(self, min_major_version=None, exclude_platforms=None):

@staticmethod
def split_version(text):

recipe_modules / powershell

DEPS: recipe_engine/json, recipe_engine/step

class PowershellAPI(RecipeApi):

API to execute powershell scripts

def __call__(self, name, command, logs=None, args=None, ret_codes=None):

Execute a command through powershell. Args:

  • name (str) - name of the step being run
  • command (str|path) - powershell command or windows script/exe to run
  • logs ([]str) - List of logs to read on completion. Specifying dir reads all logs in dir
  • args ([]str) - List of args supplied to the command
  • ret_codes ([]int) - List of return codes to be treated as success Returns: Dict containing ‘results’ as a key Raises: StepFailure if the failure is detected. See resources/psinvoke.py

recipe_modules / qemu

DEPS: recipe_engine/cipd, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/raw_io, recipe_engine/step

class QEMUAPI(RecipeApi):

API to manage qemu VMs

def cleanup_disks(self):

cleanup_disks deletes all the disks in the disks dir. This is meant to be used for cleanup after using the VM

def create_disk(self, disk_name, fs_format=‘fat’, min_size=0, include=None):

create_disk creates a virtual disk with the given name, format and size.

Optionally it is possible to specify a list of paths to copy to the disk. If the size is deemed to be too small to copy the files it might be increased to fit all the files.

Args:

  • disk_name: name of the disk image file
  • fs_format: one of [exfat, ext3, fat, msdos, vfat, ext2, ext4, ntfs]
  • min_size: minimum size of the disk in Megabytes(1048576 bytes) (bigger size used if required)
  • include: sequence of files and directories to copy to image

def create_empty_disk(self, disk_name, fs_format, size):

create_empty_disk creates an empty disk image and formats it

Args:

  • disk_name: name of the disk image file
  • fs_format: one of [ext3, fat, ntfs]
  • size: size of the disk image in bytes

@property
def disks(self):

def init(self, version):

Initialize the module, ensure that qemu exists on the system

Note:

  • QEMU is installed in cache/qemu
  • Virtual disks are stored in cleanup/qemu/disks

Args:

  • version: the cipd version tag for qemu

def mount_disk_image(self, disk, partitions=[1]):

mount_disk_image mounts the given image and returns the mount location and loop file used for mounting

Args:

  • disk: name of the disk image file
  • partitions: list of partitions to mount, If None attempt to mount the whole image

Returns: loop file used for the disk and list of mount locations

@property
def path(self):

def powerdown_vm(self, name):

powerdown_vm sends a shutdown signal to the given VM. Similar to power button on a physical device

Args:

  • name: name of the vm to shutdown

Returns: True if powerdown signal was sent to VM. False otherwise

def quit_vm(self, name):

quit_vm sends a quit signal to the qemu process. Use this if your VM doesn't respond to powerdown signal.

Args:

  • name: name of the vm to quit

Returns: True if quit signal was sent to VM. False otherwise

def start_vm(self, arch, qemu_vm, kvm=False):

start_vm starts a qemu vm

QEMU is started with qemu_monitor running a qmp service. It also connects the serial port of the machine to a tcp port.

Args:

  • arch: The arch that the VM should be based on
  • qemu_vm: QEMU_VM proto object containing all the config for starting the vm
  • kvm: If true then VM is run on hardware. It's emulated otherwise

def unmount_disk_image(self, loop_file, partitions=[1]):

unmount_disk_image unmounts the disk mounted using the given loop_file

Args:

  • loop_file: Loop device used to mount the image

def vm_status(self, name):

vm_status returns a dict describing the status of the vm. The return value is the QMP response to query-status

Args:

  • name: name of the vm

Returns: QMP json response for status query

recipe_modules / recipe_autoroller

DEPS: depot_tools/depot_tools, depot_tools/git, depot_tools/git_cl, depot_tools/gsutil, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/json, recipe_engine/path, recipe_engine/proto, recipe_engine/random, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time

class RecipeAutorollerApi(RecipeApi):

def roll_projects(self, projects, db_gcs_bucket):

Attempts to roll each project from the provided list.

If rolling any of the projects leads to failures, other projects are not affected.

Args: projects: list of tuples of project_id (string): id as found in recipes.cfg. project_url (string): Git repository URL of the project. db_gcs_bucket (string): The GCS bucket used as a database for previous roll attempts.

recipe_modules / support_3pp

DEPS: depot_tools/git, depot_tools/osx_sdk, depot_tools/tryserver, depot_tools/windows_sdk, recipe_engine/archive, recipe_engine/bcid_reporter, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/defer, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step, recipe_engine/url

Allows uniform cross-compiliation, version tracking and archival for third-party software packages (libs+tools) for distribution via CIPD.

The purpose of the Third Party Packages (3pp) recipe/module is to generate CIPD packages of statically-compiled software for distribution in our continuous integration fleets, as well as software distributed to our develepers (e.g. via depot_tools).

Target os and architecture uses the CIPD “${os}-${arch}” (a.k.a. “${platform}”) nomenclature, which is currently defined in terms of Go's GOOS and GOARCH runtime variables (with the unfortunate exception that CIPD uses ‘mac’ instead of ‘darwin’). This is somewhat arbitrary, but has worked well so far for us.

Package Definitions

The 3pp module loads package definitions from a folder containing subfolders. Each subfolder defines a single software package to fetch, build and upload. For example, you might have a folder in your repo like this:

my_repo.git/
  3pp/  # "root folder"
    .vpython3            # common vpython file for all package scripts
    zlib/                # zlib "package folder"
      3pp.pb             # REQUIRED: the Spec.proto definition for zlib
      install.sh         # a script to build zlib from source
      extra_resource_file
    other_package/
      3pp.pb             # REQUIRED
      fetch.py           # a script to fetch `other_package` in a custom way
      install.sh
      install-win.sh     # windows-specific build script
    ...

This defines two packages (zlib, and other_package). The 3pp.pb files have references to the fetch/build scripts, and describe what dependencies the packages have (if any).

NOTE: Only one layer of package folders is supported currently.

Packages are named by the folder that contains their definition file (3pp.pb) and build scripts. It's preferable to have package named after software that it contains. However, sometimes you want multiple major versions of the software to exist side-by-side (e.g. pcre and pcre2, python and python3, etc.). In this case, have two separate package definition folders.

Each package folder contains a package spec (3pp.pb), as well as scripts, patches and/or utility tools to build the software from source.

The spec is a Text Proto document specified by the spec.proto schema.

The spec is broken up into two main sections, “create” and “upload”. The create section allows you to specify how the package software gets created, and allows specifying differences in how it's fetched/built/tested on a per-target basis, and the upload section has some details on how the final result gets uploaded to CIPD.

Creation Stages

The 3pp.pb spec begins with a series of create messages, each with details on on how to fetch+build+test the package. Each create message contains a “platform_re” field which works as a regex on the ${platform} value. All matching patterns apply in order, and non-matching patterns are skipped. Each create message is applied with a dict.update for each member message (i.e. [‘source’].update, [‘build’].update, etc.) to build a singular create message for the current target platform. For list values (e.g. ‘tool’, ‘dep’ in the Build message), you can clear them by providing a new empty value (e.g. tool: "")

Once all the create messages are merged (see schema for all keys that can be present), the actual creation takes place.

Note that “source” is REQUIRED in the final merged instruction set. All other messages are optional and have defaults as documented in spec.proto.

The creation process is broken up into 4 different stages:

  • Source
  • Build
  • Package
  • Verify
Envvars

All scripts described below are invoked with the following environment variables set:

  • $_3PP_PACKAGE_NAME - the name of the package currently building
  • $_3PP_PATCH_VERSION - the patch_version set for the version we're building (if any patch version was set).
  • $_3PP_PLATFORM - the platform we're targeting
  • $_3PP_TOOL_PLATFORM - the platform that we‘re building on (will be different than _3PP_PLATFORM if we’re cross-compiling)
  • $_3PP_VERSION - the version we're building, e.g. 1.2.3
  • $GOOS - The golang OS name we're targeting
  • $GOARCH - The golang architecture we're targeting
  • $MACOSX_DEPLOYMENT_TARGET - On OS X, set to 10.10, for your semi-up-to-date OS X building needs. This needs to be consistently set for all packages or it will cause linker warnings/errors when linking in static libs that were targeting a newer version (e.g. if it was left unset). Binaries built with this set to 10.10 will not run on 10.9 or older systems.

Additionally, on cross-compile environments, the $CROSS_TRIPLE environment variable is set to a GCC cross compile target triplet of cpu-vendor-os.

Source

The source is used to fetch the raw sources for assembling the package. In some cases the sources may actually be binary artifacts (e.g. prebuilt windows installers).

The source is unpacked to a checkout directory, possibly in some specified subdirectory. Sources can either produce the actual source files, or they can produce a single archive file (e.g. zip or tarball), which can be unpacked with the ‘unpack_archive’ option. In addition, patches can be applied to the source with the ‘patch_dir’ option (the patches should be in git format-patch format, and will be applied with git apply).

  • git - This checks out a semver tag in the repo.
  • cipd - This fetches data from a CIPD package.
  • url - This is used for packages that do not provide a stable distribution like git for their source code. An original download url will be passed in this method to download the source artifact from third party distribution.
  • script - Used for “weird” packages which are distributed via e.g. an HTML download page or an API. The script must be able to return the ‘latest’ version of its source, as well as to actually fetch a specified version. Python fetch scripts will be executed with vpython3, and so may have a .vpython3 file (or similar) in the usual manner to pull in dependencies like requests.

Additionally the Source message contains a patch_version field to allow symver disambiguation of the built packages when they contain patches or other alterations which need to be versioned. This string will be joined with a ‘.’ to the source version being built when uploading the result to CIPD.

Build

The build message allows you to specify deps, and tools, as well as a script install which contains your logic to transform the source into the result package.

Deps are libraries built for the target ${platform} and are typically used for linking your package.

Tools are binaries built for the host; they‘re things like automake or sed that are used during the configure/make phase of your build, but aren’t linked into the built product. These tools will be available on $PATH (both ‘$tools’ and ‘$tools/bin’ are added to $PATH, because many packages are set up with their binaries at the base of the package, and some are set up with them in a /bin folder)

Installation occurs by invoking the script indicated by the ‘install’ field (with the appropriate interpreter, depending on the file extension) like:

<interpreter> "$install[*]" "$PREFIX" "$DEPS_PREFIX"

Where:

  • The current working directory is the base of the source checkout w/o subdir.
  • $install[*] are all of the tokens in the ‘install’ field.
  • $PREFIX is the directory which the script should install everything to; this directory will be archived into CIPD verbatim.
  • $DEPS_PREFIX is the path to a prefix directory containing the union of all of your packages' transitive deps. For example, all of the headers of your deps are located at $DEPS_PREFIX/include.
  • All tools are in $PATH

If the ‘install’ script is omitted, it is assumed to be ‘install.sh’.

If the ENTIRE build message is omitted, no build takes place. Instead the result of the ‘source’ stage will be packaged.

During the execution of the build phase, the package itself and its dependent packages (e.g. “dep” and “tool” in PB file) will be copied into the source checkout in the .3pp directory, and the script will be invoked as /path/to/checkout/.3pp/<cipd_pkg_name>/$script_name. If the package has shared resources (like .vpython3 files or helper scripts) which are outside of the package directory, you would need to create a symbolic link for it. See chromium.googlesource.com/infra/infra/+/main/3pp/cpython_common/ssl_suffix.py as an example.

Package

Once the build stage is complete, all files in the $PREFIX folder passed to the install script will be zipped into a CIPD package.

It is strongly recommended that if your package is a library or tool with many files that it be packaged in the standard POSIXey PREFIX format (e.g. bin, lib, include, etc.). If your package is a collection of one or more standalone binaries, it's permissible to just have the binaries in the root of the output $PREFIX.

If the build stage is skipped (i.e. the build message is omitted) then the output of the source stage will be packaged instead (this is mostly useful when using a ‘script’ source).

Verify

After the package is built it can be optionally tested. The recipe will run your test script in an empty directory with the path to the packaged-but-not-yet-uploaded cipd package file and it can do whatever testing it needs to it (exiting non-zero if something is wrong). You can use the cipd pkg-deploy command to deploy it (or whatever cipd commands you like, though I wouldn't recommend uploading it to CIPD, as the 3pp recipe will do that after the test exits 0).

Additionally, vpython3 for the tool platform will be guaranteed to be in $PATH.

Upload

Once the test comes back positive, the CIPD package will be uploaded to the CIPD server and registered with the prefix indicated in the upload message. The full CIPD package name is constructed as:

<prefix>/<pkg_name>/${platform}

So for example with the prefix infra, the bzip2 package on linux-amd64 would be uploaded to infra/bzip2/linux-amd64 and tagged with the version that was built (e.g. version:1.2.3.patch_version1).

You can also mark the upload as a universal package, which will:

  • Omit the ${platform} suffix from the upload name
  • Only build the package on the `linux-amd64' platform. This was chosen to ensure that “universal” packages build consistently.

Versions

Every package will try to build the latest identifiable semver of its source, or will attempt to build the semver requested as an input property to the 3pp recipe. This semver is also used to tag the uploaded artifacts in CIPD.

Because some of the packages here are used as dependencies for others (e.g. curl and zlib are dependencies of git, and zlib is a dependency of curl), each package used as a dependency for others should specify its version explicitly (currently this is only possible to do with the ‘cipd’ source type). So e.g. zlib and curl specify their source versions, but git and python float at ‘head’, always building the latest tagged version fetched from git.

When building a floating package (e.g. python, git) you may explicitly state the symver that you wish to build as part of the recipe invocation.

The symver of a package (either specified in the package definition, in the recipe properties or discovered while fetching its source code (e.g. latest git tag)) is also used to tag the package when it's uploaded to CIPD (plus the patch_version in the source message).

Cross Compilation

Third party packages are currently compiled on linux using the ‘infra.tools.dockerbuild’ tool from the infra.git repo. This uses a slightly modified version of the dockcross Docker cross-compile environment. Windows and OS X targets are built using the ‘osx_sdk’ and ‘windows_sdk’ recipe modules, each of which provides a hermetic (native) build toolchain for those platforms.

For linux, we can support all the architectures implied by dockerbuild, including:

  • linux-arm64
  • linux-armv6l
  • linux-mips32
  • linux-mips64
  • linux-amd64

Dry runs / experiments

If the recipe is run with force_build it will always build all packages indicated. Dependencies will be built if they do not exist in CIPD. None of the built packages will be uploaded to CIPD.

The recipe must always be run with a package_prefix (by assigning to the .package_prefix property on the Support3ppApi). If the recipe is run in experimental mode, ‘experimental/’ will be prepended to this. Additionally, you may specify experimental: true in the Create message for a package, which will have the same effect when running the recipe in production (to allow adding new packages or package/platform combintations experimentally).

Examples

As an example of the package definition layout in action, take a look at the 3pp folder in this infra.git repo.

Caches

This module uses the following named caches:

  • 3pp_cipd - Caches all downloaded and uploaded CIPD packages. Currently tag lookups are performed every time against the CIPD server, but this will hold the actual package files.
  • osx_sdk - Cache for depot_tools/osx_sdk. Only on Mac.
  • windows_sdk - Cache for depot_tools/windows_sdk. Only on Windows.

class Support3ppApi(RecipeApi):

def ensure_uploaded(self, packages=(), platform='', force_build=False, tryserver_affected_files=(), use_pkgbuild=False):

Executes entire {fetch,build,package,verify,upload} pipeline for all the packages listed, targeting the given platform.

Args:

  • packages (seq[str]) - A sequence of packages to ensure are uploaded. Packages must be listed as either ‘pkgname’ or ‘pkgname@version’. If empty, builds all loaded packages.
  • platform (str) - If specified, the CIPD ${platform} to build for. If unspecified, this will be the appropriate CIPD ${platform} for the current host machine.
  • force_build (bool) - If True, all applicable packages and their dependencies will be built, regardless of the presence in CIPD. The source and built packages will not be uploaded to CIPD.
  • tryserver_affected_files (seq[Path]) - If given, run in tryserver mode where the specified files (which must correspond to paths that were passed to load_packages_from_path) have been modified in the CL. All affected packages, and any that depend on them (recursively) are built. If any files are modified which cannot be mapped to a specific package, all packages are rebuilt. Overrides ‘packages’, and forces force_build=True (packages are never uploaded in this mode).
  • use_pkgbuild (bool) - If True, use the experimental pkgbuild to build 3pp packages and skip the rest of the 3pp recipe. This will not upload packages in any case.

Returns (list[(cipd_pkg, cipd_version)], set[str]) of built CIPD packages and their tagged versions, as well as a list of unsupported packages.

def initialize(self):

def load_packages_from_path(self, base_path, glob_pattern=‘**/3pp.pb’, check_dup=True):

Loads all package definitions from the given base_path and glob pattern inside the git repository.

To include package definitions across multiple git repository or git submodules, load_packages_from_path need to be called for each of the repository.

This will parse and intern all the 3pp.pb package definition files so that packages can be identified by their cipd package name. For example, if you pass:

path/ pkgname/ 3pp.pb install.sh

And the file “path/pkgname/3pp.pb” has the following content:

upload { pkg_prefix: “my_pkg_prefix” }

Its cipd package name will be “my_pkg_prefix/pkgname”.

Args:

  • base_path (Path) - A path of a directory where the glob_pattern will be applied to.
  • glob_pattern (str) - A glob pattern to look for package definition files “3pp.pb” whose behavior is defined by 3pp.proto. Default to “**/3pp.pb”.
  • check_dup (bool): When set, it will raise DuplicatePackage error if a package spec is already loaded. Default to True.

Returns a set(str) containing the cipd package names of the packages which were loaded.

Raises a DuplicatePackage exception if this function encounters a cipd package name which is already registered. This could occur if you call load_packages_from_path multiple times, and one of the later calls tries to load a package which was registered under one of the earlier calls.

def package_prefix(self, experimental=False):

Returns the CIPD package name prefix (str), if any is set.

This will prepend ‘experimental/’ to the currently set prefix if:

  • The recipe is running in experimental mode; OR
  • You pass experimental=True

def set_experimental(self, experimental):

Set the experimental mode (bool).

def set_package_prefix(self, prefix):

Set the CIPD package name prefix (str).

All CIPDSpecs for built packages (not sources) will have this string prepended to them.

def set_source_cache_prefix(self, prefix):

Set the CIPD namespace (str) to store the source of the packages.

recipe_modules / windows_adk

DEPS: powershell, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step

class WindowsADKApi(RecipeApi):

API for using Windows ADK distributed via CIPD.

def cleanup(self):

Remove the ADK and WinPE.

def cleanup_win_adk(self):

Cleanup the Windows ADK.

def cleanup_winpe(self):

Cleanup WinPE.

def ensure(self, install=True):

Ensure the presence of the Windows ADK.

def ensure_win_adk(self, refs):

Downloads & Installs the Windows ADK.

def ensure_win_adk_winpe(self, refs):

Ensures that the WinPE add-on is available.

recipe_modules / windows_scripts_executor

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/git, depot_tools/gitiles, depot_tools/gsutil, powershell, qemu, recipe_engine/archive, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/raw_io, recipe_engine/step

class WindowsPSExecutorAPI(RecipeApi):

API for using Windows PowerShell scripts.

def download_all_packages(self, custs):

download_all_packages downloads all the packages referenced by given custs.

Args:

  • custs: List of Customizations object from customizations.py

def execute_customizations(self, custs):

Executes the windows image builder user config.

Args:

  • custs: List of Customizations object from customizations.py

def filter_executable_customizations(self, customizations):

filter_executable_customizations generates a list of customizations that need to be executed.

Args:

  • customizations: List of Customizations object from customizations.py

def gen_canonical_configs(self, customizations):

gen_canonical_configs strips all the names in the config and returns individual configs containing one customization per image.

Example: Given an Image

Image{
  arch: x86,
  name: "windows10_x86_GCE",
  customizations: [
    Customization{
      OfflineWinPECustomization{
        name: "winpe_networking"
        image_dest: GCSSrc{
          bucket: "chrome-win-wim"
          source: "rel/win10_networking.wim"
        }
        ...
      }
    },
    Customization{
      OfflineWinPECustomization{
        name: "winpe_diskpart"
        image_src: Src{
          gcs_src: GCSSrc{
            bucket: "chrome-win-wim"
            source: "rel/win10_networking.wim"
          }
        }
        ...
      }
    }
  ]
}

Writes two configs: windows10_x86_GCE-winpe_networking.cfg with

Image{
  arch: x86,
  name: "",
  customizations: [
    Customization{
      OfflineWinPECustomization{
        name: ""
        image_dest: GCSSrc{
          bucket: "chrome-win-wim"
          source: "rel/win10_networking.wim"
        }
        ...
      }
   }
  ]
}

and windows10_x86_GCE-winpe_diskpart.cfg with

Image{
  arch: x86,
  name: "",
  customizations: [
    Customization{
      OfflineWinPECustomization{
        name: ""
        image_src: Src{
          gcs_src: GCSSrc{
            bucket: "chrome-win-wim"
            source: "rel/win10_networking.wim"
          }
        }
        ...
      }
    }
  ]
}

to disk, calculates the hash for each config and sets the key for each of them. The strings representing name of the image, customization,... etc,. are set to empty before calculating the hash to maintain the uniqueness of the hash.

Args:

  • customizations: List of Customizations object from customizations.py

def gen_executable_configs(self, custs):

gen_executable_configs generates wib.Image configs that can be executed.

Given a list of custs that can be run on a builder. Generates wib.Image proto configs making sure that they can be executed independently. Some customizations are dependent on others for inputs and can only be executed if the dependent input is generated in the same wib.Image proto or is already available.

Args:

  • custs: list of customization objects from customizations.py that can be executed on the same builder

Returns a list of tuples containing config and set of customization hash that can be executed at the time

def get_executable_configs(self, custs):

get_executable_configs returns a list of images that can be executed at this time.

image is generated after determining what customizations can be executed as part of the image. A list of keys representing the customizations to be executed are also returned with the image.

Args:

  • custs: List of customizations to be processed. Returns a dict mapping builder name to the image-key_list tuple

def init(self, try_job=False):

init initializes all the dirs and sub modules required.

def init_customizations(self, config):

init_customizations initializes the given config and returns list of customizations

Args:

  • config: wib.Image proto config

def pin_customizations(self, customizations, ctx):

pin_customizations pins all the sources in the customizations

Args:

  • customizations: List of Customizations object from customizations.py
  • ctx: dict containing the context for the customization

def process_customizations(self, custs, ctx, inputs=()):

process_customizations pins all the volatile srcs and generates canonnical configs.

Args:

  • custs: List of customizations from customization.py
  • ctx: dict containing the context for the customization
  • inputs: List of inputs that are required

Returns list of customizations in order that they were processed

def trim_uploads(self, customizations):

trim_uploads removes the user specified uploads from a config.

def update_context(self, custs, ctx):

update_context returns an updated dict with all the contexts updated

Args:

  • custs: List of customizations from customization.py
  • ctx: Current context

Returns updated context dict

recipe_modules / windows_sdk

DEPS: recipe_engine/cipd, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/step

class WindowsSDKApi(RecipeApi):

API for using Windows SDK distributed via CIPD.

@contextmanager
def __call__(self, path=None, version=None, enabled=True):

Setups the SDK environment when enabled.

Args: path (path): Path to a directory where to install the SDK (default is ‘[start_dir]/cipd/windows_sdk’) version (str): CIPD instance ID, tag or ref of the SDK (default is set via $infra/windows_sdk.version property) enabled (bool): Whether the SDK should be used or not.

Raises: StepFailure or InfraFailure.

recipe_modules / zip

DEPS: recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/raw_io, recipe_engine/step

class ZipApi(RecipeApi):

Provides steps to zip and unzip files.

def directory(self, step_name, directory, output, comment=None):

Step to compress a single directory.

Args: step_name: display name of the step. directory: path to a directory to compress, it would become the root of an archive, i.e. |directory|/file.txt would be named ‘file.txt’ in the archive. output: path to a zip file to create. comment: the archive comment to set on the created ZIP file.

def get_comment(self, step_name, zip_file):

Returns the archive comment from |zip_file|.

Args: step_name: display name of a step. zip_file: path to a zip file to read, should exist.

def make_package(self, root, output):

Returns ZipPackage object that can be used to compress a set of files.

Usage: pkg = api.zip.make_package(root, output) pkg.add_file(root / ‘file’) pkg.add_directory(root / ‘directory’) yield pkg.zip(‘zipping step’)

Args: root: a directory that would become root of a package, all files added to an archive will have archive paths relative to this directory. output: path to a zip file to create.

Returns: ZipPackage object.

def unzip(self, step_name, zip_file, output, quiet=False):

Step to uncompress |zip_file| into |output| directory.

Zip package will be unpacked to |output| so that root of an archive is in |output|, i.e. archive.zip/file.txt will become |output|/file.txt.

Step will FAIL if |output| already exists.

Args: step_name: display name of a step. zip_file: path to a zip file to uncompress, should exist. output: path to a directory to unpack to, it should NOT exist. quiet (bool): If True, print terse output instead of the name of each unzipped file.

def update_package(self, root, output):

Returns ZipPackage object that can be used to update an existing package.

Usage: pkg = api.zip.update_package(root, output) pkg.add_file(root / ‘file’) pkg.add_directory(root / ‘directory’) yield pkg.zip(‘updating zip step’)

Args: root: the root directory for adding new files/dirs to the package; all files/dirs added to an archive will have archive paths relative to this directory. output: path to a zip file to update.

Returns: ZipPackage object.

Recipes

recipes / 3pp

DEPS: depot_tools/git, depot_tools/tryserver, support_3pp, recipe_engine/bcid_reporter, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step

This recipe builds and packages third party software, such as Git.

def RunSteps(api, package_locations, to_build, platform, force_build, package_prefix, source_cache_prefix, use_pkgbuild):

recipes / build_from_tarball

DEPS: depot_tools/gsutil, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / build_wheels

DEPS: depot_tools/gclient, depot_tools/git, depot_tools/osx_sdk, depot_tools/tryserver, depot_tools/windows_sdk, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

@contextmanager
def PlatformSdk(api, platforms):

def RunSteps(api, platforms, dry_run, rebuild):

recipes / buildenv:examples/simple

DEPS: buildenv, recipe_engine/path, recipe_engine/step

def RunSteps(api):

recipes / chromium_bootstrap/test

This recipe verifies importing of chromium bootstrap protos.

The protos are exported via a symlink in //recipe/recipe_proto/infra/chromium.

def RunSteps(api):

recipes / cloudbuildhelper:examples/discover

DEPS: cloudbuildhelper, recipe_engine/path

def RunSteps(api):

recipes / cloudbuildhelper:examples/full

DEPS: cloudbuildhelper, recipe_engine/json, recipe_engine/path, recipe_engine/step

def RunSteps(api):

def build(api):

def repo_checkout_metadata(api):

def restrictions(api):

def upload(api):

recipes / cloudbuildhelper:examples/roll

DEPS: cloudbuildhelper, recipe_engine/path, recipe_engine/properties

def RunSteps(api, commit):

recipes / cloudbuildhelper:examples/version_label

DEPS: cloudbuildhelper, recipe_engine/path, recipe_engine/properties, recipe_engine/step

def RunSteps(api, commit_position):

recipes / cloudkms:examples/usage

DEPS: cloudkms, recipe_engine/path

def RunSteps(api):

recipes / codesearch:examples/full

DEPS: depot_tools/bot_update, depot_tools/gclient, codesearch, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/path, recipe_engine/runtime, recipe_engine/step

def RunSteps(api):

recipes / codesearch:tests/checkout_generated_files_repo_and_sync

DEPS: depot_tools/bot_update, depot_tools/gclient, codesearch, recipe_engine/properties

def RunSteps(api):

recipes / codesearch:tests/clone_and_run_clang_tool

DEPS: codesearch, recipe_engine/path, recipe_engine/properties

def RunSteps(api, properties):

recipes / codesearch:tests/configs

DEPS: codesearch, recipe_engine/properties

def RunSteps(api):

recipes / codesearch:tests/create_and_upload_kythe_index_pack

DEPS: depot_tools/bot_update, depot_tools/gclient, codesearch, recipe_engine/properties

def RunSteps(api):

recipes / cv_testing/tryjob

DEPS: recipe_engine/cv, recipe_engine/properties, recipe_engine/step

Recipe to test LUCI CQ/CV itself.

def RunSteps(api, properties):

recipes / depot_tools_builder

DEPS: depot_tools/git, depot_tools/gsutil, zip, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Recipe to build windows depot_tools bootstrap zipfile.

def RunSteps(api):

recipes / docker:examples/full

DEPS: docker, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api):

recipes / docker_image_builder

DEPS: depot_tools/bot_update, depot_tools/gclient, docker, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/service_account, recipe_engine/step, recipe_engine/time

def RunSteps(api, arch_type):

recipes / fleet_systems/dhcp

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/tryserver, docker, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

Test chrome-golo repo DHCP configs using dhcpd binaries via docker.

def RunSteps(api):

recipes / gae_tarball_uploader

DEPS: depot_tools/git, buildenv, cloudbuildhelper, infra_checkout, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

def RunSteps(api, properties):

recipes / gerrit_hello_world

DEPS: recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step, recipe_engine/time

Pushes a trivial CL to Gerrit to verify git authentication works on LUCI.

def RunSteps(api):

recipes / gerrit_plugins

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/gsutil, zip, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/step

def RunSteps(api):

recipes / git_cache_updater

DEPS: depot_tools/depot_tools, depot_tools/git, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step, recipe_engine/url

Updates the Git Cache zip files.

def RunSteps(api, inputs):

recipes / gsutil_hello_world

DEPS: depot_tools/depot_tools, depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/step, recipe_engine/time

Pushes a trivial CL to Gerrit to verify git authentication works on LUCI.

def RunSteps(api):

recipes / images_builder

DEPS: depot_tools/gerrit, depot_tools/git, buildenv, cloudbuildhelper, infra_checkout, recipe_engine/buildbucket, recipe_engine/futures, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time

def RunSteps(api, properties):

recipes / images_pins_roller

DEPS: depot_tools/git, depot_tools/git_cl, cloudbuildhelper, infra_checkout, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/properties

def RunSteps(api, properties):

recipes / infra_checkout:examples/ci

DEPS: infra_checkout, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / infra_checkout:examples/try

DEPS: depot_tools/gerrit, infra_checkout, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/platform, recipe_engine/raw_io

def RunSteps(api):

recipes / infra_cipd:examples/usage

DEPS: infra_cipd, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / infra_continuous

DEPS: depot_tools/bot_update, depot_tools/depot_tools, depot_tools/gclient, depot_tools/osx_sdk, infra_checkout, infra_cipd, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/defer, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/resultdb, recipe_engine/runtime, recipe_engine/step

def RunSteps(api):

def build_main(api, checkout, buildername, project_name, repo_url, rev):

def run_python_tests(api, checkout, project_name):

def should_run_python_tests(api, builder_name):

recipes / infra_frontend_tester

DEPS: depot_tools/bot_update, depot_tools/gclient, infra_checkout, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/nodejs, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunFrontendTests(api, cwd, app_name):

def RunInfraFrontendTests(api, root_path):

This function runs the UI tests in infra project.

def RunInfraInternalFrontendTests(api, root_path):

This function runs UI tests in infra_internal project.

def RunLuciGoTests(api, root_path):

This function runs UI tests in the luci-go project.

def RunSteps(api):

recipes / infra_repo_trybot

DEPS: depot_tools/depot_tools, depot_tools/osx_sdk, infra_checkout, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/defer, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/runtime, recipe_engine/step

def RunSteps(api, go_version_variant, run_lint, skip_python_tests):

def should_run_python_tests(api):

recipes / luci_go

DEPS: depot_tools/osx_sdk, infra_checkout, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/resultdb, recipe_engine/step, recipe_engine/tricium

def RunSteps(api, GOARCH, go_version_variant, run_integration_tests, run_lint):

recipes / luci_py

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/git, infra_checkout, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/platform, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api):

recipes / powershell:examples/test

DEPS: powershell, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties

def RunSteps(api):

recipes / qemu:examples/basic

DEPS: qemu, recipe_engine/path

def RunSteps(api):

recipes / qemu:examples/create_disk

DEPS: qemu, recipe_engine/path, recipe_engine/raw_io

def RunSteps(api):

recipes / qemu:examples/mount_disk_image

DEPS: qemu, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, inputs):

recipes / qemu:examples/powerdown_vm

DEPS: qemu, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io

def RunSteps(api):

recipes / qemu:examples/quit_vm

DEPS: qemu, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io

def RunSteps(api):

recipes / qemu:examples/start_vm

DEPS: qemu, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, qemu_vm):

recipes / qemu:examples/status_vm

DEPS: qemu, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io

def RunSteps(api):

recipes / recipe_autoroller

DEPS: recipe_autoroller, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/properties, recipe_engine/proto, recipe_engine/time

Rolls recipes.cfg dependencies for public projects.

def RunSteps(api, projects, db_gcs_bucket):

recipes / recipe_autoroller:examples/full

DEPS: recipe_autoroller, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/properties, recipe_engine/proto, recipe_engine/time

def RunSteps(api, projects, db_gcs_bucket):

recipes / recipe_bundler

DEPS: recipe_engine/cipd, recipe_engine/path, recipe_engine/properties, recipe_engine/step

def RunSteps(api, recipe_bundler_pkg, recipe_bundler_vers, repo_specs, repo_specs_optional, package_name_prefix, package_name_internal_prefix):

recipes / recipe_roll_tryjob

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/git, depot_tools/tryserver, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api, upstream_id, upstream_url, downstream_id, downstream_url):

recipes / recipe_simulation

DEPS: depot_tools/bot_update, depot_tools/gclient, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

A continuous builder which runs recipe tests.

def RunSteps(api, git_repo):

recipes / recipes_py_continuous

DEPS: depot_tools/bot_update, depot_tools/gclient, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/path, recipe_engine/properties

def RunSteps(api):

recipes / support_3pp:tests/full

DEPS: depot_tools/tryserver, support_3pp, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def RunSteps(api, GOOS, GOARCH, experimental, load_dupe, package_prefix, source_cache_prefix, to_build, tryserver_affected_files, use_pkgbuild):

recipes / tricium_infra

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/gerrit, depot_tools/tryserver, infra_checkout, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step, recipe_engine/tricium

def RunSteps(api, inputs):

This recipe runs legacy analyzers for the infra repo.

recipes / update_submodules_mirror

DEPS: depot_tools/gclient, depot_tools/git, depot_tools/gitiles, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step

def GetSubmodules(api, deps, source_checkout_name, overlays):

def RefToRemoteRef(ref):

def RunSteps(api, source_repo, target_repo, extra_submodules, cache_name, overlays, internal, with_tags, ref_patterns, refs_to_skip, push_to_refs_cs):

recipes / windows_adk:examples/ensure

DEPS: windows_adk, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties

def RunSteps(api):

recipes / windows_image_builder/offline

DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/gitiles, depot_tools/tryserver, windows_adk, windows_scripts_executor, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/proto, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step

def RunSteps(api, inputs):

This recipe runs image builder for a given user config.

def mock_lsdir(path):

def mock_tests(config):

def url_title(build):

url_title is a helper function to display the customization name over the build link in schedule process. Returns string formatted with builder name and customization

recipes / windows_image_builder/online_windows_customization

DEPS: depot_tools/gitiles, depot_tools/tryserver, windows_adk, windows_scripts_executor, recipe_engine/json, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, image):

This recipe executes offline_winpe_customization.

recipes / windows_image_builder/winpe_customization

DEPS: depot_tools/gitiles, depot_tools/tryserver, windows_adk, windows_scripts_executor, recipe_engine/json, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, image):

This recipe executes offline_winpe_customization.

recipes / windows_scripts_executor:examples/add_windows_driver

DEPS: windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/add_windows_package

DEPS: windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/cipd_test

DEPS: windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/customization_mode

DEPS: depot_tools/gsutil, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/edit_offline_registry_test

DEPS: windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/execute_online_customization

DEPS: depot_tools/gsutil, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/gcs_test

DEPS: depot_tools/gsutil, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/get_executable_configs

DEPS: depot_tools/bot_update, depot_tools/gclient, windows_adk, windows_scripts_executor, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/proto, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step

def RunSteps(api, config):

def lsdir(path):

def tests(config):

recipes / windows_scripts_executor:examples/git_test

DEPS: depot_tools/gitiles, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/online_windows_customization

DEPS: depot_tools/gitiles, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/powershell_expression

DEPS: depot_tools/gsutil, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/process_customizations

DEPS: windows_scripts_executor, recipe_engine/platform, recipe_engine/properties

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/shutdown_vm

DEPS: depot_tools/gsutil, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/test

DEPS: depot_tools/gitiles, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/trim_uploads

DEPS: windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_scripts_executor:examples/windows_iso

DEPS: depot_tools/gsutil, windows_scripts_executor, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io

def RunSteps(api, config):

recipes / windows_sdk:examples/full

DEPS: windows_sdk, recipe_engine/platform, recipe_engine/properties, recipe_engine/step

def RunSteps(api):

recipes / zip:examples/full

DEPS: zip, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/step

def RunSteps(api):