DEPS: depot_tools/git, depot_tools/git_cl, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/raw_io, recipe_engine/step
API for calling ‘cloudbuildhelper’ tool.
See https://chromium.googlesource.com/infra/infra/+/master/build/images/.
API for calling ‘cloudbuildhelper’ tool.
— def build(self, manifest, canonical_tag=None, build_id=None, infra=None, labels=None, tags=None, step_test_image=None):
Calls cloudbuildhelper build <manifest>
interpreting the result.
Args:
Returns: Image instance or NotUploadedImage if the YAML doesn't specify a registry.
Raises: StepFailure on failures.
@command.setter
— def command(self, val):
Can be used to tell the module to use an existing binary.
— def discover_manifests(self, root, dirs, test_data=None):
Returns a list with paths to all manifests we need to build.
Args:
dirs
in training mode.Returns: [Path].
— def do_roll(self, repo_url, root, callback, ref=‘master’):
Checks out a repo, calls the callback to modify it, uploads the result.
Args:
callback(root)
with cwd also set to root
. It can modify files there and either return None to skip the roll or RollCL to attempt the roll. If no files are modified, the roll will be skipped regardless of the return value.Returns:
— def report_version(self):
Reports the version of cloudbuildhelper tool via the step text.
Returns: None.
— def update_pins(self, path):
Calls cloudbuildhelper pins-update <path>
.
Updates the file at path
in place if some docker tags mentioned there have moved since the last pins update.
Args:
pins.yaml
file to update.Returns: List of strings with updated “:” pairs, if any.
— def upload(self, manifest, canonical_tag, build_id=None, infra=None, step_test_tarball=None):
Calls cloudbuildhelper upload <manifest>
interpreting the result.
Args:
Returns: Tarball instance.
Raises: StepFailure on failures.
DEPS: recipe_engine/cipd, recipe_engine/path, recipe_engine/step
API for interacting with CloudKMS using the LUCI cloudkms tool.
@property
— def cloudkms_path(self):
Returns the path to LUCI cloudkms binary.
When the property is accessed the first time, cloudkms will be installed using cipd.
— def decrypt(self, kms_crypto_key, input_file, output_file):
Decrypt a ciphertext file with a CloudKMS key.
Args:
— def sign(self, kms_crypto_key, input_file, output_file, service_account_creds_file=None):
Processes a plaintext and uploads the digest for signing by Cloud KMS.
Args:
— def verify(self, kms_crypto_key, input_file, signature_file, output_file=‘-’, service_account_creds_file=None):
Verifies a signature that was previously created with a key stored in CloudKMS.
Args:
DEPS: recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/python, recipe_engine/step, recipe_engine/url
Functions to work with Miniconda python environment.
See http://conda.pydata.org/miniconda.html
— def install(self, version, path):
Downloads Miniconda installer for given version and executes it.
Args: version: version of Miniconda to install, e.g. ‘Miniconda2-3.18.3’. path: prefix to install Miniconda into.
Returns: Instance of CondaEnv, that also optionally acts as context manager that deletes the environment on exit.
DEPS: recipe_engine/path, recipe_engine/python, recipe_engine/raw_io, recipe_engine/service_account, recipe_engine/step
Provides steps to connect and run Docker images.
— def __call__(self, *args, **kwargs):
Executes specified docker command.
Please make sure to use api.docker.login method before if specified command requires authentication.
Args: args: arguments passed to the ‘docker’ command including subcommand name, e.g. api.docker(‘push’, ‘my_image:latest’). kwargs: arguments passed down to api.step module.
— def ensure_installed(self, **kwargs):
Checks that the docker binary is in the PATH.
Raises StepFailure if binary is not found.
— def get_version(self):
Returns Docker version installed or None if failed to detect.
— def login(self, server=‘gcr.io’, project=‘chromium-container-registry’, service_account=None, step_name=None, **kwargs):
Connect to a Docker registry.
This step must be executed before any other step in this module that requires authentication.
Args: server: GCP container registry to pull images from. Defaults to ‘gcr.io’. project: Name of the Cloud project where Docker images are hosted. service_account: service_account.api.ServiceAccount used for authenticating with the container registry. Defaults to the task's associated service account. step_name: Override step name. Default is ‘docker login’.
— def pull(self, image, step_name=None):
Pull a docker image from a remote repository.
Args: image: Name of the image to pull. step_name: Override step name. Default is ‘docker pull’.
— def run(self, image, step_name=None, cmd_args=None, dir_mapping=None, env=None, inherit_luci_context=False, **kwargs):
Run a command in a Docker image as the current user:group.
Args: image: Name of the image to run. step_name: Override step name. Default is ‘docker run’. cmd_args: Used to specify command to run in an image as a list of arguments. If not specified, then the default command embedded into the image is executed. dir_mapping: List of tuples (host_dir, docker_dir) mapping host directories to directories in a Docker container. Directories are mapped as read-write. env: dict of env variables. inherit_luci_context: Inherit current LUCI Context (including auth). CAUTION: removes network isolation between the container and the docker host. Read more https://docs.docker.com/network/host/.
DEPS: depot_tools/bot_update, depot_tools/depot_tools, depot_tools/gclient, depot_tools/git, depot_tools/presubmit, infra_system, recipe_engine/buildbucket, recipe_engine/commit_position, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step
Stateless API for using public infra gclient checkout.
— def checkout(self, gclient_config_name, patch_root=None, path=None, internal=False, named_cache=None, generate_env_with_system_python=False, go_version_variant=None, go_modules=False, **kwargs):
Fetches infra gclient checkout into a given path OR named_cache.
Arguments:
internal
argument value. Note: your cr-buildbucket.cfg should specify named_cache for swarming to prioritize bots which actually have this cache populated by prior runs. Otherwise, using named cache isn't particularly useful, unless your pool of builders is very small.Returns: a Checkout object with commands for common actions on infra checkout.
DEPS: recipe_engine/buildbucket, recipe_engine/context, recipe_engine/json, recipe_engine/python, recipe_engine/runtime, recipe_engine/step
API for building packages defined in infra's public and intenral repos.
Essentially a shim around scripts in https://chromium.googlesource.com/infra/infra.git/+/master/build/ and its internal counterpart.
— def build_without_env_refresh(self):
Builds packages.
Prevents build.py from refreshing the python ENV.
@contextlib.contextmanager
— def context(self, path_to_repo, goos=None, goarch=None):
Sets context building CIPD packages.
Arguments: path_to_repo (path): path infra or infra_internal repo root dir. Expects to find build/build.py
inside provided dir. goos, goarch (str): allows for setting GOOS and GOARCH for cross-compiling Go code.
Doesn't support nesting.
— def tags(self, git_repo_url, revision):
Returns tags to be attached to uploaded CIPD packages.
— def test(self):
Tests previously built packages integrity.
— def upload(self, tags, step_test_data=None):
Uploads previously built packages.
DEPS: recipe_engine/context, recipe_engine/path, recipe_engine/platform
API for interacting with a provisioned infrastructure system.
@property
— def sys_bin_path(self):
DEPS: recipe_engine/raw_io, recipe_engine/url
APIs for interacting with omahaproxy.
— def history(self, min_major_version=None, exclude_platforms=None):
@staticmethod
— def split_version(text):
DEPS: recipe_engine/cipd, recipe_engine/path, recipe_engine/step
API for interacting with Provenance using the provenance tool.
— def generate(self, kms_crypto_key, input_file, output_file):
Generate an attestation file with a built artifact.
Args:
@property
— def provenance_path(self):
Returns the path to provenance binary.
When the property is accessed the first time, the latest, released provenance will be installed using cipd and verified using the provenance built-in to the OS image (if available).
DEPS: depot_tools/depot_tools, depot_tools/git, depot_tools/git_cl, depot_tools/gsutil, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/json, recipe_engine/path, recipe_engine/proto, recipe_engine/python, recipe_engine/random, recipe_engine/raw_io, recipe_engine/step, recipe_engine/time
— def roll_projects(self, projects, db_gcs_bucket):
Attempts to roll each project from the provided list.
If rolling any of the projects leads to failures, other projects are not affected.
Args: projects: list of tuples of project_id (string): id as found in recipes.cfg. project_url (string): Git repository URL of the project. db_gcs_bucket (string): The GCS bucket used as a database for previous roll attempts.
DEPS: depot_tools/git, depot_tools/osx_sdk, depot_tools/windows_sdk, provenance, recipe_engine/archive, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step, recipe_engine/url
Allows uniform cross-compiliation, version tracking and archival for third-party software packages (libs+tools) for distribution via CIPD.
The purpose of the Third Party Packages (3pp) recipe/module is to generate CIPD packages of statically-compiled software for distribution in our continuous integration fleets, as well as software distributed to our develepers (e.g. via depot_tools).
Target os and architecture uses the CIPD “${os}-${arch}” (a.k.a. “${platform}”) nomenclature, which is currently defined in terms of Go's GOOS and GOARCH runtime variables (with the unfortunate exception that CIPD uses ‘mac’ instead of ‘darwin’). This is somewhat arbitrary, but has worked well so far for us.
The 3pp module loads package definitions from a folder containing subfolders. Each subfolder defines a single software package to fetch, build and upload. For example, you might have a folder in your repo like this:
my_repo.git/ 3pp/ # "root folder" .vpython # common vpython file for all package scripts zlib/ # zlib "package folder" 3pp.pb # REQUIRED: the Spec.proto definition for zlib install.sh # a script to build zlib from source extra_resource_file other_package/ 3pp.pb # REQUIRED fetch.py # a script to fetch `other_package` in a custom way install.sh install-win.sh # windows-specific build script ...
This defines two packages (zlib
, and other_package
). The 3pp.pb files have references to the fetch/build scripts, and describe what dependencies the packages have (if any).
NOTE: Only one layer of package folders is supported currently.
Packages are named by the folder that contains their definition file (3pp.pb) and build scripts. It's preferable to have package named after software that it contains. However, sometimes you want multiple major versions of the software to exist side-by-side (e.g. pcre and pcre2, python and python3, etc.). In this case, have two separate package definition folders.
Each package folder contains a package spec (3pp.pb), as well as scripts, patches and/or utility tools to build the software from source.
The spec is a Text Proto document specified by the spec.proto schema.
The spec is broken up into two main sections, “create” and “upload”. The create section allows you to specify how the package software gets created, and allows specifying differences in how it's fetched/built/tested on a per-target basis, and the upload section has some details on how the final result gets uploaded to CIPD.
The 3pp.pb spec begins with a series of create
messages, each with details on on how to fetch+build+test the package. Each create message contains a “platform_re” field which works as a regex on the ${platform} value. All matching patterns apply in order, and non-matching patterns are skipped. Each create message is applied with a dict.update for each member message (i.e. [‘source’].update, [‘build’].update, etc.) to build a singular create message for the current target platform. For list values (e.g. ‘tool’, ‘dep’ in the Build message), you can clear them by providing a new empty value (e.g. tool: ""
)
Once all the create messages are merged (see schema for all keys that can be present), the actual creation takes place.
Note that “source” is REQUIRED in the final merged instruction set. All other messages are optional and have defaults as documented in spec.proto.
The creation process is broken up into 4 different stages:
All scripts described below are invoked with the following environment variables set:
patch_version
set for the version we're building (if any patch version was set).Additionally, on cross-compile environments, the $CROSS_TRIPLE environment variable is set to a GCC cross compile target triplet of cpu-vendor-os.
The source is used to fetch the raw sources for assembling the package. In some cases the sources may actually be binary artifacts (e.g. prebuilt windows installers).
The source is unpacked to a checkout directory, possibly in some specified subdirectory. Sources can either produce the actual source files, or they can produce a single archive file (e.g. zip or tarball), which can be unpacked with the ‘unpack_archive’ option. In addition, patches can be applied to the source with the ‘patch_dir’ option (the patches should be in git format-patch
format, and will be applied with git apply
).
git
- This checks out a semver tag in the repo.cipd
- This fetches data from a CIPD package.url
- This is used for packages that do not provide a stable distribution like git for their source code. An original download url will be passed in this method to download the source artifact from third party distribution.script
- Used for “weird” packages which are distributed via e.g. an HTML download page or an API. The script must be able to return the ‘latest’ version of its source, as well as to actually fetch a specified version. Python fetch scripts will be executed with vpython
, and so may have a .vpython file (or similar) in the usual manner to pull in dependencies like requests
.Additionally the Source message contains a patch_version
field to allow symver disambiguation of the built packages when they contain patches or other alterations which need to be versioned. This string will be joined with a ‘.’ to the source version being built when uploading the result to CIPD.
The build message allows you to specify deps
, and tools
, as well as a script install
which contains your logic to transform the source into the result package.
Deps are libraries built for the target ${platform}
and are typically used for linking your package.
Tools are binaries built for the host; they‘re things like automake
or sed
that are used during the configure/make phase of your build, but aren’t linked into the built product. These tools will be available on $PATH (both ‘$tools’ and ‘$tools/bin’ are added to $PATH, because many packages are set up with their binaries at the base of the package, and some are set up with them in a /bin folder)
Installation occurs by invoking the script indicated by the ‘install’ field (with the appropriate interpreter, depending on the file extension) like:
<interpreter> "$install[*]" "$PREFIX" "$DEPS_PREFIX"
Where:
$install[*]
are all of the tokens in the ‘install’ field.$PREFIX
is the directory which the script should install everything to; this directory will be archived into CIPD verbatim.$DEPS_PREFIX
is the path to a prefix directory containing the union of all of your packages' transitive deps. For example, all of the headers of your deps are located at $DEPS_PREFIX/include
.tools
are in $PATHIf the ‘install’ script is omitted, it is assumed to be ‘install.sh’.
If the ENTIRE build message is omitted, no build takes place. Instead the result of the ‘source’ stage will be packaged.
During the execution of the build phase, the package itself and its dependent packages (e.g. “dep” and “tool” in PB file) will be copied into the source checkout in the .3pp directory, and the script will be invoked as /path/to/checkout/.3pp/<cipd_pkg_name>/$script_name
. If the package has shared resources (like .vpython
files or helper scripts) which are outside of the package directory, you would need to create a symbolic link for it. See chromium.googlesource.com/infra/infra/+/master/3pp/cpython/ssl_suffix.py as an example.
Once the build stage is complete, all files in the $PREFIX folder passed to the install script will be zipped into a CIPD package.
It is strongly recommended that if your package is a library or tool with many files that it be packaged in the standard POSIXey PREFIX format (e.g. bin, lib, include, etc.). If your package is a collection of one or more standalone binaries, it's permissible to just have the binaries in the root of the output $PREFIX.
If the build stage is skipped (i.e. the build message is omitted) then the output of the source stage will be packaged instead (this is mostly useful when using a ‘script’ source).
After the package is built it can be optionally tested. The recipe will run your test script in an empty directory with the path to the packaged-but-not-yet-uploaded cipd package file and it can do whatever testing it needs to it (exiting non-zero if something is wrong). You can use the cipd pkg-deploy
command to deploy it (or whatever cipd commands you like, though I wouldn't recommend uploading it to CIPD, as the 3pp recipe will do that after the test exits 0).
Additionally, vpython for the tool platform will be guaranteed to be in $PATH.
Once the test comes back positive, the CIPD package will be uploaded to the CIPD server and registered with the prefix indicated in the upload message. The full CIPD package name is constructed as:
<prefix>/<pkg_name>/${platform}
So for example with the prefix infra
, the bzip2
package on linux-amd64 would be uploaded to infra/bzip2/linux-amd64
and tagged with the version that was built (e.g. version:1.2.3.patch_version1
).
You can also mark the upload as a universal
package, which will:
${platform}
suffix from the upload namelinux-amd64
, regardless of what platform you build the recipe on. This was chosen arbitrarially to ensure that “universal” packages build consistently. You can override this behavior (and bypass the normal docker environment entirely) by setting the no_docker_env flag to true in your Create.Build message.Every package will try to build the latest identifiable semver of its source, or will attempt to build the semver requested as an input property to the 3pp
recipe. This semver is also used to tag the uploaded artifacts in CIPD.
Because some of the packages here are used as dependencies for others (e.g. curl and zlib are dependencies of git, and zlib is a dependency of curl), each package used as a dependency for others should specify its version explicitly (currently this is only possible to do with the ‘cipd’ source type). So e.g. zlib and curl specify their source versions, but git and python float at ‘head’, always building the latest tagged version fetched from git.
When building a floating package (e.g. python, git) you may explicitly state the symver that you wish to build as part of the recipe invocation.
The symver of a package (either specified in the package definition, in the recipe properties or discovered while fetching its source code (e.g. latest git tag)) is also used to tag the package when it's uploaded to CIPD (plus the patch_version in the source message).
Third party packages are currently compiled on linux using the ‘infra.tools.dockerbuild’ tool from the infra.git repo. This uses a slightly modified version of the dockcross Docker cross-compile environment. Windows and OS X targets are built using the ‘osx_sdk’ and ‘windows_sdk’ recipe modules, each of which provides a hermetic (native) build toolchain for those platforms.
For linux, we can support all the architectures implied by dockerbuild, including:
If the recipe is run with force_build
it will always build all packages indicated (and their dependencies), and will not upload any of them to the central server.
The recipe must always be run with a package_prefix (by assigning to the .package_prefix property on the Support3ppApi). If the recipe is run in experimental mode, ‘experimental/’ will be prepended to this. Additionally, you may specify experimental: true
in the Create message for a package, which will have the same effect when running the recipe in production (to allow adding new packages or package/platform combintations experimentally).
As an example of the package definition layout in action, take a look at the 3pp folder in this infra.git repo.
This module uses the following named caches:
3pp_cipd
- Caches all downloaded and uploaded CIPD packages. Currently tag lookups are performed every time against the CIPD server, but this will hold the actual package files.osx_sdk
- Cache for depot_tools/osx_sdk
. Only on Mac.windows_sdk
- Cache for depot_tools/windows_sdk
. Only on Windows.— def ensure_uploaded(self, packages=(), platform='', force_build=False):
Executes entire {fetch,build,package,verify,upload} pipeline for all the packages listed, targeting the given platform.
Args:
Returns (list[(cipd_pkg, cipd_version)], set[str]) of built CIPD packages and their tagged versions, as well as a list of unsupported packages.
— def initialize(self):
— def load_packages_from_path(self, base_path, glob_pattern=‘**/3pp.pb’, check_dup=True):
Loads all package definitions from the given base_path and glob pattern.
This will parse and intern all the 3pp.pb package definition files so that packages can be identified by their cipd package name. For example, if you pass:
path/ pkgname/ 3pp.pb install.sh
And the file “path/pkgname/3pp.pb” has the following content:
upload { pkg_prefix: “my_pkg_prefix” }
Its cipd package name will be “my_pkg_prefix/pkgname”.
Args:
Returns a set(str) containing the cipd package names of the packages which were loaded.
Raises a DuplicatePackage exception if this function encounters a cipd package name which is already registered. This could occur if you call load_packages_from_path multiple times, and one of the later calls tries to load a package which was registered under one of the earlier calls.
— def package_prefix(self, experimental=False):
Returns the CIPD package name prefix (str), if any is set.
This will prepend ‘experimental/’ to the currently set prefix if:
— def set_experimental(self, experimental):
Set the experimental mode (bool).
— def set_package_prefix(self, prefix):
Set the CIPD package name prefix (str).
All CIPDSpecs for built packages (not sources) will have this string prepended to them.
— def set_source_cache_prefix(self, prefix):
Set the CIPD namespace (str) to store the source of the packages.
DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/git, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step
— def __call__(self, source, source_repo_checkout_name, dest, source_ref=‘refs/heads/master’, dest_ref=‘refs/heads/master’, extra_submodules=None, deps_path_prefix=None, disable_path_prefix=False):
Args: source: URL of the git repository to mirror. source_repo_checkout_name: Name of the directory that the source repo should be checked out into. dest: URL of the git repository to push to. source_ref: git ref in the source repository to checkout. dest_ref: git ref in the destination repository to push to. extra_submodules: a list of “path=URL” strings. These are added as extra submodules. deps_path_prefix: path prefix used to filter out DEPS. DEPS with the prefix are included. disable_path_prefix: disable filtering out DEPS by path prefix.
DEPS: recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/step
API for using Windows ADK distributed via CIPD.
— def cleanup(self):
Remove the ADK and WinPE.
— def cleanup_win_adk(self):
Cleanup the Windows ADK.
— def cleanup_winpe(self):
Cleanup WinPE.
— def ensure(self, install=True):
Ensure the presence of the Windows ADK.
— def ensure_win_adk(self, refs):
Downloads & Installs the Windows ADK.
— def ensure_win_adk_winpe(self, refs):
Ensures that the WinPE add-on is available.
DEPS: depot_tools/bot_update, depot_tools/gclient, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/step
API for using Windows PowerShell scripts.
— def execute_script(self, command, *args):
Executes the windows powershell script
— def execute_wib_config(self, config):
Executes the windows image builder user config.
— def gen_ps_script_cmd(self, command, *args):
Generate the powershell command.
— def init_win_pe_image(self, arch, dest):
Calls Copy-PE to create WinPE media folder for arch
— def perform_winpe_action(self, action):
Performs the given action
DEPS: recipe_engine/cipd, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/step
API for using Windows SDK distributed via CIPD.
@contextmanager
— def __call__(self, path=None, version=None, enabled=True):
Setups the SDK environment when enabled.
Args: path (path): Path to a directory where to install the SDK (default is ‘[start_dir]/cipd/windows_sdk’) version (str): CIPD instance ID, tag or ref of the SDK (default is set via $infra/windows_sdk.version property) enabled (bool): Whether the SDK should be used or not.
Raises: StepFailure or InfraFailure.
DEPS: recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/python
Provides steps to zip and unzip files.
— def directory(self, step_name, directory, output):
Step to compress a single directory.
Args: step_name: display name of the step. directory: path to a directory to compress, it would become the root of an archive, i.e. |directory|/file.txt would be named ‘file.txt’ in the archive. output: path to a zip file to create.
— def make_package(self, root, output):
Returns ZipPackage object that can be used to compress a set of files.
Usage: pkg = api.zip.make_package(root, output) pkg.add_file(root.join(‘file’)) pkg.add_directory(root.join(‘directory’)) yield pkg.zip(‘zipping step’)
Args: root: a directory that would become root of a package, all files added to an archive will have archive paths relative to this directory. output: path to a zip file to create.
Returns: ZipPackage object.
— def unzip(self, step_name, zip_file, output, quiet=False):
Step to uncompress |zip_file| into |output| directory.
Zip package will be unpacked to |output| so that root of an archive is in |output|, i.e. archive.zip/file.txt will become |output|/file.txt.
Step will FAIL if |output| already exists.
Args: step_name: display name of a step. zip_file: path to a zip file to uncompress, should exist. output: path to a directory to unpack to, it should NOT exist. quiet (bool): If True, print terse output instead of the name of each unzipped file.
— def update_package(self, root, output):
Returns ZipPackage object that can be used to update an existing package.
Usage: pkg = api.zip.update_package(root, output) pkg.add_file(root.join(‘file’)) pkg.add_directory(root.join(‘directory’)) yield pkg.zip(‘updating zip step’)
Args: root: the root directory for adding new files/dirs to the package; all files/dirs added to an archive will have archive paths relative to this directory. output: path to a zip file to update.
Returns: ZipPackage object.
DEPS: depot_tools/git, support_3pp, recipe_engine/cipd, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/python, recipe_engine/step
This recipe builds and packages third party software, such as Git.
— def RunSteps(api, package_locations, to_build, platform, force_build, package_prefix, source_cache_prefix):
DEPS: conda, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties
Recipe to build CIPD package with sealed Conda environment.
Supposed to be used from manually triggered Buildbot builders. We aren't expecting rebuilding this environment often, so setting up and periodic schedule is a waste of resources.
To build a new package for all platforms:
— def RunSteps(api):
DEPS: depot_tools/depot_tools, depot_tools/gsutil, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/step
— def RunSteps(api):
DEPS: depot_tools/gsutil, zip, recipe_engine/cipd, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step
— def RunSteps(api):
DEPS: depot_tools/gclient, depot_tools/git, depot_tools/osx_sdk, depot_tools/tryserver, depot_tools/windows_sdk, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io
@contextmanager
— def PlatformSdk(api, platforms):
— def RunSteps(api, platforms, dry_run, rebuild):
This recipe verifies importing of chromium bootstrap protos.
The protos are exported via a symlink in //recipe/recipe_proto/infra/chromium.
— def RunSteps(api):
DEPS: cloudbuildhelper, recipe_engine/path
— def RunSteps(api):
DEPS: cloudbuildhelper, recipe_engine/json, recipe_engine/step
— def RunSteps(api):
— def build(api):
— def upload(api):
DEPS: cloudbuildhelper, recipe_engine/path
— def RunSteps(api):
DEPS: cloudkms, recipe_engine/path
— def RunSteps(api):
DEPS: recipe_engine/cq, recipe_engine/properties, recipe_engine/step
Recipe to test LUCI CQ/CV itself.
— def RunSteps(api, properties):
DEPS: depot_tools/git, depot_tools/gsutil, zip, recipe_engine/buildbucket, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
Recipe to build windows depot_tools bootstrap zipfile.
— def RunSteps(api):
DEPS: docker, recipe_engine/raw_io, recipe_engine/step
— def RunSteps(api):
DEPS: depot_tools/bot_update, depot_tools/gclient, docker, recipe_engine/file, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/service_account, recipe_engine/step, recipe_engine/time
— def RunSteps(api, arch_type):
DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/tryserver, docker, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step
Test chrome-golo repo DHCP configs using dhcpd binaries via docker.
— def RunSteps(api):
DEPS: cloudbuildhelper, infra_checkout, recipe_engine/buildbucket, recipe_engine/futures, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time
— def RunSteps(api, properties):
DEPS: recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step, recipe_engine/time
Pushes a trivial CL to Gerrit to verify git authentication works on LUCI.
— def RunSteps(api):
DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/gsutil, zip, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/step
— def RunSteps(api):
DEPS: depot_tools/depot_tools, depot_tools/git, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/futures, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step, recipe_engine/url
Updates the Git Cache zip files.
— def RunSteps(api, inputs):
DEPS: depot_tools/gclient, recipe_engine/context, recipe_engine/path, recipe_engine/properties, recipe_engine/python
Runs git submodule daemon (gsubmodd) against a given source repo.
Intended to be called periodically (see CYCLE_TIME_SEC). Runs several iterations of the daemon and then quits so that recipe has a chance to resync the source code.
— def RunSteps(api, source_repo, target_repo, limit=‘‘, epoch=’’):
DEPS: depot_tools/depot_tools, depot_tools/gsutil, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/python, recipe_engine/time
Pushes a trivial CL to Gerrit to verify git authentication works on LUCI.
— def RunSteps(api):
DEPS: depot_tools/gerrit, cloudbuildhelper, infra_checkout, recipe_engine/buildbucket, recipe_engine/futures, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/step, recipe_engine/time
— def RunSteps(api, properties):
DEPS: depot_tools/git, depot_tools/git_cl, cloudbuildhelper, infra_checkout, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/properties
— def RunSteps(api, properties):
DEPS: infra_checkout, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/step
— def RunSteps(api):
DEPS: depot_tools/bot_update, infra_checkout, recipe_engine/buildbucket, recipe_engine/properties, recipe_engine/step
— def RunSteps(api):
DEPS: infra_checkout, recipe_engine/buildbucket, recipe_engine/platform, recipe_engine/raw_io
— def RunSteps(api):
DEPS: infra_cipd, recipe_engine/assertions, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/python, recipe_engine/step
— def RunSteps(api):
DEPS: depot_tools/bot_update, depot_tools/depot_tools, depot_tools/gclient, depot_tools/osx_sdk, infra_checkout, infra_cipd, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/runtime, recipe_engine/step
— def RunSteps(api):
— def build_main(api, checkout, buildername, project_name, repo_url, rev):
— def run_python_tests(api, project_name):
DEPS: depot_tools/bot_update, depot_tools/gclient, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step
— def RunFrontendTests(api, env, cwd, app_name):
— def RunInfraFrontendTests(api, env):
— def RunInfraInternalFrontendTests(api, env):
— def RunSteps(api):
DEPS: depot_tools/osx_sdk, infra_checkout, infra_system, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/runtime, recipe_engine/step
— def RunSteps(api, go_version_variant):
DEPS: infra_system, recipe_engine/context, recipe_engine/platform, recipe_engine/step
— def RunSteps(api):
DEPS: depot_tools/osx_sdk, infra_checkout, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/context, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step, recipe_engine/tricium
— def RunSteps(api, GOARCH, go_version_variant, go_modules, run_integration_tests, run_lint):
— def apply_golangci_lint(api, co):
DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/git, infra_checkout, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/path, recipe_engine/platform, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step
— def RunSteps(api):
DEPS: provenance, recipe_engine/path
— def RunSteps(api):
DEPS: recipe_autoroller, recipe_engine/json, recipe_engine/properties, recipe_engine/proto, recipe_engine/time
Rolls recipes.cfg dependencies for public projects.
— def RunSteps(api, projects, db_gcs_bucket):
DEPS: recipe_engine/cipd, recipe_engine/path, recipe_engine/properties, recipe_engine/step
— def RunSteps(api, recipe_bundler_pkg, recipe_bundler_vers, repo_specs, repo_specs_optional, package_name_prefix, package_name_internal_prefix):
DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/git, depot_tools/tryserver, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/python, recipe_engine/raw_io, recipe_engine/step
— def RunSteps(api, upstream_id, upstream_url, downstream_id, downstream_url):
DEPS: depot_tools/bot_update, depot_tools/gclient, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/python, recipe_engine/step
A continuous builder which runs recipe tests.
— def RunSteps(api, git_repo):
DEPS: depot_tools/bot_update, depot_tools/gclient, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/path, recipe_engine/properties
— def RunSteps(api):
DEPS: recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step
— def RunSteps(api):
— def get_value(pairs, key):
Returns a the value for the given key in the given pairs.
Args: pairs: A list of {“key”: key, “value”: value} dicts. key: A key whose value to get. If the key appears more than once, only the first value is returned.
Returns: The value for the given key.
DEPS: recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step, recipe_engine/time, recipe_engine/url, recipe_engine/uuid
— def RunSteps(api):
— def normalize(s):
Normalizes a string for use in a resource label.
DEPS: support_3pp, recipe_engine/buildbucket, recipe_engine/cipd, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
— def RunSteps(api, GOOS, GOARCH, experimental, load_dupe, package_prefix, source_cache_prefix):
DEPS: sync_submodules, recipe_engine/properties, recipe_engine/runtime
— def RunSteps(api, disable_path_prefix):
DEPS: depot_tools/bot_update, depot_tools/gclient, depot_tools/gerrit, depot_tools/tryserver, infra_checkout, recipe_engine/buildbucket, recipe_engine/json, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step, recipe_engine/tricium
— def RunSteps(api, inputs):
This recipe runs legacy analyzers for the infra repo.
DEPS: depot_tools/gclient, depot_tools/git, depot_tools/gitiles, recipe_engine/buildbucket, recipe_engine/context, recipe_engine/file, recipe_engine/json, recipe_engine/path, recipe_engine/properties, recipe_engine/raw_io, recipe_engine/step
— def GetSubmodules(api, deps, source_checkout_name, overlays):
— def RefToRemoteRef(ref):
— def RunSteps(api, source_repo, target_repo, extra_submodules, refs, overlays):
DEPS: windows_adk, recipe_engine/file, recipe_engine/path, recipe_engine/properties
— def RunSteps(api):
DEPS: depot_tools/bot_update, depot_tools/gclient, windows_adk, windows_scripts_executor, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/properties, recipe_engine/step
— def RunSteps(api, inputs):
This recipe runs windows offline builder for a given user config.
DEPS: windows_scripts_executor, recipe_engine/json, recipe_engine/properties
— def RunSteps(api, image):
DEPS: windows_sdk, recipe_engine/platform, recipe_engine/properties, recipe_engine/step
— def RunSteps(api):
DEPS: zip, recipe_engine/context, recipe_engine/file, recipe_engine/path, recipe_engine/platform, recipe_engine/step
— def RunSteps(api):