DEPS: json, path, platform, python, step
Provides steps to manipulate archive files (tar, zip, etc.).
— def extract(self, step_name, archive_file, output, mode=‘safe’, include_files=()):
Step to uncompress |archive_file| into |output| directory.
Archive will be unpacked to |output| so that root of an archive is in |output|, i.e. archive.tar/file.txt will become |output|/file.txt.
Step will FAIL if |output| already exists.
Args:
output
location, the extraction will fail (raise StepException) which contains a member StepException.archive_skipped_files
(all other files will be extracted normally). If ‘unsafe’, then tarfiles containing paths escaping output
will be extracted as-is.fnmatch
module. If a file “filename” in the archive exists, include_files with “file*” will match it. All paths for the matcher are converted to posix style (forward slash).— def package(self, root):
Returns Package object that can be used to compress a set of files.
Usage:
# Archive root/file and root/directory/** (api.archive.package(root). with_file(root.join('file')). with_dir(root.join('directory')). archive('archive step', output, 'tbz')) # Archive root/** zip_path = ( api.archive.package(root). archive('archive step', api.path['start_dir'].join('output.zip')) )
Args:
Returns: Package object.
Provides access to the assertion methods of the python unittest module.
Asserting non-step aspects of code (return values, non-step side effects) is expressed more naturally by making assertions within the RunSteps function of the test recipe. This api provides access to the assertion methods of unittest.TestCase to be used within test recipes.
All non-deprecated assertion methods of unittest.TestCase can be used.
An enhancement to the assertion methods is that if a custom msg is used, values for the non-msg arguments can be substituted into the message using named substitution with the format method of strings. e.g. self.AssertEqual(0, 1, ‘{first} should be {second}’) will raise an AssertionError with the message: ‘0 should be 1’.
The attributes longMessage and maxDiff are supported and have the same behavior as the unittest module.
Example (.../recipe_modules/my_module/tests/foo.py): DEPS = [ ‘my_module’, ‘recipe_engine/assertions’, ‘recipe_engine/properties’, ‘recipe_engine/runtime’, ]
def RunSteps(api):
value = api.my_module.foo() expected_value = api.properties.get(‘expected_value’) api.assertions.assertEqual(value, expected_value)
def GenTests(api): yield ( api.test(‘basic’) + api.properties(expected_value=‘normal value’) )
yield ( api.test(‘experimental’) + api.properties(expected_value=‘experimental value’) + api.properties(is_luci=True, is_experimental=True) )
DEPS: json, platform, properties, raw_io, runtime, step, uuid
API for interacting with the buildbucket service.
Requires buildbucket
command in $PATH
: https://godoc.org/go.chromium.org/luci/buildbucket/client/cmd/buildbucket
A module for interacting with buildbucket.
@property
— def bucket_v1(self):
Returns bucket name in v1 format.
Mostly useful for scheduling new builds using V1 API.
@property
— def build(self):
Returns current build as a buildbucket.v2.Build
protobuf message.
For value format, see Build
message in build.proto.
DO NOT MODIFY the returned value. Do not implement conditional logic on returned tags; they are for indexing. Use returned build.input
instead.
Pure Buildbot support: to simplify transition to buildbucket, returns a message even if the current build is not a buildbucket build. Provides as much information as possible. Some fields may be left empty, violating the rules described in the .proto files. If the current build is not a buildbucket build, returned build.id
is 0.
@property
— def build_id(self):
DEPRECATED, use build.id instead.
@property
— def build_input(self):
DEPRECATED, use build.input instead.
— def build_url(self, host=None, build_id=None):
Returns url to a build. Defaults to current build.
@property
— def builder_id(self):
Deprecated. Use build.builder instead.
@property
— def builder_name(self):
Returns builder name. Shortcut for .build.builder.builder
.
— def cancel_build(self, build_id, **kwargs):
— def collect_build(self, build_id, mirror_status=False, **kwargs):
Shorthand for collect_builds
below, but for a single build only.
Args:
Returns: Build. for the ended build.
— def collect_builds(self, build_ids, interval=None, timeout=None, step_name=None):
Waits for a set of builds to end and returns their details.
Args:
Returns: A map from integer build IDs to the corresponding Build for all specified builds.
— def get_build(self, build_id, **kwargs):
@property
— def gitiles_commit(self):
Returns input gitiles commit. Shortcut for .build.input.gitiles_commit
.
For value format, see GitilesCommit
message.
Never returns None, but sub-fields may be empty.
@host.setter
— def host(self, value):
— def is_critical(self, build=None):
Returns True if the build is critical. Build defaults to the current one.
@property
— def properties(self):
DEPRECATED, use build attribute instead.
— def put(self, builds, **kwargs):
Puts a batch of builds.
DEPRECATED. Use schedule()
instead.
Args:
None
(for example, tags={'buildset': None}
will ensure build is triggered without buildset
tag).Returns: A step that as its .stdout
property contains the response object as returned by buildbucket.
— def run(self, schedule_build_requests, collect_interval=None, timeout=None, url_title_fn=None, step_name=None):
Runs builds and returns results.
A shortcut for schedule() and collect_builds(). See their docstrings.
Returns: A list of completed Builds in the same order as schedule_build_requests.
— def schedule(self, schedule_build_requests, url_title_fn=None, step_name=None):
Schedules a batch of builds.
Example:
req = api.buildbucket.schedule_request(builder='linux') api.buildbucket.schedule([req])
Hint: when scheduling builds for CQ, let CQ know about them:
api.cq.record_triggered_builds(*api.buildbucket.schedule([req1, req2]))
Args:
buildbucket.v2.ScheduleBuildRequest
protobuf messages. Create one by calling schedule_request
method.Returns: A list of Build
messages in the same order as requests.
Raises: InfraFailure
if any of the requests fail.
— def schedule_request(self, builder, project=None, bucket=None, properties=None, experimental=None, gitiles_commit=None, gerrit_changes=None, tags=None, inherit_buildsets=True, dimensions=None, priority=None, critical=None):
Creates a new ScheduleBuildRequest
message with reasonable defaults.
This is a convenient function to create a ScheduleBuildRequest
message.
Among args, messages can be passed as dicts of the same structure.
Example:
request = api.buildbucket.schedule_request( builder='linux', tags=api.buildbucket.tags(a='b'), ) build = api.buildbucket.schedule([request])[0]
Args:
common_pb2.Trinary
or bool. Defaults to the value of the current build. Read more about [experimental
field](https://cs.chromium.org/chromium/infra/go/src/go.chromium.org/luci/buildbucket/proto/build.proto?q=“bool experimental”).gitiles_commit
.gerrit_changes
.True
(default), the returned request will include buildset tags from the current build.[20..255]
. Defaults to the value of the current build.— def set_buildbucket_host(self, host):
DEPRECATED. Use host property.
— def set_output_gitiles_commit(self, gitiles_commit):
Sets buildbucket.v2.Build.output.gitiles_commit
field.
This will tell other systems, consuming the build, what version of the code was actually used in this build and what is the position of this build relative to other builds of the same builder.
Args:
refs/
.Can be called at most once per build.
— def tags(self, **tags):
Alias for tags in util.py. See doc there.
@property
— def tags_for_child_build(self):
A dict of tags (key -> value) derived from current (parent) build for a child build.
— def use_service_account_key(self, key_path):
Tells this module to start using given service account key for auth.
Otherwise the module is using the default account (when running on LUCI or locally), or no auth at all (when running on Buildbot).
Exists mostly to support Buildbot environment. Recipe for LUCI environment should not use this.
Args:
DEPS: json, path, platform, properties, python, raw_io, service_account, step
API for interacting with CIPD.
Depends on ‘cipd’ binary available in PATH: https://godoc.org/go.chromium.org/luci/cipd/client/cmd/cipd
CIPDApi provides basic support for CIPD.
This assumes that cipd
(or cipd.exe
or cipd.bat
on windows) has been installed somewhere in $PATH.
— def acl_check(self, pkg_path, reader=True, writer=False, owner=False):
Checks whether the caller has a given roles in a package.
Args:
Returns True if the caller has given roles, False otherwise.
— def build(self, input_dir, output_package, package_name, compression_level=None, install_mode=None, preserve_mtime=False, preserve_writable=False):
Builds, but does not upload, a cipd package from a directory.
Args:
Returns the CIPDApi.Pin instance.
— def build_from_pkg(self, pkg_def, output_package, compression_level=None):
Builds a package based on a PackageDefinition object.
Args:
Returns the CIPDApi.Pin instance.
— def build_from_yaml(self, pkg_def, output_package, pkg_vars=None, compression_level=None):
Builds a package based on on-disk YAML package definition file.
Args:
Returns the CIPDApi.Pin instance.
— def create_from_pkg(self, pkg_def, refs=None, tags=None, compression_level=None):
Builds and uploads a package based on a PackageDefinition object.
This builds and uploads the package in one step.
Args:
Returns the CIPDApi.Pin instance.
— def create_from_yaml(self, pkg_def, refs=None, tags=None, pkg_vars=None, compression_level=None):
Builds and uploads a package based on on-disk YAML package definition file.
This builds and uploads the package in one step.
Args:
Returns the CIPDApi.Pin instance.
— def describe(self, package_name, version, test_data_refs=None, test_data_tags=None):
Returns information about a pacakge instance given its version: who uploaded the instance and when and a list of attached tags.
Args:
Returns the CIPDApi.Description instance describing the package.
— def ensure(self, root, ensure_file):
Ensures that packages are installed in a given root dir.
Args:
Returns the map of subdirectories to CIPDApi.Pin instances.
@property
— def executable(self):
— def pkg_deploy(self, root, package_file):
Deploys the specified package to root.
ADVANCED METHOD: You shouldn‘t need this unless you’re doing advanced things with CIPD. Typically you should use the ensure
method here to fetch+install packages to the disk.
Args:
Returns a Pin for the deployed package.
— def pkg_fetch(self, destination, package_name, version):
Downloads the specified package to destination.
ADVANCED METHOD: You shouldn‘t need this unless you’re doing advanced things with CIPD. Typically you should use the ensure
method here to fetch+install packages to the disk.
Args:
Returns a Pin for the downloaded package.
— def register(self, package_name, package_path, refs=(), tags=None):
Uploads and registers package instance in the package repository.
Args:
Returns: The CIPDApi.Pin instance.
— def search(self, package_name, tag):
Searches for package instances by tag, optionally constrained by package name.
Args:
Returns the list of CIPDApi.Pin instances.
— def set_ref(self, package_name, version, refs):
Moves a ref to point to a given version.
Args:
Returns the CIPDApi.Pin instance.
@contextlib.contextmanager
— def set_service_account(self, service_account):
Temporarily sets the service account used for authentication to CIPD.
Implemented as a context manager to avoid one part of a recipe from overwriting another's specified service account.
Args:
— def set_tag(self, package_name, version, tags):
Tags package of a specific version.
Args:
Returns the CIPDApi.Pin instance.
Recipe module providing commit position parsing and formatting.
@classmethod
— def format(cls, ref, revision_number):
Returns a commit position string.
ref must start with ‘refs/’.
@classmethod
— def parse(cls, value):
Returns (ref, revision_number) tuple.
The context module provides APIs for manipulating a few pieces of ‘ambient’ data that affect how steps are run.
The pieces of information which can be modified are:
The values here are all scoped using Python‘s with
statement; there’s no mechanism to make an open-ended adjustment to these values (i.e. there's no way to change the cwd permanently for a recipe, except by surrounding the entire recipe with a with statement). This is done to avoid the surprises that typically arise with things like os.environ or os.chdir in a normal python program.
Example:
with api.context(cwd=api.path['start_dir'].join('subdir')): # this step is run inside of the subdir directory. api.step("cat subdir/foo", ['cat', './foo'])
@contextmanager
— def __call__(self, cwd=None, env_prefixes=None, env_suffixes=None, env=None, infra_steps=None, name_prefix=None, namespace=None):
Allows adjustment of multiple context values in a single call.
Args:
api.path['start_dir']
.Name prefixes and namespaces:
Example:
with api.context(name_prefix='cool '): # has name 'cool something' api.step('something', ['echo', 'something']) with api.context(namespace='world', name_prefix='hot '): # has name 'cool world|hot other' api.step('other', ['echo', 'other']) with api.context(name_prefix='tamale '): # has name 'cool world|hot tamale yowza' api.step('yowza', ['echo', 'yowza']) with api.context(namespace='ocean'): # has name 'cool ocean|mild' api.step('other', ['echo', 'mild'])
Environmental Variable Overrides:
Env is a mapping of environment variable name to the value you want that environment variable to have. The value is one of:
“env_prefix” and “env_suffix” are a list of Path or strings that get prefixed (or suffixed) to their respective environment variables, delimited with the system's path separator. This can be used to add entries to environment variables such as “PATH” and “PYTHONPATH”. If prefixes are specified and a value is also defined in “env”, the value will be installed as the last path component if it is not empty.
Look at the examples in “examples/” for examples of context module usage.
@property
— def cwd(self):
Returns the current working directory that steps will run in.
Returns (Path|None) - The current working directory. A value of None is equivalent to api.path[‘start_dir’], though only occurs if no cwd has been set (e.g. in the outermost context of RunSteps).
@property
— def env(self):
Returns modifications to the environment.
By default this is empty; There‘s no facility to observe the program’s startup environment. If you want to pass data to the recipe, it should be done with properties.
Returns (dict) - The env-key -> value mapping of current environment modifications.
@property
— def env_prefixes(self):
Returns Path prefix modifications to the environment.
This will return a mapping of environment key to Path tuple for Path prefixes registered with the environment.
Returns (dict) - The env-key -> value(Path) mapping of current environment prefix modifications.
@property
— def env_suffixes(self):
Returns Path suffix modifications to the environment.
This will return a mapping of environment key to Path tuple for Path suffixes registered with the environment.
Returns (dict) - The env-key -> value(Path) mapping of current environment suffix modifications.
@property
— def infra_step(self):
Returns the current value of the infra_step setting.
Returns (bool) - True iff steps are currently considered infra steps.
@property
— def namespace(self):
Gets the current namespace.
Returns (Tuple[str]) - The current step namespace plus name prefix for nesting.
— def record_step_name(self, name):
Records a step name in the current namespace.
Args:
Returns Tuple[str] of the step name_tokens that should ACTUALLY run.
Side-effect: Updates global tracking state for this step name.
This module provides recipe API of LUCI CQ, aka pre-commit testing system.
More information about CQ: https://chromium.googlesource.com/infra/luci/luci-go/+/master/cq
— def initialize(self):
— def record_triggered_build_ids(self, *build_ids):
Adds given Buildbucket build ids to the list of triggered builds for CQ to wait on corresponding build completion later.
Must be called after some step.
Args:
— def record_triggered_builds(self, *builds):
Adds given Buildbucket builds to the list of triggered builds for CQ to wait on corresponding build completion later.
Must be called after some step.
Expected usage:
api.cq.record_triggered_builds(*api.buildbucket.schedule([req1, req2]))
Args:
Build
objects, typically returned by api.buildbucket.schedule
. @property
— def state(self):
CQ state pertaining to this recipe execution.
@property
— def triggered_build_ids(self):
Returns recorded Buildbucket build ids as a list of integers.
DEPS: json, path, python, raw_io, step
File manipulation (read/write/delete/glob) methods.
— def copy(self, name, source, dest):
Copies a file (including mode bits) from source to destination on the local filesystem.
Behaves identically to shutil.copy.
Args:
source
will be appended to derive a path to a destination file.Raises file.Error
— def copytree(self, name, source, dest, symlinks=False):
Recursively copies a directory tree.
Behaves identically to shutil.copytree. dest
must not exist.
Args:
Raises file.Error
— def ensure_directory(self, name, dest, mode=511):
Ensures that dest
exists and is a directory.
Args:
Raises file.Error if the path exists but is not a directory.
— def filesizes(self, name, files, test_data=None):
Returns list of filesizes for the given files.
Args:
Returns list[int], size of each file in bytes.
— def flatten_single_directories(self, name, path):
Flattens singular directories, starting at path.
Example:
$ mkdir -p dir/which_has/some/singlular/subdirs/ $ touch dir/which_has/some/singlular/subdirs/with $ touch dir/which_has/some/singlular/subdirs/files $ flatten_single_directories(dir) $ ls dir with files
This can be useful when you just want the ‘meat’ of a very sparse directory structure. For example, some tarballs like foo-1.2.tar.gz
extract all their contents into a subdirectory foo-1.2/
.
Using this function would essentially move all the actual contents of the extracted archive up to the top level directory, removing the need to e.g. hard-code/find the subfolder name after extraction (not all archives are even named after the subfolder they extract to).
Args:
Raises file.Error
— def glob_paths(self, name, source, pattern, test_data=()):
Performs glob expansion on pattern
.
glob rules for pattern
follow the same syntax as for the python glob
stdlib module.
Args:
source
.Returns list[Path] - All paths found.
Raises file.Error.
— def listdir(self, name, source, test_data=()):
List all files inside a directory.
Args:
Returns list[Path]
Raises file.Error.
— def move(self, name, source, dest):
Moves a file or directory.
Behaves identically to shutil.move.
Args:
Raises file.Error
— def read_json(self, name, source, test_data=''):
Reads a file as UTF-8 encoded json.
Args:
Returns (object) - The content of the file.
Raise file.Error
— def read_raw(self, name, source, test_data=''):
Reads a file as raw data.
Args:
Returns (str) - The unencoded (binary) contents of the file.
Raises file.Error
— def read_text(self, name, source, test_data=''):
Reads a file as UTF-8 encoded text.
Args:
Returns (str) - The content of the file.
Raises file.Error
— def remove(self, name, source):
Remove a file.
Does not raise Error if the file doesn't exist.
Args:
Raises file.Error.
— def rmcontents(self, name, source):
Similar to rmtree, but removes only contents not the directory.
This is useful e.g. when removing contents of current working directory. Deleting current working directory makes all further getcwd calls fail until chdir is called. chdir would be tricky in recipes, so we provide a call that doesn't delete the directory itself.
Args:
Raises file.Error.
— def rmglob(self, name, source, pattern):
Removes all entries in source
matching the glob pattern
.
Args:
source
. Anything matching this pattern will be removed.Raises file.Error.
— def rmtree(self, name, source):
Recursively removes a directory.
This uses a native python on Linux/Mac, and uses rd
on Windows to avoid issues w.r.t. path lengths and read-only attributes. If the directory is gone already, this returns without error.
Args:
Raises file.Error.
— def symlink(self, name, source, linkname):
Creates a symlink on the local filesystem.
Behaves identically to os.symlink.
Args:
Raises file.Error
— def symlink_tree(self, root):
Creates a SymlinkTree, given a root directory.
Args:
— def truncate(self, name, path, size_mb=100):
Creates an empty file with path and size_mb on the local filesystem.
Args:
Raises file.Error
— def write_json(self, name, dest, data):
Write the given json serializable data
to dest
.
Args:
Raises file.Error.
— def write_raw(self, name, dest, data):
Write the given data
to dest
.
Args:
Raises file.Error.
— def write_text(self, name, dest, text_data):
Write the given UTF-8 encoded text_data
to dest
.
Args:
Raises file.Error.
DEPS: context, json, path, python, step
A simplistic method for running steps generated by an external script.
This module was created before there was a way to put recipes directly into another repo. It is not recommended to use this, and it will be removed in the near future.
— def __call__(self, path_to_script, *args):
Run a script and generate the steps emitted by that script.
The script will be invoked with --output-json /path/to/file.json. The script is expected to exit 0 and write steps into that file. Once the script outputs all of the steps to that file, the recipe will read the steps from that file and execute them in order. Any *args specified will be additionally passed to the script.
The step data is formatted as a list of JSON objects. Each object corresponds to one step, and contains the following keys:
--presentation-json /path/to/file.json
. This file will be used to update the step's presentation on the build status page. The file will be expected to contain a single json object, with any of the following keys:DEPS: cipd, context, json, path, properties, raw_io, runtime, step
API for interacting with isolated.
The isolated client implements a tar-like scatter-gather mechanism for archiving files. The tool's source lives at http://go.chromium.org/luci/client/cmd/isolated.
This module will deploy the client to [CACHE]/isolated_client/; users should add this path to the named cache for their builder.
— def download(self, step_name, isolated_hash, output_dir, isolate_server=None):
Downloads an isolated tree from an isolate server.
Args: step_name (str): name of the step. isolated_hash (str): the hash of an isolated tree. output_dir (Path): Path to an output directory. If a non-existent directory, it will be created; else if already existent, conflicting files will be overwritten and non-conflicting files already in the directory will be ignored. isolate_server (str|None): an isolate server to donwload from; if None, the module's default server will be used instead.
— def initialize(self):
@property
— def isolate_server(self):
Returns the associated isolate server.
— def isolated(self, root_dir):
Returns an Isolated object that can be used to archive a set of files and directories, relative to a given root directory.
Args: root_dir (Path): directory relative to which files and directory will be isolated.
@property
— def namespace(self):
Returns the associated namespace.
@contextlib.contextmanager
— def on_path(self):
This context manager ensures the go isolated client is available on $PATH.
Example:
with api.isolated.on_path(): # do your steps which require the isolated binary on path
DEPS: properties, python, raw_io
Methods for producing and consuming JSON.
@returns_placeholder
— def input(self, data):
A placeholder which will expand to a file path containing .
— def is_serializable(self, obj):
Returns True if the object is JSON-serializable.
@staticmethod
— def loads(data, **kwargs):
Works like json.loads
, but strips out unicode objects (replacing them with utf8-encoded str objects).
@returns_placeholder
— def output(self, add_json_log=True, name=None, leak_to=None):
A placeholder which will expand to ‘/tmp/file’.
If leak_to is provided, it must be a Path object. This path will be used in place of a random temporary file, and the file will not be deleted at the end of the step.
Args:
name
. If this is ‘on_failure’, only create this log when the step has a non-SUCCESS status.— def read(self, name, path, add_json_log=True, output_name=None, **kwargs):
Returns a step that reads a JSON file.
This method is deprecated. Use file.read_json instead.
DEPS: cipd, json, path, service_account, step
Interface to the led tool.
“led” stands for LUCI editor. It allows users to debug and modify LUCI jobs. It can be used to modify many aspects of a LUCI build, most commonly including the recipes used.
The main interface this module provides is a direct call to the led binary:
led_result = api.led( ‘get-builder’, [‘luci.chromium.try:chromium_presubmit’]) final_data = led_result.then(‘edit-recipe-bundle’).result
See the led binary for full documentation of commands.
— def __call__(self, *cmd):
Runs led with the given arguments. Wraps result in a LedResult
.
All functions related to manipulating paths in recipes.
Recipes handle paths a bit differently than python does. All path manipulation in recipes revolves around Path objects. These objects store a base path (always absolute), plus a list of components to join with it. New paths can be derived by calling the .join method with additional components.
In this way, all paths in Recipes are absolute, and are constructed from a small collection of anchor points. The built-in anchor points are:
api.path['start_dir']
- This is the directory that the recipe started in. it‘s similar to cwd
, except that it’s constant.api.path['cache']
- This directory is provided by whatever's running the recipe. Files and directories created under here /may/ be evicted in between runs of the recipe (i.e. to relieve disk pressure).api.path['cleanup']
- This directory is provided by whatever's running the recipe. Files and directories created under here /are guaranteed/ to be evicted in between runs of the recipe. Additionally, this directory is guaranteed to be empty when the recipe starts.api.path['tmp_base']
- This directory is the system-configured temp dir. This is a weaker form of ‘cleanup’, and its use should be avoided. This may be removed in the future (or converted to an alias of ‘cleanup’).api.path['checkout']
- This directory is set by various ‘checkout’ modules in recipes. It was originally intended to make recipes easier to read and make code somewhat generic or homogenous, but this was a mistake. New code should avoid ‘checkout’, and instead just explicitly pass paths around. This path may be removed in the future.There are other anchor points which can be defined (e.g. by the depot_tools/infra_paths
module). Refer to those modules for additional documentation.
— def __getitem__(self, name):
Gets the base path named name
. See module docstring for more information.
— def abs_to_path(self, abs_string_path):
Converts an absolute path string string_path
to a real Path object, using the most appropriate known base path.
This method will find the longest match in all the following:
Example:
# assume [START_DIR] == "/basis/dir/for/recipe" api.path.abs_to_path("/basis/dir/for/recipe/some/other/dir") -> Path("[START_DIR]/some/other/dir")
Raises an ValueError if the preconditions are not met, otherwise returns the Path object.
— def assert_absolute(self, path):
Raises AssertionError if the given path is not an absolute path.
Args:
— def get(self, name, default=None):
Gets the base path named name
. See module docstring for more information.
— def get_config_defaults(self):
Internal recipe implementation function.
— def initialize(self):
Internal recipe implementation function.
— def mkdtemp(self, prefix=tempfile.template):
Makes a new temporary directory, returns Path to it.
Args:
Returns a Path to the new directory.
— def mkstemp(self, prefix=tempfile.template):
Makes a new temporary file, returns Path to it.
Args:
Returns a Path to the new file. Unlike tempfile.mkstemp, the file's file descriptor is closed.
— def mock_add_paths(self, path):
For testing purposes, mark that |path| exists.
— def mock_copy_paths(self, source, dest):
For testing purposes, copy |source| to |dest|.
— def mock_remove_paths(self, path, filt=(lambda p: True)):
For testing purposes, assert that |path| doesn't exist.
Args:
Mockable system platform identity functions.
Provides host-platform-detection properties.
Mocks:
@property
— def arch(self):
Returns the current CPU architecture.
TODO: This is currently always hard-coded to ‘intel’... Apparently no one has actually needed this function?
@property
— def bits(self):
Returns the bitness of the userland for the current system (either 32 or 64 bit).
TODO: If anyone needs to query for the kernel bitness, another accessor should be added.
@property
— def cpu_count(self):
The number of CPU cores, according to multiprocessing.cpu_count().
@property
— def is_linux(self):
Returns True iff the recipe is running on Linux.
@property
— def is_mac(self):
Returns True iff the recipe is running on OS X.
@property
— def is_win(self):
Returns True iff the recipe is running on Windows.
@property
— def name(self):
Returns the current platform name which will be in:
@staticmethod
— def normalize_platform_name(plat):
One of python's sys.platform values -> ‘win’, ‘linux’ or ‘mac’.
Provides access to the recipes input properties.
Every recipe is run with a JSON object called “properties”. These contain all inputs to the recipe. Some common examples would be properties like “revision”, which the build scheduler sets to tell a recipe to build/test a certain revision.
The properties that affect a particular recipe are defined by the recipe itself, and this module provides access to them.
Recipe properties are read-only; the values obtained via this API reflect the values provided to the recipe engine at the beginning of execution. There is intentionally no API to write property values (lest they become a kind of random-access global variable).
PropertiesApi implements all the standard Mapping functions, so you can use it like a read-only dict.
— def legacy(self):
DEPRECATED: Returns a set of properties, possibly used by legacy scripts.
This excludes any recipe module-specific properties (i.e. those beginning with $
).
Instead of passing all of the properties as a blob, please consider passing specific arguments to scripts that need them. Doing this makes it much easier to debug and diagnose which scripts use which properties.
— def thaw(self):
Returns a read-write copy of all of the properties.
Provides methods for running python scripts correctly.
This includes support for vpython
, and knows how to specify parameters correctly for bots (e.g. ensuring that python is working on Windows, passing the unbuffered flag, etc.)
— def __call__(self, name, script, args=None, unbuffered=True, venv=None, **kwargs):
Return a step to run a python script with arguments.
TODO: We should just use a single “args” list. Having “script” separate but required/first leads to weird things like:
(... script='-m', args=['module'])
Args:
Returns (types.StepData
) - The StepData object as returned by api.step.
— def failing_step(self, name, text, as_log=None):
Runs a failing step (exits 1).
— def infra_failing_step(self, name, text, as_log=None):
Runs an infra-failing step (exits 1).
— def inline(self, name, program, add_python_log=True, **kwargs):
Run an inline python program as a step.
Program is output to a temp file and run when this step executes.
Args:
python /path/to/file.py
program
.Returns (types.StepData
) - The StepData object as returned by api.step.
— def result_step(self, name, text, retcode, as_log=None, **kwargs):
Runs a no-op step that exits with a specified return code.
The recipe engine will raise an exception when seeing a return code != 0.
— def succeeding_step(self, name, text, as_log=None):
Runs a succeeding step (exits 0).
Allows randomness in recipes.
This module sets up an internal instance of ‘random.Random’. In tests, this is seeded with 1234
, or a seed of your choosing (using the test_api's seed()
method)
All members of random.Random
are exposed via this API with getattr.
NOTE: This is based on the python random
module, and so all caveats which apply there also apply to this (i.e. don't use it for anything resembling crypto).
Example:
def RunSteps(api): my_list = range(100) api.random.shuffle(my_list) # my_list is now random!
— def __getattr__(self, name):
Access a member of random.Random
.
Provides objects for reading and writing raw data to and from steps.
@returns_placeholder
@staticmethod
— def input(data, suffix='', name=None):
Returns a Placeholder for use as a step argument.
This placeholder can be used to pass data to steps. The recipe engine will dump the ‘data’ into a file, and pass the filename to the command line argument.
data MUST be of type ‘str’ (not basestring, not unicode).
If ‘suffix’ is not '', it will be used when the engine calls tempfile.mkstemp.
See examples/full.py for usage example.
@returns_placeholder
@staticmethod
— def input_text(data, suffix='', name=None):
Returns a Placeholder for use as a step argument.
data MUST be of type ‘str’ (not basestring, not unicode). The str is expected to have valid utf-8 data in it.
Similar to input(), but ensures that ‘data’ is valid utf-8 text. Any non-utf-8 characters will be replaced with �.
@returns_placeholder
@staticmethod
— def output(suffix='', leak_to=None, name=None, add_output_log=False):
Returns a Placeholder for use as a step argument, or for std{out,err}.
If ‘leak_to’ is None, the placeholder is backed by a temporary file with a suffix ‘suffix’. The file is deleted when the step finishes.
If ‘leak_to’ is not None, then it should be a Path and placeholder redirects IO to a file at that path. Once step finishes, the file is NOT deleted (i.e. it's ‘leaking’). ‘suffix’ is ignored in that case.
Args:
name
. If this is ‘on_failure’, only create this log when the step has a non-SUCCESS status. @returns_placeholder
@staticmethod
— def output_dir(suffix='', leak_to=None, name=None):
Returns a directory Placeholder for use as a step argument.
If ‘leak_to’ is None, the placeholder is backed by a temporary dir with a suffix ‘suffix’. The dir is deleted when the step finishes.
If ‘leak_to’ is not None, then it should be a Path and placeholder redirects IO to a dir at that path. Once step finishes, the dir is NOT deleted (i.e. it's ‘leaking’). ‘suffix’ is ignored in that case.
@returns_placeholder
@staticmethod
— def output_text(suffix='', leak_to=None, name=None, add_output_log=False):
Returns a Placeholder for use as a step argument, or for std{out,err}.
Similar to output(), but uses an OutputTextPlaceholder, which expects utf-8 encoded text. Similar to input(), but tries to decode the resulting data as utf-8 text, replacing any decoding errors with �.
Args:
name
. If this is ‘on_failure’, only create this log when the step has a non-SUCCESS status.This module assists in experimenting with production recipes.
For example, when migrating builders from Buildbot to pure LUCI stack.
@property
— def is_experimental(self):
True if this recipe is currently running in experimental mode.
Typical usage is to modify steps which produce external side-effects so that non-production runs of the recipe do not affect production data.
Examples:
@property
— def is_luci(self):
True if this recipe is currently running on LUCI stack.
Should be used only during migration from Buildbot to LUCI stack.
DEPS: buildbucket, json, platform, properties, raw_io, runtime, step, time
API for interacting with the LUCI Scheduler service.
Depends on ‘prpc’ binary available in $PATH: https://godoc.org/go.chromium.org/luci/grpc/cmd/prpc Documentation for scheduler API is in https://chromium.googlesource.com/infra/luci/luci-go/+/master/scheduler/api/scheduler/v1/scheduler.proto RPCExplorer available at https://luci-scheduler.appspot.com/rpcexplorer/services/scheduler.Scheduler
A module for interacting with LUCI Scheduler service.
— def emit_trigger(self, trigger, project, jobs, step_name=None):
Emits trigger to one or more jobs of a given project.
Args: trigger (Trigger): defines payload to trigger jobs with. project (str): name of the project in LUCI Config service, which is used by LUCI Scheduler instance. See https://luci-config.appspot.com/. jobs (iterable of str): job names per LUCI Scheduler config for the given project. These typically are the same as builder names.
— def emit_triggers(self, trigger_project_jobs, timestamp_usec=None, step_name=None):
Emits a batch of triggers spanning one or more projects.
Up to date documentation is at https://chromium.googlesource.com/infra/luci/luci-go/+/master/scheduler/api/scheduler/v1/scheduler.proto
Args: trigger_project_jobs (iterable of tuples(trigger, project, jobs)): each tuple corresponds to parameters of emit_trigger
API above. timestamp_usec (int): unix timestamp in microseconds. Useful for idempotency of calls if your recipe is doing its own retries. https://chromium.googlesource.com/infra/luci/luci-go/+/master/scheduler/api/scheduler/v1/triggers.proto
@property
— def host(self):
Returns the backend hostname used by this module.
— def set_host(self, host):
Changes the backend hostname used by this module.
Args: host (str): server host (e.g. ‘luci-scheduler.appspot.com’).
@property
— def triggers(self):
Returns a list of triggers that triggered the current build.
A trigger is an instance of triggers_pb2.Trigger.
DEPS: path, platform, raw_io, step
API for getting OAuth2 access tokens for LUCI tasks or private keys.
This is a thin wrapper over the luci-auth go executable ( https://godoc.org/go.chromium.org/luci/auth/client/cmd/luci-auth).
Depends on luci-auth to be in PATH.
— def default(self):
Returns an account associated with the task.
On LUCI, this is default account exposed through LUCI_CONTEXT[“local_auth”] protocol. When running locally this is an account the user logged in via “luci-auth login ...” command prior to running the recipe.
— def from_credentials_json(self, key_path):
Returns a service account based on a JSON credentials file.
This is the file generated by Cloud Console when creating a service account key. It contains the private key inside.
Args: key_path: (str|Path) object pointing to a service account JSON key.
— def set_json_manifest(self, name, data):
Uploads a source manifest with the given name.
NOTE: Due to current implementation restrictions, this method may only be called after some step has been run from the recipe. Calling this before running any steps is invalid and will fail. We hope to lift this restriction sometime after we don't need to support buildbot any more.
Args:
Step is the primary API for running steps (external programs, scripts, etc.).
@property
— def InfraFailure(self):
InfraFailure is a subclass of StepFailure, and will translate to a purple build.
This exception is raised from steps which are marked as infra_step
s when they fail.
@property
— def StepFailure(self):
This is the base Exception class for all step failures.
It can be manually raised from recipe code to cause the build to turn red.
Usage:
raise api.StepFailure("some reason")
except api.StepFailure:
@property
— def StepTimeout(self):
StepTimeout is a subclass of StepFailure and is raised when a step times out.
@property
— def StepWarning(self):
StepWarning is a subclass of StepFailure, and will translate to a yellow build.
@recipe_api.composite_step
— def __call__(self, name, cmd, ok_ret=None, infra_step=False, wrapper=(), timeout=None, allow_subannotations=None, trigger_specs=None, stdout=None, stderr=None, stdin=None, step_test_data=None):
Returns a step dictionary which is compatible with annotator.py.
Args:
Returns a types.StepData
for the running step.
@property
— def active_result(self):
The currently active (open) result from the last step that was run. This is a types.StepData
object.
Allows you to do things like:
try: api.step('run test', [..., api.json.output()]) finally: result = api.step.active_result if result.json.output: new_step_text = result.json.output['step_text'] api.step.active_result.presentation.step_text = new_step_text
This will update the step_text of the test, even if the test fails. Without this api, the above code would look like:
try: result = api.step('run test', [..., api.json.output()]) except api.StepFailure as f: result = f.result raise finally: if result.json.output: new_step_text = result.json.output['step_text'] api.step.active_result.presentation.step_text = new_step_text
@property
— def defer_results(self):
See recipe_api.py for docs.
@contextlib.contextmanager
— def nest(self, name):
Nest allows you to nest steps hierarchically on the build UI.
Calling
with api.step.nest(<name>): ...
will generate a dummy step with the provided name. All other steps run within this with statement will be hidden from the UI by default under this dummy step in a collapsible hierarchy. Nested blocks can also nest within each other.
DEPS: cipd, context, isolated, json, path, properties, raw_io, runtime, step
API for interacting with swarming.
The tool's source lives at http://go.chromium.org/luci/client/cmd/swarming.
This module will deploy the client to [CACHE]/swarming_client/; users should add this path to the named cache for their builder.
— def collect(self, name, tasks, output_dir=None, timeout=None):
Waits on a set of Swarming tasks.
Args: name (str): The name of the step. tasks ((list(str|TaskRequestMetadata)): A list of ids or metadata objects corresponding to tasks to wait output_dir (Path|None): Where to download the tasks' isolated outputs. If set to None, they will not be downloades; else, a given task's outputs will be downloaded to output_dir//. timeout (str|None): The duration for which to wait on the tasks to finish. If set to None, there will be no timeout; else, timeout follows the format described by https://golang.org/pkg/time/#ParseDuration.
Returns: A list of TaskResult objects.
— def initialize(self):
@contextlib.contextmanager
— def on_path(self):
This context manager ensures the go swarming client is available on $PATH.
Example:
with api.swarming.on_path(): # do your steps which require the swarming binary on path
— def task_request(self):
Creates a new TaskRequest object.
See documentation for TaskRequest/TaskSlice to see how to build this up into a full task.
Once your TaskRequest is complete, you can pass it to trigger
in order to have it start running on the swarming server.
— def trigger(self, step_name, requests):
Triggers a set of Swarming tasks.
Args: step_name (str): The name of the step. tasks (seq[TaskRequest]): A sequence of task request objects representing the tasks we want to trigger.
Returns: A list of TaskRequestMetadata objects.
@contextlib.contextmanager
— def with_server(self, server):
This context sets the server for Swarming calls.
Example:
with api.swarming.server(‘new-swarming-server.com’): # perform swarming calls
Args: server (str): The swarming server to call within context.
Simplistic temporary directory manager (deprecated).
@contextlib.contextmanager
— def temp_dir(self, prefix):
This makes a temporary directory which lives for the scope of the with statement.
Example:
with api.tempfile.temp_dir("some_prefix") as path: # use path # path is deleted here.
Allows mockable access to the current time.
— def ms_since_epoch(self):
Returns current timestamp as an int number of milliseconds since epoch.
— def sleep(self, secs):
Suspend execution of |secs| (float) seconds. Does nothing in testing.
If secs > 60 (sleep longer than one minute), run a step to do the sleep, so that if a user looks at a build, they know what the recipe is doing.
— def time(self):
Return current timestamp as a float number of seconds since epoch.
— def utcnow(self):
Return current UTC time as a datetime.datetime.
DEPS: json, properties, step
API for Tricium analyzers to use.
TriciumApi provides basic support for Tricium.
— def __init__(self, repository, ref, paths, **kwargs):
Sets up the API.
This assumes that the input is a Tricium GitFileDetails object, and the output is a Tricium Results object (see https://chromium.googlesource.com/infra/infra/+/master/go/src/infra/tricium/api/v1/data.proto for details and definitions).
— def add_comment(self, category, message, path, url='', start_line=0, end_line=0, start_char=0, end_char=0, suggestions=None):
@property
— def paths(self):
@property
— def ref(self):
@property
— def repository(self):
— def write_comments(self, dump=False):
DEPS: context, json, path, python, raw_io
Methods for interacting with HTTP(s) URLs.
— def get_file(self, url, path, step_name=None, headers=None, transient_retry=True, strip_prefix=None, timeout=None):
GET data at given URL and writes it to file.
Args:
Returns (UrlApi.Response) - Response with “path” as its “output” value.
Raises:
— def get_json(self, url, step_name=None, headers=None, transient_retry=True, strip_prefix=None, log=False, timeout=None, default_test_data=None):
GET data at given URL and writes it to file.
Args:
Returns (UrlApi.Response) - Response with the JSON as its “output” value.
Raises:
— def get_text(self, url, step_name=None, headers=None, transient_retry=True, timeout=None, default_test_data=None):
GET data at given URL and writes it to file.
Args:
Returns (UrlApi.Response) - Response with the content as its output value.
Raises:
— def join(self, *parts):
Constructs a URL path from composite parts.
Args:
— def validate_url(self, v):
Validates that “v” is a valid URL.
A valid URL has a scheme and netloc, and must begin with HTTP or HTTPS.
Args:
Returns (bool) - True if the URL is considered secure, False if not.
Raises:
Allows test-repeatable access to a random UUID.
— def random(self):
Returns a random UUID string.
DEPS: archive, context, file, json, path, platform, raw_io, step
— def RunSteps(api):
DEPS: assertions, properties, step
— def RunSteps(api):
DEPS: assertions, properties, step
— def RunSteps(api):
DEPS: assertions, properties, step
— def RunSteps(api):
— def RunSteps(api):
DEPS: assertions, properties, step
— def RunSteps(api):
DEPS: buildbucket, platform, properties, raw_io, step
This file is a recipe demonstrating the buildbucket recipe module.
— def RunSteps(api):
DEPS: buildbucket, properties, step
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
DEPS: buildbucket, properties, runtime
— def RunSteps(api):
DEPS: buildbucket, properties, runtime, step
— def RunSteps(api):
DEPS: cipd, json, path, platform, properties, service_account, step
— def RunSteps(api, use_pkg, pkg_files, pkg_dirs, pkg_vars, ver_files, install_mode, refs, tags):
— def RunSteps(api):
DEPS: context, path, raw_io, step
— def RunSteps(api):
— def RunSteps(api):
DEPS: context, path, raw_io, step
— def RunSteps(api):
— def RunSteps(api):
DEPS: buildbucket, cq, step
— def RunSteps(api):
DEPS: cq, properties, step
— def RunSteps(api):
Tests that daemons that hang on to STDOUT can't cause the engine to hang.
— def RunSteps(api):
A fast-running recipe which comprehensively covers all StepPresentation features available in the recipe engine.
— def RunSteps(api):
— def named_step(api, name):
Tests that step_data can accept multiple specs at once.
— def RunSteps(api):
Engine shouldn't explode when step_test_data gets functools.partial.
This is a regression test for a bug caused by this revision: http://src.chromium.org/viewvc/chrome?revision=298072&view=revision
When this recipe is run (by run_test.py), the _print_step code is exercised.
— def RunSteps(api):
Tests that deleting the current working directory doesn't immediately fail.
— def RunSteps(api):
This test serves to demonstrate that the ModuleInjectionSite object on recipe modules (i.e. the .m
) also contains a reference to the module which owns it.
This was implemented to aid in refactoring some recipes (crbug.com/782142).
— def RunSteps(api):
Tests that step_data can accept multiple specs at once.
— def RunSteps(api):
— def RunSteps(api):
Tests that recipes have access to names, resources and their repo.
— def RunSteps(api):
Tests that step presentation properties can be ordered.
— def RunSteps(api):
Tests that placeholders can't wreck the world by exhausting the step stack.
— def RunSteps(api):
DEPS: properties, python, step
— def RunSteps(api, from_recipe, attribute, module):
— def RunSteps(api):
DEPS: context, properties, step
Tests that step_data can accept multiple specs at once.
— def RunSteps(api, fakeit):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
DEPS: generator_script, json, path, properties, step
— def RunSteps(api, script_name):
DEPS: file, isolated, json, path, runtime, step
— def RunSteps(api):
DEPS: json, path, properties, python, raw_io, step
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
DEPS: path, platform, properties, step
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api, test_prop, param_name_test, from_env):
DEPS: path, python, raw_io, step
Launches the repo bundler.
— def RunSteps(api):
Tests for api.python.infra_failing_step.
— def RunSteps(api):
— def RunSteps(api):
DEPS: path, properties, python, raw_io, step
— def RunSteps(api):
— def RunSteps(api):
DEPS: json, properties, runtime, scheduler, time
This file is a recipe demonstrating emitting triggers to LUCI Scheduler.
— def RunSteps(api):
This file is a recipe demonstrating reading/mocking scheduler host.
— def RunSteps(api):
This file is a recipe demonstrating reading triggers of the current build.
— def RunSteps(api):
DEPS: path, platform, properties, raw_io, service_account
— def RunSteps(api, key_path, scopes):
— def RunSteps(api):
DEPS: context, path, properties, step
— def RunSteps(api, bad_return, access_invalid_data, timeout):
— def RunSteps(api):
— def RunSteps(api):
DEPS: context, path, properties, step
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api, timeout):
— def RunSteps(api, command):
DEPS: cipd, path, runtime, step, swarming
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
— def RunSteps(api):
DEPS: context, path, step, url
— def RunSteps(api):
— def RunSteps(api):
DEPS: properties, step, url
— def RunSteps(api):
— def RunSteps(api):