Visit the application URL in your browser, and upload a JSON dictionary with the following keys:
action (string): the action to perform. Only trace and report are supported.action_params (dictionary): the parameters associated to the action. See below for more details.backend_params (dictionary): the parameters configuring the backend for this task. See below for more details.backend_paramsstorage_bucket (string): Name of the storage bucket used by the backend instances. Backend code and data must have been previously deployed to this bucket using the deploy.sh script.instance_count (int, optional): Number of Compute Engine instances that will be started for this task. If not specified, the number of instances is determined automatically depending on the size of the task and the number of available instances.task_name (string, opitonal): Name of the task, used to build the name of the output directory.tag (string, optional): tag internally used to associate tasks to backend ComputeEngine instances. This parameter should not be set in general, as it is mostly exposed for development purposes. If this parameter is not specified, a unique tag will be generated.timeout_hours (int, optional): if workers are still alive after this delay, they will be forcibly killed, to avoid wasting Compute Engine resources. If not specified, the timeout is determined automatically.trace actionThe trace action takes a list of URLs as input and generates a list of traces by running Chrome.
urls (list of strings): the list of URLs to process.repeat_count (integer, optional): the number of traces to be generated for each URL. Defaults to 1.emulate_device (string, optional): the device to emulate (e.g. Nexus 4).emulate_network (string, optional): the network to emulate.report actionFinds all the traces in the specified bucket and generates a report in BigQuery.
trace_bucket (string): Name of the storage bucket where trace databases can be found. It can be either absolute or relative to the storage_bucket specified in the backend parameters.This requires an existing clovis_dataset.report BigQuery table that will be used as a template. The schema of this template is not updated automatically and must match the format of the report (as generated by report.py). See how to update the schema manually.
This is a python AppEngine application using Flask.
app.yaml defines the handlers. There is a static handler for all URLs in the static/ directory, and all other URLs are handled by the clovis_frontend.py script.queue.yaml defines the task queues associated with the application. In particular, the clovis-queue is a pull-queue where tasks are added by the AppEngine frontend and consummed by the ComputeEngine backend. See the TaskQueue documentation for more details.templates/form.html is a static HTML document allowing the user to upload a JSON file. clovis_frontend.py is then invoked with the contents of the file (see the /form_sent handler).clovis_task.py defines a task to be run by the backend. It is sent through the clovis-queue task queue.clovis_frontend.py is the script that processes the file uploaded by the form, creates the tasks and enqueues them in clovis-queue.queue.yaml file in the application directory (i.e. next to app.yaml) defining a clovis-queue pull queue that can be accessed by the ComputeEngine service worker associated to the project. Add your email too if you want to run the application locally. See the TaskQueue configuration documentation for more details. Example:# queue.yaml
- name: clovis-queue
mode: pull
acl:
- user_email: me@address.com # For local development.
- user_email: 123456789-compute@developer.gserviceaccount.com
# Install dependencies in the lib/ directory. Note that this will pollute your # Chromium checkout, see the cleanup intructions below. pip install -r requirements.txt -t lib # Start the local server. dev_appserver.py -A $PROJECT_NAME .
Visit the application http://localhost:8080.
After you are done, cleanup your Chromium checkout:
rm -rf $CHROMIUM_SRC/tools/android/loading/frontend/lib
# Install dependencies in the lib/ directory. pip install -r requirements.txt -t lib # Deploy. gcloud preview app deploy app.yaml
To deploy to a staging/test version of the server, you can do:
gcloud preview app deploy --no-promote --version $MY_VERSION
where MY_VERSION can be something like staging or something more unique to ensure there is no name collision. You can then access the application live on the web by prefixing the URL of the service with $MY_VERSION-dot-.
When a change is made to the dictionary returned by report.py, the BigQuery database schema must be updated accordingly.
To update the schema, run:
bq update \ --schema \ $CHROMIUM_SRC/tools/android/loading/cloud/frontend/bigquery_schema.json \ -t clovis_dataset.report
Adding a new field is harmless, but don't modify existing ones.
If the above command does not work, this is probably because you are doing more than adding fields. In this case, you can delete and recreate the clovis_dataset.report table from the BigQuery web interface:
clovis_dataset from the left menu, and delete the report table.clovis_dataset, and call it report.Location to None in order to create an empty table.Edit as Text in the Schema section , and paste the contents of bigquery_schema.json there.