tree: f68ba7eda30cfc5084a685a2b68b10bc48b69d30 [path history] [tgz]
  1. change/
  2. compare/
  3. evaluators/
  4. quest/
  5. tasks/
  6. templates/
  7. __init__.py
  8. attempt.py
  9. cas.py
  10. cas_test.py
  11. errors.py
  12. evaluators_test.py
  13. event.py
  14. exploration.py
  15. exploration_test.py
  16. isolate.py
  17. isolate_test.py
  18. job.py
  19. job_bug_update.py
  20. job_state.py
  21. job_state_test.py
  22. job_test.py
  23. README.md
  24. results2.py
  25. results2_test.py
  26. scheduler.py
  27. task.py
  28. task_test.py
  29. timing_record.py
  30. timing_record_test.py
dashboard/dashboard/pinpoint/models/README.md

Pinpoint Architecture

Running stuff

At its core, Pinpoint is a service that can run a list of steps at a particular commit. In a Pinpoint Job, the steps are known as Quests and the commit is known as a Change. Usually, there's at least a Build Quest and a Test Quest, but a Quest can be arbitrary Python code, as long as it conforms to the Quest API.

When a list of Quests is bound to a Change, an Attempt is created. The Attempt binds each Quest to the Change, creating an Execution for each one. It runs the Executions serially, passing the output of each one to the input of the following Execution. If any Execution fails, the following Executions are skipped and the entire Attempt fails. There can be multiple Attempts on the same Change.

Each individual Attempt handles its own execution, so all Attempts can run in parallel.

Bisection

When the Job's comparison_mode flag is set, Pinpoint automatically chooses what Changes to run. It compares the results of each pair of adjacent Changes. If any two Changes have different results, it finds the midpoint of those Changes and speculates multiple levels in (currently only two levels) runs the tests on the additional points as well.

Data migration

The App Engine documentation provides advice on when to update entities in the Datastore.

Most of the Job state is stored in a PickleProperty. To update a pickleable object, commit a change to the object's __getstate__ method, so that it is updated when loaded from the Datastore and unpickled. Then use the data migration page,/migrate, which reads and stores every completed Job, causing them all to update. If there are Jobs in flight, you may need to run multiple migrations.