tree: 1fcb2269a363a9703dda84782fb45834635cb60f [path history] [tgz]
  1. data/
  2. cross_document_resource_reuse_test.cc
  3. DEPS
  4. DIR_METADATA
  5. event_counts_browsertest.cc
  6. first_input_delay_browsertest.cc
  7. first_scroll_delay_browsertest.cc
  8. image_loading_uma_browsertest.cc
  9. interaction_to_next_paint_browsertest.cc
  10. jsdeps.gni
  11. largest_contentful_paint_browsertest.cc
  12. layout_instability_browsertest.cc
  13. metric_integration_test.cc
  14. metric_integration_test.h
  15. README.md
  16. render_blocking_browsertest.cc
  17. smoothness_metric_browsertest.cc
  18. soft_navigation_metrics_browsertest.cc
  19. sources.gni
  20. unused_preloads_browsertest.cc
  21. user_timing_browsertest.cc
chrome/browser/page_load_metrics/integration_tests/README.md

End To End Tests for Metrics

Background

Chrome's speed metrics are reported to a number of downstream consumers:

  • Web Performance APIs (typically through the PerformanceObserver)
  • UKM (as seen in chrome://ukm]
  • UMA (as seen in chrome://histograms)
  • Trace Events (as seen in chrome://tracing)

Due to the diverse use cases for and contexts required by each consumer, we can't always guarantee that the calculation of each metric is done entirely in one place.

Further, some consumers must observe distinct values (e.g. privacy and security requirements can mean that a Performance API must see a value based solely on first-party content while a more holistic value can be reported internally).

Because of this, it‘s all too easy to introduce bugs where consumers see incorrect and/or inconsistent values. This is “a bad thing”™ that we’d like to avoid, so we write integration tests to assert each consumer sees the correct value.

Metrics Integration Test Framework

To make it easier to write tests of metrics emissions, we have the Metrics Integration Test Framework. The framework makes it easy to

  • Run an integration test
    • Load a web page consisting of a given string literal
    • Serve resources that can be fetched by the above page
  • Record and verify reports of metrics as they're emitted
    • Performance APIs can be queryed in-page and tested with familiar testharness.js assertions
    • UMA metrics can be observed and inspected with a HistogramTester
    • UKM metrics can be observed and inspected with a TestAutoSetUkmRecorder
    • Trace Events can be queried and aggregated with a TraceAnalyzer

Examples

See the source!

Tips and Tricks

Use content::EvalJS to pass JavaScript values back to C++ and check for consistency.

Use xvfb-run when running the browser_tests executable.

  • no more flashing windows
  • no chance to accidentally send real input to the test