For a detailed guide on how to write and run benchmarks in this directory, see the guide on benchmarks.
|assert||Benchmarks for the |
|buffers||Benchmarks for the |
|child_process||Benchmarks for the |
|crypto||Benchmarks for the |
|dgram||Benchmarks for the |
|domain||Benchmarks for the |
|es||Benchmarks for various new ECMAScript features and their pre-ES2015 counterparts.|
|events||Benchmarks for the |
|fixtures||Benchmarks fixtures used in various benchmarks throughout the benchmark suite.|
|fs||Benchmarks for the |
|http||Benchmarks for the |
|http2||Benchmarks for the |
|misc||Miscellaneous benchmarks and benchmarks for shared internal modules.|
|module||Benchmarks for the |
|net||Benchmarks for the |
|path||Benchmarks for the |
|process||Benchmarks for the |
|querystring||Benchmarks for the |
|streams||Benchmarks for the |
|string_decoder||Benchmarks for the |
|timers||Benchmarks for the |
|tls||Benchmarks for the |
|url||Benchmarks for the |
|util||Benchmarks for the |
|vm||Benchmarks for the |
The top-level files include common dependencies of the benchmarks and the tools for launching benchmarks and visualizing their output. The actual benchmark scripts should be placed in their corresponding directories.
_benchmark_progress.js: implements the progress bar displayed when running
_cli.js: parses the command line arguments passed to
_cli.R: parses the command line arguments passed to
_http-benchmarkers.js: selects and runs external tools for benchmarking the
common.js: see Common API.
compare.js: command line tool for comparing performance between different Node.js binaries.
compare.R: R script for statistically analyzing the output of
run.js: command line tool for running individual benchmark suite(s).
scatter.js: command line tool for comparing the performance between different parameters in benchmark configurations, for example to analyze the time complexity.
scatter.R: R script for visualizing the output of
scatter.jswith scatter plots.
The common.js module is used by benchmarks for consistency across repeated tasks. It has a number of helpful functions and properties to help with writing benchmarks.
createBenchmark(fn, configs[, options])
The default benchmarker used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.
The default port used to run HTTP benchmarks. See the guide on writing HTTP benchmarks.
Used in special benchmarks that can't use
createBenchmark and the object it returns to accomplish what they need. This function reports timing data to the parent process (usually created by running