kc3-lang/angle/src/tests/perf_tests/README.md

Download

ANGLE Performance Tests

angle_perftests is a standalone microbenchmark testing suite that contains tests for the OpenGL API. angle_trace_tests is a suite to run captures traces for correctness and performance. Because the traces contain confidential data, they are not publicly available. For more details on ANGLE’s tracer please see the docs.

The tests currently run on the Chromium ANGLE infrastructure and report results to the Chromium perf dashboard. Please refer to thepublic dashboard docs for help

Running the Tests

You can follow the usual instructions to check out and build ANGLE. Build the angle_perftests or angle_trace_tests targets. Note that all test scores are higher-is-better. You should also ensure is_debug=false in your build. Running with angle_assert_always_on or debug validation enabled is not recommended.

Variance can be a problem when benchmarking. We have a test harness to run a tests repeatedly to find a lower variance measurement. See src/tests/run_perf_tests.py.

To use the script first build angle_perftests or angle_trace_tests, set your working directory your build directory, and invoke the run_perf_tests.py script. Use --test-suite to specify your test suite, and --filter to specify a test filter.

Choosing the Test to Run

You can choose individual tests to run with --gtest_filter=*TestName*. To select a particular ANGLE back-end, add the name of the back-end to the test filter. For example: DrawCallPerfBenchmark.Run/gl or DrawCallPerfBenchmark.Run/d3d11. Many tests have sub-tests that run slightly different code paths. You might need to experiment to find the right sub-test and its name.

Null/No-op Configurations

ANGLE implements a no-op driver for OpenGL, D3D11 and Vulkan. To run on these configurations use the gl_null, d3d11_null or vulkan_null test configurations. These null drivers will not do any GPU work. They will skip the driver entirely. These null configs are useful for diagnosing performance overhead in ANGLE code.

Command-line Arguments

Each test runs N trials and prints metrics for each trial. Trials are limited by time (default), step/frame limits can also be set. Note that at the beginning performance might be affected by hitting new code paths, cold caches etc (see warmup below) but longer runs on some devices trigger thermal throttling affecting performance (known: phones, desktop perf CI bots).

Several command-line arguments control how the tests run:

The command line arguments implementations are located in ANGLEPerfTestArgs.cpp.

Test Breakdown

Microbenchmarks

Many other tests can be found that have documentation in their classes.

Trace Tests

Trace tests take command line arguments that pick the run configuration:

For example, for an endless run with no warmup on swiftshader, run:

angle_trace_tests --gtest_filter=TraceTest.trex_200 --use-angle=swiftshader --trial-time 1000000

Understanding the Metrics


Source

Download