|
9eab6d32
|
2022-05-26T12:00:45
|
|
[benchmark-set] Another Pause/Resume
|
|
33c990f0
|
2022-05-27T16:57:00
|
|
Include cassert to fix bots
|
|
0fe18692
|
2022-05-24T17:49:15
|
|
[benchmark-set] Pause timing around set copy initialization
|
|
ce5435a8
|
2022-05-24T16:34:04
|
|
[benchmark-set] Remove use of rand() inside benchmark
|
|
0623aa59
|
2022-05-19T14:12:42
|
|
[benchmark-set] Add benchmark for set copy
|
|
5f43ce82
|
2022-04-29T13:37:46
|
|
[benchmark-set] Split SetLookup into an ordered and random version
|
|
ae9c7b86
|
2022-04-29T13:39:04
|
|
[benchmark-set] At least increase needle by one in lookup benchmark
|
|
68a9b83d
|
2022-04-29T13:27:42
|
|
[benchmark-set] At least increase needle by one in lookup benchmark
|
|
dd005911
|
2022-04-29T12:23:53
|
|
[benchmark-set] Reduce lookup benchmark overhead
Turnsout 90% was overhead... Now lookup is in the 4ns ballpark.
|
|
2b03bced
|
2022-04-20T21:34:45
|
|
[perf] Cleanup range specifiers in set benchmark.
|
|
178c6700
|
2022-04-20T21:19:54
|
|
[perf] Rework set insert test to not use pause/resume timing.
These have high overhead which affect the result. Also change set iteration to time the individual iteration operation.
|
|
fc2027bf
|
2022-04-20T19:33:04
|
|
[perf] Add map benchmarks.
|
|
057ec2c9
|
2022-04-20T19:15:03
|
|
[perf] Add set ieration and lookup benchmarks.
|
|
cef64b94
|
2022-04-20T18:36:35
|
|
[perf] Add the start of a benchmark for set operations.
|