-
Notifications
You must be signed in to change notification settings - Fork 41
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better support for JMH Profiler Configuration #146
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please explain what libasyncProfiler
is and how can it be specified?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I am currently unfamiliar.
https://github.com/async-profiler/async-profiler
Https://www.youtube.com/playlist?list=PLNCLTEx3B8h4Yo_WvKWdLvI9mj1XpTKBr
I will look to dive in here and modify the pr if appropriate.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The libasyncProfiler
argument is passed when a benchmark task is run from the IntelliJ Gradle panel with an embedded profiler attached.
It makes sense to add a comment in the code explaining this.
integration/src/test/kotlin/kotlinx/benchmark/integration/OptionsValidationTest.kt
Outdated
Show resolved
Hide resolved
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Could you please add tests to check that the selected jvmProfiler
affects the reported results.
Also, the option should be added to the configuration options document |
@@ -244,6 +249,7 @@ private object ValidOptions { | |||
) | |||
val modes = setOf("thrpt", "avgt", "Throughput", "AverageTime") | |||
val nativeForks = setOf("perBenchmark", "perIteration") | |||
val jvmProfilers = setOf("stack", "gc", "cl", "comp") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we support other profilers described here? https://github.com/openjdk/jmh/blob/362d6579e007f0241f05c1305f0b269fcc2cc27a/jmh-samples/src/main/java/org/openjdk/jmh/samples/JMHSample_35_Profilers.java#L352C97-L352C97
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Those profilers seem to require root privileges to run (at least for macOS). Such requirements can be documented in the configuration options document
…e.googlejavaformat-google-java-format-1.18.0 build(deps): bump com.google.googlejavaformat:google-java-format from 1.17.0 to 1.18.0
@@ -246,6 +253,9 @@ class OptionsValidationTest : GradleTest() { | |||
runner.runAndFail("invalidJsUseBridgeBenchmark") { | |||
assertOutputContains("Invalid value for 'jsUseBridge': 'x'. Expected a Boolean value.") | |||
} | |||
runner.runAndFail("invalidJvmProfiler") { | |||
assertOutputContains("Invalid value for 'jvmProfiler': 'x'. Accepted values: ${ValidOptions.jvmProfilers.joinToString(", ")}.") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
One of the test above fails now. Because jvmProfiler
was added to the "Accepted options":
runner.runAndFail("invalidAdvancedConfigNameBenchmark") {
assertOutputContains("Invalid advanced option name: 'jsFork'. Accepted options: \"nativeFork\", \"nativeGCAfterIteration\", \"jvmForks\", \"jsUseBridge\".")
}
when (profilerName) { | ||
"gc" -> jmhOptions.addProfiler("gc") | ||
"stack" -> jmhOptions.addProfiler("stack") | ||
"cl" -> jmhOptions.addProfiler("cl") | ||
"comp" -> jmhOptions.addProfiler("comp") | ||
"perf" -> jmhOptions.addProfiler("perf") | ||
"perfnorm" -> jmhOptions.addProfiler("perfnorm") | ||
"perfasm" -> jmhOptions.addProfiler("perfasm") | ||
"xperfasm" -> jmhOptions.addProfiler("xperfasm") | ||
"dtraceasm" -> jmhOptions.addProfiler("dtraceasm") | ||
null -> {} | ||
else -> throw IllegalArgumentException("Invalid value for 'jvmProfiler': $profilerName. Accepted values: gc, stack, cl, comp") | ||
} |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can it be changed to the following?
if (profilerName != null) {
jmhOptions.addProfiler(profilerName)
}
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Voting for jmhOptions.addProfiler(profilerName)
:
- the list of profilers here is already incomplete;
- we can't hardcode it as users are allowed to specify a custom JMH version and it may have different profilers bundled inside;
- there's no way to specify a custom profiler bundled in a separate jar.
null -> {} | ||
else -> throw IllegalArgumentException("Invalid value for 'jvmProfiler': $profilerName. Accepted values: gc, stack, cl, comp") | ||
} | ||
|
||
val reportFormat = ResultFormatType.valueOf(config.reportFormat.uppercase()) | ||
val reporter = BenchmarkProgress.create(config.traceFormat) | ||
val output = JmhOutputFormat(reporter, config.name) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
When a profiler (e.g., gc
) is added, each iteration result includes not just the primary result but also a secondary one. Currently, our reporters seem to ignore these secondary results from the iterations. To observe the discrepancy, one can run a benchmark directly with JMH adding a profiler, and compare it to running a benchmark via the kx-benchmark plugin, also with a profiler added.
Could you please ensure that the secondary result (if not empty) of each iteration is reported as well?
The secondary result of a set of iterations should also be reported.
"Invalid value for 'jvmProfiler': '$value'. Accepted values: ${ValidOptions.jvmProfilers.joinToString(", ")}." | ||
} | ||
} | ||
else -> throw IllegalArgumentException("Invalid advanced option name: '$param'. Accepted options: \"nativeFork\", \"nativeGCAfterIteration\", \"jvmForks\", \"jsUseBridge\", \"jvmProfiler\".") |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It might make sense to extract the list of advanced option names into a property of the ValidOptions
object.
Closing in favour of #217 |
Introduces the capability for users to specify the desired JMH profiler directly as an advanced config option.
advanced("jmhProfiler", "gc")
Resolves: #50