Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better support for JMH Profiler Configuration #146

Closed
wants to merge 9 commits into from
Closed
Show file tree
Hide file tree
Changes from 5 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,13 @@ class OptionsValidationTest : GradleTest() {
iterationTimeUnit = "ms"
advanced("jsUseBridge", "x")
}

configuration("invalidJvmProfiler") {
iterations = 1
iterationTime = 100
iterationTimeUnit = "ms"
advanced("jvmProfiler", "x")
}
}

runner.runAndFail("blankAdvancedConfigNameBenchmark") {
Expand All @@ -246,6 +253,9 @@ class OptionsValidationTest : GradleTest() {
runner.runAndFail("invalidJsUseBridgeBenchmark") {
assertOutputContains("Invalid value for 'jsUseBridge': 'x'. Expected a Boolean value.")
}
runner.runAndFail("invalidJvmProfiler") {
assertOutputContains("Invalid value for 'jvmProfiler': 'x'. Accepted values: ${ValidOptions.jvmProfilers.joinToString(", ")}.")
Copy link
Contributor

@qurbonzoda qurbonzoda Mar 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the test above fails now. Because jvmProfiler was added to the "Accepted options":

runner.runAndFail("invalidAdvancedConfigNameBenchmark") {
    assertOutputContains("Invalid advanced option name: 'jsFork'. Accepted options: \"nativeFork\", \"nativeGCAfterIteration\", \"jvmForks\", \"jsUseBridge\".")
}

}
}
}

Expand All @@ -260,4 +270,5 @@ private object ValidOptions {
)
val modes = setOf("thrpt", "avgt", "Throughput", "AverageTime")
val nativeForks = setOf("perBenchmark", "perIteration")
val jvmProfilers = setOf("stack", "gc", "cl", "comp")
}
6 changes: 6 additions & 0 deletions plugin/main/src/kotlinx/benchmark/gradle/Utils.kt
Original file line number Diff line number Diff line change
Expand Up @@ -228,6 +228,11 @@ private fun validateConfig(config: BenchmarkConfiguration) {
"jsUseBridge" -> require(value is Boolean) {
"Invalid value for 'jsUseBridge': '$value'. Expected a Boolean value."
}
"jvmProfiler" -> {
require(value.toString() in ValidOptions.jvmProfilers) {
"Invalid value for 'jvmProfiler': '$value'. Accepted values: ${ValidOptions.jvmProfilers.joinToString(", ")}."
}
}
else -> throw IllegalArgumentException("Invalid advanced option name: '$param'. Accepted options: \"nativeFork\", \"nativeGCAfterIteration\", \"jvmForks\", \"jsUseBridge\".")
wldeh marked this conversation as resolved.
Show resolved Hide resolved
}
}
Expand All @@ -244,6 +249,7 @@ private object ValidOptions {
)
val modes = setOf("thrpt", "avgt", "Throughput", "AverageTime")
val nativeForks = setOf("perBenchmark", "perIteration")
val jvmProfilers = setOf("stack", "gc", "cl", "comp")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Those profilers seem to require root privileges to run (at least for macOS). Such requirements can be documented in the configuration options document

}

internal val Gradle.isConfigurationCacheAvailable
Expand Down
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please explain what libasyncProfiler is and how can it be specified?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am currently unfamiliar.
https://github.com/async-profiler/async-profiler
Https://www.youtube.com/playlist?list=PLNCLTEx3B8h4Yo_WvKWdLvI9mj1XpTKBr
I will look to dive in here and modify the pr if appropriate.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The libasyncProfiler argument is passed when a benchmark task is run from the IntelliJ Gradle panel with an embedded profiler attached.

It makes sense to add a comment in the code explaining this.

Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,15 @@ fun main(args: Array<String>) {
}
}

val profilerName = config.advanced["jvmProfiler"]
when (profilerName) {
"gc" -> jmhOptions.addProfiler("gc")
"stack" -> jmhOptions.addProfiler("stack")
"cl" -> jmhOptions.addProfiler("cl")
"comp" -> jmhOptions.addProfiler("comp")
else -> throw IllegalArgumentException("Invalid value for 'jvmProfiler': $profilerName. Accepted values: gc, stack, cl, comp")
wldeh marked this conversation as resolved.
Show resolved Hide resolved
}

val reportFormat = ResultFormatType.valueOf(config.reportFormat.uppercase())
val reporter = BenchmarkProgress.create(config.traceFormat)
val output = JmhOutputFormat(reporter, config.name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When a profiler (e.g., gc) is added, each iteration result includes not just the primary result but also a secondary one. Currently, our reporters seem to ignore these secondary results from the iterations. To observe the discrepancy, one can run a benchmark directly with JMH adding a profiler, and compare it to running a benchmark via the kx-benchmark plugin, also with a profiler added.

Could you please ensure that the secondary result (if not empty) of each iteration is reported as well?
The secondary result of a set of iterations should also be reported.

Expand Down