Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Better support for JMH Profiler Configuration #146

Closed
wants to merge 9 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
@@ -0,0 +1,74 @@
package kotlinx.benchmark.integration

import kotlin.test.*

class JvmProfilerTest : GradleTest() {

@Test
fun testGcProfiler() {
val runner = project("kotlin-multiplatform") {
configuration("gcProfiler") {
iterations = 1
iterationTime = 100
iterationTimeUnit = "ms"
advanced("jvmProfiler", "gc")
}
}

runner.run("jvmGcProfilerBenchmark") {
assertOutputContains("gc.alloc.rate")
assertOutputContains("BUILD SUCCESSFUL")
}
}

@Test
fun testStackProfilerEffect() {
val runner = project("kotlin-multiplatform") {
configuration("stackProfiler") {
iterations = 1
iterationTime = 100
iterationTimeUnit = "ms"
advanced("jvmProfiler", "stack")
}
}

runner.run("jvmStackProfilerBenchmark") {
assertOutputContains("stack")
assertOutputContains("BUILD SUCCESSFUL")
}
}

@Test
fun testClProfiler() {
val runner = project("kotlin-multiplatform") {
configuration("clProfiler") {
iterations = 1
iterationTime = 100
iterationTimeUnit = "ms"
advanced("jvmProfiler", "cl")
}
}

runner.run("jvmClProfilerBenchmark") {
assertOutputContains("class.unload.norm")
assertOutputContains("BUILD SUCCESSFUL")
}
}

@Test
fun testCompProfilerEffect() {
val runner = project("kotlin-multiplatform") {
configuration("compProfiler") {
iterations = 1
iterationTime = 100
iterationTimeUnit = "ms"
advanced("jvmProfiler", "comp")
}
}

runner.run("jvmCompProfilerBenchmark") {
assertOutputContains("compiler.time.profiled")
assertOutputContains("BUILD SUCCESSFUL")
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -226,6 +226,13 @@ class OptionsValidationTest : GradleTest() {
iterationTimeUnit = "ms"
advanced("jsUseBridge", "x")
}

configuration("invalidJvmProfiler") {
iterations = 1
iterationTime = 100
iterationTimeUnit = "ms"
advanced("jvmProfiler", "x")
}
}

runner.runAndFail("blankAdvancedConfigNameBenchmark") {
Expand All @@ -246,6 +253,9 @@ class OptionsValidationTest : GradleTest() {
runner.runAndFail("invalidJsUseBridgeBenchmark") {
assertOutputContains("Invalid value for 'jsUseBridge': 'x'. Expected a Boolean value.")
}
runner.runAndFail("invalidJvmProfiler") {
assertOutputContains("Invalid value for 'jvmProfiler': 'x'. Accepted values: ${ValidOptions.jvmProfilers.joinToString(", ")}.")
Copy link
Contributor

@qurbonzoda qurbonzoda Mar 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

One of the test above fails now. Because jvmProfiler was added to the "Accepted options":

runner.runAndFail("invalidAdvancedConfigNameBenchmark") {
    assertOutputContains("Invalid advanced option name: 'jsFork'. Accepted options: \"nativeFork\", \"nativeGCAfterIteration\", \"jvmForks\", \"jsUseBridge\".")
}

}
}
}

Expand All @@ -260,4 +270,5 @@ private object ValidOptions {
)
val modes = setOf("thrpt", "avgt", "Throughput", "AverageTime")
val nativeForks = setOf("perBenchmark", "perIteration")
val jvmProfilers = setOf("stack", "gc", "cl", "comp", "perf", "perfnorm", "perfasm", "xperfasm", " dtraceasm")
}
8 changes: 7 additions & 1 deletion plugin/main/src/kotlinx/benchmark/gradle/Utils.kt
Original file line number Diff line number Diff line change
Expand Up @@ -228,7 +228,12 @@ private fun validateConfig(config: BenchmarkConfiguration) {
"jsUseBridge" -> require(value is Boolean) {
"Invalid value for 'jsUseBridge': '$value'. Expected a Boolean value."
}
else -> throw IllegalArgumentException("Invalid advanced option name: '$param'. Accepted options: \"nativeFork\", \"nativeGCAfterIteration\", \"jvmForks\", \"jsUseBridge\".")
"jvmProfiler" -> {
require(value.toString() in ValidOptions.jvmProfilers) {
"Invalid value for 'jvmProfiler': '$value'. Accepted values: ${ValidOptions.jvmProfilers.joinToString(", ")}."
}
}
else -> throw IllegalArgumentException("Invalid advanced option name: '$param'. Accepted options: \"nativeFork\", \"nativeGCAfterIteration\", \"jvmForks\", \"jsUseBridge\", \"jvmProfiler\".")
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It might make sense to extract the list of advanced option names into a property of the ValidOptions object.

}
}
}
Expand All @@ -244,6 +249,7 @@ private object ValidOptions {
)
val modes = setOf("thrpt", "avgt", "Throughput", "AverageTime")
val nativeForks = setOf("perBenchmark", "perIteration")
val jvmProfilers = setOf("stack", "gc", "cl", "comp", "perf", "perfnorm", "perfasm", "xperfasm", " dtraceasm")
}

internal val Gradle.isConfigurationCacheAvailable
Expand Down
15 changes: 15 additions & 0 deletions runtime/jvmMain/src/kotlinx/benchmark/jvm/JvmBenchmarkRunner.kt
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Could you please explain what libasyncProfiler is and how can it be specified?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I am currently unfamiliar.
https://github.com/async-profiler/async-profiler
Https://www.youtube.com/playlist?list=PLNCLTEx3B8h4Yo_WvKWdLvI9mj1XpTKBr
I will look to dive in here and modify the pr if appropriate.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The libasyncProfiler argument is passed when a benchmark task is run from the IntelliJ Gradle panel with an embedded profiler attached.

It makes sense to add a comment in the code explaining this.

Original file line number Diff line number Diff line change
Expand Up @@ -56,6 +56,21 @@ fun main(args: Array<String>) {
}
}

val profilerName = config.advanced["jvmProfiler"]
when (profilerName) {
"gc" -> jmhOptions.addProfiler("gc")
"stack" -> jmhOptions.addProfiler("stack")
"cl" -> jmhOptions.addProfiler("cl")
"comp" -> jmhOptions.addProfiler("comp")
"perf" -> jmhOptions.addProfiler("perf")
"perfnorm" -> jmhOptions.addProfiler("perfnorm")
"perfasm" -> jmhOptions.addProfiler("perfasm")
"xperfasm" -> jmhOptions.addProfiler("xperfasm")
"dtraceasm" -> jmhOptions.addProfiler("dtraceasm")
null -> {}
else -> throw IllegalArgumentException("Invalid value for 'jvmProfiler': $profilerName. Accepted values: gc, stack, cl, comp")
wldeh marked this conversation as resolved.
Show resolved Hide resolved
}
Comment on lines +60 to +72
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can it be changed to the following?

if (profilerName != null) {
    jmhOptions.addProfiler(profilerName)
}

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Voting for jmhOptions.addProfiler(profilerName):

  • the list of profilers here is already incomplete;
  • we can't hardcode it as users are allowed to specify a custom JMH version and it may have different profilers bundled inside;
  • there's no way to specify a custom profiler bundled in a separate jar.


val reportFormat = ResultFormatType.valueOf(config.reportFormat.uppercase())
val reporter = BenchmarkProgress.create(config.traceFormat)
val output = JmhOutputFormat(reporter, config.name)
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

When a profiler (e.g., gc) is added, each iteration result includes not just the primary result but also a secondary one. Currently, our reporters seem to ignore these secondary results from the iterations. To observe the discrepancy, one can run a benchmark directly with JMH adding a profiler, and compare it to running a benchmark via the kx-benchmark plugin, also with a profiler added.

Could you please ensure that the secondary result (if not empty) of each iteration is reported as well?
The secondary result of a set of iterations should also be reported.

Expand Down