Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Compute total score without mods during standardised score conversion #28277

Merged
merged 2 commits into from
May 27, 2024

Conversation

bdach
Copy link
Collaborator

@bdach bdach commented May 21, 2024

This is going to be used by server-side flows. Note that the server-side overload of UpdateFromLegacy() was not calling LegacyScoreDecoder.PopulateTotalScoreWithoutMods().

Computing the score without mods inline reduces reflection overheads from constructing mod instances, which feels pretty important for upcoming server-side flows.

There is one weird kink in the treatment of stable scores with score V2 active - they get the legacy multipliers unapplied for them because that made the most sense. For all intents and purposes this matters mostly for client-side replays with score V2. I'm not sure whether scores with SV2 ever make it to submission in stable.

There may be minute differences in converted score due to rounding shenanigans but I don't think it's worth doing a reverify for this. They will most likely get kinked out when we perform the server-side operations.

This is going to be used by server-side flows. Note that the server-side
overload of `UpdateFromLegacy()` was not calling
`LegacyScoreDecoder.PopulateTotalScoreWithoutMods()`.

Computing the score without mods inline reduces reflection overheads
from constructing mod instances, which feels pretty important for
server-side flows.

There is one weird kink in the treatment of stable scores with score V2
active - they get the *legacy* multipliers unapplied for them because
that made the most sense. For all intents and purposes this matters
mostly for client-side replays with score V2. I'm not sure whether
scores with SV2 ever make it to submission in stable.

There may be minute differences in converted score due to rounding
shenanigans but I don't think it's worth doing a reverify for this.
@bdach

This comment was marked as duplicate.

@bdach

This comment was marked as duplicate.

@bdach

This comment was marked as duplicate.

This comment was marked as outdated.

@bdach

This comment was marked as duplicate.

This comment was marked as outdated.

This comment was marked as outdated.

This comment was marked as outdated.

@bdach
Copy link
Collaborator Author

bdach commented May 23, 2024

I can't conclusively prove this because I failed to turn up a concrete counterexample in an hour of trying, but I'm reasonably sure that the above empty spreadsheets are a lie. See smoogipoo/diffcalc-sheet-generator#9.

I believe that this should and will cause fluctuations in score by 1 point and no more, but at this point without the sheet generator as a tool I am not sure how to conclusively prove this. Not sure how to proceed with this further.

@bdach bdach added the blocked Don't merge this. label May 23, 2024
@smoogipoo

This comment was marked as duplicate.

Copy link

github-actions bot commented May 24, 2024

@smoogipoo
Copy link
Contributor

Looks like it worked this time :)

@smoogipoo

This comment was marked as duplicate.

@smoogipoo

This comment was marked as duplicate.

Copy link

github-actions bot commented May 26, 2024

@smoogipoo

This comment was marked as duplicate.

Copy link

github-actions bot commented May 26, 2024

Copy link

github-actions bot commented May 26, 2024

@bdach
Copy link
Collaborator Author

bdach commented May 27, 2024

Well the sheets would have looked good... except the catch ones don't. They contain completely unexplained increases by more than +/-1 score. The increase seems to be related to the legacy maximum score simulation changing somehow (somehow this branch had more bonus). And in local testing it looks like somehow merging master fixes this which is a very scary thing to learn.

I'll investigate this in more detail but I just know this is going to be half a day down the drain and I don't want to get into it right now so I'll leave it for tomorrow. For now I'll just re-queue a catch sheet to see what happens.........

!diffcalc
GENERATORS=score
RULESET=catch

Copy link

github-actions bot commented May 27, 2024

@bdach
Copy link
Collaborator Author

bdach commented May 27, 2024

Sheet looks better and I'm 99% sure whatever the aforementioned issue is it won't have been caused by this pull, so undrafting (but will still look into what happened with that catch stuff tomorrow).

@bdach bdach marked this pull request as ready for review May 27, 2024 15:29
@smoogipoo smoogipoo merged commit e2b4e25 into ppy:master May 27, 2024
13 of 17 checks passed
bdach added a commit to bdach/osu that referenced this pull request May 28, 2024
Compare: ppy#26616

This came up elsewhere, namely in
ppy#28277 (comment).

As it turns out, at least one beatmap among those whose scores had
unexpected changes in total score, namely
https://osu.ppy.sh/beatmapsets/971028#fruits/2062131, was using slider
velocity multipliers that were not a multiple of 0.01 (the specific
value used was 0.225x). This meant that due to the rounding applied to
`SliderVelocityMultiplierBindable` via `Precision`, the raw value was
being incorrectly rounded, resulting in incorrect conversion.

The "direct" change that revealed this is most likely
ppy/osu-framework#6249, by the virtue of
shuffling the `BindableNumber` rounding code around and accidentally
changing midpoint rounding semantics in the process. But it was not
at fault here, as rounding was just as wrong before that change
as after in this specific context.
@bdach bdach deleted the total-score-without-mods-once-more branch May 29, 2024 14:52
TextAdventurer12 pushed a commit to TextAdventurer12/osu that referenced this pull request Jul 6, 2024
Compare: ppy#26616

This came up elsewhere, namely in
ppy#28277 (comment).

As it turns out, at least one beatmap among those whose scores had
unexpected changes in total score, namely
https://osu.ppy.sh/beatmapsets/971028#fruits/2062131, was using slider
velocity multipliers that were not a multiple of 0.01 (the specific
value used was 0.225x). This meant that due to the rounding applied to
`SliderVelocityMultiplierBindable` via `Precision`, the raw value was
being incorrectly rounded, resulting in incorrect conversion.

The "direct" change that revealed this is most likely
ppy/osu-framework#6249, by the virtue of
shuffling the `BindableNumber` rounding code around and accidentally
changing midpoint rounding semantics in the process. But it was not
at fault here, as rounding was just as wrong before that change
as after in this specific context.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
Development

Successfully merging this pull request may close these issues.

2 participants