Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

meta: setup continuous benchmarking #42

Open
4 of 13 tasks
ankush opened this issue Dec 23, 2024 · 1 comment
Open
4 of 13 tasks

meta: setup continuous benchmarking #42

ankush opened this issue Dec 23, 2024 · 1 comment

Comments

@ankush
Copy link
Member

ankush commented Dec 23, 2024

  1. Get a new bare metal server
  2. Do some A/A tests ensure <1% variance
  3. Setup self hosted GHA runner
  4. Write an action for benchmarking - nightly + dispatch
  5. Do some A/A testing with GHA setup.
  6. Add custom metadata to pyperf's JSON output.
  7. Evaluate bencher v. custom insights based solution
  8. Establish a baseline with this commit frappe/frappe@7c06c84
  9. Run benchmarks nightly
  10. Upload data somewhere
  11. Ensure concurrency of 1
  12. Dashboards, pretty charts
  13. Alerts
@epompeii
Copy link

@ankush you may want to check out Bencher: https://github.com/bencherdev/bencher

Bencher supports 2, 3, 5, & 6 out of the box, for free for public projects. Points 1 & 4 are also available. I'm the maintainer of Bencher, so I would be more than happy to help answer any questions.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants