Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Benchmark baseline definition, allowed deviation and continous validation #54

Open
3 tasks
tobrun opened this issue Jan 15, 2022 · 0 comments
Open
3 tasks
Labels
enhancement New feature or request

Comments

@tobrun
Copy link
Collaborator

tobrun commented Jan 15, 2022

Thanks to @baparham, we now have a MVP benchmark implementation from #51. This approach uses the benchmark package so that executing dart run benchmark runs all the *_benchmark.dart within the benchmark directory. Having this setup is a great as a development tool, it allows us to quantify performance differences between different implementations. Iterating on this further, we can invest into making this a validation system that runs in CI.

Work to be done:

  • integrate running benchmark in CI env. + store results as CI run artifact
  • run integration, create a baseline file from that contains benchmark name, output value and allowed deviation percentage
  • integrate validation step that verifies that output didn't breach the baseline deviated output value, fail run if it did

cc @lukas-h

@lukas-h lukas-h added this to the Improve GeoJSON Core milestone Jul 7, 2022
@lukas-h lukas-h added the enhancement New feature or request label Jul 7, 2022
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

2 participants