Skip to content

Fix bad benchmark_config.json in frameworks #10012

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 1 commit into from
Jul 29, 2025

Conversation

joanhey
Copy link
Contributor

@joanhey joanhey commented Jul 29, 2025

Problem when run ./tfb:

 => => naming to docker.io/techempower/tfb                                                                                                                0.0s
Framework oak does not define a default test in benchmark_config.json
Framework ring-http-exchange does not define a default test in benchmark_config.json
Framework ring-http-exchange does not define a default test in benchmark_config.json
Framework ring-http-exchange does not define a default test in benchmark_config.json
Framework http4k does not define a default test in benchmark_config.json
Framework hyperlane does not define a default test in benchmark_config.json
Framework httpserver does not define a default test in benchmark_config.json
Framework oak does not define a default test in benchmark_config.json
Framework ring-http-exchange does not define a default test in benchmark_config.json
Framework ring-http-exchange does not define a default test in benchmark_config.json
Framework ring-http-exchange does not define a default test in benchmark_config.json
Framework http4k does not define a default test in benchmark_config.json
Framework hyperlane does not define a default test in benchmark_config.json
Framework httpserver does not define a default test in benchmark_config.json
================================================================================
Running Tests...

They have bad json or not default.
And create noisy lines when running any test locally.
Perhaps we need to add a test for this file.

PD: as this PR don't touch the code, any fail was there before.

Worst and need to fix

Hyperlane need to be redone.

Actually each test alone have a dockerfile and is compiling the rust, very bad for the running time and not realistic.
https://github.com/TechEmpower/FrameworkBenchmarks/blob/master/frameworks/Rust/hyperlane/benchmark_config.json

@尤雨东 @尤雨东

I can't contact the authors by mention, and the last PRs for this problem don't exist (#9987, #9976, #9959, ...)
https://github.com/TechEmpower/FrameworkBenchmarks/commits/master/frameworks/Rust/hyperlane

This change in Hyperlane can't pass a review.

@msmith-techempower @NateBrady23 What we do with this situation ?

Mark as bad, revert PR, ... ?

@msmith-techempower
Copy link
Member

and the last PRs for this problem don't exist

Whoa... I have never seen that. How does that even happen?

Actually each test alone have a dockerfile and is compiling the rust, very bad for the running time and not realistic.

Agreed. Bad oversight on my part. I can fix - none of the Dockerfiles differ in any meaningful way; they just enable features (for routing) at compile time... sort of an odd design choice. Assuming the routing is sane, the worst case is basically zero-cost lookup... so I'll just create one Dockerfile that enables all the routes.

They have bad json or not default.

I can also fix - just need set a default for each.

@joanhey
Copy link
Contributor Author

joanhey commented Jul 29, 2025

I fixed all the bad benchmark_config.json.
Only left the Hyperlane problem in this PR.

@msmith-techempower msmith-techempower merged commit 89be83d into TechEmpower:master Jul 29, 2025
6 of 7 checks passed
@joanhey joanhey deleted the fw-defaults branch July 29, 2025 18:22
AliRn76 pushed a commit to AliRn76/FrameworkBenchmarks that referenced this pull request Aug 1, 2025
@eastspire
Copy link
Contributor

My account was suspended and only resumed the day before yesterday. I just submitted an action and found that the execution failed. I also saw an error message. My code has not been changed during this period. Is there any change in the processing logic for this?

@eastspire
Copy link
Contributor

and the last PRs for this problem don't exist

Whoa... I have never seen that. How does that even happen?

Actually each test alone have a dockerfile and is compiling the rust, very bad for the running time and not realistic.

Agreed. Bad oversight on my part. I can fix - none of the Dockerfiles differ in any meaningful way; they just enable features (for routing) at compile time... sort of an odd design choice. Assuming the routing is sane, the worst case is basically zero-cost lookup... so I'll just create one Dockerfile that enables all the routes.

They have bad json or not default.

I can also fix - just need set a default for each.

Using separate configurations for each route is to minimize the impact of routing on performance as much as possible. Although this impact is minimal or even non-existent, I still hope to optimize the framework performance through code or configuration

@joanhey
Copy link
Contributor Author

joanhey commented Aug 2, 2025

The problem is that for each test (route) the run need to create a new image from dockerfile, very bad for the time to run it.

Any framework can do the same trick, but this is NOT the intention of this benchmark, we want realistic results.
Do you'll create a new binary for each route in your app ??

Even exist a issue about that, and another about create a mixed bench, with all the routes (but that all are for the same fw and platform) and from the same dockerfile.

@eastspire
Copy link
Contributor

I have completed the modifications in this PR: #10023

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants