Skip to content
This repository was archived by the owner on Feb 2, 2024. It is now read-only.
This repository was archived by the owner on Feb 2, 2024. It is now read-only.

Your benchmarks are way off #41

@Richie765

Description

@Richie765

🐛 Bug Report

I reran the benchmarks, my results are way lower than yours.

To Reproduce

Rerun the benchmarks, with the following bugfix:

-router.on(['GET', 'POST', 'PUT', 'PATCH', 'OPTIONS', 'DELETE'], '/service/*', (req, res) => {
+router.get('/service/*', (req, res) => {
wrk -t8 -c50 -d20s http://127.0.0.1:8080/service/hi

Expected behavior

Somewhat similar benchmarks.

My benchmarks:

fast-proxy-undici/0http: Requests/sec 10259.59 (HTTP pipelining = 10)
fast-proxy/0http: Requests/sec 6773.80
fast-proxy/restana: Requests/sec 6460.21
fast-proxy-undici/0http: Requests/sec 9448.67 (HTTP pipelining = 1)
fastify-reply-from: Requests/sec 5635.55
http-proxy: Requests/sec 3105.40

As you can see I'm getting max 3.3x performance gain instead of your 46.6x.
Without the above mentioned bugfix, the first test clocks in at 39305.96 Requests/sec (12x faster than http-proxy). Even then it is WAY slower compared to your benchmarks.

I don't know what is exactly going on, but I think it's fair to say that your benchmarks are wrong and misleading.

Your Environment

  • node version: 12
  • os: Linux

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions