Skip to content

Backend benchmarking of Python, Node, and Dart frameworks

License

Notifications You must be signed in to change notification settings

SirusCodes/backend_benchmark

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

55 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Backend Benchmark

Currently, it benchmarks backends in two ways.

  1. Custom script to target specific cases
    1. Sequential requests (GET/POST)
    2. Multiple parallel requests (GET/POST)
    3. File upload (Multipart Requests)
    4. JSON parsing
  2. Load testing by [k6]

Below are the results for both

Specific stress test

Send one request at a time Send multiple GET request at a time
Send one request at a time Send multiple request at a time
Send files with multipart Send 1.04MB of JSON to server and let it parse it
Send files with multipart Send 1.04MB of JSON to server and let it parse it

Load testing

You can have a look at the config file on how it works.

A TL;DR would be it simulates the user increasing from 0 to 50, staying there for a minute then increasing to 100 and so on till it reaches 200 in step 50 and then decreases in step 100 till it reaches 0.

In the graphs below the Red line, represents the number of virtual/simulated users and the Blue line, represents the Average round trip time (ms) on the Y-axes and the time for test on X-axis.

Conduit (Dart)

Conduit (Dart)

Dia (Dart)

Dia (Dart)

dart_frog (Dart)

dart_frog (Dart)

minerva (Dart)

minerva (Dart)

shelf (Dart)

shelf (Dart)

spry (Dart)

spry (Dart)

Fiber (Go)

Fiber (Go)

expressjs (Node)

expressjs (Node)

Flask (Python)

Flask (Python)

About

Backend benchmarking of Python, Node, and Dart frameworks

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 3

  •  
  •  
  •