I have written a multipurpose test server in Go.
I will use it to test various stuff with load balancing etc. - this article is a short description what it can do.
The server is a small simple application. It has a total line count of 142 - with comments:
➜ demoserver git:(master) cat *.go | wc -l 142
I am going to be doing some experiments with proxying and load balancing and therefore I need something to use behind these.
Starting the server
The following flags can be used when running the server:
-httpto set the IP address and port to listen to (
-clientuse this if the client identity is stored in a header like
-idthe id/name of the server.
Starting the server with no arguments starts the server listening to
127.0.0.1:9001 (port is over 9000) gives the server a random ID and gets the client ID from the IP address + port combination of the request.
All the endpoints return a header
X-Server-Id that contains the ID of the server. All the endpoints only accept
The base endpoint returns a friendly message containing the ID of the server and the ID of the client.
This endpoint returns a chunk of random data with the given size. The maximum size is 10 MB - if an invalid size, zero/negative or no size is given the max chunk is returned.
This endpoint adds a random deplay - maximum 1000 ms - to the request.
This endpoint adds a delay of the given time (in milliseconds) to the request.
I ran some small tests with
wrk (GitHub) to see if everything is running as expected.
➜ ~ wrk http://127.0.0.1:9001 Running 10s test @ http://127.0.0.1:9001 2 threads and 10 connections Thread Stats Avg Stdev Max +/- Stdev Latency 383.83us 1.45ms 44.28ms 97.91% Req/Sec 19.28k 5.25k 36.64k 68.81% 387209 requests in 10.10s, 87.52MB read Requests/sec: 38334.61 Transfer/sec: 8.66MB
Secondly a fixed delay of one second:
➜ ~ wrk http://127.0.0.1:9001/delay\?time\=1000 Running 10s test @ http://127.0.0.1:9001/delay?time=1000 2 threads and 10 connections Thread Stats Avg Stdev Max +/- Stdev Latency 1.00s 1.95ms 1.01s 52.00% Req/Sec 4.20 0.41 5.00 80.00% 100 requests in 10.09s, 15.92KB read Requests/sec: 9.91 Transfer/sec: 1.58KB
And finally with some data (fixed size 2 MB):
➜ ~ wrk http://127.0.0.1:9001/data_chunk\?size\=2000000 Running 10s test @ http://127.0.0.1:9001/data_chunk?size=2000000 2 threads and 10 connections Thread Stats Avg Stdev Max +/- Stdev Latency 5.87ms 2.14ms 17.78ms 68.89% Req/Sec 590.24 59.19 760.00 67.68% 11824 requests in 10.08s, 22.03GB read Requests/sec: 1172.44 Transfer/sec: 2.18GB
It seems everything is running smoothly.
Check out the source code to the server in the GitHub repository.