HTTP 2.0 will (most likely) introduce compression for HTTP headers. In order to help the Internet community evaluate the possible benefits of HTTP 2.0 and its compression algorithm, this websites offers a benchmark for HTTP 2.0 header compression.
The benchmark was generated by monitoring actual HTTP traffic of real-world users. However, the public data set was anonymized to ensure that no private information is exposed. At the same time, care was taken to ensure that characteristics of the original traffic were preserved wherever possible.
The benchmark consists of sequences of HTTP headers that are likely to be embedded within the same HTTP 2.0 TCP stream and thus might be compressed together. While the benchmark thus identifies series of requests, it does not identify the sources or targets of the requests.
This website offers five data sets with one million HTTP headers (and trace information) in the form of an SQLITE v3 database, as well as access to the source code that was used to generate the traces, to perform statistical evaluations and to run (simplistic) compression benchmarks.
A technical report detailing our methodology and summarizing the benchmarks properties and initial experimental results is available here.
|Please note that using our benchmark is subject to the condition that you cite our technical report as the source of the data where appropriate (that is, in technical documentation and scientific papers that build on results from the benchmark, but not in product manuals for your customers).|
|Data Sets:||1, 2, 3, 4, 5, all (tar)|
|Source Code:||httpbenchmark.tgz (GPLv3)|