I have few more requirements. Essentially, every data point (result) that I'll provide will have a list of attributes. For example, in my case of storage: connectivity, read/write, block size, random/sequential IO, and so on and on and on. Version and build too, of course.
I'd like to be able to have:
1. BI (business intelligence) to report any result that is either below a certain threshold (the requirements) or previous result, or drastically different (Codespeed has some of it already, in the trend).
2. Be able to compare every data point with another, with different or same attributes. For example, for the same block size, results for read vs. write. Or for different block sizes, sequential vs. random, etc. This is the main requirement I'm lacking in Codespeed, as it doesn't have the 'attributes' notion.
3. Since I won't know ahead of time of ALL attributes, (as every release we add features, and we'd like to test with and without a new feature), we'll indeed might need to use a nosql solution. We use in another project rethinkdb - we might reuse it as well.
I believe the idea of 'attributes' might make it appealing to testing performance of anything, not just my tests (and in fact, not only pure performance numbers, but anything you can quantify, would like to have trends and compare between different runs).
I don't have a problem with django, it is standard and extensible.
Let me know if the above makes sense, or I'll expand on my ideas.