Project dead? Alternatives?

194 views
Skip to first unread message

Gábor Csárdi

unread,
May 9, 2014, 5:14:14 PM5/9/14
to code...@googlegroups.com
Hi all,

I wish this wasn't true, but it seems that the project it dead. A pity, really. 

Can anyone suggest some alternatives to codespeed? 

Thanks.
Gabor

Miquel Torres

unread,
May 10, 2014, 4:03:16 PM5/10/14
to code...@googlegroups.com
Hi Gabor,

well, it is true it hasn't been very active lately. There are some
feature wishes, but in generall it just works, so they weren't a big
enough itch for most.

Is there something in Codespeed that doesn't work for you? or some
improvement you would wish?
> --
> You received this message because you are subscribed to the Google Groups
> "codespeed" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to codespeed+...@googlegroups.com.
> For more options, visit https://groups.google.com/d/optout.

Gábor Csárdi

unread,
May 10, 2014, 7:46:05 PM5/10/14
to code...@googlegroups.com
Hi Miquel, thanks for the answer. No troubles so far, it is just in general better to use projects that are alive, so I wanted to know. 

But it seems that codespeed does not really have an alternative, at least I could not find any. So if I don't want to build my own stuff, codespeed is kind of my only choice. I'll give it a try and will see.

Best,
Gabor

Miquel Torres

unread,
May 11, 2014, 6:58:57 AM5/11/14
to code...@googlegroups.com, code...@googlegroups.com
I agree, i also consider how alive a project is when evaluating new software.

Generally, whenever a problem crops up in codespeed it is solved, so give it a try. Any feedback is welcome!

Yaniv Kaul

unread,
May 12, 2014, 11:42:38 AM5/12/14
to code...@googlegroups.com
I have several bugs to report, but to be honest, we'll be developing our own performance dashboard. Partially may be because we are abusing CodeSpeed for something it was not meant to be in the first place.
We are testing performance of a storage appliance, and need a lot of data to be compared to. Some of things we are missing:
1. You can't store results of 100's of different tests. Some pages would simply not load.
2. You can't compare a lot of items - very basic comparison between data points (I'd like to compare not only between platforms, but executables and between tests. Again, perhaps something CodeSpeed was not meant to be).
3. We've seen issues with uploading a lot of data - we split it into chunks these days - might be since our CodeSpeed server is a VM, etc. Not sure.
4. Lack of a distribution of CodeSpeed. An RPM or DEB or any other package would have made upgrading much easier.
5. Multiple results for the same build
6. No 'user' who reported results - important for us. 

Again, I'm quite sure we are abusing Codespeed for things it was not meant to be in the first place. I'd be happy to report smaller issues I've encountered as well.
Y.

Gábor Csárdi

unread,
May 12, 2014, 11:50:53 AM5/12/14
to code...@googlegroups.com
Hi Yaniv,

is the tool you are developing open source? I was actually also considering coding up something, with the objectives of 

1) handling more benchmarks, 
2) lightweight, i.e. no need for django, or other "big" web framework.
3) lightweight DB, i.e. some non-structured database like couchdb or probably mongodb.
3) RESTful API
4) Some good looking plotting lib, potentially in JS, e.g d3.

Gabor

Yaniv Kaul

unread,
May 12, 2014, 11:58:50 AM5/12/14
to code...@googlegroups.com


On Monday, May 12, 2014 6:50:53 PM UTC+3, Gábor Csárdi wrote:
Hi Yaniv,

is the tool you are developing open source? I was actually also considering coding up something, with the objectives of 

I don't know yet. I hope so, but depending on company policy and who'll implement it.
 

1) handling more benchmarks, 
2) lightweight, i.e. no need for django, or other "big" web framework.
3) lightweight DB, i.e. some non-structured database like couchdb or probably mongodb.
3) RESTful API
4) Some good looking plotting lib, potentially in JS, e.g d3.

Gabor

I have few more requirements. Essentially, every data point (result) that I'll provide will have a list of attributes. For example, in my case of storage: connectivity, read/write, block size, random/sequential IO, and so on and on and on. Version and build too, of course.
I'd like to be able to have:
1. BI (business intelligence) to report any result that is either below a certain threshold (the requirements) or previous result, or drastically different (Codespeed has some of it already, in the trend).
2. Be able to compare every data point with another, with different or same attributes. For example, for the same block size, results for read vs. write. Or for different block sizes, sequential vs. random, etc. This is the main requirement I'm lacking in Codespeed, as it doesn't have the 'attributes' notion.
3. Since I won't know ahead of time of ALL attributes, (as every release we add features, and we'd like to test with and without a new feature), we'll indeed might need to use a nosql solution. We use in another project rethinkdb - we might reuse it as well.   

I believe the idea of 'attributes' might make it appealing to testing performance of anything, not just my tests (and in fact, not only pure performance numbers, but anything you can quantify, would like to have trends and compare between different runs).

I don't have a problem with django, it is standard and extensible.

Let me know if the above makes sense, or I'll expand on my ideas.
Y.

Gábor Csárdi

unread,
May 12, 2014, 12:13:54 PM5/12/14
to code...@googlegroups.com
On Mon, May 12, 2014 at 11:58 AM, Yaniv Kaul <myk...@gmail.com> wrote:
[...]

I have few more requirements. Essentially, every data point (result) that I'll provide will have a list of attributes. For example, in my case of storage: connectivity, read/write, block size, random/sequential IO, and so on and on and on. Version and build too, of course.
I'd like to be able to have:
1. BI (business intelligence) to report any result that is either below a certain threshold (the requirements) or previous result, or drastically different (Codespeed has some of it already, in the trend).
2. Be able to compare every data point with another, with different or same attributes. For example, for the same block size, results for read vs. write. Or for different block sizes, sequential vs. random, etc. This is the main requirement I'm lacking in Codespeed, as it doesn't have the 'attributes' notion.
3. Since I won't know ahead of time of ALL attributes, (as every release we add features, and we'd like to test with and without a new feature), we'll indeed might need to use a nosql solution. We use in another project rethinkdb - we might reuse it as well.   

I believe the idea of 'attributes' might make it appealing to testing performance of anything, not just my tests (and in fact, not only pure performance numbers, but anything you can quantify, would like to have trends and compare between different runs).

I don't have a problem with django, it is standard and extensible.

Let me know if the above makes sense, or I'll expand on my ideas.

It makes perfect sense, I think. Searching/using all attributes is exactly why I don't want to go with SQL. Rethinkdb seems perfect for the job.

As for django, it is fairly standard, but I would detach the interface from the db, and implement the db first, with a simple API to upload data and make queries or reports. Then these can be formatted in any client language you like.

This is probably getting a bit off-topic here, let me know if you start doing something and it is open-source, I'll also let you know.

Gabor
Reply all
Reply to author
Forward
0 new messages