Ranking and indexing

81 views
Skip to first unread message

Alexandre Miguel Pinto

unread,
Feb 22, 2017, 12:51:06 PM2/22/17
to Computational Creativity Forum
We all do research because we hope to provide valuable scientific and technological contributions. However, in this bibliometrics age we live in, indicators and ranks are the basis of most decisions, either for funding grants or projects, or to hire people. For this reason, the time we spend working on a paper on any given topic better have as well a quantifiable and recognisable impact — ultimately, these have been measured by metrics such as the h5-index of the venue we publish, or its (australian) CORE ranking, among many other possibilities including number of citations.

I know that only recently DBLP has started indexing ICCC, but I want to open up the following discussion: do you think ICCC should be indexed by the australian CORE? should it have a h5-index recognised by Google Scholar? what can we do to foster such developments? how long will it take until ICCC has such recognition? how can that wait hinder progress in Computational Creativity?
Let me remind you that the former "International Joint Workshop on Computational Creativity" (IJWCC) was part of the CORE (rank B) (http://portal.core.edu.au/conf-ranks/?search=creativity&by=all&source=all&sort=atitle&page=1). The IJWCC no longer exists, but ICCC is its continuation; however the ICCC is not ranked in the CORE.

I'd love to hear (read) your thoughts on this.

Alexandre

Joe Corneli

unread,
Feb 22, 2017, 4:42:31 PM2/22/17
to Alexandre Miguel Pinto, Computational Creativity Forum
Hi Alexandre,

I've been applying for jobs and similar thoughts crossed my mind.

I guess one possible answer is that it would be good to be in CORE if
the rating was good. Conferences like IJCAI, NIPS, and CHI are all A*.
If someone cares a lot about ratings they will surely take this into
consideration when they think about where to submit a paper...

So yeah: if we want the conference to get a good rating (or be
equivalent to one that does) then there are fairly clear guidelines
published (goo.gl/DrNxnE). To me ICCC sounds like it matches the
description of an A-grade conference:

"An A Conference may be the center of an ecosystem, including
workshops and tutorials."

"Reviews of papers are generally undertaken by people who have
published in the area of the submitted work, and provide detailed and
extended feedback."

So perhaps at this point the question becomes procedural: what does it
take to get noticed by the people who run CORE?

Regarding the h5 index ("h5-index is the h-index for articles published
in the last 5 complete years. It is the largest number h such that h
articles published in 2011-2015 have at least h citations each.")... I
would say, let's avoid doing anything whatsoever related to this metric
*directly*. There was a big scandal related citation hacking a few
years ago, you're probably aware of that.

https://blogs.unimelb.edu.au/sciencecommunication/2013/08/31/scandal-yelling-and-mess-the-outcome-of-hyping-factors/

I guess everyone is familiar with "Goodhart's law", even if not by that
name. Variously:

"When a measure becomes a target, it ceases to be a good measure."

"Any observed statistical regularity will tend to collapse once
pressure is placed upon it for control purposes."

@incollection{goodhart1984problems,
title={Problems of monetary management: the {UK} experience},
author={Goodhart, Charles AE},
booktitle={Monetary Theory and Practice},
pages={91--121},
year={1984},
publisher={Springer}
}

Naturally, we could do more for dissemination without creating a
scandal, but to my mind this is best approached from a holistic
standpoint.

Within computational creativity we can and do have quite a few more
interesting ways to measure "quality" than bibliometrics. If I do say
so myself, I put forth several interesting suggestions in this regard
last year:

@inproceedings{corneli2016institutional,
author={Corneli, Joseph},
title={An institutional approach to computational social creativity},
editor={Cardoso, Amilcar and Pachet, Fran\c{c}ois and Corruble, Vincent and Ghedini, Fiammetta},
booktitle={Proceedings of the Seventh International Conference on Computational Creativity, ICCC 2016},
year={2016},
url={http://www.computationalcreativity.net/iccc2016/wp-content/uploads/2016/06/paper_9.pdf},
abstract={Modelling the creativity that takes place in social settings presents a range of theoretical challenges. Mel Rhodes's classic "4Ps" of creativity, the "Person, Process, Product, and Press," offer an initial typology. Here, Rhodes's ideas are connected with Elinor Ostrom's work on the analysis of economic governance to generate several "creativity design principles." These principles frame a survey of the shared concepts that structure the contexts that support creative work. The concepts are connected to the idea of computational "tests" to foreground the relationship with standard computing practice, and to draw out specific recommendations for the further development of computational creativity culture.}
}

Naturally, I'd be interested to see more discussion of this stuff!

Another point to emphasise here: we should be concerned about social
value, not just academic profiles. E.g., for a nice juicy metric,
consider how much money "superstar" AI programmers are making these days
in the field of machine learning...!

Joe

Alexandre Miguel Pinto

unread,
Feb 23, 2017, 3:28:02 AM2/23/17
to Computational Creativity Forum, alexand...@gmail.com
Hi Joe,

I definitely agree with you on considering ICCC sounds an A-grade conference, and I believe that is the right question
"what does it take to get noticed by the people who run CORE?"

Regarding h5-index, it never even crossed my mind to *hack* this metric or to do anything *directly*; but since ICCC is around since 2010 it has more than 5 years of existence. So, why, oh, why, doesn't it have an h5-index yet? Aren't ICCC papers being cited at all?

Regarding the use, and abuse, of metrics, I couldn't agree more: I think employers and funding organisations are way too much focused on these biblio-metrics that  basically quantify popularity of research and not its scientific quality (that is a whole other discussion and I'll add my 20cents on that shortly), but these are the mechanisms people are using today, so we must "dance to the tune".

Regarding the metrics themselves: metrics such as h-index, and h5-index, lie on top of a single one: the number of citations. The problem with the "number of citations" and other metrics derived from it is that are highly biased towards "currently popular and trendy" fields, which actually discourages working in radically novel ideas. If anything, we want scientific publications to be rated by their quality, not popularity. But how do we measure quality? The answer, I think, is much easier than it may look like: the reviewers already do that! That is precisely the role of reviewers: to evaluate the quality of a paper; so the classification given by the reviewers, after curation by the editor, should be the fundamental metric, and not the number of citations (which could be added to the mix later on, e.g., by a linear combination).

I think this discussion just started and I'd really like to know what more people think and how we could move forward on this.

Alex

Joe Corneli

unread,
Feb 23, 2017, 5:03:28 AM2/23/17
to Alexandre Miguel Pinto, Computational Creativity Forum
It would be a bit tedious but it would of course be possible to tally up
the h5-index "by hand" (I mean, obviously the number exists, we would
just have to pin it down!).

I'm not entirely sure where you're looking on Google Scholar, but e.g.,
looking at
https://scholar.google.com/citations?view_op=top_venues&hl=en&vq=eng_artificialintelligence
there's only a list of the *top 20* publications in AI.

It seems safe to say that either ICCC isn't a top-20 publication
(likely, in my opinion), [and/]or it's not being identified as a
"publication" at all by Google's crawler.

The papers themselves are being identified; there are technical tips for
webmasters here:
https://scholar.google.com/intl/en/scholar/inclusion.html#indexing
which might improve the way they are indexed.

On Thu, Feb 23 2017, Alexandre Miguel Pinto wrote:

> If anything, we want scientific publications to be rated by their
> quality, not popularity. But how do we measure quality? The answer, I
> think, is much easier than it may look like: the reviewers already do
> that! That is precisely the role of reviewers: to evaluate the quality
> of a paper; so the classification given by the reviewers, after
> curation by the editor, should be the fundamental metric, and not the
> number of citations (which could be added to the mix later on, e.g.,
> by a linear combination).

Good points - program chairs should try and make sure that that data is
all archived somewhere.

Anna Jordanous did some interesting related research using public data
(see esp. Section 3.2 of this):

@inproceedings{jordanous2016longer,
title={The longer term value of creativity judgements in computational creativity},
author={Jordanous, Anna},
booktitle={AISB Symposium on Computational Creativity (CC2016). Sheffield, UK: AISB},
pages={16--23},
year={2016}
}

diarmuid.odonoghue

unread,
Mar 9, 2017, 12:03:04 PM3/9/17
to Computational Creativity Forum

Hi Alexandre and Joe,

 

I have been looking into additional avenues for getting greater recognition for the ICCC Conference series.

In particular, requirements for inclusion in the Scopus database, whose list is also used by others (eg SJR)

Scopus appears to have already have ICCC2011 & ICCC-2011 but not the others.

It also contains a few of the previous IJWCC workshops.

 

It looks like an ISSN number and possibly a formal ethics and malpractice statement are the only remaining issues (see below)

 

From http://suggestor.step.scopus.com/suggestTitle/step1.cfm

“Titles will only be considered for evaluation if they meet the following minimum criteria:

►The title should publish peer reviewed content.
►The title should be published on a regular basis (i.e. have an ISSN that has been confirmed by the ISSN International Centre). To register an ISSN, please visit this page.
►The title should have English language abstracts and article titles.
►The title should have references in Roman script.
►The title should have a publication ethics and publication malpractice statement.


Regards,
Diarmuid.

hannu.toivonen

unread,
Mar 10, 2017, 3:25:27 AM3/10/17
to Computational Creativity Forum

Good points. Getting listed in Scopus would be a great addition for the visibility of ICCC!

BTW, ICCC 2016 has meanwhile become indexed and available in DBLP, see http://dblp2.uni-trier.de/db/conf/icccrea/

Cheers,

Hannu

Amílcar Cardoso

unread,
Mar 10, 2017, 4:03:02 AM3/10/17
to hannu.toivonen, Computational Creativity Forum
The indexing in DBLP is great news, indeed.

Regarding the Scopus, I suggest that the steering committee and the board of ACC should take the process in hands (one more!). The indexing would be very important for the conference. Apparently, we are almost meeting all the conditions that Diarmuid mentions. 

Cheers,
Amilcar


--
You received this message because you are subscribed to the Google Groups "Computational Creativity Forum" group.
To unsubscribe from this group and stop receiving emails from it, send an email to computational-creativ...@googlegroups.com.
To post to this group, send email to computational-c...@googlegroups.com.
Visit this group at https://groups.google.com/group/computational-creativity-forum.
To view this discussion on the web visit https://groups.google.com/d/msgid/computational-creativity-forum/b16d89eb-6369-4306-b7e1-c61ff6d0eb70%40googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply all
Reply to author
Forward
0 new messages