Danah Boyd, of Microsoft Research and NYU, on Monday put out a much-circulated question on Twitter, asking “how many academics have a personal commitment to #openaccess“ — that is, free public online access to published research. Conversation ensued:
How many academics out there have a personal commitment to #openaccess? Ex: would you join the board of a closed journal? cc: @ginasue
— danah boyd (@zephoria) August 5, 2013
@zephoria @ginasue I'm signed on to http://t.co/41Zw3YYxu2 pledge as won't publish, won't referee, won't do editorial work. Kept up so far.
— david a banks (@DA_Banks) August 5, 2013
@DA_Banks @zephoria That would cut out pretty much every top tier communication and sociology journal with very few exceptions.
— Gina Neff (@ginasue) August 5, 2013
@ginasue @DA_Banks @zephoria from public use standpoint, why need your work go into a legacy prestige journal, vs free #OA jrnl/repository?
— Tim McCormick (@tmccormick) August 5, 2013
@tmccormick @DA_Banks @zephoria There's only two metrics (research) academia cares about for scholarship: prestige and prestige.
— Gina Neff (@ginasue) August 6, 2013
@ginasue @DA_Banks @zephoria "(research) academia cares only for prestige": quite. How might we realign this dynamic with public interest?
— Tim McCormick (@tmccormick) August 6, 2013
@tmccormick @DA_Banks Absolutely! Let's work to change that!
— Gina Neff (@ginasue) August 6, 2013
@tmccormick @ginasue @DA_Banks @zephoria I do wonder sometimes abt how to change this, if prestige + closed will always be synonymous.
— JessieNYC (@JessieNYC) August 6, 2013
@JessieNYC @ginasue @DA_Banks @zephoria the wonder is they ever were, given the conflict w/scholarly, knowledge, & public values.
— Tim McCormick (@tmccormick) August 6, 2013
@DA_Banks @ginasue @tmccormick @zephoria … need academia to see "public scholarship" as tenure worthy. Public *versions* of research > OA
— Chanders (@Chanders) August 6, 2013
to encourage open scholarship, an Open Scholar/Researcher Index, 0-100, calculated from publication record? @Chanders @DA_Banks @ginasue
— Tim McCormick (@tmccormick) August 6, 2013
@tmccormick @DA_Banks @ginasue I very much like that idea …
— Chanders (@Chanders) August 6, 2013
@Chanders @DA_Banks @ginasue everyone loves lists & rankings! or possibly, might fear them in this case.. ;)
— Tim McCormick (@tmccormick) August 6, 2013
@Chanders @DA_Banks @ginasue I imagine #OSRI as a transparent calculator using std inputs, e.g. @ORCID_Org & how (soon) pubs avail publicly
— Tim McCormick (@tmccormick) August 6, 2013
ORCID is an emerging standard identifier for scientists/researchers, designed to facilitate easier identification and tracking of research work. There are also forms of researcher ID used by Google Scholar, Elsever, etc, but ORCID is more of an open, international standards effort .
There are many issues and nuances to consider in such a rating, but I loosely imagined something like this:
- A transparent, well-defined formula (or set of defined variant formulae) which can run on standard inputs such as ORCIDs and DOIs (Digital Object Identifiers).
. - For any given scholarly work, say an article, chapter, or book, a quantitative evaluation would be made based on e.g. the degree, manner, and timeline upon which the work was made publicly accessible.
Note, this would be based on time from original publication, not whether it is publicly accessible now. So an article published in 2005, now available after a 2-yr embargo, counts as 2-year public-access delay.
I then heard from Theo Andrew, Open Scholarship Development Officer, University of Edinburgh, who pointed to his and colleague’s very similar 2012 project Open Access Index, aka #OAIndex.
@tmccormick Something like an open index perhaps? http://t.co/thvF4aUOeZ @altmetric
— Theo (@earthowned) August 7, 2013
This was funded by a startup-style mini-grant from Jisc Elevator, UK. The team cites as precursor a May 2012 presentation on “Metrics for Openness,” given by David Nichols, University of Waikato (NZ), at University of Illinois’s Graduate School of Library and Information Science.
The OAIndex team produced a nice 3-minute video explaining their project idea:
While Open Access Index is broadly similar to what I had in mind with Open Scholar Rating OSR, there are a few differences I’d suggest:
1) OSR avoids being specifically tied to “Open Access,” which has quite specific and contested meanings in various contexts. It might be useful to have an OSR rating, or variant ratings, which have different criteria such as availability of layman/public/educational research summaries or popular-media dissemination; or which apply to contexts where no Open Access technically exists, etc.
2) “Open Scholar Rating” emphasizes the point that it describes a person‘s set of work, not just any collection. While I can see it might be useful to have an index describing any collection, e.g. a department or university’s research output, I am particularly interested in OSR as an evaluative and incenting measure to help individual scholars embrace more open scholarly practices.
.
What do you think? Is this a useful or feasible idea, what are the (fatal?) problems, or how might it be done? Please feel free to comment using Disqus box below (login w/Twitter, Facebook, or Google+; or use/create a Disqus account), or on Twitter with a link to the post and/or me @tmccormick and/or hashtag #oaindex; or email me at tmccormick (at) gmail.com.