Please follow this easily memorable link: http://www.bu.edu/openaccess/
Assignment of publishing rights
I hereby assign to <Copyright owner> the copyright in the manuscript identified above (government authors not electing to transfer agree to assign a non-exclusive licence) and any supplemental tables, illustrations or other information submitted therewith that are intended for publication as part of or as a supplement to the manuscript (the “Article”) in all forms and media (whether now known or hereafter developed), throughout the world, in all languages, for the full term of copyright, effective when and if the article is accepted for publication. This transfer includes the right to provide the Article in electronic and online forms and systems. No revisions, additional terms or addenda to this Agreement can be accepted without our express written consent. Authors at institutions that place restrictions on copyright assignments, including those that do so due to policies about local institutional repositories, are encouraged to obtain a waiver from those institutions so that the author can accept our publishing agreement. (Emphasis mine.)
So, first they imply that institutions with institutional repositories restrict their faculty’s publishing opportunities by placing “restrictions on copyright assignments.” Not true: most institutions aim to educate their faculty about copyright and make sure that their researchers don’t sign away all rights in perpetuity without knowing exactly what they’re doing. It’s understandable that Elsevier wouldn’t like this, as they want exclusive copyright on work they didn’t perform (though, to be fair, are publishing).
Then Elsevier encourages authors to opt out of an enterprise that is proving to be a significant boon to academics (first and foremost providing them with visibility), implying that this is required for the authors to accept Elsevier’s apparently immutable publishing agreement. No contract is immutable before it is signed, but the language here does strongly suggest this, counting on most people just going along with it because they are unaware, or because they want to publish and don’t have time to pursue this with Elsevier.
It’s true that the very next paragraph, and its continuation later in the document, have different implications:
Retention of Rights for Scholarly Purposes (see Definitions below)
I understand that I retain or am hereby granted (without the need to obtain further permission) rights to use certain versions of the Article for certain scholarly purposes, as described and defined below (“Retained Rights”), and that no rights in patents, trademarks or other intellectual property rights are transferred to the journal.
The Retained Rights include the right to use the Pre-print or Accepted Authors Manuscript for Personal Use, Internal Institutional Use and for Scholarly Posting; and the Published Journal Article for Personal Use and Internal Institutional Use. […]
[definition of scholarly posting] Voluntary posting by an author on open Web sites operated by the author or the author’s institution for scholarly purposes, or (in connection with Pre-prints) pre-print servers, provided there is no Commercial Purpose involved. Deposit in or posting to Special Repositories (such as PubMed Central) is permitted only under specific agreements between Elsevier and the repository and only consistent with Elsevier’s policies concerning such repositories. If the author wishes to refer to the journal in connection with such posting, the Appropriate Bibliographic Citation should be used.
Further confusing: a scholar may post pre-prints to the websites that fit the italicized definition above, which would seem to include institutional repositories. Except Elsevier mentions repositories twice, and both in a permission-denied context: the second one is the Special Repositories such as PubMed.
Seems like language designed to mislead and bully, to me. Elsevier, would you please clarify?
The article, from the Public Library of Science, is this: “Clickstream Data Yields High-Resolution Maps of Science.” The authors collected “nearly 1 billion user interactions recorded by the scholarly web portals of some of the most significant publishers, aggregators and institutional consortia,” says the abstract. They proceeded to create maps that illustrate citations in the articles with which the users interacted. These maps “provide a detailed, contemporary view of scientific activity and correct the underrepresentation of the social sciences and humanities that is commonly found in citation data.” The most interesting illustration in this context is Figure 5—check out that big white and yellow cluster in the center. It’s worth the load time to view the larger image.
The CFP is for the next annual meeting of the Text Encoding Initiative Consortium. This year’s theme is text encoding in the era of mass digitization. The the first three suggested topics are conceptually larger than TEI, and are intriguing: In-depth encoding vs. mass digitization; Is text encoding sustainable?; Is text encoding scalable? People are bound to talk about crowdsourcing metadata, which I think is the only hope we have of scaling semantic encoding. (The quality control issues, which are the first concern that usually arises when people talk about collaborative knowledge work, are real. But there are ways to deal with them, and data that can be corrected may well be better than no data at all.)
The site I came across today is FairShare. It allows people to track how their online publications are used and/or remixed. Haven’t played with it yet, but it looks promising, particularly in the context of an institutional repository. Imagine a researcher depositing an article, pointing FairShare at it and seeing others respond to her work. Just the psychological boost from that is valuable in spurring future work.
I’m not the only one blogging my day today. Happy DoDH!
Two books have appeared recently that may be of interest. One is Terry Harpold’s Ex-foliations: Reading Machines and the Upgrade Path. (University of Minnesota Press 2008, $25 paper, $75 cloth.)
A sophisticated consideration of technologies of reading in the digital age.
“Every reading is, strictly speaking, unrepeatable; something in it, of it, will vary. Recollections of reading accumulate in relation to this iterable specificity; each takes its predecessors as its foundation, each inflects them with its backward-looking futurity.” In Ex-foliations, Terry Harpold investigates paradoxes of reading’s backward glances in the theory and literature of the digital field.
The other book that has come to my attention, thanks to Grand Text Auto, is Ted Nelson’s Geeks Bearing Gifts: How the Computer World Got This Way. Nelson, one of the pioneer scholars of humanities computing and new media, has not published anything book-length in about a decade, it seems. This one’s self-published, available through Lulu.com for $19.95, and is 202 pages long. There’s no short blurb about it, but the chapter summaries are available here. Nelson knows his way around words; I’ll be getting my own copy of this one.
And speaking of the computer age and its history, check out this video of Douglas Engelbart demoing the mouse (and hypertext, and an online collaboration system) for the first time. In 1968.
And who could pass up a free, downloadable (as a PDF file) Atlas of Cyberspace, by Martin Dodge and Rob Kitchin? Licensed under Creative Commons, no less; these guys did it right. Gorgeous illustrations and interesting exposition.