Digital Publishing Category

0

Citation controversy: does online access change citation practices?

by Philip Davis, The Scholarly Kitchen, Oct. 13, 2008

Excerpt:

Earlier this year, Davis reported on a study by sociologist James Evans suggesting that online access to scientific journals is leading to more recent citations and a narrowing of the diversity of those articles which are cited.

This study was not taken at face value, and three information scientists (Vincent Larivière, Yves Gingras, and Éric Archambault) all at the University of Quebec in Montreal have released a new analysis taking aim at the diversity claim.

Their manuscript, “The decline in the concentration of citations, 1900-2007,” deposited September 30th in the arXiv, uses a simpler methodology. They report the percentage of papers that received at least one citation, the percentage of papers needed to account for 20%, 50%, and 80% of total citations, and the Herfindahl-Hirschman index, a measure used to estimate market concentration.

. . . What makes this controversy interesting is that both studies make theoretical sense.  A narrowing of science conforms to attention economics and preferential attachment (why the cited get more citations and the rest get ignored); a broadening of science conforms to information foraging theory, the principle of least effort, and the increasing ease of retrieving relevant articles.  The results of both studies imply something different about the state of science, whether scientific information is being disseminated efficiently, and whether the literature is reflecting more diversity of opinion or more conformity.

Read the entire post at: http://scholarlykitchen.sspnet.org/2008/10/13/citation-controversy/

Read more commentary on the topic at:

Great minds think (too much) alike. Economist; 7/19/2008, Vol. 387 Issue 8589, p89-89, 2/3p (available to UI affliates only)

0

Christian Science Monitor to Publish Online Only

After a century of continuous publication, The Christian Science Monitor will abandon its weekday print edition and appear online only, its publisher announced Tuesday. The cost-cutting measure makes The Monitor the first national newspaper to largely give up on print.

The paper is currently published Monday through Friday, and will move to online only in April, although it will also introduce a weekend magazine. John Yemma, The Monitor’s editor, said that moving to a Web focus will mean it can keep its eight foreign bureaus open.

“We have the luxury — the opportunity — of making a leap that most newspapers will have to make in the next five years,” Mr. Yemma said.

The Monitor is an anomaly in journalism, a nonprofit financed by a church and delivered through the mail. But with seven Pulitzer Prizes and a reputation for thoughtful writing and strong international coverage, it long maintained an outsize influence in the publishing world, which declined as its circulation has slipped to 52,000, from a high of more than 220,000 in 1970.

read more….

New York Times, Oct. 28, 2008

0

U. of Michigan Places 1 Millionth Scanned Book Online

The University of Michigan has reached the 1 million book milestonetemp in its digitization program. That figure represents around 13% of the 7.5 million books in the library’s collections. The books are available via the library’s catalog or via Google Book Search, as part of the Michigan Digitization Project.

Most of the scanning has been done as part of the library’s controversial deal with Google. The search giant is working with dozens of major libraries around the world to scan the full text of books to add to its index. But Michigan is one of the only institutions to agree to scan every one of its holdings — even those that are still covered by copyright. Some publishers have sued Google for copyright infringement over the scanning effort, though officials from Google say their effort is legal because they are not making the full text of copyrighted books available to the public.

The Wired Campus News Blog, Feb. 4, 2008
and Open Access News, Feb. 4, 2008

0

Blue Ribbon Task Force on Sustainable Digital Preservation and Access

Fran Berman, director of the San Diego Supercomputer Center, and Brian Lavoie, a research scientist at OCLC, have been named co-chairs of a Blue Ribbon Task Force on Sustainable Digital Preservation and Access, which is being funded by the National Science Foundation and the Andrew W. Mellon Foundation. The Library of Congress, the National Archives and Records Administration, the Council on Library and Information Resources, and JISC will also be involved in the task force.

Here’s an excerpt from the press release:

Berman and co-chair Brian Lavoie . . . will convene an international group of prominent leaders to develop actionable recommendations on economic sustainability of digital information for the science and engineering, cultural heritage, academic, public, and private sectors. The Task Force is expected to meet over the next two years and gather testimony from a broad set of thought leaders in preparation for the Task Force’s Final Report. . . .

The Task Force will bring together a group of national and international leaders who will focus attention on this critical grand challenge of the Information Age. Task Force members will represent a cross-section of fields and disciplines including information and computer sciences, economics, entertainment, library and archival sciences, government, and business. Over the next two years, the Task Force will convene a broad set of international experts from the academic, public and private sectors who will participate in quarterly panels and discussions. . . .

In its final report, the Task Force is charged with developing a comprehensive analysis of current issues, and actionable recommendations for the future to catalyze the development of sustainable resource strategies for the reliable preservation of digital information. During its tenure, the Task Force also will produce a series of articles about the challenges and opportunities of digital information preservation, for both the scholarly community and the public.

from DigitalKoans, September 25, 2007

0

Committee on Institutional Cooperation (CIC) Joins Google’s Library Project

The number of libraries participating in the Google Book Search Library Project just got a whole lot bigger with today’s addition of the Committee on Institutional Cooperation (CIC). The CIC is a national consortium of 12 research universities, including University of Chicago, University of Illinois, Indiana University, University of Iowa, University of Michigan, Michigan State University, University of Minnesota, Northwestern University, Ohio State University, Pennsylvania State University, Purdue University and the University of Wisconsin-Madison. Google will work with the CIC to digitize select collections across all its libraries, up to 10 million volumes.

Readers will have access to many distinctive and unique collections held by the consortium. Users will be able to explore collections that are global in scope, like Northwestern’s Africana collection or dive deep into the universities’ unique Midwest heritage, including the University of Minnesota’s Scandinavian and forestry collections, Michigan State’s extensive holding in agriculture, Indiana University’s folklore collection, and the history and culture of Chicago collection at the University of Illinois-Chicago.

Google will provide the CIC with a digital copy of the public domain materials digitized for this project. With these files, the consortium will create a first-of-its-kind shared digital repository of these works held across the CIC libraries. Both readers and libraries will benefit from this group effort:

* The shared repository of public domain books will give faculty and students convenient access to a large and diverse online library before housed in separate locations.
* This new collaboration will enable librarians to collectively archive materials over time, and allow researchers to access a vast array of material with searches customized for scholarly activity.

For books in the public domain, readers will be able to view, browse, and read the full texts online. For books protected by copyright, users will get basic background (such as the book’s title and the author’s name), at most a few lines of text related to their search, and information about where they can buy or borrow a book.

“This library digitization agreement is one of the largest cooperative actions of its kind in higher education,” said CIC chairman Lawrence Dumas, provost of Northwestern University. “We have a collective ambition to share resources and work together to preserve the world’s printed treasures.”

Two CIC member universities are already working with Google Book Search, the University of Michigan and the University of Wisconsin-Madison, and this new agreement will complement the digitization work already taking place.

The CIC becomes the latest partner in the Google Books Library Project, which in addition to the University of Michigan and University of Wisconsin-Madison, also includes Harvard University, Stanford University, Oxford University, the New York Public Library, Stanford University, University of California, University of Texas at Austin, University of Virginia, Princeton Library, the Complutense University of Madrid, the Bavarian State Library, the Library of Catalonia, the University Library of Lausanne and Ghent University Library. Google is also conducting a pilot project with the Library of Congress.

The Google Books Library Project digitizes books from major libraries around the world and makes their collections searchable on Google Book Search. More information can be found at: http://books.google.com.

Google Press Center, June 6, 2007

0

Amazon Will Digitize Universities’ Books and Sell Print-on-Demand Copies

Amazon, which made its name selling books online, is now entering the book-digitizing business.

Like Google and, more recently, Microsoft, Amazon will be making hundreds of thousands of digital copies of books available online through a deal with university libraries and a technology company.

But, unlike Google and Microsoft, Amazon will not limit people to reading the books online. Thanks to print-on-demand technology, readers will be able to buy hard copies of out-of-print books and have them shipped to their homes.

And Amazon will sell only books that are in the public domain or that libraries own the copyrights to, avoiding legal issues that have worried many librarians — and that have prompted publishers to sue Google for copyright infringement.

Read the article in it’s entirety at: http://chronicle.com/daily/2007/06/2007062206n.htm

Chronicle of Higher Education, 6/22/07

1

Nature: Agencies Join Forces to Share Data

From the March 22 issue of Nature. For the full text, see http://ealerts.nature.com/cgi-bin24/DM/y/hc530SpivX0HjB0BOpY0EA

Excerpt:

The US government is considering a massive plan to store almost all scientific data generated by federal agencies in publicly accessible digital repositories. The aim is for the kind of data access and sharing currently enjoyed by genome researchers via GenBank, or astronomers via the National Virtual Observatory, but for the whole of US science.

Scientists would then be able to access data from any federal agency and integrate it into their studies. For example, a researcher browsing an online journal article on the spread of a disease could not only pull up the underlying data, but mesh them with information from databases on agricultural land use, weather and genetic sequences.

Nature has learned that a draft strategic plan will be drawn up by next autumn by a new Interagency Working Group on Digital Data (IWGDD). It represents 22 agencies, including the National Science Foundation (NSF), NASA, the Departments of Energy, Agriculture, and Health and Human Services, and other government branches including the Office of Science and Technology Policy.
The group’s first step is to set up a robust public infrastructure so all researchers have a permanent home for their data. One option is to create a national network of online data repositories, funded by the government and staffed by dedicated computing and archiving professionals. It would extend to all communities a model similar to the Arabidopsis Information Resource, in which 20 staff serve 13,000 registered users and 5,000 labs.

0

Institutional Repositories: Evaluating the Reasons for Non-use of Cornell University’s Installation of DSpace

Philip M. Davis
Cornell University
<pmd8@cornell.edu> (corresponding author)

Matthew J. L. Connolly
Cornell University
<mjc12@cornell.edu>

D-Lib Magazine March/April 2007 Volume 13 Number 3/4

Abstract

Problem: While there has been considerable attention dedicated to the development and implementation of institutional repositories, there has been little done to evaluate them, especially with regards to faculty participation.

Purpose: This article reports on a three-part evaluative study of institutional repositories. We describe the contents and participation in Cornell’s DSpace and compare these results with seven university DSpace installations. Through in-depth interviews with eleven faculty members in the sciences, social sciences and humanities, we explore their attitudes, motivations, and behaviors for non-participation in institutional repositories.

Results: Cornell’s DSpace is largely underpopulated and underused by its faculty. Many of its collections are empty, and most collections contain few items. Those collections that experience steady growth are collections in which the university has made an administrative investment, such are requiring deposits of theses and dissertations into DSpace. Cornell faculty have little knowledge of and little motivation to use DSpace. Many faculty use alternatives to institutional repositories, such as their personal Web pages and disciplinary repositories, which are perceived to have higher community salience than one’s affiliate institution. Faculty gave many reasons for not using repositories: redundancy with other modes of disseminating information, the learning curve, confusion with copyright, fear of plagiarism and having one’s work scooped, associating one’s work with inconsistent quality, and concerns about whether posting a manuscript constitutes “publishing”.

Conclusion: While some librarians perceive a crisis in scholarly communication as a crisis in access to the literature, Cornell faculty perceive this essentially as a non-issue. Each discipline has a normative culture, largely defined by their reward system and traditions. If the goal of institutional repositories is to capture and preserve the scholarship of one’s faculty, institutional repositories will need to address this cultural diversity.

Read the article at: http://www.dlib.org/dlib/march07/davis/03davis.html

Related article:
Harnad: Mandates Would Empower Institutional Repositories

It was probably not much of a surprise to librarians to learn in a recent D-Lib article that Cornell’s DSpace institutional repository (IR) is not filling up very quickly or catching on with faculty; nor was it a surprise that self-archiving and open access champion Stevan Harnad quickly offered a point-by-point response to D-Lib authors Phil Davis and Matthew Connolly. However, the problem of filling IRs with content, Harnad maintains, has a surprisingly easy solution. “The finding is that faculty don’t self-archive spontaneously,” Harnad posited. “The remedy, which [Davis & Connolly] do not mention at all, is to mandate that they self-archive.”

Previous studies show that when mandated to do so, faculty will populate IRs, Harnad argues. “The Swan surveys had already reported that faculty say they will not self-archive on their own,” Harnad told the LJ Academic Newswire, “but 95 percent will self-archive if mandated, over 80 percent of them willingly.” Harnad also cited Arthur Sale’s April 2006 D-Lib article that found that, while “voluntary” policies resulted in repositories collecting less than 12 percent of the available theses, “mandatory policies were well accepted and cause deposit rates to rise towards 100 percent.” There are currently 12 university or departmental mandates adopted worldwide and 11 funder mandates, plus one multi-institutional mandate and six funder mandates proposed, Harnad noted, adding that “the remedy has been tried, a number of times, and it works each time.”

“It sounds simple enough,” responded Davis on Yale University’s Liblicense-L discussion list. “Make one’s faculty do what they don’t see as necessary themselves.” Davis says his and Connolly’s study aimed not to “demonstrate that IRs are a failure,” but by focusing on “non-use” aimed to find out why they are not growing more quickly. “If we are to work at an institution where our researchers have the freedom to choose how they disseminate and archive their work, then it is important to understand the beliefs and motivations behind their behaviors,” Davis continued. “These results may lead to building better services around repositories.”

Library Journal Academic Newswire, March 20, 2007

0

For Oxford University Press, Online Venture Breathes New Life into the Monograph

Officials at Oxford University Press (OUP) say their Oxford Scholarship Online program (OSO), a digital database of the press’s monographs, will expand by September to include nearly all of its monograph titles. OUP vice president and publisher Niko Pfund said that the successful program, will add new “modules” in math, physics, biology, psychology, business/management, history, literature, classics, and linguistics, adding nine new disciplines in all to the original four “modules” (economics, politics, religion, and philosophy). “This will in effect double the number of new titles we’re putting into OSO to approximately 400-450 new books a year,” he told the LJ Academic Newswire.

Pfund said the bold effort to launch OSO has taken “thousands” of hours of effort between publishers and authors and “lots” of m taoney, though he declined to estimate theb. OSO launched in 2003 as a subscription database before switching to a “perpetual access” model in 2005. The press’ massive effort, Pfund says, is paying off, with usage up over 450 percent. Pfund said OUP may even add more of its mainstream books to OSO, titles not necessarily considered monographs. “One of the challenges of academic publishing is the different way in which scholars, librarians, and publishers define the term ‘monograph,'” he conceded. “But one of the animating principles behind OSO is to maximize exposure for our books. So, if we think that inclusion of a specific academic trade or trade title in OSO would make for a good fit in the site, we’d include it.”

Of course, inclusion in OSO depends on a host of other factors as well, including the wishes of authors, some of whom may fear that an online offering will diminish book sales. In an age when Google is roiling the marketplace with its scan plan, prompting lawsuits and forcing many presses and authors into sometimes uncomfortable choices, OSO represents a significant strategic commitment of resources by OUP. And so far, authors have enthusiastically embraced the venture. “I’d say 90-95 percent of authors are initially positive, with about 20-25 percent having substantive questions that involve a few rounds of conversation,” Pfund said. Only “a very few” authors have declined inclusion. “In fact,” he added, “it’s been inspirational to see how many are willing to put in extra work on drafting abstracts and keyword lists to ensure their work is not only included in OSO but well-represented.”

It’s difficult to judge whether OSO has had an effect on book sales, Pfund said, noting that overall monograph sales have not decreased since OSO launched and some titles may appear to have benefited from increased exposure. But focusing on book sales, he suggests, is simply too narrow a measure in today’s “fragmented,” increasingly digital world. “We’re not seeing the end of the book, we’re seeing the galloping diversification of how its message can be conveyed,” he explains, describing the press’ philosophy as “format agnosticism,” that is to deliver content in whatever format is desired. “If dissemination and influence is our primary currency, then having books available via OSO, or netlibrary, in print perpetuity via print-on-demand, or in Google Book Search and Amazon’s Search Inside the Book means that more people can access your work in more ways from more places than ever before. That does translate to dollars.”

Library Journal Academic Newswire, Feb. 1, 2007

0

U. of Michigan Press, Library, Scholarly Publishing Office Launch Digital Studies Imprint, Web Site

With its latest venture, the University of Michigan Press is exploring the cutting edge, both in terms of the content it publishes and how it publishes. Under a new collaborative program between the press, the library, and the Scholarly Publishing Office, the UM Press’s new Digital Culture imprint will both sell books and offer the full-text of those books freely on its Digital Culture Books website. The imprint debuted with the fall 2006 publication of The Best of Technology Writing 2006, edited by journalist and Sunday New York Times business columnist Brendan Koerner, who with UM press editor Alison MacKeen selected 24 pieces from several hundred submissions. “We wanted a versatile mixture of the light-hearted and serious, profiles, features, and ‘big think’ pieces,” MacKeen said of the first volume. “We also wanted to embed some articles in there that would help to make people aware of undercovered issues such as digital copyright, municipal wireless, and so on,” A 2007 volume, to be edited by Newsweek’s Steven Levy, is currently accepting nominations.

As groundbreaking as some of the ideas, however, is the Press’s decision to practice what many of its authors now preach, using the Digital Culture imprint to develop an “open and participatory publishing model” that seeks to “build a community” around its content. “Our goal is to give each project a robust online and print presence and to use the effort not only to introduce scholars to a range of publishing choices but also to collect data about how consumption habits vary on the basis of genre, age, discipline,” MacKeen explained. “The data will help us to understand more about the economics of digital publishing, and will also, we think, offset any potential economic risks by developing the venture as a research opportunity.”

While press officials use the term “open access,” the venture is actually more “free access” than open at this stage. Open access typically does not require permission for reuse, only a proper attribution. UM director Phil Pochoda told the LJ Academic Newswire that, while no final decision has been made, the press’s “inclination is to ask authors to request the most restrictive Creative Commons license” for their projects. That license, he noted, requires attribution and would not permit commercial use, such as using it in a subsequent for-sale product, without permission. The Digital Culture Books web site currently reads that “permission must be received for any subsequent distribution.”

The initiative is an innovative publishing strategy for university presses, who have the increasingly complex mission of serving scholarly communication needs while staying financially viable. “It will be interesting to see how it will go in terms of book sales,” MacKeen concedes. “I can imagine either an increase or a decrease.” Pochoda stressed that there is “more than a business model at stake,” however, noting that the collaborative nature of the Digital Culture imprint represents the press’ chance “to support open access in principle and practice while still acknowledging the obligation to survive as a business operation.” Nevertheless, he has reason to believe the press will sell some books. The National Academy Press, for example, offers its book content online, Pochoda notes, and its data suggests a corresponding jump in sales.

Library Journal Academic Newswire, Jan. 11, 2007