2011 CrossRef Annual Member Meeting
Tuesday, November 15, 2011
The Charles Hotel
Cambridge, Massachusetts, USA
Twitter Hashtag: #crossref11


8:30-10:00      Registration and Breakfast
9:00-9:45 Corporate Annual Meeting for Members and Board Election
  Linda Beebe, Chair, Board of Directors
  Ian Bannerman, Treasurer, Board of Directors
  Ed Pentz, Executive Director
10:00-10:20 Main Open Meeting
  Introduction and CrossRef Overview, Ed Pentz, Executive Director - Presentation
  - Presentation
  - Video Recording
10:20-10:40 System Update
  Chuck Koscher, Director of Technology
  - Presentation
  - Video Recording
10:40-11:00 Strategic Initiatives Update
  Geoff Bilder, Director of Strategic Initiatives
11:00-11:30 Break
11:30-11:45 CrossRef Member Obligations (including Display Guidelines)

Carol Anne Meyer, Business Development and Marketing

  - Presentation
  - Video Recording
11:45-12:15 CrossMark Update
  Evan Owens, American Institute of Physics
  - Presentation
  - Video Recording
  Kirsty Meddings, Product Manager
  - Presentation
  - Video Recording
12:15-12:45 ORCID Update
  Howard Ratner, Nature Publishing Group
  - Presentation
  - Video Recording
12:45-13:15 DataCite: the Perfect Complement to CrossRef
  James Mullins, Purdue University
  Just as CrossRef provides a digital object identifier to scholarly articles, DataCite, as an international collaboration, provides digital object identifiers to data sets, including those that contributed to the published research article. This presentation will provide the mission, vision, challenges, and latest advances of DataCite.
  - Presentation
  - Video Recording
13:15-14:15 Lunch
14:15-15:15 Sex and the Scientific Publisher: How Journals and Journalists Collude (despite their best intentions) to Mislead the Public

Ellen Ruppel Shell, Boston University Center for Science & Medical Journalism

  Publication bias is the tendency of researchers, editors, and pharmaceutical companies to handle the reporting of experimental results that are positive (i.e. showing a statistically significant finding) differently from results that are negative (i.e. supporting the null hypothesis) or inconclusive, leading to bias in the scientific literature overall. Indeed, statistically significant results are three times more likely to be published than papers affirming a null result. Such bias occurs despite the fact that studies with significant results do not appear to be of superior design than are studies with a null result. There is evidence that some investigators actually decline to seek publication due to their anticipation that scientific publishers will not be interested in null results--the so-called "file drawer" effect. Complicating matters still further is that journalists tend to over-report positive scientific findings, with the result that the public is too often mislead as to the purpose, scope and consequences of a given scientific study. This talk takes a look at this problem through the lens of one of the most confusing--and least understood--issues of our time, the significance of innate cognitive differences between genders.
  - Presentation
  - Video Recording
15:15-15:45 The Persistence of Error: A Study of Retracted Articles on the Internet
  Phil Davis, Publishing Consultant
  Article retraction is an attempt to correct the scientific record. In practice, readers may be unaware that an article has been retracted and cite it for years as a valid study. Scientific authors have little incentive to consult the publisher’s website or a literature index for the current status of each cited reference. In addition, copies of retracted articles persist, in many versions, on public websites beyond the control of the publisher. In this talk, I report on a study to locate versions of retracted articles on the public Internet as well as in the personal collections of Mendeley users. I discuss how a series of services could be designed to more effectively alert readers on the updated status of scientific articles.
  - Presentation
  - Video Recording
15:45-16:15 Break
16:15-16:45 Results from global journal editor survey on detecting plagiarism
  Helen (Y.H) ZHANG, JZUS (Journal of Zhejiang University-SCIENCE)
  1. How do journal editors use CrossCheck?
2. How do journal editors respond to the CrossCheck similarity report?
3. What are the attitudes of journal editors toward typical problems that may often be encountered in different disciplines and different countries?
4. What is the difference between the Anglophone and non-Anglophone journals in dealing with plagiarism?
5. A few interesting questions that came up in our survey.
16:45-17:15 The Good, the Bad and the Ugly: What Retractions Tell Us About Scientific Transparency
  Ivan Oransky, Retraction Watch

Science is supposed to be self-correcting, and retractions are the most draconian of efforts to keep the scientific record up to date. They've also risen dramatically in the past decade. But there is wide variation in how journals approach the withdrawals of papers. Some of those approaches raise serious questions about the transparency with which science would like to be linked. Here's a look at a number of case studies from more than a year of Retraction Watch, as well as some suggestions for improving retraction practices.

  - Presentation
  - Video Recording
17:15-17:30 Wrap up
17:30-18:30 Cocktail Reception

Updated December 21, 2011

copyright 2015, pila, inc. all rights reserved