Liberal arts colleges are known for low professor to student ratios, intimate seminar classes and highly personalized undergraduate experiences. On the surface, it is not obvious how online learning fits with this picture. But these days liberal arts colleges face many of the same pressures as larger universities – resource constraints, the growth of non-traditional students with more extracurricular responsibilities, even uncertainty about how a liberal arts education should evolve to stay relevant in a digital world. There is an urgent need to figure out how online learning technologies might help to address these pressures while honoring these institution’s core missions and values.
Council of Independent Colleges (CIC) President Richard Ekman believes that online learning presents new opportunities for liberal arts colleges to enhance their curricula and enrich their students' experience. With this in mind, CIC launched the Consortium for Online Humanities Instruction. This 20-member group is now working to develop online and hybrid courses that will be available to all members of the consortium, greatly expanding the number of courses available to their students. Focusing on upper-level humanities courses, the consortium is experimenting with new ways to deliver specialized topics in an array of disciplines.
Rebecca Griffiths, who is working with CIC to manage the program and evaluate its accomplishments, recently interviewed Richard Ekman to understand both the motivation for creating the Consortium and the challenges it is facing in its first year. We invite you to read this interview, published as our December issue brief, and to weigh in through our comments field below on the role of online learning in the liberal arts college.
Interested? Download Does Online Learning Have a Role in Liberal Arts Colleges?
Yesterday, I attended a symposium sponsored by Digital Science, Harvard, Microsoft, and MIT, called “Shaking It Up: How to thrive in – and change – the research ecosystem.” I made the trip to attend this event in person because I am focusing some attention on serving the sciences right now, and the sessions featured a remarkable array of mostly new initiatives in support of scientific research and scholarly communication.
The opening keynote featured an appropriately pointed but ultimately inspirational talk by CrossRef’s Geoffrey Bilder, who spoke about funding and sustaining cyberinfrastructure. Scientific research tends to be funded in a series of grants to individual researchers or research groups, and much infrastructure is purchased – or leased – through those grants. Imagine, Geoff urged, if physical infrastructure were funded in the same way – when your grant runs out, the lights go off. Geoff argued that a lot of technical jargon such as “open,” “distributed,” “framework,” and even “fabric,” are worrisomely misleading, used when a lack of trust prevents us from actually building the systems needed to accomplish the task at hand. When you "distribute" something, Geoff argued, someone else may centralize it, co-opting and enclosing the whole thing. And so the talk closed with a set of recommendations about how trusted entities can build and manage infrastructure, emphasizing transparency, a “living will” in case of systemic failure, a goal of generating a surplus allowing for reinvestment, and an incentive structure to wind down operations once the mission has been fulfilled.
Geoff’s argument might apply to national infrastructure like highways, airports, and the electric grid, but he is especially interested in the development of academic infrastructure like libraries, shared scientific instruments, ORCID, and HathiTrust. Some of the basic tenets of Geoff’s argument resonated for me with Ithaka S+R’s sustainability work, and in particular our definition of sustainability as “the ability to generate or gain access to the resources – financial or otherwise – needed to protect and increase the value of the content or service for those who use it.” It would be interesting to consider whether this definition could carry over to the types of infrastructure Geoff was speaking about, or if another type of definition would be needed.
Subsequent panels featured a variety of innovative services in support of scientific and scholarly research, many of which are in alpha or beta stages of development. These panels were held as lightning-rounds, so I’ll share only a few highlights.
Tiim Gardner spoke about Riffyn, a system designed to eliminate noise in data collection and thereby vastly improve the quality and reproducibility of R&D. Riffyn allows scientists to harness all their research equipment and tasks into a managed research process. It treats this research process as a designable object on which they can collaborate readily. With Riffyn, they can harness all their research equipment and stages into a managed process. The introduction of quality practices in biotech research processes is designed to increase productivity and reproducibility dramatically.
John Lees-Miller spoke about Overleaf, a well-used service for scientific writing in the cloud, somewhat akin to a Google Docs for scientific papers, but with special features to manage figures, equations, and article-like typesetting. Recently, they are integrating with innovative scientific publishers such as F1000, FigShare, and PeerJ, so that editorial and review processes can be cloud-managed in this native format.
Rob Siegel introduced Publiscize, a platform to help scientists broaden their audience and impact by creating jargon-free synopses of their works. While there are a growing number of services trying to help the underlying research articles maximize their impact, Publiscize appears to offer the ability to create complementary publications to reach alternative audiences.
Alex Hodgson presented ReadCube principally as an access service to address the increasingly well-documented shortcomings in the ability of academic libraries to provide seamless access to all the content that scientists require. (I appreciated his language in offering a critique of ILL’s “latency period.”) Integrating with publishers, ReadCube allows a researcher to purchase access to an article at rates such as $6 for 48 hour rental, “cloud access” (ie no downloading) for $15, or $35 to buy a PDF. He also showed an option for a library to pay some of these fees directly on behalf of its users as a sort of demand-driven access to materials.
William Gunn shared some of Mendeley’s new developments in the field of research discovery, arguing that pull methods (such as search) are reaching their limits while recommender-driven push methods (which I have elsewhere characterized as anticipatory) have great future potential.
Check out the hashtag #ShakingItUp14, and I understand recordings will be made available shortly.
At the Charleston Conference, Ithaka S+R hosted a session on “The Spaces Between,” which was intended to explore our communities’ needs for research that fall between the traditional boundaries of library, publisher, and vendor. As I mentioned in my opening remarks, these spaces can prove themselves to be cracks into which important issues fall unnoticed, or opportunities to build connections between communities with ultimately many shared interests.
Our panel consisted of Joe Esposito, an independent publishing consultant, Susan Stearns, the executive director of the Boston Library Consortium, and Roger Schonfeld, program director at Ithaka S+R. Joe spoke about an upcoming research project he is leading with Ithaka S+R to establish the share of library acquisitions accounted for by Amazon as opposed to more traditional vendors, a project just being launched that has implications for scholarly publishers, academic libraries, and of course the vendors themselves. Susan shared some data that suggested the limited impact new discovery services have had, at least for some types of content at some types of libraries, and emphasized that more attention is needed on the delivery side of the equation, suggesting that a fuller understanding of user behaviors would be valuable. She suggested that one promising area for publisher-library collaboration would be thinking about how we move beyond the PDF article and the journal title to imagine the future of scholarly communications. Roger suggested that we think about discovery more broadly, thinking critically about serving as a search “starting point” and finding ways to improve scholars’ efforts to “keep up” with the literature.
The terrific discussion was given over to several major themes.
- There were some questions about whether we should be satisfied with digital monograph interfaces provided through institutional channels and a sense that we should consider what a more complete ecosystem for engaging with texts might look like.
- A number of questions arose about how to manage non-reading analysis, such as text-mining, including the advantages of cross-publisher services such as the HathiTrust Research Center or the local loading of OhioLink.
- We heard about some of the perceived challenges in authentication and delivery systems, including off-campus access, with a sense that improved practices and systems are urgently needed.
We welcome additional feedback about these issues and others as we plan our next directions in this program area.
Ithaka S+R offers a program of workshops to support libraries that wish to make evidence-based decisions on some of the biggest issues they face. These workshops support libraries in structuring strong decision-making processes that incorporate a diverse base of evidence, including survey findings, qualitative research, usage data, budget information, and other evidence. In the coming months, we are offering these in New York City, Chicago during ALA Midwinter, and Portland, Oregon before the ACRL conference. Follow the links below for more information and to register.
Last week, Joseph Esposito announced on The Scholarly Kitchen a new research project in partnership with Ithaka S+R to study changing channels through which publishers sell to libraries and libraries acquire from publishers. We believe that the mechanisms for book sales/acquisitions are changing to some degree, especially at smaller libraries, with real implications both for the print and digital marketplace. We are thrilled to be launching this project in partnership with Joe, and grateful to the support of The Andrew W. Mellon Foundation that makes it possible.
Ithaka S+R brings to this project a wealth of experience in conducting surveys of libraries and in helping libraries and publishers adapt to the changing landscape for book collecting and collections. We expect to complete this project and release a full report of findings publicly by December 2015.
Evidence-Driven Decisions on Library Space in the Digital Age: A Workshop Before ACRL
A couple of weeks ago, while attending the Harvard Library Visiting Committee meeting, I participated in an amazing discussion of collection development strategies. I heard Harvard librarians saying that Harvard can no longer collect everything, indeed, shouldn't collect everything, and needed to build strong collaborative relationships so that Harvard scholars and students would be able to find the resources they need to do their work. This view—access is more important than ownership—is not new among other academic and research libraries, but that Harvard was saying this seemed truly revolutionary.
Only two years ago, at the Harvard Library Visiting Committee meeting, we had heard from the provost that the One Library program was being implemented so that resulting cost savings accrued from consolidation and streamlining of services could be used to bolster the acquisitions budget, allowing Harvard to regain its standing in building collections. With the arrival of Vice President Sarah Thomas, the emphasis has changed. She is thinking in terms of strong interdependencies with colleague institutions both for off-site storage of monographic collections and for building collections into the future. She made it clear that no library, not even Harvard, could hope to acquire everything scholars need to do their work.
Other ARL library directors also on the Visiting Committee added that they continue to place a high priority on building local collections, but the amount of scholarly information now being produced is overwhelming, and they cannot hope to keep up. All of them spoke of the need to think of the scholarly ecosystem as a whole, and to ensure that the system functions well for scholars.
Two weeks later, I attended the Charleston Conference, where at a plenary session former provost of Georgetown University and library director designate of Arizona State University Jim O'Donnell had assembled a panel of faculty whose charge it was to tell librarians what they needed to know. A physicist, a classicist, and a faculty member/policy specialist in South Asia talked about what they needed from libraries. Of greatest interest to me was the way in which each of them relies on the ecosystem, not the library. The South Asian expert recalled lovingly her days at the University of Chicago where the librarians collected everything she needed and wanted, but now at a smaller institution, she cannot rely on the local library for the resources she needs. To her, interlibrary loan is absolutely essential. She also praised the PL480 program that brings hard-to-acquire documents from difficult parts of the world into research collections that she can then use. The physicist does not use libraries at all, and instead relies on arXiv for his news about what is going on in his field, and for access to the scholarly literature. He thinks libraries are important, but his work is independent of them. Finally, the classicist who is a department chair in a public college, urged libraries to form consortia that go beyond similar types of institutions and geographic proximity. He called on librarians to enter into partnerships that would open up the ecosystem of scholarly resources to all scholars, wherever they happen to be.
Both of these meetings triggered a number of questions about library collections for me. In our highly decentralized system of higher education in this country, who has responsibility for maintaining and nurturing the ecosystem? In a universal belief that no one institution can collect everything, what structures need to be in place to ensure that, collectively, as many resources as possible are being collected, preserved, and made accessible. Over the course of my career, I have seen several initiatives develop and then sputter¬the National Periodicals Center, the Research Libraries Group conspectus project, and several discipline-specific attempts to rationalize collection development among groups of libraries. Why did they eventually falter? What kind of new thinking will be required in this Internet-connected world?
We librarians are dedicated to supporting the scholarly enterprise. We have done that in the past by acquiring as many resources our budgets allowed. Now, our job is to connect scholars with resources wherever they are, but our local institutional structures may be barriers to providing the kind of support that is needed. The consortia we have supported facilitate interlibrary lending, but that seems to be an inadequate response to today's needs. I would love to hear others' ideas for how we can develop an ecosystem of scholarly support while still living within confines of individual institutions that report acquisitions statistics to professional associations and accrediting bodies as a measure of strength.
Roger Schonfeld will speak on the "What Role(s) Should the Library Play in Support of Discovery?" at CNI's Fall Meeting in Washington, DC.
Over the past two years, several member universities of the Association of Southeastern Research Libraries (ASERL) have implemented the Ithaka S+R local surveys on their campuses. At ASERL's Fall Meeting on Thursday, November 20, Roger Schonfeld will discuss the findings from those surveys.