The Library Assessment Conference took place last week in Seattle, a valuable forum for those gathering and using evidence in support of library management and planning. I attended, with my colleague Alisa Rod, Ithaka S+R's surveys coordinator. The program included a diverse set of presentations on topics from information literacy to space planning. Ithaka S+R's local surveys were also featured in a number of sessions on the program.
Developing the Ithaka S+R Student Survey
Alisa and Heather Gendron, head of Sloane Art Library and coordinator of assessment at UNC Chapel Hill, spoke about how the Ithaka S+R local student survey was developed and tested. Heather spearheaded a cognitive interview process to ensure that the language of the survey questionnaire is well-suited for the population of students who take it, resulting in a variety of modifications. Alisa reviewed other aspects of the development and testing of the questionnaire and methodology, including her use of factor analysis to quantitatively evaluate the questionnaire following the pilot testing of the survey at six higher education institutions. This testing and improvement process is a real advantage of using a community survey such as the Ithaka S+R instrument, and I was glad to see the lively interest in the methodological techniques used to develop and test our survey questionnaire.
Maximizing response rates
Later in the program, Alisa spoke with Debbie McMahon of Baylor University about techniques for maximizing levels of response to surveys of faculty members. They shared practices and tips for achieving buy-in with key library staff such as liaisons, for structuring invitations and reminder messages, and for crafting incentives that help to achieve project objectives. We focus carefully with each local survey participant on techniques for maximizing responses, and we are constantly adapting our implementation guidelines to provide distinctive best practices for reaching various groups of faculty members and students.
Surveys and strategic planning
Lisa Hinchliffe, coordinator for Information Literacy Services and Instruction at the University of Illinois at Urbana-Champaign, organized a panel that focused on how to "close the loop" to establish real impact from survey projects. Eric Ackerman, head of Reference Services and Library Assessment at Radford University, spoke on LibQual and some of the service adjustments that its findings made possible. Heather Gendron spoke about running both the Ithaka S+R faculty and student surveys in the context of a major strategic planning process that has refocused the UNC Chapel Hill libraries on supporting the research lifecycle. Finally, Elizabeth Edwards, assessment librarian at the University of Chicago, described the Herculean task of performing a kind of meta-analysis of ten years' worth of assessment in preparation for a new library director and in anticipation of fielding the Ithaka S+R survey of graduate students. All three institutions have translated survey findings into various types of real impact.
Heather also spoke on a panel with Andrew Asher, the assessment librarian at Indiana University Bloomington, that looked at the Ithaka S+R local surveys in different institutional contexts. Chapel Hill and Indiana—two large research-intensive public universities—have both now run the local student and faculty surveys. This gave them the opportunity to compare findings from students and faculty members within and between both universities in order to enhance situational awareness in the context of campus-specific strategic priorities.
In an earlier post, I wrote about my LAC presentation on the potential role of assessment in library decision making.
There were many other terrific presentations on other topics, and overall it was interesting to see a real focusing of effort on the outcomes of assessment, which I believe is a vitally important issue for our community in coming years.
Last week at the Library Assessment Conference in Seattle, I gave a talk on "Vision, Alignment, Impediments, Assessment." As academic libraries face a variety of strategic issues, I argued, they need to consider how to implement evidence-based decision making processes more broadly in their institutions. There's a significant role for the assessment community in building such processes, and as libraries continue to invest in assessment, they have the opportunity to use data to address their challenges.
I reviewed some of the key strategic issues that face many academic libraries today, things like
- finding the right balance between collecting and community engagement,
- future format choices for books,
- the role of the library in support of discovery,
- the role and purpose of the library's spaces, and
- what sustainable roles there may be for supporting teaching and learning as pedagogies change rapidly.
Whether these are exactly the right issues for any specific library, my essential message was that library leaders can build decision-making processes for strategic dilemmas such as these, and not just for operational matters. Deanna Marcum and I covered some of the ways in which such decision-making processes can be structured in our recent issue brief Driving with Data. I believe there are real opportunities for library leaders to draw on the capacities of their assessment and user experience teams to bring evidence to bear in support of strategic issues.
I am grateful to Anne Cooper Moore (Southern Illinois University) and Scott Walter (DePaul University) for serving as panelists and sharing the library director’s perspective. They offered valuable reactions to my remarks, helping to focus our attention on the importance of identifying university priorities, and especially student success, in developing strategy for the academic library.
Rebecca Griffiths Discusses Social Network Mapping Study at EDUCAUSE 2014
Earlier this year, Ithaka S+R, in partnership with the Association of American State Universities and Colleges, the League for Innovation in the Community College, and Syndio Social, and with funding from the Bill and Melinda Gates Foundation, embarked on a project to analyze social networks among college and university faculty, staff, and administration to see how these networks advance innovation across campus.
In 2012, Ithaka S+R partnered with Jisc and Research Libraries UK to conduct the inaugural UK Survey of Academics. The report of findings was published in May 2013, and it is freely available on our website. This project was the first in several steps to internationalize Ithaka S+R's US Faculty Survey. It developed rich findings for the UK higher education sector about discovery, open access, the print to electronic transition, research methods, and other issues of strategic relevance. As with the US Faculty Survey, we plan to run the UK Survey of Academics on a triennial cycle.
As is our practice with other surveys, Ithaka S+R deposited the dataset for the UK Survey of Academics with ICPSR for preservation and access. With the ingest process now completed, ICPSR has released the dataset.
Ithaka S+R Releases Report on Hybrid Classroom Experiments at the University System of Maryland
New York, NY—During the same month that The New York Times declared 2012 the “Year of the MOOC,” Ithaka S+R partnered with the University System of Maryland (USM) to determine the feasibility of using MOOCs in new ways—incorporating MOOCs and other online technologies into undergraduate classrooms. The results of that study are available today: Interactive Online Learning on Campus: Testing MOOCs and Other Platforms in Hybrid Formats in the University System of Maryland.
Over the course of a year, Ithaka S+R worked with faculty on seven USM campuses to set up side-by-side tests in classes from statistics to biology comparing sections using interactive online learning technologies to those taught through more traditional methods. Four of these used MOOCs and three used a course from the Open Learning Initiative (OLI) at Carnegie Mellon University. Ten additional classes incorporated MOOCs in order to gain further insight into the opportunities and challenges presented by using these materials in campus environments.
Findings indicate substantial promise for using interactive online technologies in traditional college settings. The USM faculty were enthusiastic in their willingness to experiment with MOOCs and found they could serve as useful tools for accomplishing their goals with students and—perhaps most importantly—most would like to continue using them in the future and would recommend them to their colleagues. Specifically, faculty reported that by using the lecture videos to cover content, they were able to engage in more active teaching activities with their students. Some instructors also found that MOOCs could be used as a tool to strengthen students’ foundational skills in critical thinking. Additionally, despite the significant time investments faculty made to use MOOCs in the study, they believe they can save time teaching by incorporating existing online materials into their courses.
The study also found that student outcomes were roughly the same in hybrid sections as in traditional face-to-face sections. These results held in the subgroups we examined, including those from low-income families, under-represented minorities, first-generation college students, and those with weaker academic preparation.
While these findings point to the potential benefits MOOCs and OLIs might offer, Ithaka S+R also found several challenges will need to be overcome before the wider-spread implementation of these technologies will be feasible. To better fit this type of use, MOOC providers will need to make their courseware more modular and must consider the intellectual property and licensing implications of making this content available in different contexts. At the same time, universities must provide leadership, infrastructure, support, and incentives to help faculty engage with this new type of content, in much the same ways that the USM committed to in this study.
MJ Bishop, director of the Center for Academic Innovation at the USM, said of the project: “The University System of Maryland is especially pleased to see the positive reaction of our faculty participants in this endeavor, and to see that our comparison of MOOCs with traditional teaching settings showed roughly the same academic outcomes. As important, though, the study reflects USM’s careful stewardship in this process, because challenges remain before we can grow the use of these new technologies. We are even more confident in the soundness of the approach we are bringing to this study, an approach that values a deliberate and careful collection of feedback from students and faculty.”
As project advisor and President Emeritus of Tufts University Lawrence S. Bacow notes: “This study provides much needed data about the benefits and challenges of adapting MOOCs for traditional institutions and putting these opportunities in context. This is an important step forward, and I hope that university and college leaders and faculty take note of these findings and consider ways that they can take advantage of existing online content in their courses. I hope, too, that the institutions and platforms offering MOOCs continue to explore ways that these resources can benefit mainstream students at public universities.”
Ithaka S+R would like to thank the Bill and Melinda Gates Foundation for their generous funding of this project.
Supporting the Digital Humanities on Your Campus—A Webinar with Nancy Maron
Join Ithaka S+R for a webinar on July 17th at noon.
As faculty and students continue to build new digital projects, how can institutions develop end-to-end systems that address the needs of those projects over time?
Today, NISO is releasing the recommended practice for its Open Discovery Initiative. This important initiative is intended to bring greater order to the indexed discovery services that have achieved a market penetration of roughly three-quarters of US academic libraries, according to the most recent Ithaka S+R US Library Survey 2013 (pages 53-54). With such a high share of libraries positioning indexed discovery services as the primary discovery interface for their users, it is essential to address the concerns—both real and perceived—about how these systems work and interact with libraries and content providers. The ODI report recommends practices in the areas of usage statistics, various data sharing mechanisms, content availability, and fair or unbiased linking.
I chaired the “fair linking” subcommittee, which interpreted its mandate as addressing perceptions of bias in search results and relevancy ranking while bringing greater transparency to the business connections between content providers and discovery services. I am especially proud of the principle we have established that “Discovery service providers should offer an affirmative statement of the neutrality of their algorithms for generating result sets, relevance rankings, and link order” (188.8.131.52, at page 25). While there are many ways to game any system, I believe we have set out some valuable principles regarding not only neutrality but also transparency that will allow libraries more effectively to steward the discovery interests of their user communities.
Bridging the differences in interests between content providers, discovery services, and libraries was no small task, and I thank all the members of the ODI committee, chaired by Marshall Breeding and Jenny Walker and staffed by NISO’s Nettie Lagace, for their contributions.
The University of Rochester is set to open a new data visualization lab with a direct link to the University’s supercomputers. Ithaka S+R provided support in the design phase by collecting information through interviews and workshops and identifying key requirements for the space itself and for the technology to be installed in the space.
The Visualization-Innovation-Science-Technology-Application (VISTA) Collaboratory lab will be used in many ways including research in the sciences, engineering and optics as well as the humanities and social sciences. The leader of the project makes a compelling case for this sort of visualization facility. To quote from the press release: “The best analytical tool we have is still the human brain,” said David Topham, Ph.D. the executive director of the HSCCI and a professor in the Department of Microbiology and Immunology. “We can see relationships between data that computers cannot. But in order to do that you have to have the information in front of you so you can see the patterns and connections that matter. In other words, you need to be able to see the forest and the trees simultaneously.”
For more information on the VISTA Collaboratory Lab, please see the press release from New York Governor Andrew Cuomo.
Roger Schonfeld and Alisa Rod are presenting at several sessions during the Library Assessment Conference in Seattle, Washington. This year's conference theme is "Building Effective, Sustainable, Practical Assessment."
Monday August 4, 11-12:30; Session 2
"A Mixed-Methods Approach to Questionnaire Development: Understanding Students’ Interpretations of Library Survey Questions"
Heather Gendron, University of North Carolina Chapel Hill
Alisa Rod, Ithaka S+R
New Report—Sustaining the Digital Humanities: Host Institution Support beyond the Start-Up Phase
Digital Humanities has captured the imagination of many faculty, staff and students in recent years. Experts in the field, from veterans of Digital Humanities Centers to library digitization units, know well the challenges that digital projects can pose, just to keep content and software up to date and relevant. As more scholars experiment with building digital humanities resources, how are their host institutions approaching the challenge of supporting these efforts over time?
Ithaka S+R has just published Sustaining the Digital Humanities: Host-Institution Support beyond the Start-Up Phase, undertaken with generous support from the National Endowment for the Humanities. The final report explores the question of sustainability from the institutional point of view: while many libraries and digital humanities centers think deeply about supporting digital outputs, is there a campus-wide strategy concerning who gets to build what, the resources they have access to, and the systems in place to support complex digital research resources once they are built?
The final report suggests that while many faculty report building things, not everything thought of as a “DH” project could or should pose significant sustainability challenges. Some are smaller, private experiments, some are easily executed using existing platforms and templates. Yet even for those significant digital research resources that do have long-term goals and merit ongoing support, few if any institutional strategies take into account the full range of activities needed, from project set up and management, to preservation, long term hosting, upgrading, and dissemination, some approaches to managing this are beginning to emerge. The report outlines examples of good practice, and suggests three archetypes—service, lab, and network models—that illustrate campus-wide approaches to addressing these issues.
Since each campus truly will need to develop and adopt the strategy that fits them best, the Sustainability Implementation Toolkit offers useful tools to help understand one’s own campus landscape, identify areas of overlaps and gaps, and to bring together key stakeholders, including senior administrators, to engage in a productive discussion about new, coordinated systems, that take into account the range of project types and needs, as well as the resources available to support them.
We hope you will find the report and toolkit useful and welcome your feedback.