Introduction

The Council of Independent Colleges (CIC) Consortium for Online Humanities Instruction began in 2014 with the support of The Andrew W. Mellon Foundation to determine if small, independent colleges could collaborate in developing online, upper-level humanities courses that would give students at these institutions a broader range of courses from which to choose. The success of the first Consortium (2014–2016) motivated the Mellon Foundation to support a second Consortium that was formed in the summer of 2016 with teams of faculty members and administrators from 21 new institutions selected through a competitive process.[1]   Each institution is represented by a four-member team that includes a senior academic administrator, two full-time faculty members in the humanities, and a registrar or a representative of the registrar’s office.[2]  In the first year of the program (2017), faculty members from each institution developed online or hybrid courses and offered them to their students.[3] In the second year (2018), the courses were revised and taught again, but this time were open to enrollment by students from other colleges in the Consortium as well.

The CIC Consortium set out to address the following three goals:

  1. To provide an opportunity for CIC member institutions to build capacity for online humanities instruction and share their successes with other liberal arts colleges
  2. To explore how online humanities instruction can improve student learning outcomes
  3. To determine whether smaller, independent liberal arts institutions can make more effective use of their instructional resources and/or reduce costs through online humanities instruction by sharing courses with other like-minded institutions

The report that follows documents the experience of the 21 participating institutions in the second year of the Consortium II initiative.[4] The data refer mainly to the online courses developed for the project and the reactions of faculty and students to those courses. Also incorporated throughout the report are additional insights that the research team gathered from discussions that took place during a final workshop for participants in July 2018. This workshop included representatives from the 21 institutions in Consortium II and selected institutions from the first Consortium.

Summary of Findings

At the end of the second year of Consortium II, the participating institutions have achieved a great deal towards each goal:

Goal 1: Building Capacity

Advanced humanities courses can be designed and delivered effectively online, although some challenges still need to be addressed.

  • Over 80 percent of the responding instructors this year indicated that they felt comfortable guiding their class toward an understanding of the course topics and helping them clarify their thinking in the online environment (compared to about 60 percent last year). Most of the instructors this year (80 percent) also indicated that online discussions helped students develop a sense of collaboration (compared to 40 percent last year). The instructors attributed their more successful experience in the second year to setting clearer expectations for students at the outset and reconfiguring their courses to promote student interest and engagement more effectively.
  • About 80 percent of instructors indicated that they were not able to form personal relationships with students similar to the kinds of relationships they form with students in in-person courses. Several instructors noted that the inability to form personal relationships with students was the least satisfying aspect of teaching online courses. The instructors also noted that longer term projects and group assignments continue to pose challenges in online courses, especially for less-prepared students.
  • At the final workshop, participants agreed that their conversation about online humanities instruction has noticeably shifted from “Should we do this?” to “How do we do this?” The instructors unanimously agreed that the experience made them become better teachers overall. Additionally, the Consortium gave some instructors the opportunity to teach in their area of specialization when their course fulfilled curricular needs of other participating institutions.

Goal 2: Enhancing Student Learning

Overall, students performed well in the Consortium courses, although the DFW rate (the rate at which students withdrew or earned D’s or F’s) for visiting students was slightly higher than that of locally enrolled students.

  • The student withdrawal rate across all Consortium courses was only about 6 percent, which is considerably lower than that of last year (11 percent). Ninety two percent of the enrolled students completed their courses and received grades and of those students who completed the courses, 91 percent earned C or better (compared to 80 percent last year). The average GPA was 3.22 (grade B) and the median GPA was 3.70 (grade A-). Moreover, the average learning outcomes scores across the courses was 3.06, which falls within the “competent” range. The peer faculty also rated the students’ work in other Consortium course very highly (3.02, competent).
  • A gap that warrants further study is that visiting students’ performance was relatively poor on the DFW scale compared to that of locally enrolled students (4.36 percentage points higher). Some instructors noted in their survey responses that working with students who bring different calendars and different cultural norms and academic expectations was challenging.
  • Both student surveys and interviews with project team members reinforced the advantage of being able to offer online courses for commuter students as well as visiting students from institutions that have relatively limited course offerings. Several instructors at the final workshop also noted that the increased course options and diversified student population made teaching and learning more dynamic and interesting.

Goal 3. Increasing Efficiency

Instructors can save time (and therefore resources) as they gain more experience with their online courses. Administrators are confident that cross-institutional collaboration will result in greater efficiency over time, but participants agreed that more coordination and integration will be needed to manage partnerships.

  • The cross-enrollment experiment was quite successful this year with nearly 80 percent of the responding instructors indicating that students from other Consortium institutions had enrolled in their courses (compared to 60 percent in the first Consortium).
  • In 2017, about half of the responding instructors (44 percent) indicated that they created brand new courses. In 2018, more than 70 percent of instructors noted that they only made slight modifications to their courses. Also, a smaller share of instructors this year noted that they sought additional training (formal or informal) to teach online and used the support of instructional designers and technologists to design their courses.
  • The Consortium courses also helped students manage their time. The students in the second year of Consortium II, like those in the first year, indicated that the primary reason for taking the courses online was convenience. Those who responded in this way were especially appreciative of the opportunity to complete coursework on their own schedules.
  • The instructors noted in their survey responses that the process for sharing courses would have to become more systematic and designed in a way that benefits all members. The administrators at the final workshop noted that although this experiment did not directly address their low enrollment and cost-saving issues, it has shown people that collaboration can work and is a promising way to diversify their course offerings and serve different populations of students in ways they couldn’t do previously.

Data Sources

In order to assess the Consortium’s success in achieving each of its explicit goals, we collected data from multiple sources. These include:

  • Registration Data [N= 21 institutions]. Student-level registration data for Consortium courses at each institution were gathered at the end of the Spring 2018 term to report course enrollments, course completion rates, and grade distributions. Institutional-level data on the total number of courses offered in-person and online at each institution as well as institutional spending on instruction also were collected (the same data routinely reported to the federal Department of Education).[5] For more information about the specific data elements requested from the institutions, see Appendix A.
  • Instructor Learning Outcomes Scores [N= 38 instructors, 2283 scores]. Each instructor was asked to revisit the course-specific learning outcomes developed in the first year iteration and either keep or revise them before submitting them to the research team in January 2018. Instructors included with their learning outcomes a description of how they planned to use the digital tools associated with online/hybrid instruction to help students achieve course-specific learning outcomes. The number of learning outcomes provided by each instructor ranged between two and nine. At the end of the Spring 2018 term, instructors were asked to select one assignment to assess learning outcomes for each of their students and submit these scores for analysis. The scores were based on a four-point scale: Beginning (student did not meet the goal), Developing (student is approaching the goal), Competent (student met the goal), and Accomplished (student exceeded the goal). A total of 38 instructors, representing 20 institutions, submitted their scores.
  • Faculty Peer Assessment Scores [N= 51 artifacts, 504 scores]. Thirteen randomly selected instructors were asked to designate artifacts from three students in their courses to serve as evidence of all students’ performance towards each of the two predefined general learning outcomes. These general learning outcomes relate to students’ ability to interpret and analyze texts, as well as their ability to synthesize knowledge. Examples of artifacts included research papers, portfolio assignments, and creative endeavors (e.g., creative writing, multimedia assignment, online exhibit). A panel of six evaluators selected from the cohort of participating faculty members, used the four-point scale rubric (i.e., Beginning, Developing, Competent, or Accomplished) to assess how well each artifact reflected the desired learning outcomes. This process aimed to assess whether instructors were able to successfully use the online format to achieve general goals of humanities instruction. The rubric and a full description of the learning outcomes are in Appendix B.
  • Instructor Survey [N= 40 instructors]. This survey was administered at the end of the Spring 2018 term. Sections related to student experience in the online course were derived from the Community of Inquiry survey instrument, which focuses on three constructs: instructor presence, social presence, and cognitive presence.[6] The survey attempted to understand instructors’ perceptions of their preparation to teach an online/hybrid course, their access to quality instructional supports, the amount of time spent preparing and delivering the course compared to traditional in-person courses, their beliefs about whether students’ learning outcomes and experiences in online courses are comparable to in-person courses, and their reflections on implementation success. All but one instructor who taught the same online or hybrid course for a second time in Spring 2018 completed the survey. The instructor survey instrument is in Appendix C. Note that questions 14, 15 and 51 were added to the survey developed for the first Consortium cohort to better understand (1) the benefits and challenges of having students from other institutions enroll in the courses and (2) whether instructors believed that a consortium could help institutions and departments enhance student learning outcomes and address low enrollment and the high cost of course delivery.
  • Student Survey [N= 41 courses, 257 students]. Student surveys were submitted for all 41 courses. Surveys were administered by the research team on a third-party platform, but instructors coordinated their own students’ participation. The number of student survey responses for each course varied between zero and 39. The student survey instrument is in Appendix D. Note that questions 1, 13 and 14 were added to the original survey to better understand the benefits and challenges of having students from other institutions in the classes.
  • Stakeholder Interviews [N= 9 stakeholders]. Additional views on the Consortium experience were collected during stakeholder interviews with a sample of three faculty members, four administrators, and two registrars. The list of interviewees and the interview scripts are in Appendix E.

Limitations

  • Because we were not able to measure student learning in Consortium courses against comparable traditionally taught courses, the student learning outcomes lack a baseline for comparison. As a result, the student learning outcome scores may not reflect the full impact of online instruction on student learning. Nonetheless, many faculty members (and some students) offered qualitative comparisons between face-to-face and online courses in their surveys and interviews.
  • Because all analyses in this report reflect aggregate data, it is difficult to isolate the unique challenges presented by any single institution or course.
  • The interpretation of student survey responses is hindered by low response rates as only 39 percent of enrolled students completed the survey.

Description of Courses and Participants

All except three courses offered through the Consortium in Spring 2017 were offered again in Spring 2018 by the same instructors.[7] Almost all courses (95 percent) were 200- and 300-level upper-division humanities courses offered in a range of disciplines including Art History, Asian Studies, Classics, French, Italian, Religious Studies, Theatre, Women’s Studies, and Writing.

Over 80 percent of the courses offered in 2017 were taught entirely online, and 78 percent of responding instructors this year indicated that their course formats did not change significantly (Figure 1). Some hybrid courses were also transformed into fully online courses this year, suggesting that almost all courses were delivered primarily online with limited in-person interactions between instructors and students.

Figure 1: Changes Made to Course Format from Year 1 to Year 2

When asked to indicate whether face-to-face interaction was built into their courses with locally enrolled students, 45 percent of the responding instructors indicated that their courses did not include opportunities to interact in person while another 38 percent noted they had some opportunities (Figure 2). Anecdotal accounts of both students and instructors suggest that the amount of face-to-face interaction between instructors and visiting students was very limited. Some instructors noted though that they required or encouraged visiting students to interact with them synchronously using video conferencing tools.

Figure 2: Face-to-Face Interactions between Instructors and Local Students

Slightly more instructors this year compared to last year thought enrollment was about the same or higher than the typical enrollment in comparable face-to-face courses (Figure 3).

Figure 3: Consortium Enrollment Compared to Typical Enrollment in Year 1 and Year 2

This year, about 83 percent of responding instructors indicated that they had students from other Consortium institutions enrolled in their courses with the majority of them having 5 or fewer visiting students (Figure 4).  In the first Consortium, about 60 percent of courses attracted enrollments from other campuses.

Figure 4: Cross-Campus Enrollment

Of the total of 661 students enrolled in the Consortium courses in Spring 2018, 257 responded to the survey (39 percent). And of those who responded to the survey, about 12 percent were visiting students. Most of the responding students were juniors and seniors, similar to the group that took Consortium courses in Spring 2017 (Figure 5).

Figure 5: Student Survey Respondents’ Class Levels

Fifty-seven percent of local and 56 percent of visiting student respondents indicated that they had prior experience with online/hybrid courses (Figure 6). This result suggests that visiting students were not significantly different from local students in terms of their level of experience with online courses – although this must be interpreted with a caveat since only about a third of enrolled students completed the survey. About 60 percent of last year’s responding students indicated that they had prior experience with online/hybrid courses.

Figure 6: Student Survey Respondents’ Experience with Online/Hybrid Courses

As was the case in 2017, the responding students’ top reasons for enrolling in a course offered by the Consortiums were: (1) It fit my schedule, (2) The course is required for my major, and (3) The quality/reputation of the instructor attracted me to this course (Figure 7). Almost 50 percent of the students chose “It fit my schedule” as the first reason for enrolling in the Consortium courses and another 30 percent chose it as their second reason. This confirms that the flexibility afforded by the online mode of delivery continues to be a strong motivating factor for students when choosing to enroll in these courses.

Figure 7: Students’ Top Reasons for Enrolling in Consortium Courses

That convenience and flexibility of scheduling were the main motivating factors in choosing to enroll in the courses was reinforced in some of students’ written responses. One student who took the online course from her home institution wrote:

Being a mother of 4 children and working [in] a high-stress full time job, I prefer online courses. It is not always easy or convenient for me to drive to campus because of my work schedule or family obligations. I am thankful for the few opportunities I have had to take an online course. I only wish more online courses were offered on a more consistent basis.

Another student who took the course from an institution other than his own also noted the flexibility online courses offer in terms of how he interacts with course content:

I like online classes. I get to choose when I have the time to do the work and choose how to interact with the information myself. Also, video lectures are better for me over real [sic] ones because of the ability to pause, rewind, re-watch, etc.

Such sentiments were also shared by some instructors, who noted the advantages of being able to check student progress during the slower parts of the day and not having to be in a classroom at a set time.

Instructor Experience

Because the courses were being offered for the second time in Spring 2018, the typical amount of time required to prepare each course for teaching was less than it had been a year earlier.  Forty-four percent (44 percent) of responding instructors last year indicated that they created brand new courses. In contrast, 73 percent of responding instructors this year noted that they only had to make slight modifications to their courses (Figure 8).

Figure 8: Modifications Made to Courses in Year 1 and Year 2

In Year 1, more than 50 percent of responding instructors indicated that they spent “much more time” on both planning and developing than they would typically do for courses taught in traditional classroom settings. In Year 2, only 15 percent of the responding instructors indicated that revising their courses took much more time compared to the time required to revise traditionally taught courses. Even though 50 percent of responding instructors this year indicated that revising their courses did take more time or much more time, more instructors this year (about 28 percent) noted that the actual teaching of the courses took less time (Figure 9). One instructor wrote:

In all honestly, because the work was so front-loaded, teaching an online course allowed me to have more time in the week to grade, respond to discussion posts, and focus on student learning.

Some of the most time-intensive revisions that instructors noted in their written responses included creating video lectures, redesigning assignments, developing exams, and other course resources, changing grading system and the number of assignments offered, adding more specific instructions for online modules, and carefully planning the order of the materials to provide a clearer course road map for students. Several instructors also noted that learning the technology necessary to include visiting students and changing their course schedules to allow for their participation took a significant amount of their time and resources.

Figure 9: Amount of Time Required to Revise and Teach Consortium Courses

Last year, 87 percent of instructors reported that they had some kind of training to teach online and 74 percent indicated that they had access to instructional designers or technologies to support the design and development of their courses while 80 percent indicated that they had access to IT support. This year, those numbers have decreased, as about 50 percent of the instructors indicated that they had additional training to teach online and about the same share of instructors indicated that they had access to instructional designers/technologies to help them revise their courses. Furthermore, 67 percent indicated that they had access to IT support this year.

It is unclear whether the decreased numbers are due to the instructors’ greater level of comfort with online teaching and therefore needing fewer or different resources, or the curtailment of resources at the institutions. As shown in Figure 10, however, an overwhelming majority of the instructors reported that they felt adequately prepared both to revise and offer their courses this year and a relatively small number of instructors indicated that they experienced significant challenges in revising and offering their courses.

Figure 10: Instructors’ Perceived Level of Preparation and Support

Even though over 80 percent of instructors felt comfortable guiding their class toward an understanding of the course topics and helping them to clarify their thinking in the online environment, about 40 percent indicated that they did not get to know their students as individuals in this course, and almost 80 percent indicated that they were not able to form personal relationships with students similar to those that they have with students in traditionally taught courses (Figure 11). These patterns are similar to those found last year.

Figure 11: Instructors’ Interactions with Students in Consortium Courses

Several instructors have noted that the inability to form personal relationships with their students and not being able to engage in “before class casual conversation” are the least satisfying aspects of teaching online courses. One instructor wrote:

I missed getting to know the students, and didn’t feel a “spark” like I feel when teaching students face-to-face. I enjoy getting to know them as people, laughing as a group, etc.—those are things that are difficult, and sometimes impossible, to replicate online.

Another instructor wrote:

I miss the opportunity to talk with students about one of my primary research interests and to share my passion about the subject with them personally. But the fact that we didn’t meet meant I couldn’t get “off track” during our discussions.

Similar to last year, the instructors this year had mixed feelings about whether the online environment helped create a sense of community among students (Figure 12). An interesting pattern to note is that most of the instructors this year reacted positively to the statement about online discussions helping students to develop a sense of collaboration (~80 percent compared to ~40 percent last year). This may be due in part to the instructors’ greater confidence with online teaching from the experience they gained through the Consortium and/or other online courses and training offered through their institutions.

Figure 12: Instructors’ Perceived Social Presence in Consortium Courses

At least anecdotally, more instructors this year seemed quite pleased with the level of online discussion in their courses, although some noted continuing problems:

Discussion was not nearly as good as a face-to-face class. Students are comfortable asking questions about topics they genuinely do not understand in a face-to-face class where their words are ephemeral and they get immediate feedback. They are unwilling to post what they feel might be “dumb questions” to a discussion board where their words are eternal.

The [online format limited my ability to stray off] subject, which I like to do. In the online format, discussions were more rigid.

The first column in Table 1 highlights some instructional approaches that instructors noted to have worked especially well in their courses this year. These approaches involve guiding students to navigate the course content and assignments by giving weekly instructor overviews and establishing a well-defined weekly schedule that could help students make the coursework part of a daily routine. Giving very clear and targeted feedback on individual students’ work also proved to be an effective strategy as was presenting a variety of resources and assignment types to make students’ learning experience more interesting and engaging.

There are still some lingering challenges that are worth noting. Some instructors believe that the asynchronous nature of online courses makes it difficult for them to model certain disciplinary practices that are most effective in a face-to-face context. Also, in the case of entry-level language courses, where many students have little or no prior background with the language, the online space was not ideal for providing regular opportunities for students to practice using the language with native speakers and other students. Similarly to last year, instructors found that longer-term projects (e.g., research papers) and group projects are challenging, especially for weaker students and those who are not locally enrolled. The inability to meet with students to provide on-the-spot feedback or continuing guidance on assignments that require deeper thought and effort was the main barrier to achieving the learning goals that many instructors hoped to achieve.

Table 1: Successful Instructional Approaches and Lingering Challenges

 Approaches that Worked Well  Lingering Challenges
  • Giving weekly updates/overview to the whole class to help them understand what the class as a group need to focus on
  • Requiring regular interactions in a weekly pattern – e.g., posts due on Mondays and Wednesdays, reading quizzes due on Thursdays, replies to posts due on Fridays
  • Giving targeted responses to individual students on each assignment performed – e.g., pointing out certain tendencies, misunderstanding of the course material, urging students to make more meaningful responses to others’ posts (raise questions rather than simple agreement or disagreement), giving helpful hints about how to approach a particularly difficult reading
  • Presenting a mix of resources and a variety of assignment types each week to help make the learning experience more interesting and engaging
  • Asynchronous nature of online courses make it difficult to perform certain disciplinary practices – e.g., for disciplines like Theology which is more reflective than discursive, it is difficult to model those practices to students in an online space
  • Asynchronous nature of online courses pose challenges for entry-level language courses in which students could benefit from meeting the instructor and other students face-to-face to practice using the language
  • Projects or assignments that require synchronous participation are especially challenging for visiting students
  • Research projects or longer-term assignments that need to be done more independently pose challenges  for weaker students who could benefit from more one-on-one guidance through class interactions

When asked to rate the perceived depth and breadth of learning in the Consortium courses compared to previous courses they have taught in more traditional classroom settings, more instructors indicated that learning was shallower in their online courses (Figure 13).

Figure 13: Depth and Breadth of Learning in Consortium Courses Compared to Traditional Courses

When asked to indicate their perception of the percentage of students who either met or exceeded their learning expectations, 49 percent of responding instructors indicated that 85 to 100 percent of their students met or exceeded their expectations while another 36 percent of instructors indicated that 70 to 85 percent of their students did so (Figure 14).

Figure 14: Instructors’ Perception of the Percentage of Students Meeting Learning Expectations

When asked to rate how the performance of visiting students compared to that of local students, 55 percent of responding instructors thought that students’ performance was about the same; 31 percent thought it was somewhat worse or much worse; and 14 percent thought that it was somewhat better or much better (Figure 15). These perceptions are in fact reflected in the final grades that are presented in a later section of this report.

Figure 15: Performance of Students Enrolled from Other Institutions

A close examination of responding instructors’ written responses reveal that the number and quality of interactions they had with visiting students varied considerably. One instructor actually had more interactions with his visiting student than with his local students. He noted:

I had one student from another campus who enrolled in the course for its duration, and I met with her via Skype weekly for 30–40 minute conversations; thus, I had a much higher level of one-on-one interaction with her than the students on my own campus.

On the other hand, another instructor, who had one student from another Consortium institution and five others who were taking his course from France wrote:

 [My visiting students] did not take advantage of repeated offers to use Skype, FaceTime or Zoom to talk to me about their papers. The on-campus students came to my physical office or face-to-face meetings. In my face-to-face classes, the repeated reminders to visit office hours usually get 15–25% of students to show up. I think [those] who did not use online conference tools to chat with me about their papers did not learn as much. They also missed paper deadlines at greater rates than the students on campus in this online class.

Another instructor noted that his visiting students had different attitudes and behaviors that were obvious in their engagement with the coursework:

This is an ethics class. Different social and cultural attitudes and behaviors were obvious in discussion. Different academic expectations in terms of quantity and quality of writing, quantity of reading, and expectations as to appropriate answers to exam questions were also obvious. 

Despite such challenges, some instructors at the final workshop spoke enthusiastically about the benefits for their local students to hear different perspectives from the visiting students from other geographic regions with different cultural norms.

Student Experience

As was true last year, when the Consortium courses were offered the first time, student respondents gave high marks to the level of social presence in their courses. Many students felt comfortable interacting with other students online, disagreeing with others’ viewpoints while maintaining a sense of trust, and developing a sense of collaboration (although a smaller percentage of students this year agreed to the latter). They also rated highly the role of online discussions in helping them appreciate different perspectives as well as the instructors’ role in facilitating productive discussions and developing a sense of community among students.

Figure 16: Students’ Perceived Social Presence in Consortium Courses

Like last year’s students, most of the responding students this year rated their learning experience in Consortium courses highly (Figure 17). Many indicated that they felt motivated to explore questions raised by the course and rated highly the instructor’s role in providing clear instructions on how to participate in course learning activities. They also appreciated the instructors providing guidance to help them understand course topics in ways that clarified their thinking. Many students also responded positively to the transferability of knowledge gained in these online/hybrid courses to other related courses and activities.

Figure 17: Students’ Perceived Cognitive and Instructor Presence in Consortium Courses

Also, like the last year’s responding students, most of the students this year were comfortable using online tools and technology that were part of their courses and believed that they had adequate access to technical support throughout the semester.

Figure 18: Students’ Experience with Technology in Spring 2018

When students were asked to indicate how their courses compared to other traditional in-person courses, 41 percent of responding students said that the Consortium courses were somewhat better or much better; 37 percent thought they were about the same; and the remaining 22 percent indicated that they were somewhat or much worse than most traditionally taught courses (Figure 19).

Figure 19: Comparing Consortium Courses to Traditional In-Person Courses (Students)

Those who responded favorably to the Consortium courses reported the convenience of fitting them into their schedules and the greater level of independence they afforded as reasons for their positive responses. One student wrote:

The course was very independent which worked best for my schedule. As a senior, the course content connected with my research so I enjoyed completing the assignments on my own time. I was able to read and write without the restriction of an in-class deadline.

Another student rated her course highly because it was well organized to help her keep up with the work and manage time effectively:

The course was organized very effectively, I liked that there were certain days for specific assignments each week because it helped me stay on top of my work, as well as manage my time to make sure I got each assignment done. I also like how I had many interactions online with my peers.

Another student noted that the Consortium course helped her develop important skills that traditional in-person classes do not necessarily promote due to the very nature of the instructional medium:

I think this class was better than a traditional in-person class, because it raised more challenges, for example, at times we would have group projects and we would be challenged with the task of communicating with students we never met before and it worked out. It helped improve my communication skills, take constructive criticism (discussion board posts) and study independently.

Similar sentiments were shared by another student who wrote,

I think learning online with other students lets me be open more in my discussions because the fear of others’ opinions of my opinions was decreased due to not having to be in a classroom face-to-face with my classmates […] I also think I learned to motivate myself and [developed] more discipline having to do work on my own and meeting deadlines […] online classes require more work because you’re not in a traditional classroom where time is limited so you have to use different methods to make sure you’re learning what you need to learn.

Those who responded less favorably to this question cited two main reasons for their negative responses:  the difficulty of keeping up with a lot of details and the demanding coursework. One student wrote:

This course was just very hard to keep track of and in my opinion way too demanding for an online course. The lectures were fine but I just really think the class needs to be better fitted for an online environment and have a drastically different curriculum. I do think that I would have enjoyed this same course more in person and think the online aspect was the biggest problem.

Another student thought that the online format did not support in-depth learning, at least the in the particular course they had taken:

I felt like I didn’t really learn the material. I just read the chapter assigned and asked to answer questions. The instructor never went over anything to summarize what we read, she came off like we were supposed to already know and understand everything that we had read.

Another student wrote:

For me, it was difficult to stay on top of my work. I also felt like we had to do a lot of busy work to prove that we were listening to the lectures.

In terms of the difficulty of Consortium courses compared to other upper-level humanities courses, 36 percent of responding students thought their courses were somewhat more difficult or much more difficult, 48 percent thought they were about the same, and 16 percent thought they were somewhat easier or much easier.

Figure 20: Comparing Consortium Courses to Other Upper Level Humanities Courses (Students)

In response to a question asking them to describe the benefits or challenges of having students from other institutions in the same class, many local students noted that they were unaware of the presence of visiting students in their courses. However, many students were supportive of the idea of having students from other campuses in their courses. One student wrote:

It is interesting to interact with students from other locations because mindsets and viewpoints are so different. In the traditional lecture setting, we are used to working with students usually in the same year as us and oftentimes the same major, so it was nice to be able to interact with different students.

Another student also appreciated the opportunity to interact with students from different backgrounds and gain new perspectives:

It was interesting to gain perspectives from people outside [of my home institution] community. [My home institution’s] demographic tends to be similar (mostly young women), but students from other schools gave the opportunity for new insight in online discussions.

Many visiting students reported communicating with professors and navigating different academic standards and schedules as challenging, but they appreciated the opportunities to take courses not offered by their home institutions. One visiting student wrote,

My home university offers almost no specialized humanities courses. I like having access to these more narrowly focused courses from another university without having to “transfer” credits.

Another visiting student wrote,

[My home institution] doesn’t have these courses and I feel like taking this class really helped advance my knowledge in something I am actually interested in. I was excited when I saw this course offered and am really glad I got to take it as I would have not had an opportunity to learn about the art in such great depth.

Like last year’s students, the student respondents this year rated their overall experience with the courses very positively, with 84 percent indicating that their experience was either good or very good (Figure 21).

Figure 21: Students’ Experience with Consortium Courses

Finally, also similarly to last year, over 80 percent of students indicated that they would probably or definitely take another online/hybrid courses in the future (Figure 22).

Figure 22: Students’ Willingness to Take Online/Hybrid Course in the Future

Student Learning Outcomes

All 21 institutions supplied course-level data which included course completion and grade outcomes for 661 students. Based on the data reported from the institutions, the student withdrawal rate across the Consortium courses was just six percent —considerably lower than the first iteration’s withdrawal rate (11 percent). Of the 661 students, 610 (92 percent) completed their courses and received grades. Of those who completed their courses, 553 (91 percent) received a C or better, which also is an improvement over last year’s rate (~80 percent). The average GPA was 3.22, which is equivalent to the letter grade B, and the median GPA was 3.70, which is equivalent to the letter grade A-.

Visiting students comprised about 13 percent of the total enrollment in these courses (577 local and 84 visiting students). The withdrawal rate for visiting students was slightly higher than that for the local students (7.14 percent and 5.89 percent, respectively), and a higher percentage of visiting students earned F’s (9.5 percent compared to 3.1 percent). Overall, the DFW rate for the visiting student group was 4.36 percentage points higher than that of the local students (16.67 percent and 12.31 percent, respectively). The average GPA for visiting students was 3.09 and that of local students was 3.24, both of which are equivalent to the letter grade B. The median GPA for both visiting and local students was 3.70, which is equivalent to the letter grade A-.

Figure 23: Grade Distribution across Consortium Courses

(Note: AB’s were counted as A’s and BC’s were counted as B’s)

Course-Specific Learning Outcome Scores by Instructors

A total of 38 instructors representing 20 Consortium institutions provided learning outcome scores for their students. Instructors identified between two and nine specific learning outcomes for their individual courses at the beginning of the semester and assessed each of their students’ achievements on a particular assignment using the following scale: Beginning [1], Developing [2], Competent [3], and Accomplished [4]. The average learning outcome score was 3.06, which falls within the “competent” range.

General Learning Outcome Scores by Peer Evaluators

A panel of six faculty evaluators recruited from the current cohort of instructors used the same four-point scale to assess student learning outcomes by reviewing select students’ work from other Consortium courses. The faculty evaluators reviewed a total of 51 student artifacts submitted by 13 instructors to assess whether the students achieved two learning outcomes that were identified to be general goals of the humanities disciplines: (1) interpreting and analyzing texts, and (2) synthesizing knowledge. The average assessment score was 3.02, which also falls within the “competent” range. Figure 24 shows the distribution of scores for each learning outcome.

One faculty evaluator noted that even though the evaluation rubric (see Appendix B) was largely content-based and without a specific component for assessing the rigor of student writing, the quality of the grammar, syntax, and style applied to an assignment inevitably affected the results. He noted that most assignments were text-based. The same evaluator also commented that there were several creatively framed projects (involving blogs, interviews, and web-based interactive discussions, for examples) from which he was able to sense a higher level of student enthusiasm and learning. He suggested that the current evaluation rubric be modified in the future to serve as a better assessment of the value of such creative projects on student learning.

Figure 24. Distribution of Learning Outcomes Scores by Peer Faculty Assessors

 Overall Assessment

Compared to the previous year, slightly more instructors thought that their courses went better than they expected (38 percent compared to 31 percent) and about as well as they expected (48 percent compared to 23 percent) (Figure 25).  As was the case following the first iteration of the courses, faculty members in 2018 explained that engaging and high-performing students in their courses made online teaching a positive experience.

Figure 25: Instructors’ Overall Assessment of Second Iteration of Consortium Courses

Some instructors noted that certain revisions made to their courses in the second iteration seemed to have helped enhance student outcomes this year. As one instructor wrote:

Last year’s quiz and exam results averaged a B to B-. This year’s average was a pretty consistent B+. I really think the addition of the instructor overviews played a big part in motivating students to dive into the weekly course materials with more interest and focus.

Another instructor wrote that setting clearer expectations for students this year helped improve their overall engagement and appreciation for the course:

Based on my experience from last year — which was less than I had expected/hoped — I wasn’t sure what would happen. But overall the students seemed more engaged, there were more expressions of appreciation for the course, and, I think, more learning. I’m not entirely sure why — I’ll continue to reflect on that. I think I was clearer with my expectations, though, which helped a lot.

One instructor who responded less positively to this question noted that embedding many assessment components to test students’ developing knowledge in the course produced an overwhelming amount of grading.

Too many students and too many technology issues. I couldn’t give them anywhere near the attention they deserved. I tried, but the assessment component to ensure they were doing reading and writing absolutely buried me. 400 level is just too much for an online course. Last year was better because the students all knew me and each other, and frankly were less needy. This year was just HARD.

Such sentiments were shared by other instructors who found it difficult to preserve an adequate level of rigor in their courses without making it too burdensome to teach and deliver. Some instructors also noted that many students who could have benefited from virtual office hours did not take advantage them despite repeated offers. One instructor wrote:

I am very accessible and approachable, but somehow students didn’t make appointments for online conferencing despite repeated suggestions, on my part, that grades often improve after such appointments. Perhaps the lack of immediacy in the online environment made that offer less credible.

 As was true last year, most of the responding instructors (90 percent) rated the online/hybrid instructional format to be appropriate for teaching advanced humanities content (Figure 26).

Figure 26: Instructors’ Overall Assessment of Appropriateness of Online/Hybrid Format for Teaching Advanced Humanities Content

One instructor noted that many courses can be delivered online, and if designed and delivered effectively, they can engage all students, not just “a particularly good/chatty student” in discussions:

I do not think EVERY course is appropriate for online delivery, but I certainly think many courses can be delivered online. I was surprised at how much better my students performed online. There is no “hiding” in an online class. Students cannot rely on a particularly good/chatty student doing the work of discussion. It is very clear to me and to the students if they are doing the work or not.

Another instructor, who also responded positively to this question, noted that if designed and delivered properly, online courses can help students develop the skills they need to be as successful:

I believe absolutely that we can deliver high-quality, upper-level courses in the humanities in an online/hybrid format. Course design is critical, as is a willingness to invest considerable “upfront” time at the beginning. When done properly, however, online/hybrid courses can engage students and help them develop the skills they need for life-long personal and professional success at the same rate as traditional courses.

One instructor noted that the greater flexibility and accessibility offered by online courses are notable but that there are issues that are more challenging to address. Depth of learning and spontaneous conversation that can occur in in-person discussions are not always present in online courses:

There are definitely benefits to an online or hybrid course for a small college, primarily the delivery of course content to students who need schedule flexibility, or who cannot be present for a traditional course. Access is a huge benefit. However, there are some significant benefits for students that are lost, I think, or at least challenging to address, in an online format. Chief among these is the depth of learning and spontaneous conversation that can occur during class discussions in person…Online can be lonelier in some ways, and not as rich an environment.

Fifty-five percent of responding instructors this year indicated that they are more likely or much more likely to encourage their colleagues to teach online as a result of their experience with the second iteration of Consortium courses in 2017–2018 (Figure 27).

Figure 27: Instructors’ Likelihood of Encouraging Other Colleagues to Teach Online Due to Consortium Experience

Similarly, a majority of instructors indicated that they would encourage students to enroll in an online/hybrid course and that they would be interested in teaching another online/hybrid course in the future (Figure 28).

Figure 28: Instructors’ Willingness to Teach Online and Encourage Students to Take Online Courses

Faculty members were asked to share their “big takeaways” after having taught the same online courses twice as part of the Consortium. One instructor noted that setting clear expectations for online classes at the outset is very important and something that should be emphasized to students:

I think I should create a more substantive first lecture about the expectations for online classes. I had one such lecture this term, but next time I’ll really emphasize that self-starters who keep a calendar with work to do every few days seem to have learned more and done better than those who just check in now and then.

Another instructor noted that it is possible to incorporate “informal” types of discourse in an online sphere in a similar way as in a traditional face-to-face context, which sometimes helps students pay attention and internalize course content:

Showing my real self—my frustrations as the instructor, hopes, and expectations for students—really makes the students pay attention more.  My face-to-face method uses a lot of “informal” types of discourse and allows for the class to go off a bit on tangents. Though that seems like it would distract from the content, it makes them internalize it more. I found that I can do that in the online sphere as well and it works in a similar way.

Another instructor noted that keeping an online course up-to-date is not an easy chore, especially for subject areas that have new and emerging materials that should be explored. This instructor also noted that the ability to make abundant materials available to students gave students more choice in determining how they engage with the course:

One thing I’ve been struck by is how much time one could spend keeping one’s online course current and relevant — and how quickly the content can become out of date or out of touch. This is no doubt especially true in a class about science fiction and the future. There is so much fascinating material to be explored! And since my course is designed so that students have quite a bit of choice about the work they do, I can put lots of that material in front of them.

Participants’ Impressions of the Consortium

The faculty, administrators, and registrars with whom the research team spoke all agreed that the consortium concept has considerable potential for helping small, independent colleges provide broader course offerings to their students at a reasonable cost. At least one dean praised the Consortium for its ability to give students more varied humanities offerings:

One of our hopes for this program was that we would be able to offer a greater variety of humanities courses to our students. It was very interesting to see that a significant number of our students—14—took online courses from another institution. So, they did have more options.

Another administrator interested in expanding offerings for students said:

We have been concerned about online instruction in part to attract more non-traditional students, but even our traditional students are very concerned about flexibility and convenience. Our nursing students, for example, struggle with the time demands of course work and labs. Having online courses they can take really helps them.

The registrars also were positive about the Consortium; all of the interviewed registrars commented on their pleasant surprise that the logistics of collaboration were easier to deal with than expected:

We were concerned that it would be difficult to deal with registering and recording grades for students from other schools, but the process proved to be surprisingly easy. Everything worked well. If the project continues, we’ll need to develop some procedures, but this pilot project was so easy.

Another registrar also commented on the Consortium’s beneficial effect on collaboration:

Collaboration between and among faculty members has been the biggest highlight for us. Our faculty were willing to step out of their comfort zone to try new things. The workshop hosted by CIC was especially helpful in promoting collaboration. Our faculty gained lots of new ideas from talking with other faculty [from the Consortium].

In their written responses, many instructors noted that the process for sharing courses would need to be more systematic and designed in a way that benefits all, not some, participating institutions. One instructor wrote:

It would be helpful to expand course offerings but in the world of scarce resources and cuts, it is too hard to see consortium classes as helpful rather than threatening for many colleagues. I was disappointed to see how many other institutions would not accept my course (while my institution accepted all courses that we did not offer). Maybe this could help in a better environment.

Another instructor wrote:

Colleagues in small programs may fear having enrollment in their own courses reduced by students migrating to online courses from other colleges; responsible faculty (course coordinators and department heads, and even deans) want to make sure that their SLOs (Student Learning Outcomes) are being prioritized and that the faculty teaching the course on another campus is qualified. In essence, the consortium would have to create a very robust and transparent structure/protocol addressing faculty concerns and stipulating what and how many online courses on other campuses that students would be allowed to take. In my opinion, it would take an extraordinary effort to allay the fears (some justified, others not) of faculty before they would accept this.

Many also pointed out that there should be a robust logistics support both at the individual institution and consortium-wide levels to help instructors redesign their courses successfully and deliver them effectively to more students. Several instructors have noted that offering online courses to students who bring to class different academic calendars was very challenging and thought that there should be support in place for those who are crossing the institutional boundaries to take advantage of courses that are offered from other institutions.

Conclusion

When this experiment began four years ago, liberal arts colleges had relatively limited experience with online teaching and learning. While online learning was receiving a tremendous amount of attention from funding agencies, governing boards, and educational media—much of it fixated on MOOCs and other large-scale initiatives—faculty and administrators at smaller independent colleges and universities pointed to the superior learning experience in small face-to-face classes. Yet even at these institutions, administrators considered the possibility that online courses would help to increase course offerings without increasing instructional costs, while at least a few faculty members wanted to experiment with new pedagogical methods.

While faculty members and administrators who have participated in the Consortium have gained confidence in the efficacy of online learning, not everyone on these small campuses is convinced that online learning is an acceptable option for their institutions.  Some faculty members who have not been part of the Consortium project still worry about losing their jobs if online teaching becomes widely practiced. Others believe that the special value proposition of their institutions would be lost in an online environment. Others worry that they do not have the technological skills necessary to develop online courses, or that their institutions will not be able to provide sufficient technology support for instructors or students.

The Council of Independent Colleges’ Consortium for Online Humanities Instruction II has clearly demonstrated that online learning can be designed and delivered in ways that are academically sound. For participating faculty, there are no lingering questions about the value or validity of online instruction. They have concluded that good teaching is good teaching regardless of the delivery mode. Students have come to expect access to online courses, and they value both the unique learning experience and the convenience online courses afford.

Instructors who have participated in the Consortium readily admit that their teaching practice has improved through the process of creating and delivering online courses. They were forced to consider carefully the learning objectives and the relationship of each assignment to those objectives. They observed that students were, for the most part, deeply engaged with the course material, and that students interacted easily with their classmates and instructors. Nonetheless, faculty members continue to place a high value on personal interaction with their students—and they missed that ability to interact in an online environment. Several instructors at the final workshop in July 2018, for example, described how they missed seeing the facial expressions of their students, their usual resource to gauge comprehension and devise interventions.

Faculty and students alike were enthusiastic about opportunities to interact with students from other campuses in an online learning space. In some cases, having the online courses available from other institutions meant that students were able to find the courses they needed to graduate on time. In others, it meant that students were able to take courses that were not available on their local campuses. And in some cases, homogeneous student populations had an opportunity to hear different perspectives from students in another region of the country, from another faith community, or from another gender or ethnic perspective. In all cases, this exposure to “other” was perceived as valuable.

The ability to share upper level online humanities courses is no longer a question. Institutions can be more efficient in scheduling courses on their own campuses by relying on online options from other institutions. Language instruction can be broadened by participating in a consortium in which different institutions focus on different languages. It is possible that the number of adjunct instructors can be reduced by relying more on online options for local students. If so, traditional faculty members would appreciate the effective use of tenured, tenure-track, or other long-term, full-time faculty members. Yet, questions about the circumstances under which courses from other institutions will be offered to local students are a political conundrum for many faculty members and their academic departments.

The project clearly demonstrated that students can learn effectively in online courses and be as engaged as they are in face-to-face courses and that faculty can design online courses that result in good learning outcomes for their students. Faculty readily noted that online instruction is most effective when support structures are firmly in place. All participants in the Consortium valued the experience, but they also noted that it may take some time for online learning to become an accepted part of the academic program among all small colleges and universities.

 

Appendix A: List of Data Elements Requested

Student-Level Data

For students participating in Consortium courses:

  • Unique identifier (student IDs must be anonymous)
  • Home Institution
  • Student major field of study
  • Student minor field of study (if applicable)
  • Consortium course name and number
  • Student final course grade
  • Indicator of whether a student withdrew from the course
  • If available, the student’s withdrawal date
  • Indicator of whether the course counts towards the student’s major requirements (either core or elective)
  • Indicator of whether a student is visiting or locally enrolled

Course-Level Data

  • For years (2010-11 to 2015-16), number of courses offered in-person at the institution
  • For years (2010-11 to 2015-16), number of courses offered online at the institution
  • For years (2010-11 to 2015-16), institutional spending on instruction

 

Appendix B: Rubric for Peer Assessment

CIC Consortium for Online Humanities Instruction Learning Outcomes Assessment Rubric

Outcome 1

High Level Goal Beginning:

did not meet the goal

Developing:

is approaching  the goal

Competent:

met the goal

Accomplished:

exceeded the goal

1. Interpret meaning as it is expressed in artistic, intellectual, or cultural works The student

a. does not appropriately use discipline-based terminology,

b. does not summarize or describe major points or features of relevant works

c. does not articulate similarities or differences in a range of works

The student

a. attempts to use discipline-based terminology with uneven success, and demonstrates a basic understanding of that terminology.

b. summarizes or describes most of the major points or features of relevant works

c. articulates some similarities and differences among assigned works

The student

a. uses discipline-based terminology appropriately and demonstrates a conceptual understanding of that terminology.

b. summarizes or describes the major points or features of relevant works, with some reference to a contextualizing disciplinary framework

c. articulates important relationships among assigned works

The student

a. incorporates and demonstrates command of disciplinary concepts and terminology in sophisticated and complex ways

b. summarizes or describes the major points or features of relevant works in detail and depth, and articulates their significance within a contextualizing disciplinary framework

c. articulates original and insightful relationships within and beyond the assigned works

 

Outcome 2

High Level Goal Beginning:

did not meet the goal

Developing:

is approaching  the goal

Competent:

met the goal

Accomplished:

exceeded the goal

2. Synthesize knowledge and perspectives gained from interpretive analysis (such as the interpretations referred to in goal 1) The student

a. makes judgments without using clearly defined criteria

b. takes a position (perspective, thesis/ hypothesis) that is simplistic and obvious

c. does not attempt to understand or engage different positions or worldviews

The student

a. makes judgments using rudimentary criteria that are appropriate to the discipline

b. takes a specific position (perspective, thesis/hypothesis) that acknowledges different sides of an issue

c. attempts to understand and engage different positions and worldviews

The student

a. makes judgments using clear criteria based on appropriate disciplinary principles

b. takes a specific position (perspective, thesis/hypothesis) that takes into account the complexities of an issue and acknowledges others’ points of view

c. understands and engages with different positions and worldviews

The student

a. makes judgments using elegantly articulated criteria based on a sophisticated and critical engagement with disciplinary principles

b. takes a specific position (perspective, thesis/hypothesis) that is imaginative, taking into account the complexities of an issue and engaging others’ points of view.

c. engages in sophisticated dialogue with different positions and worldviews

 

 

Appendix C: Instructor Survey

Instructor Survey Instrument

Dear Consortium Colleague,

Thank you for taking the time to complete this survey. All questions in this survey refer to the course you taught this semester as part of CIC’s Consortium for Online Humanities Instruction. While we have pieces of this information from various sources (proposals, interviews, etc.), the survey will ensure that we have comprehensive information about all the participants’ courses and backgrounds. This will enable us to assess the impact of institutional and background factors on your experiences teaching online. We also wish to learn about your experiences and observations as a result of teaching the course.

This survey should take about 30-40 minutes to complete. If you wish to pause while filling out the survey, your work will be saved and you can return to it later. Click on the “Begin Survey” link below to agree to the terms of participation and start the survey.

 

Your answers to the following questions will give us a sense of your background and provide us with information about how your course changed from its first to its second iteration. 

  1. What is your institutional affiliation?
  2. How many years have you been teaching at this institution?
  3. What is your primary departmental affiliation?
  4. What is the name and number of your course?
  5. Did your course include opportunities for face-to-face interactions between the instructor and locally enrolled students?
  • My course included regular opportunities for face-to-face interactions between the instructor and locally enrolled students.
  • My course included some or ad-hoc opportunities for face-to-face interactions between the instructor and locally enrolled.
  • I delivered my course entirely online and did not have face-to-face interactions with locally enrolled students.
  • Other ________________________________________________

 

  1. Did the format of your course change significantly since last year?
  • My course changed from being a hybrid course to being delivered fully online.
  • My course changed from being a hybrid course to being delivered fully online for students enrolled from other institutions, but maintained face-to-face components for locally enrolled students.
  • My course changed from being a fully online course to having face-to-face components for locally enrolled students.
  • My course format did not change significantly.
  1. To what extent did you modify other aspects of your course (content, online tools used, assessment methods, etc.) since last year?
  • I made significant modifications to my course.
  • I made slight modifications to my course.
  • I made no modifications to my course.

 

Display Question 8, if selected “I made significant modifications to my course” or “I made slight modifications to my course” in Question 7.

  1. Which components of your course did you modify?
  • Course curriculum and/or content
  • Online tools and platforms
  • Learning outcomes
  • Assessments
  • Pedagogical approach
  • Other ________________________________________________
  1. How many students enrolled in your course this semester (final enrollment, after drops and adds)?
  • 5 or fewer
  • 6-10
  • 11-15
  • 16-20
  • 21 or more
  1. How many students enrolled from your home institution?
  • 5 or fewer
  • 6-10
  • 11-15
  • 16-20
  • 21 or more

 

  1. How many students enrolled from other institutions?
  • No students from other institutions enrolled in my course.
  • 5 or fewer
  • 6-10
  • 11-15
  • 16-20
  • 21 or more
  1. How does the number of students who enrolled in your course this semester compare to the typical enrollment for a course of this nature at your institution?
  • Fewer students enrolled in this course than typically do for a traditionally taught course of this nature.
  • About the same number of students enrolled in this course.
  • More students enrolled in this course than typically do for a traditionally taught course of this nature.
  • I am not sure.
  1. How does the number of students who enrolled this semester compare to the number of students who enrolled in your course’s first iteration?
  • Fewer students enrolled in this course than did during its first iteration.
  • About the same number of students enrolled in this course as did during its first iteration.
  • More students enrolled in this course than did during its first iteration.
  • I am not sure.

 

Display Question 14 and 15 if selected “5 or fewer”, “6-10”, “11-15”, “16-20”, or “21 or more” in Question 11.

  1. Do you think having student(s) in the class from institutions other than your home institution enriched (or hindered) the overall learning experience of the students? Please explain.
  2. Please describe any challenges that you encountered having student(s) in the class from institutions other than your home institution this semester (if any).

Your answers to the following questions will help us understand the support you received in revising and teaching your course.

  1. Have you participated in any kind of training to teach online in the past year?
  • Yes
  • No

 

Display Question 17 if selected “Yes” in Question 16.

  1. Please describe the training you received in the past year. For example, who provided the training? What was the duration in terms of hours or weeks?
  2. Did you have access to instructional designers and/or instructional technologists at your institution to help you revise your course?
  • Yes
  • No

Display Question 19 if selected “Yes” in Question 18.

  1. Please estimate how many hours of instructional designer/ instructional technologists’ time you used to revise this course.
  2. Did you have access to IT support to plan, revise and/or teach your course?
  • Yes
  • No

Display Question 21 if selected “Yes” in Question 20. 

  1. Please estimate how many hours of IT staff time you used for this course.
  2. Please indicate the extent to which you agree with each statement.
Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree
I felt adequately prepared to REVISE my online/hybrid courses this semester.
I felt adequately prepared to OFFER my online/hybrid course this semester.
I had adequate access to support from instructional designers and/or instructional technologists for this course.
I had adequate access to support from IT for this course.
I experienced significant technical challenges REVISING my course.
I experienced significant technical challenges OFFERING my course.

 

  1. How much time did it take to revise this course relative to the revision time for a comparable face-to-face course?
  • Much less time
  • Less time
  • About the same time
  • More time
  • Much more time
  1. Which aspects of course revision were the most time intensive?
  2. How much time did it take to teach this course relative to a comparable face-to-face course?
  • Much less time
  • Less time
  • About the same time
  • More time
  • Much more time

Your answers to the following questions will help us understand your impressions of student learning in your course. 

  1. Please select the statement that best fits your sense of the depth of student learning in this course:
  • The depth of student learning in this course was greater than in most traditionally taught courses.
  • The depth of student learning in this course was about the same as in most traditionally taught courses.
  • The depth of student learning in this course was less than in most traditionally taught courses.
  1. Please select the statement that best fits your sense of the breadth of student learning in this course:
  • The breadth of student learning in this course was greater than in most traditionally taught courses.
  • The breadth of student learning in this course was about the same as in most traditionally taught courses.
  • The breadth of student learning in this course was less than in most traditionally taught courses.
  1. Did you define learning outcomes for your course and assess students based on those outcomes?
  • Yes
  • No
  1. Based on your method of assessing student learning, roughly what percentage of students met or exceeded learning expectations in your course?
  • 85%-100%
  • 70%-85%
  • 55%-70%
  • 40%-55%
  • Less than 40%
  • Not sure
  1. Based on your method of assessing student learning, did students enrolled at other institutions perform noticeably better or worse than students enrolled at your home institution?
  • Much better
  • Somewhat better
  • About the same
  • Somewhat worse
  • Much worse
  • Not sure
  1. Please indicate the extent to which you agree with each statement.
Strongly Disagree Disagree Agree Strongly Agree N/A
I was able to form personal relationships with students in this course similar to the kind of relationships that I have with students in traditionally taught courses.
I was able to get to know students as individuals in this course.
Students felt comfortable interacting with each other in an online environment.
Students were able to disagree with each other in the online environment while still maintaining a sense of trust.
Online discussions helped students to develop a sense of collaboration.
There was a strong sense of community among the students in the course.
Students demonstrated a clear understanding of the course structure and expectations.
I felt comfortable guiding the class towards an understanding of course topics and helping them to clarify their thinking in the online environment.
Students were engaged and participated in productive dialogue in the online environment.
Students were motivated to explore questions raised by the course.
Students were comfortable using the online tools/technologies that were part of this course.

 

  1. Were your online interactions with students from different institutions noticeably different from your interactions with students from other institutions?
  • Yes
  • No

 

Display Question 33 if selected “Yes” in Question 32.

  1. Please explain your answer to the previous question.

Your answers to the following questions will help us understand your experience using technology for course design and delivery.

  1. What instructional approaches did you find worked especially well in the online environment?
  2. What instructional approaches did you find disappointing in the online environment?
  3. What technology tools did you find worked especially well in this course?
  4. What technology tools did you find did not work well in this course?

Your answers to the following questions will help us understand your overall impressions of teaching an online or hybrid course. 

  1. Please select the statement that best fits your situation:
  • Overall, my course went better than I expected.
  • Overall, my course went about as well as I expected.
  • Overall, my course did not go as well as I expected.
  • Overall, some aspects of my course went better and some things did not go as well as I expected.
  1. Please explain your answer to the previous question.
  2. What did you find most satisfying about teaching in an online/hybrid format during this iteration of your course?
  3. What did you find least satisfying about teaching in an online/hybrid format during this iteration of your course?
  4. How did teaching in an online/hybrid format this year compare to your experience during the first iteration of your course?
  5. What is your overall assessment of whether the online/hybrid format is appropriate for teaching advanced humanities content?
  • Appropriate
  • Somewhat appropriate
  • Not appropriate
  • Too early to tell

 

  1. If you were teaching this course a third time, what changes would you make in content or approach?
  2. What were the big lessons or takeaways from the second iteration of your course?
  3. Are you more or less likely to encourage your colleagues to teach online as a result of this experience?
  • Much more likely
  • More likely
  • Not any more likely or less likely
  • Less likely
  • Much less likely
  1. Based on your experience this term, please indicate the extent to which you agree with the following statements:
Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree
I would like to teach an online course in the future.
I would like to teach a hybrid course in the future.
I would teach a course that was open to students from other institutions in the future.
I would encourage my students to enroll in an online course.
I would encourage my students to enroll in a hybrid course.
I would encourage my students to enroll in an online course offered by another institution.

 

Your answers to the following questions will help us understand your experience working with your project team to deliver your course. 

  1. How often did you discuss the Consortium or your course with other members of your institution’s project team?
  • Never
  • Rarely
  • Sometimes
  • Often
  • Very often
  1. How often did you discuss with issues related to the enrollment of students from other institutions with the registrar or other administrators
  • Never
  • Rarely
  • Sometimes
  • Often
  • Very often
  1. Do you think a consortium, similar to the one you are participating in through CIC, could help your institution and/or department address issues of low enrollment and high cost while maintaining, or even enhancing, the quality of student learning? Please share your thoughts below.
  2. Do you have any additional comments about your course or experience that you would like to share?

Appendix D: Student Survey

Student Survey Instrument

Thank you for taking the time to complete this survey about your experience in Professor [Instructor Last Name]’s course on [Course Title] this semester.   Please note that your responses are confidential and anonymous, and results will only be reported in the aggregate. The survey should take 5-10 minutes to complete.

  1. Is [Institution Name] your home institution?
    • Yes
    • No

 

  1. Have you taken one or more online or hybrid courses before this semester?
    • Yes
    • No

 

  1. Rank the three most important reasons you chose to enroll in this course, where 1 is the most important.

______ It fit my schedule.

______ I like to interact with fellow students online.

______ The course is required for my major.

______ I thought it would be easier than a traditional in-person course.

______ I thought I would learn more than in a traditional in-person course.

______ I was curious about online or hybrid courses.

______ The quality/reputation of the instructor attracted me to the course.

______ Other (please explain): __________

 

  1. To what extent do you agree or disagree with the following statements about the course:
  Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Disagree
I felt comfortable interacting with other students in an online environment
I felt comfortable disagreeing with other students while still maintaining a sense of trust.
Online discussions helped me to develop a sense of collaboration.
Online discussions were valuable in helping me appreciate different perspectives.
The instructor helped to keep students engaged and participating in productive dialogue.
The instructor helped develop a sense of community among the students in the course.

 

  1. To what extent do you agree or disagree with the following statements about the course:
Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree
I felt motivated to explore questions raised by the course.
The instructor provided clear instructions on how to participate in course learning activities.
The instructor was helpful in guiding the class towards understanding course topics in a way that helped me clarify my thinking.
I can apply the knowledge created in this course to other courses or non-class related activities.

 

  1. To what extent do you agree or disagree with the following statements about the course:
Strongly Disagree Disagree Neither Agree nor Disagree Agree Strongly Agree
I felt comfortable using the online tools/technologies that were part of this course.
Use of technology in this course enhanced my learning.
I had adequate access to technical support (e.g. help in accessing online materials and making use of online tools/ technology).

 

  1. How would you evaluate your experience in this course?
  • Very Good
  • Good
  • Fair
  • Poor

 

  1. How would you compare this course to a traditional in-person course?
  • Much Worse
  • Somewhat Worse
  • About the Same
  • Somewhat Better
  • Much Better

 

  1. Please explain why you answered the previous question the way you did.

 

  1. How did this course compare to other upper level humanities courses in terms of difficulty?
  • Much more difficult
  • Somewhat more difficult
  • About the same
  • Somewhat easier
  • Much easier

 

  1. Would you take another online or hybrid course?
  • Definitely not
  • Probably not
  • Probably yes
  • Definitely yes

 

  1. Why would you or would you not take another online or hybrid course?

 

Display Question 13 if selected “No” in Question 1.

 

  1. Reflecting on your experience with this course, what do you think are the benefits and challenges of taking a course that is offered from an institution other than your own?

 

Display Question 13 if selected “Yes” in Question 1.

 

  1. Reflecting on your experience with this course, what do you think are the benefits and challenges of having students from other institutions in the same class? (Note: if you didn’t know there were students from other institutions in this class AND if you don’t have anything to comment on, please just write NA)

 

  1. What is your class level?
  • First-year
  • Sophomore
  • Junior
  • Senior
  • Unclassified

Appendix E: Interviewee List and Interview Scripts

Interviewees

We conducted 30-minute phone interviews with the following instructors, administrators, and registrars between May and June, 2018.

Instructors

  • Douglas Root, Assistant Professor of English at Claflin University
  • Georgia Seminet, Associate Professor of Spanish at St. Edward’s University
  • Timothy Shannon, Chair and Professor of History at Gettysburg College

Administrators

  • Charles Byler, Dean of College of Arts and Sciences and Professor of History at Carroll University
  • Tracy Dinesen, Associate Dean of Academic Affairs and Professor of Spanish at Simpson College
  • Matthew Gordley, Dean of College of Learning and Innovation and Associate Professor of Theology at Carlow University
  • Anne Marchant, Director for Transformative Teaching and Learning at Shenandoah University

Registrars

  • Catherine Day, Associate Vice President of Academic Affairs at Carroll College
  • Jody Ragan, Registrar at Simpson College

Interview Scripts

Instructors

  1. What was your experience teaching online or hybrid courses before participation in the CIC consortium?
  2. Tell us about the format of your course, what tools you used, what resources you used, how you interacted with students, etc.
  3. What was challenging about the delivery method of this course? Were there ways in which this format enhanced the teaching/learning experience?
  4. Did students seem engaged in the course? Were there parts of the course (e.g. discussion boards, lectures, readings) in which they were more or less engaged? How did this compare to an in-person format?
  5. How has participating in the consortium changed your views of online teaching and learning in the humanities? How will your experience teaching online and working with the CIC Consortium for Online change your approach to teaching in the future?
  6. What are your goals for the coming year? What could be improved?

Administrators

  1. What has been your role in the CIC project on your campus?
  2. Before the CIC project started, what was the state of online learning on your campus?
  3. How much experience did your institution’s faculty have with online learning?
  4. Do you think the project is helping the college accomplish important goals? For example?
  5. What has been most successful aspects of this project from your perspective?
  6. What has been least successful?
  7. What are your goals for the coming year?

Registrars

  1. What has been your role in the CIC project at your institution?
  2. How have you prepared to accommodate cross-enrollment next year? How could be better supported in this?
  3. Before the CIC project started, what was the state of online learning on your campus?
  4. Before the CIC project started, were their opportunities for cross-enrollment?
  5. Do you think the project has helped or will help the college accomplish important goals? For example?

 

Endnotes

[1] The 21 participating institutions in CIC Consortium II are: Bloomfield College (NJ), Carlow University (PA), Carroll College (MT), Carroll University (WI), Claflin University (SC), Clarke University (IA), Concordia University Texas (TX), Gettysburg College (PA), Lasell College (MA), Mount Mary University (WI), Northwestern College (IA), Randolph-Macon College (VA), Rosemont College (PA), Shenandoah University (VA), Siena College (NY), Simpson College (IA), St. Edward’s University (TX), St. Olaf College (MN), Ursuline College (OH), Walsh University (OH), and Wesleyan College (GA).

[2] In the second year, only one full-time faculty member from Clarke University (IA) participated in the project.

[3] The evaluation report of Consortium II’s first year experience can be found here: http://sr.ithaka.org/publications/cic-consortium-for-online-humanities-instruction-ii/.

[4] The authors thank David Brailow, Barbara Hetrick, and Phil Katz of the Council of Independent Colleges and Catharine Bond Hill, Martin Kurzweil, and Kimberly Lutz of Ithaka S+R for their contributions to this paper.

[5] IPEDS (Integrated Postsecondary Education Data System) is a system of interrelated surveys conducted annually by the US Department of Education’s National Center for Education Statistics (NCES). For more information, visit https://nces.ed.gov/ipeds/Home/AboutIPEDS.

[6] For more details about Community of Inquiry work, visit: https://coi.athabascau.ca/coi-model/coi-survey/.

[7] One course was taught in Fall 2017 and another course was not taught again in Spring 2018 due to scheduling difficulties. Also, one instructor was replaced by another in the second iteration, though the topic and content of the course remained the same.