Facilitated by growth in the availability of data about learners, scholars in cognitive science, psychology, computer science, and other disciplines have developed sophisticated insights about how people learn and succeed in academic contexts.[1] Yet, growth in the field of “learning science” has far outpaced higher education institutions’ efforts to apply its insights to their students’ experience.

Leaders at Kaplan, Inc.,[2] a company serving over a million learners in various programs, believe that a practical corollary to learning science is needed. They call this field “learning engineering.” Analogous to the relationship between mechanical engineering and physical science, or medicine and biological science, Kaplan has sought to develop learning engineering as the systematic application of general principles about how people learn in specific contexts through rigorous testing and refinement.

Although the learning engineering approach is manifest in each of Kaplan’s divisions, it is most fully realized in Kaplan University, a for-profit, largely online institution that served 38,000 degree and certificate seeking students in 2014-2015.[3] Over the past decade, Kaplan University has developed and refined a course design and assessment process based in measurable learning outcomes. In addition, it has systematized an innovative “Research Pipeline,” through which skilled researchers rigorously test interventions on student learning, study the results, and scale successful interventions. These efforts have contributed to a larger-scale culture change at Kaplan University, with stakeholders across the institution embracing evidence-based methods for designing and improving learning experiences.

To better understand the process by which Kaplan and Kaplan University operationalized the learning engineering approach, we conducted phone interviews with 14 individuals from Kaplan University and Kaplan, Inc. in April and May of 2016. In our interviews, which included employees with a diverse set of roles within the two organizations, we encountered remarkable consistency in interviewees’ understanding of and commitment to rigorous, evidence-based improvement in learning.

In an era of intense—and largely deserved—scrutiny of for-profit education providers, there is a risk that the broader higher education community will look askance at even the most worthwhile innovations coming out of that sector.

As a for-profit, largely online institution, Kaplan University has certain features amenable to standardizing a learning engineering approach that are not common among not-for-profit and public institutions. Its centralized curriculum has facilitated a systematic approach to course design and assessment, and its large, online courses have enabled the section-level and student-level randomized experiments that characterize Research Pipeline trials. In an era of intense—and largely deserved—scrutiny of for-profit education providers, there is a risk that the broader higher education community will look askance at even the most worthwhile innovations coming out of that sector. Yet many factors in Kaplan’s experience are generalizable: leaders with a strong vision have strategically leveraged existing infrastructure and built collaborative processes to support sustainable, continuous improvement. There is much to learn from its example.

Origins and Operations

Kaplan Inc. and the Kaplan Way

Kaplan, Inc. was established in 1938 as a test prep company. In the 1990s and 2000s Kaplan’s operations expanded geographically and programmatically to include financial education, professional training, English language programs, a small portfolio of K-12 programs, and higher education. In 2000, Kaplan Inc. acquired Quest Education Corporation, a network of career colleges, and in 2001 it started offering its first online higher education programs. In 2004, the college added master’s programs and re-branded itself as Kaplan University.[4]

Since 2004, Kaplan University has expanded its operations and programmatic offerings. In the 2014-2015 academic year, it offered 14 associate’s degree programs, 32 bachelor’s degree programs, 30 master’s degree programs, and 33 certificate programs. In that year, Kaplan University enrolled 38,000 degree-seeking students, 94 percent of whom took their courses online.[5] Of all students enrolled during the 2014-2015 school year, 59 percent were older than 30, 75 percent were women, and 9,000 were active military, veterans, or military spouses or dependents. Though most students take their courses online and faculty work remotely, Kaplan University maintains student support centers in in Fort Lauderdale and Orlando, Florida.

Kaplan’s origins as an organization focused on test preparation contributed significantly to its evidence-based approach. In test prep, the outcomes of a course are standardized and measureable. The value of a learning experience is measured almost entirely through concrete evidence that demonstrates a course’s efficacy in improving students’ performance on the test for which they are preparing. Though the application of this approach becomes tricky when applied to less standardized educational experiences, its basic tenet—to learn from evidence about the efficacy of instructional methods in achieving clear, measureable learning outcomes—grounds an approach to instructional design that is used in all units of Kaplan, Inc.

Committed leaders—Kaplan, Inc. CEO Andy Rosen and Chief Learning Officer Bror Saxberg, in particular—have built on this foundation to systematize a learning engineering approach in different learning environments across the organization. Rosen, previously president of Kaplan University, became CEO of Kaplan, Inc. in 2008.[6] Coming to the role with a desire to make the emerging field of learning science central to Kaplan’s approach, Rosen created the role of Chief Learning Officer, as someone who would lead the articulation and operationalization of an evidence-based approach to learning throughout Kaplan, Inc. In 2009, Bror Saxberg, a cognitive scientist with a background in K-12 online learning, was hired to this role. Together, Rosen and Saxberg began to flesh out and communicate—internally and externally—the idea of “learning engineering,” characterized by evidence-based approaches, measureable and measured outcomes, and iterative processes.[7]

One early product of Rosen and Saxberg’s effort was the Kaplan Way for Learning, introduced in 2010. The Kaplan Way is an “evidence-based strategy” that “applies learning science and the latest instructional technologies to understand expertise, design and deliver content, adapt to individual learnings, measure progress, and continually revise educational course and products.”[8] Under the Kaplan Way, each learning experience at the organization is designed around three phases: a “prepare” phase in which students become acquainted with new knowledge, a “practice” phase in which students apply what they have learned through additional activities, and a “perform” phase in which students put their newly acquired knowledge to the test.

All instructional designers at Kaplan, Inc., regardless of whether they work in Kaplan University, Kaplan Test Prep, or Kaplan International, are trained extensively in this approach. Furthermore, every educational product or experience that Kaplan delivers is assessed against a “Kaplan Way Educational Product Evaluation Checklist.” The checklist includes items such as whether learning objectives are clearly stated and aligned with one another; whether assessment methods are valid and reliable; whether practice opportunities match assessments and develop knowledge; and whether course content is of high quality and aligns with objectives and assessments.[9]

In addition to developing the Kaplan Way, Saxberg and Rosen have made other high-level changes that affect all business units throughout the organization. For example, in 2015, Rosen incorporated into quarterly business unit reviews explicit discussions of each unit’s learning strategy, its investments in learning, and its investments in improving measured learning quality. Regular engagements like these make business units accountable for using evidence-based methods and demonstrating evidence of learning, while allowing individual business units to exercise ownership and flexibility in their implementation.

Kaplan University’s Competencies and Curriculum Design Process

Nearly all Kaplan University courses are delivered online, and curriculum development is centralized under a team of deans, course leads, and instructional designers. Curriculum deans work with a school or schools within Kaplan University, and manage the curriculum development process at a high level. Underneath curriculum deans are curriculum specialists, who work in both instructional design and project management capacities, and manage instructional designers and subject matter experts in the design and revision of courses according to institution-wide standard operating procedures. Instructional designers, who think through how to design and deliver educational experiences, and subject matter experts, who ensure the quality and rigor of the content, work with media specialists and a production team to produce courses. Finally, department leads and course leads share input throughout the entire design and revision process (and sometimes lead smaller revisions). Course leads ensure that courses are well integrated across programs and liaise with faculty about revisions by soliciting input, alerting faculty to new course components, and leading training for major course revisions.

Courses are designed around three types of competencies. The first are program specific competencies, which were introduced in 2007. Second are general education literacies (GELs), including communication, writing, critical thinking, and problem solving, which were introduced in 2009. And the third are professional competencies, including leadership, teamwork, multiculturalism and diversity, and presentation, which were developed in 2014. Each course at Kaplan University is designed to incorporate opportunities for students to prepare, practice, and perform a subset of program competencies, GELs, and professional competencies. Program curricula are designed to ensure that graduates have had the opportunity to demonstrate each program competency, GEL, and professional competency at least once during their time at Kaplan.

In addition to using grades, course instructors assess students on competencies assigned to their courses—scores referred to as course-level assessments or CLAs. All sections of each course use the same rubric for CLAs, rating students on a six-tier scale (in which 0 indicates no progress and 5 indicates mastery) for each relevant competency. Course leads and curriculum teams analyze data from CLAs to gain insight into how well course content and assignments prepare students for the relevant competencies, as well as how reliably course assessments test students on these skills. Though courses can be revised for a number of reasons, including the introduction of a new textbook, technical requirements, or the emergence of new research about the subject matter or student learning, CLA data also helps to inform this process.

Program competencies were designed with significant faculty and department chair input, and have been tested and iterated upon regularly since their implementation in 2007. This testing has looked for consistency across courses in CLAs and across outcomes within the same course (e.g. CLAs and course grades), and has involved regular faculty trainings led by department chairs and Kaplan University’s Center for Teaching and Learning. For example, a recent course review in the School of Education revealed that the assignment used to assess one outcome on which students typically performed poorly was misaligned to that outcome. Course leads redesigned the assignment, tested its validity, and built the new assignment into a revised curriculum. General education competencies and professional competencies, both of which were developed collaboratively by a team of deans, curriculum leads, and faculty members, have undergone a similar testing process to ensure that they are aligned and appropriate for Kaplan’s students.

Until recently, student performance on competencies was available only to faculty, the course development team, and leadership (with various levels of anonymization) for the purposes of design and revision. In 2014, Kaplan University began giving students access to their own assessment results. Initially, students were able to view their competency results on a dashboard within the learning management system. More recently, the registrar’s office developed a competency report, which aggregates student progress on course-level outcomes at the program level, and offers a visual summary of progress towards mastery.[10]

The Research Pipeline

Kaplan University’s Research Pipeline is perhaps Kaplan’s most direct realization of the learning-engineering methodology. The Pipeline was conceived of in 2013 by Bror Saxberg and David Niemi, Vice President of Measurement and Evaluation for Kaplan, Inc., along with colleagues who saw a need for Kaplan University to accumulate credible evidence about what worked to measurably improve learning and outcomes for adult, online students. Though Kaplan University had—and continues to—look to outside research to guide interventions and course revisions, Saxberg, Niemi, and their team noticed that there was little available evidence about what worked for their specific student population. Furthermore, there existed no established institutional system for exploring how to implement and scale interventions in an effective and efficient way.

In short, the “Research Pipeline” is an institutional structure through which Kaplan University tests interventions on student learning, studies their impact, and, if successful, brings them to scale.

In short, the “Research Pipeline” is an institutional structure through which Kaplan University tests interventions on student learning, studies their impact, and, if successful, brings them to scale. Ideas for the Research Pipeline are usually generated by the seven members of the Research Pipeline team, prompted by other research in the field. Faculty members have also proposed several Research Pipeline experiments. Proposals are evaluated on criteria such as their purpose, measures, previous research in the area, their feasibility, scalability, and beneficence. Importantly, all Research Pipeline interventions are tested through randomized control trials. This adds rigor to the evidence produced through trials by eliminating the potential for confounding measurements, but also limits the types of interventions that can be tested.

Once a proposal is selected for implementation, the Research Pipeline team will refine it, and collaborate with a steering committee of course leads and department chairs to ensure that the intervention can be implemented feasibly, suitably, and effectively. Interventions are usually carried out in large courses with many sections, so that researchers can work with sizable treatment and control groups. Recent studies include experiments with self-regulated learning, adaptive textbooks, flipped classrooms, synchronous learning tools, and various interventions on motivation and mindset. The Research Pipeline team uses a number of instruments to measure student learning and outcomes in its trials, depending on the nature of the intervention. These include survey data, grades, assessment data, and subsequent-term persistence. To maintain the integrity of results, students remain unaware of experiments.

At the conclusion of an experiment, the Research Pipeline team compiles a report of findings with recommendations on how to move forward, and shares the report with the university leaders and the project steering committee. The most common recommendation is for additional testing; it is rare for findings to be so clear-cut that immediate scale up is recommended. In the few cases in which Kaplan University’s leaders decided to implement a change based on the findings, the Research Pipeline team has worked with the course developers to develop a scaling strategy.

Several interventions have been tested and scaled through the Research Pipeline process. In 2014, Kaplan University used the Research Pipeline to test the efficacy of a faculty dashboard. The tool alerted faculty members when students in their courses were off-track for successful completion, and prompted faculty members to send these students an email to provide support. Analyses showed that students whose instructors were assigned to use the faculty dashboard earned significantly higher grades and passed at higher rates than those in courses where it was not used. The tool was made available to all faculty, and faculty received training on how to use the tool in their courses. Currently, Kaplan University is conducting additional research on the efficacy of the faculty dashboard, which it developed internally, compared to a similar tool developed by Civitas.

Another successful and scaled intervention focused on training for CLAs. As discussed, CLA scores play an important role at Kaplan University, as they provide feedback to students and faculty, and determine what kinds of interventions may be necessary. With large populations of students, consistency can be difficult to achieve. In 2014, Kaplan conducted a study on the impact of assessment training on the reliability of CLA scores. Results indicated that exposure to even minimal training in assessing CLAs led to a marked improvement in scoring reliability. As a result of the study, a new training intervention was put in place across Kaplan University to improve the consistency with which faculty members evaluate student work.

The Research Pipeline has developed significantly over the past three years. The team has grown to include several members with doctoral training in quantitative, social science research methods, and processes have become more formalized and organized. Multiple members with whom we spoke said that their trust in the design and results of their experiments had grown since 2013, and that the team’s skill in working with faculty members, minimizing disruptions, and scaling judiciously had noticeably improved. Though the Research Pipeline usually runs several interventions per semester, the team has intentionally reduced the number of studies that occur concurrently, focusing instead on increasing sample sizes for better results.

In addition, the focus of Research Pipeline experiments has recently shifted from interventions on student learning to interventions on motivation and mindset. This stems, in part, from the areas of academic expertise that Research Pipeline members bring to their work. For example, Matthew Braslow, who is Director of Assessment and Research at Kaplan University and is responsible for developing many of the ideas for Research Pipeline interventions, has a background in social psychology, and has generated many interventions from theories he has encountered in his academic work.

Partnerships with researchers at other universities have also informed the Research Pipeline’s development and focus. For example, in 2012, Kaplan University partnered with a researcher at Harvard to study the impacts of leveraging student support networks on student outcomes. Similarly, the Research Pipeline has built an ongoing partnership with Stanford University’s Project for Education Research that Scales (PERTS), and recently completed a study on how growth mindset interventions delivered during a new student orientation affect persistence to the second term. Members of the Research Pipeline team have been proactive in seeking out partnerships like these, and, because of Kaplan University’s large sample sizes and focus on real world outcomes, researchers are often interested in collaboration. Still, the Research Pipeline team maintains a primary focus on building an internal knowledge base for application on their unique student body, rather than a broader set of findings for publication.

Evidence of Impact

Though it is hard to tease out the causal impact of evidence-based student success interventions on large scale metrics like graduation rates, Kaplan University’s completion rates have grown for students who enrolled since the implementation of “learning engineering.” From the 2012-2013 academic year to the 2014-2015, the four-year graduation rate for full-time associate’s degree completers rose from 35 percent to 52 percent, and the graduation rate for part-time associate’s degree completers increased from 19 percent to 28 percent. Six-year graduation rates for bachelor’s degree students have also risen, but less dramatically: for full-time students, graduation rates increased from 47 percent in 2012-2013 to 48 percent in 2014-2015; for part time students, graduations rates rose from 16 percent to 18 percent over the same period of time, and the number of completers increased from 2,363 to 5,590.[11] It is worth noting that, because Kaplan University’s enrollment has fallen as the economy has recovered (enrollment tends to be countercyclical), substantially fewer students were in the later cohorts.[12]

On both institution-specific and national surveys, students report satisfaction with their experience at Kaplan University. For example, on the Kaplan University Capstone survey, 86 percent of respondents agreed or strongly agreed with the statement “Kaplan always puts my needs as a student first,” and 86 percent agreed or strongly agreed with the statement “Kaplan provides personalized support to help keep my program requirements in line with my life circumstances.” More than 90 percent agreed or strongly agreed that Kaplan would help them reach their personal or career goals, that receiving a degree from Kaplan was an efficient path to achieving their learning goals, and that Kaplan is innovative in its approach to education. Nearly 12,000 students completed the survey.[13]

Response patterns on other surveys have yielded similar findings. On a 2015 survey of new graduates and alumni, 91 percent of respondents expressed overall satisfaction with their Kaplan University experience and 89 percent reported that their educational program met their expectations. On the National Survey of Student Engagement, which measures the engagement of first and final year students and computes it against a national average, Kaplan University students ranked Kaplan’s academic rigor, effective teaching practices, and the quality of their interactions higher than the national average.

Kaplan’s evidence-based approach to designing and measuring learning has permeated Kaplan University’s organizational culture and resulted in concrete, practical changes.

Finally, Kaplan’s evidence-based approach to designing and measuring learning has permeated Kaplan University’s organizational culture and resulted in concrete, practical changes. This owes, in part, to the Research Pipeline. By involving multiple stakeholders from throughout the university, including instructors, department leads, and the course development team, the Research Pipeline, our interviewees told us, has contributed to a new level of thoughtfulness and circumspection when it comes to implementing changes. No longer are gut feelings or trends adequate justification for curricular or instructional revisions. Rather, changes must be backed with credible research, and applied with careful thought about how they can be made measureable and effective for Kaplan University’s unique student population.

In addition to changes in attitude and culture, the Research Pipeline and other evidence-based approaches have led to concrete curricular and organizational changes at Kaplan University. As discussed above, the faculty dashboard and CLA training, now scaled throughout Kaplan University, emerged from Research Pipeline studies that indicated their efficacy. Assessment data analysis contributes regularly to curricular revisions, and all instructional designers are trained in the Kaplan Way. Finally, multiple teams have emerged out of the Research Pipeline or other similar operations: Research Pipeline interventions focused on first-term courses catalyzed the formation of a group of first-year instructors who collaboratively strategize around student support strategies and potential interventions. In addition, a “measurement council” which gathers individuals involved in data analysis and measurement in different business units recently began regular meetings to optimize data sharing, governance, and strategy throughout the organization.

The impact of a learning-engineering approach extends beyond Kaplan University. Evidence of culture change was apparent in each of the Kaplan, Inc. business units we investigated. For example, upon recent tests of an instructional video, Kaplan Test Prep found that some of the embellishing features of the video were distracting to students. Though the organization had invested a substantial sum in the video’s production, it used the evidence it had collected during its tests to design a video that used “worked examples,” step-by-step demonstrations of solving a problem, that researchers at Kaplan and elsewhere have found improves student learning. In addition, as discussed, Rosen has incorporated explicit discussions about learning strategy into quarterly business reviews so that measureable impacts on learning are a central part of each unit’s operations. Now, each unit of Kaplan relies more heavily on learning scientists to develop its products. Key innovations driven by this approach include the development of an adaptive, computer-based assessment tool that replaced an inaccurate and inefficient method of paper-based testing in Kaplan’s English Language unit. Kaplan’s Australia unit has also developed an evidence-backed, industry diagnostic assessment for financial planning competencies and is modifying the assessment to roll it out elsewhere.

To date, Kaplan University and Kaplan Inc.’s learning engineering approach has not had a measureable or easily parsed impact on organizational finances. In fact, revenue generated by Kaplan Higher Education (of which Kaplan University is a part), decreased slightly from 2013 to 2015. As we discuss in “Remaining Challenges,” some of this slight downturn can be attributed to increased regulation of Kaplan University as a for-profit college, which has affected program offerings and enrollment; some of the decline may also be attributed to enrollment decline in response to improving economic conditions.[14]

Success Factors

Commitment to Rigorous Methods and Research

Though much of this case study documents Kaplan, Inc.’s and Kaplan University’s demonstrated commitment to rigorous, evidence-based methods and research, it is worth reiterating the extent to which this dedication has contributed to successful change. While many organizations—academic and non-academic—have embraced “data-driven” practices as important for improving outcomes, Kaplan University stands out for the precision with which it has defined its methodology, applying an engineering framework to the process of curricular and instructional design. This commitment is evident in the Research Pipeline’s methods and development: the Research Pipeline only tests interventions through randomized control trials, considered the “gold standard” in social science research because they eliminate potential confounding factors through randomization.

While many organizations—academic and non-academic—have embraced “data-driven” practices as important for improving outcomes, Kaplan University stands out for the precision with which it has defined its methodology, applying an engineering framework to the process of curricular and instructional design.

A Sustained Focus on What Works for Kaplan Students

Another key tenet of Kaplan’s approach is the belief that the organization must adapt interventions to each unique population of learners. Thus, the Research Pipeline team at Kaplan University uses outside research to inform interventions that are then custom-designed for the adult, online learners the university serves, rather than treating outside findings as turn-key solutions. When curricular changes are implemented through less formal channels—such as through research collected by course leads and course developers—Kaplan University remains committed to measuring the impact of these changes, and iterating upon them so that they work in context.

Strong Leadership

Kaplan, Inc. and Kaplan University leaders have been integral players in strengthening and sustaining a pervasive organizational commitment to evidence-based learning design. Andy Rosen, Kaplan’s CEO, speaks with self-awareness about the importance of reinforcing this approach into Kaplan’s culture, and has implemented a number of processes to facilitate this transition. In addition, Rosen approaches his vision with flexibility, contextualizing it to the various functions of the organization. He understands that the extent to which learning outcomes can be standardized and objectively measured may be greater in test prep unit than they will be at Kaplan University, and that the two units may need different sorts of methods and processes for implementing evidence-based approaches.

Rosen is supported by a strong team that shares his vision and brings expertise to its design and implementation. As discussed, Bror Saxberg, Chief Learning Officer and the co-architect of the Kaplan Way and Research Pipeline, has explained and articulated the value of “learning engineering” both internally and externally, and has been instrumental in systematizing a research-based approach to learning design. Nearly everyone with whom we spoke cited Saxberg’s enthusiasm for and expertise in learning engineering as a primary factor in the organization’s shift towards testing and evidence-based change.

Strong leaders at Kaplan University and within the Research Pipeline have helped to translate and implement Saxberg and Rosen’s vision. Betty Vandenbosch became President of Kaplan University in 2015, but has been involved with the Research Pipeline since its inception, adding her perspective as a social scientist to Saxberg’s engineering approach. Jahna Kahrhoff is the chairperson of the Research Pipeline, and brings her background as a professor of education to the research and operational process. Finally, Larry Rudman, Vice President of Instructional Design and Research at Kaplan, Inc., has been instrumental in bringing a “learning engineering” mindset to the course development process, and Matthew Braslow, Director of Assessment and Instructional Design has led the Research Pipeline team in designing interventions, refining its approach, and building partnerships.

A Collaborative Approach to Design, Research, and Assessment

Because much of Kaplan coursework occurs online, and faculty and staff are dispersed around the country, it would be easy for decisions to be made in isolation, or for communication and coordination to be hindered in some way. We found the opposite to be true. An intentionally collaborative approach has been crucial to Kaplan’s ability to garner buy-in and participation in research and innovation. For example, at Kaplan University, course leads and department leads are brought into the Research Pipeline design process as soon as a proposal is selected, and help to ensure smooth implementation and contextualization. Kaplan University has established clear pathways for communication with faculty members regarding curricular changes, and course leads and department chairs work closely with the course production team on course design and revision processes. In addition, through annual conferences like KU Village and the General Education Conference, Kaplan University has developed several venues for faculty members to share practices for course revisions or testable interventions and present their own pedagogical research. While cross-unit collaboration has taken root more slowly, the emergence of a Kaplan-wide measurement council, and Saxberg’s regular meetings with learning engineers across the organization, are evidence of emerging collaboration around data and measurement.

Online, Centrally Designed Curriculum

Much of the instruction in Kaplan’s units is delivered online, based on a centrally designed, standard curriculum for each course. In the context of Kaplan University, the consistency of courses and curricula across sections of large courses sets ideal conditions for the Research Pipeline’s randomly controlled trials. Centralized course design also enables Kaplan University’s competency-based approach to design and assessment.

This is not to say that features such as the Research Pipeline and a competency-based approach flow naturally from a centralized, online curriculum. Kaplan’s leaders have been strategic about leveraging this system and finding opportunities within it to advance an evidence-based approach. For example, as Kaplan University transitions to a new learning management system, provided by Desire to Learn, in fall of 2016, it will continue to leverage its technological infrastructure to support more advanced research. Research Pipeline leaders plan to use new functionality in the system to randomize interventions at the student, rather than the section level, to develop more sophisticated tests of adaptive learning tools, and to explore interactions among and the longitudinal impacts of interventions.

Remaining Challenges

While Kaplan stands out for the extent to which it has operationalized learning engineering, its efforts are still very much in progress. The Research Pipeline was initiated in 2013, and GELs and professional competencies have been phased in only over the past several years. As Kaplan, Inc. and Kaplan University continue to measure learning, and to test and scale innovations, they will face a number of challenges in making this a pervasive part of their culture and operations.

One major challenge relates to developing broader-based participation in the Research Pipeline. While the process is set up so that faculty can propose ideas, in practice, most ideas are generated internally by the Research Pipeline team. This ensures that proposals are grounded in educational research and well-primed for randomized implementation, but also omits much of the experiential knowledge that faculty may accumulate while teaching.

In addition, Kaplan University will have to manage a major transition as it migrates to a new learning management system. While this transition will present new opportunities for the Research Pipeline, it presents a significant undertaking for a school of nearly 40,000 students, nearly all of whom take their courses online. Kaplan University has already dedicated significant resources to this effort, and ensuring a smooth transition will require that the organization invest significantly in training instructors and instructional designers on the new system.

Finally, as a for-profit institution, Kaplan University faces a number of challenges to its credibility. Recently, several for-profit education providers and their accreditors have come under sharp criticism for their high prices, poor outcomes, and false employment claims. The recent closing of Corinthian Colleges and ITT, and the likely termination of ACICS loom large in the public perception of for-profit institutions. (Kaplan is accredited by the Higher Learning Commission, not ACICS.)

Like other for-profit colleges, Kaplan University functions in a market that has become more price-sensitive and more heavily regulated in recent years, and the institution has come under scrutiny for high student loan default and dropout rates. In response to this scrutiny, starting in 2011, Kaplan elected to close 30 programs, shut down several campuses, and consolidate others. In 2015 it sold off its Kaplan Higher Education vocational campuses to the Education Corporation of America. Some of these closures, however, reflect responsiveness to market demand and employment opportunities, and, since 2011, Kaplan has also opened 28 new programs.

There is no question that Kaplan University operates to generate a profit, but, we also found strong evidence of a genuine organizational commitment to helping adult students learn and progress in their careers. In recent years, Kaplan University has taken some bold actions to demonstrate this commitment, and has made sacrifices to enrollment and tuition revenue in order to do so. For example, in 2010, Kaplan University implemented the Kaplan Commitment, which allows students to fully enroll for three weeks with no financial obligation so they can determine whether the school is a good fit. The institution also ceased enrolling “ability to benefit” students (students who do not hold a high school diploma or GED), even before the federal government ended financial aid for this group.[15]

Kaplan has also taken measures to control costs and reduce loan defaults for its students: Kaplan employs a team of loan assistance counselors who help students navigate the repayment process and act as liaisons between students and their servicing agents. The one-year cohort default rate for all Kaplan University borrowers decreased from 25.9 percent for the 2009 cohort to 12.9 percent for the 2012 cohort (though some of this decrease likely reflects an improving economy).[16] To control costs, Kaplan also recently instituted tuition caps for many of its academic programs.[17]

Conclusion

There is a growing gap between the findings of the basic science of learning and the practices and conditions of learning at institutions of higher education. To bridge that gap, Kaplan Inc. and its units, especially Kaplan University, have established systems to test and apply the general principles in context. Facilitated by a centralized curriculum development process, a competency-based approach to assessment, and a disciplined “Research Pipeline,” Kaplan University in particular has fostered a learning climate that is increasingly shaped by evidence rather than inertia or intuition. Its efforts provide an example from which others in the higher education community—not-for-profit, public, and for-profit—can learn a great deal.

Appendix

We conducted phone interviews with the following faculty, staff, and administrators in April and May of 2016.

  • Matt Braslow, Director of Assessment and Research and Research Pipeline lead researcher
  • John Harnisher, Vice President of Data and Learning Science, Kaplan Inc.
  • Jahna Kahrhoff, Assistant Dean of Curriculum and Research Pipeline chairperson
  • Michael Lorenz, University Registrar
  • David Niemi, Vice President of Measurement and Evaluation
  • Holly Parker, Professor and Course Lead, Health Sciences
  • Andy Rosen, Kaplan, Inc. CEO
  • Leah Rosenberry, Course Lead and Faculty Advisor, Math
  • Larry Rudman, Vice President of Instructional Design and Research
  • Bror Saxberg, Chief Learning Officer, Kaplan, Inc.
  • David Starnes, Vice President, Academic Affairs
  • Betty Vandenbosch, President, Kaplan University
  • Linda Villareal, Academic Project Manager
  • Karen Watson, Course Lead and Adjunct Faculty, Social and Behavioral Sciences
  1. See, for example, Ken Koedinger, Sidney D’Mello, Elizabeth A. McLaughlin, Zachary A. Pardos, Carolyn P. Rose, “Data Mining and Education,” Wire Cognitive Science, 6 (2016), http://pact.cs.cmu.edu/pubs/Reformatted-Koedinger-EDM-for-WIRES-cogsci.pdf.
  2. Kaplan, Inc. is a subsidiary of Graham Holdings Company, formerly the Washington Post Company. See “History,” Kaplan, http://kaplan.com/about-us-overview/history/.
  3. Most students at Kaplan seek bachelor’s or master’s degrees, but Kaplan University also offers doctoral degrees. The Concord Law School offers JDs and the School of Nursing offers DNP’s. Both are schools within Kaplan University. As of June 30, 2015, 78 percent of enrolled students were pursuing bachelor’s degrees, certificates or diplomas, while the remaining 22 percent were enrolled in graduate level programs. See “Academic Report: The Year in Review, 2014-2015” Kaplan University (2015), http://www.kaplanuniversity.edu/about/annual-report.aspx.
  4. “History,” Kaplan University, http://www.kaplanuniversity.edu/about/history.aspx.
  5. Kaplan does offer some in person courses at campuses in Indiana, Iowa, Nebraska, Maine, Maryland, Missouri, and Wisconsin, though most students who take on-campus courses also take the majority of their courses online. See “Academic Report: The Year in Review, 2014-2015” Kaplan University (2015), http://www.kaplanuniversity.edu/about/annual-report.aspx.
  6. Rosen stepped down in 2013 to become Chairman of Kaplan, and then returned to the CEO position in 2015.
  7. See Bror Saxberg, “Why We Need Learning Engineers,” The Chronicle of Higher Education (April 20, 2015), http://chronicle.com/article/Why-We-Need-Learning-Engineers/229391/; “Learning Engineering: A Forum with Bror Saxberg,” The Center for Innovative Research in Cyber Learning (October 5, 2015), http://circlcenter.org/events/learning-engineering-with-bror-saxberg/; Bror Saxburg and Frederick M. Hess, “Breakthrough Leadership in the Digital Age: Using Learning Science to Reboot Schooling,” (Newbury Park, CA: Corwin Publishing, 2013).
  8. “The Kaplan Way,” Kaplan University School of Professional Studies and Continuing Education, https://www.schweser.com/about-kaplan/kaplan-way; “About Us,” Kaplan, http://kaplan.com/about-us-overview/.
  9. “Kaplan Education Product Evaluation Checklist, V4, March 2012,” http://kaplan.com/wp-content/uploads/2016/01/Kaplan-Way_Checklist_Summary_Sheet.pdf.
  10. “Competency-Based Education,” Kaplan University, http://www.kaplanuniversity.edu/student-experience/competency-based-education.aspx See also Paul Fain, “Profit and Competency,” Inside Higher Ed (May 5, 2015), https://www.insidehighered.com/news/2015/05/05/profit-kaplan-university-expands-its-competency-based-offerings-new-transcript.
  11. ““Academic Report: The Year in Review, 2014-2015” p.22 and “Academic Report: The Year in Review, 2012-2013.” Kaplan’s Academic Reports explain that IPEDs reported graduation rates are misleading because the methodology only accounts for first-time, full-time students, who make up a small percentage of Kaplan University students. Kaplan’s methodology breaks out students by attendance status and calculates graduation rates for 150 percent completion time.
  12. In academic year 2012-2013 45,076 students were enrolled. In 2014-2015, 38,332 students were enrolled. In addition, some of this increase might be explained by the implementation of the Kaplan Commitment in November 2010, which enabled students to enroll in classes for an introductory period to assess whether coursework met their needs. Retention rates increased from 42 percent to 48 percent one year after the Kaplan Commitment was implemented, so graduation rates for subsequent cohorts may have been effected. See ““Academic Report: The Year in Review, 2014-2015” Kaplan University (2015), http://www.kaplanuniversity.edu/about/annual-report.aspx, p.22 and “Academic Report: The Year in Review, 2012-2013,” Kaplan University (2013), www.kaplanuniversity.edu/kaplan-university-academic-report-2013.pdf.
  13. “Academic Report: The Year in Review, 2014-2015.”
  14. Kaplan Higher Education revenue dropped from just over one billion dollars in 2013 to $850 million in 2015. Some of this loss was due to enrollment limitations or eliminations in programs that were not in compliance with Gainful Employment regulations, though Kaplan University has implemented measures, such as career services support, financial literacy counseling, and tuition caps and reductions, to keep at-risk programs open. See the Graham Holdings 2015 Annual Report, http://www.ghco.com/phoenix_zhtml?c=62487&p=irol-reportsannual.
  15. See Paul Fain, “Kaplan 2.0,” Inside Higher Ed (August 15, 2013), https://www.insidehighered.com/news/2013/08/15/profit-kaplan-branches-out-learning-science-projects.
  16. “Academic Report: The Year in Review, 2014-2015,” Kaplan University, http://www.kaplanuniversity.edu/about/annual-report.aspx. Over the same period of time, the cohort default rate for graduates decreased from 5.7 percent to 4.4 percent.
  17. See “Tuition and Savings,” Kaplan University, http://www.kaplanuniversity.edu/paying-school/tuition-reduction.aspx.