Is Self-Exploration in College an Outdated Concept?
- October 28, 2015
- Rayane Alamuddin
Time and again, the concept of “self-exploration” as a crucial component of the college experience makes its way into discussions about restructuring undergraduate degree programs in the US. Proponents of such self-exploration argue that focused career-training programs and guided pathways programs are too regimented and narrow, denying students the precious gift of self-exploration and discovery that results from exposure to a vast array of courses of their choosing. Recent innovations in higher education may also limit certain exploratory experiences for students—for instance tech-enhanced advising services can suggest courses that students are likely to succeed in, based on their prior performance in other coursework—potentially reducing students’ experience of challenge and failure, and the growth that accompanies that.
Others, however, respond that self-exploration is a luxury few can afford when it comes at the expense of completing a timely degree and entering the job market or improving one’s employment circumstances. Another common critique is that self-exploration is costing tax-payers billions of dollars for a very small or no return in terms of skills acquired in the process. Ideally, the undergraduate degree experience would offer both an efficient path to fruitful employment and a meaningful career, as well as personal growth and cultivation through self-exploration. And of course it can always offer either, both, or none, depending on the individual circumstances at play. But as college students are now typically older, working, or non-residential, and the opportunities for growth and exploration outside the context of institutions of higher education become more and more accessible, I cannot help but wonder if the idea of self-exploration as a main goal of the undergraduate experience is now an outdated concept.
As Ithaka S+R analyst Jessie Brown outlined in more detail in an earlier post this summer, 70% of college students are over the age of 24, work part-time or full-time, do not reside on campus, are financially independent, or have children, and many already have an assemblage of college credits from other institutions. These are not wide-eyed 18-year-olds with limited life experiences, whose sojourn on campus will be the center of their universe for 4-6 years and a primary source of exploration and discovery—something that may no longer be true for any student anymore. Many “21st century students” have explored, wondered, struggled, learned, and grown as employees, breadwinners, students, and parents, and are now looking to make a sound investment in their career and employment outcomes. And those who find that their structured tech-enhanced programs limit their self-exploration options have a practically endless supply of free online courses, discussion networks, and meetup groups with a broader reach and richness than any one college can afford.
Whether an agglomeration of life experiences and random coursework can replace exploration within the context of an established intellectual institution may be doubtful, but it may offer something new of different value, especially for students with limited time and money to invest. As innovative solutions to problems created by changes in higher education continue to be put forth and tested, new solutions to loss of self-exploration opportunities can also be devised, perhaps starting with adding rigor and sophistication to public high school curricula and expanding access to dual-enrollment programs.