One of the more entertaining bits of cognitive dissonance I have experienced this summer includes the disparate “ideas of the university”—and of learning platforms in general—that emerge, implicitly and explicitly, from my simultaneous indulgence in Christensen and Eyring’s The Innovative University: Changing the DNA of Higher Education from the Inside Out and Alex Beam’s lighter A Great Idea at the Time: The Rise, Fall, and Curious Afterlife of the Great Books, a 2008 popular history of the Great Books program.
Christensen and Eyring are deadly serious, working through the parallel histories of Harvard and Brigham Young University-Idaho (formerly Ricks College). Their point is that the institutional “DNA” of American higher education can largely be traced to Harvard—and a handful of other established and prestigious universities—and has spread far and wide, reaching a point of financial and social unsustainability (even for Harvard, in some ways) that ought to motivate disruptive innovation based on a rethinking of the methods and purposes of post-secondary education as a whole. Idaho BYU-Idaho has done some of that rethinking, and the authors offer its online courses, its disinclination to scramble up the “ladder” of Carnegie Classifications, and its new approaches to measuring the quality of its work as a model for change. It’s good stuff, and thought-provoking on all kinds of levels.
Beam, on the other hand, takes aim at the hubris, idealism, and occasional plain wackiness of Mortimer Adler, Robert Maynard Hutchins, and the other founders and promulgators of the “Great Books of the Western World” program, which was for some decades a modest cash cow for the Encyclopedia Britannica and the University of Chicago (over which Hutchins presided between 1929 and 1945). The Great Books idea has spawned or sustained a handful of estimable academic ventures (e.g., Columbia University’s Core Curriculum, the St. John’s Colleges, Yale’s Directed Studies program—full disclosure: I’m a veteran) and also inspired a couple of generations of community discussion groups and not a few passionate individuals in all walks of life. (On occasion it has also been a handy weapon in the culture wars, proclaimed by conservatives to be the “canon” that “proves” the superiority of Western, classically rooted culture.)
As I write this the University of California at Berkeley has just joined Harvard and the Massachusetts Institute of Technology in the EdXonline learning initiative, while the Coursera universities and the publicity given to a handful of wildly successful MOOCs (massive open online courses) have raised the specter of a whole lot of post-secondary education being “outsourced” from university campuses to an anytime, anywhere model. As one commentator to the Chronicle’s piece on the Berkeley announcement noted, the EdX initiative comes from an “anticipation that online education is at an inflection point—that it’s starting to work; that it’s starting to be seen by employers as legitimate; and that universities that don’t get out ahead of this change will be left behind, in particular with students who won’t pay high tuition but are willing to pay for discrete skills training.” This is the sort disruptive change, ironically given extra street cred by a Harvard connection in the case of EdX, over which Christensen and Eyring enthuse in The Innovative University.
All of this somehow comes together in my mind as a question: Are we entering a new Age of the Autodidact? I find myself slightly surprised to note parallels to the early industrial age, when self-taught men and women with curious and inventive minds, many having accessed information through fledgling scientific societies and their journals as well as public and subscription libraries, gave us everything from the steam engine to the hand eggbeater. If by the time of World War II Harvard and Berkeley’s graduate schools and M.I.T. as a whole had taken over much of the function of the backyard and barnyard tinkerers of a century before and even of early think tanks like Edison’s labs, tinkerers, shade-tree mechanics, and other independent—and uncredentialed—inventors and entrepreneurs didn’t die off as a breed. One could even argue (and Peter Thiel has, even putting his money behind the argument) that famous college drop-outs like Steve Jobs, Steve Wozniak (who later finished his degree at Berkeley), Bill Gates, and Mark Zuckerberg prove the point.
Today the internet makes the transfer of knowledge essentially seamless, and in no area is this transfer more effective than in the realm of the practical. Want to know how to roast corn on the cob or swap out the memory chip in your computer? Puzzling over the best way to rebuild the steering on your 1949 Ford pickup? Want to build your own Genghis Khan-style bow or plant the most colorful perennial garden? Try the internet—and be ready to decide which directions or models to follow. It’s a maker’s playground that has me wishing I had stayed with more of the hands-on hobbies of my younger days.
The internet is ideal for the delivery of what the Chronicle commentator calls “discrete skills training”; there’s nothing better. With enough time and enough curiosity, an internaut could learn how to do everything. It’s an awesome possibility, and it’s no wonder that universities great and small want to get on the bandwagon—they can, and they should. If necessity is the mother of invention, the dissemination of useful information is the father. Whether internet-gained knowledge is responsible for tastier meals and prettier yards or truly new and different ways of solving critical problems—for true innovation—it’s pretty much, as they say, all good.
Seventy-five years ago the Great Books offered a kind of comparable opportunity, all built around a set of pre-screened and indexed “Great Ideas” that promised the faithful reader access to wisdom and knowledge that would render answers to all of life’s questions, from business dilemmas to personal quandaries. If it seems preposterous now (as Beam’s title, A Great Idea at the Time, certainly suggests), it wasn’t so preposterous either to the program’s founders or to the thousands who ponied up for the books and actually read and discussed at least some of them. Here was a certain kind of autodidact’s dream come true.
The great question, of course, is of coherence. Individual entrepreneurs and self-guided scholars may not require a systematic approach to life to do their work, but one of the tenets of our civilization and certainly of traditional education has been the ideal of a philosophy, code, or creed that somehow undergirds both one’s work and one’s character. That such codes sometimes coexist, sometimes compete, and sometimes conflict doesn’t reduce their social significance.
Schools, in particular independent schools, have always tried to put their values at the center of the student experience—as mission, as values, as the themes of “character education” programs. We’re all about coherence, at least in our aspirations. We try, even in our blended and online efforts, to keep our academic offerings in some ways of a piece with what we say we believe in about the kinds of lives we hope our students will build for themselves.
Universities, even those as large and diffuse as Harvard and Berkeley, in some ways do the same, perhaps more in protection of “brand” in 2012 than in promotion of the ideals of a “life of usefulness and purpose.” It’s going to be interesting to see what becomes of these brands, and of these ideals, as EdX goes forward. It’s going to be equally interesting to watch the ways in which independent schools follow the lead of universities, be they Harvards or BYU-Idahos, in the direction of establishing themselves as purveyors of knowledge beyond the brick-and-mortar model.
A side effect, I suspect, even of the most homogeneous approaches to online schooling, may be that we permit more of our students, like the EdX certificate seekers in India or far-flung students taking courses through BYU-Idaho, to follow their intellectual passions down paths that seem both exciting and eminently practical to them. I like the idea that I can enroll online to learn something about programming or some technical field that interests me, and I like even more the idea that my children and my students can find courses on line that give them access to fields of study their physical schools can’t offer.
Of course I also like the idea that we can all pick our ways the Great Books of the Western World, by ourselves or in groups, and that at the same time we can run a simple search to find alternative canons of work from a thousand cultures.
It’s about learning, and having control over what we learn—and why. Maybe we learn so we can get good jobs, and maybe we learn because a particular topic just tickles us. Maybe we learn because we need moral models, and maybe we learn so that we can read books—or websites—in languages we don’t know. We are all autodidacts, in our way, and we ought to celebrate the big brains, from Gutenberg to Mortimer Adler to the Google team to the masterminds of EdX, who have made and will continue to make this possible.
The internet is certainly chock-full of information, but the idea that online courses purvey *knowledge* seems to rest on shaky pins … it is a question of coherence, indeed. It is not surprising that studies have shown that over 80% of people taking MOOCs have college degrees – no matter how self-directed the learner, understanding requires a foundation or system of ideas (or perhaps, as you suggest, values) that allows the learner to digest and make meaning out of the endless smorgasbord of content available online. Fascinating to think about these issues in the context of independent schooling and curriculum design…