• Home
  • About Us
  • Guest Posts

Sunday, June 22, 2014

Sunday, June 22, 2014
Must innovation disrupt everything so that society might have new and better things? Widespread fatigue with this idea inspired a number of headlines last week.  "The Emperor of "Disruption Theory" is Wearing No Clothes," exclaimed one response.  Paul Krugman described a "careful takedown," suggesting that the whole era of innovation might collapse from its own overhype ("Creative Destruction Yada Yada.")  Jonathan Rees referenced an "absolutely devastating takedown."  All three were talking about Jill Lepore's much-discussed New Yorker critique of prominent business consultant Clayton Christensen's theory of "disruptive innovation." Prof. Rees concluded, "Like MacArthur at Inchon, [Prof. Lepore] has landed behind enemy lines and will hopefully force the enemy to pull back and defend ideological territory that they thought they had already conquered."  Obviously something is up when one historian compares an article by another to the "decisive" amphibious assault against the North Korean army early in the Korean War.

What's up is pervasive anger at the corporate and political classes that have used the theory of disruptive innovation to justify an endless procession of company downsizings and closings over the past thirty years (photo credit: Bill Bamberger).  People are also angry at the belief of many advocates that resistance is futile and resisters are losers.  Prof. Lepore spoke for this sense of exclusion when she wrote that in order to avoid actual debate, "disrupters ridicule doubters by charging them with fogyism."  Innovation, she wrote, has become "the idea of progress jammed into a criticism-proof jack-in-the-box."

The stakes of this debate about innovation are high.   Corporate America, health care, manufacturing, and the contemporary university have all tied their reputations to their delivery of innovation. Innovation comes with lots of turmoil, unilateral management decision making, and interference with how people do their jobs.  The critiques of the Lepore article didn't justify disruption as innovation so much as they affirmed that there is a lot of disruption:  responses from DigitopolyVox, Forbes and the Wall Street Journal tried to refight the debate to a draw.   In an interview, Prof. Christensen countered some of her examples while describing her piece as a "criminal act of dishonesty--at Harvard of all places!" (He also seemed to invite her over to talk innovation theory.)

I don't want to try to referee the debate through the examples in Prof. Lepore's piece, but to provide a better socio-cultural context for it, in the hope that the debate will continue.  The main point I will make here is that we can't overcome disruptive innovation unless we realize that it isn't a theory of innovation but a theory of governance. "DI" isn't about what people actually do to innovate better, faster, and cheaper, but about what executives must do to control innovative institutions.  Prof. Lepore's work will be wasted unless we can move from disruptive to sustainable innovation, which she argued is better than the disruptive kind.  But we won't get sustainable innovation until we identify its opposition in current managerial culture.

(1) From Schumpeter to Christensen

A little backstory may help here.  Prof. Christensen is now the most prominent heir of Joseph A. Schumpeter's twin definition of capitalism as the source of all meaningful innovation in life, and of innovation as "creative destruction." For both of these thinkers, the entrepreneur is the fountainhead of new value, and capital must be pulled out of less productive uses and allocated to the entrepreneur, who is the privileged source of all future of wealth-creation.  In Schupeter's view, governments, publics, regulations, communities, traditions, habits, faculty senates, teacher's unions, zoning boards, homeowner's groups, professional organizations, and, last but not least, business corporations, do not create value but interfere with its creation. All that is solid must be melted into air for the entrepreneur to be free to innovate and thus transform.  The resulting wreckage and waste is part of progress, and must not be reduced through regulation.  This is true for shuttered factories, and also for high levels of inequality: both are part of liberating the entrepreneur to create the greater wealth of the future.

Although years of reading Prof. Christensen makes me think he's personally humane, his theory is the business world's single most powerful rationalization for disrupting every type of humane condition, such as job security, tax-funded public infrastructure, or carefully nurtured, high-quality product lines.  Prof. Lepore was right to state, "Disruptive innovation is competitive strategy for an age seized by terror."  Disruption feeds on major and also minor terrors, like being left behind by a change deemed unavoidable, or being excluded from debate about the costs and benefits of undermining entire regional economies by offering tax breaks to companies that offshore production.

One outcome of the theory of disruptive innovation has been the shocking complacency of the U.S. political class about the national devastation wrought by deindustrialization. We have a "rust belt," and ruined cities like Newark and Detroit, and wide areas of social and economic decline amidst enormous wealth, because business and political leaders were taught by consultants like Prof. Christensen that capitalism must destroy in order to advance.  Journalists might come along and chronicle the horrible human costs of the decline of the steel industry in, say, Youngstown, Ohio (see the Tammy Thomas sections in George Packer's The Unwinding (2013)But by the time someone like Mr. Packer arrived, decline has been baked into the regional cake.

The theory of disruptive innovation was arguably head baker, for it taught politicians in Youngstown and elsewhere that industries like steel and their unionized employees had been judged by an impartial market to be uncompetitive.  Consultants would routinely opine that the only logical response to falling profits was the mass layoff and/or factory closure. In The Disposable American (2007), Louis Uchitelle pointed out that layoffs were not wars of necessity but wars of choice, and yet to say that deindustrialization expressed a cultural entitlement rather than an economic law was to stick one's finger in the dike.  Slowly but surely, Youngstown and everyplace like it no longer had economies that supported a broad, stable middle class. In addition, like Beckett's Godot, the renewal to which this disruption was to lead never actually showed up.

Thus Prof. Lepore's critique of disruptive innovation tapped into a pervasive, long-term anger about ruin in America and an anger at the corporate and political classes that deemed ruin necessary.

(2) Jill Lepore's Critique

In "The Disruption Machine," Prof. Lepore held Prof. Christensen's theory to rigorous evidentiary conditions for historical claims, and found that "none of these conditions have been met." (Score Humanities 1, B-Schools 0!--there's a disciplinary matchup in her piece that Michael or I will come back to another time.)   She suggested not just that disruptive innovation doesn't work as advertised when transferred from, say, specialty steel manufacturing to educational services, but that it didn't work well even when applied to manufacturing.

Crucially, Prof. Lepore concluded that "sustaining" innovations--which continuously improve a product--are far more successful that Prof. Christensen's theory admits.  Discussing a core Christensen example, the disk-drive industry, Prof. Lepore posited a more accurate history,
In the longer term, victory in the disk-drive industry appears to have gone to the manufacturers that were good at incremental improvements, whether or not they were the first to market the disruptive new format. Companies that were quick to release a new product but not skilled at tinkering have tended to flame out.
In other words, sustainable innovation works as well as or better than disruption, but the U.S., thanks to figures like Prof. Christensen, wasn't allowed to have it.  Americans could have developed advanced skills for advanced manufacturing and services as did Germany, Japan, China, Sweden, et al, but nooo--economists and business theorists taught that it was uneconomical to invest in all the Tammy Thomas's of the country so that they could "tinker" brilliantly for the sustainability of U.S manufacturing and its heartland cities.

I agree with Prof. Lepore's demonstration of Prof. Christensen's fallibility, and with the conclusion that disruption is a false idol.  It has indeed produced neither social progress nor economic success as such. But it's one thing to critique disruptive innovation, and another to change it into sustainability.   Prof. Christensen has and will continue to promise enormous irreversible change in articles like "MOOCs' disruption is only beginning"--and so will American capitalism in general. To disrupt disruption, particularly in a service sector like higher education, we need a better appreciation of the deeper purpose of disruptive innovation I mentioned above.  The history suggests that Prof. Christensen became influential  because he enhanced an top-down kind of innovation management, not because of his insights about innovation as such.

(3) The Revolt Against Managers 

Prof. Lepore juxtaposes Prof. Christensen to Michael Porter's strategy-based model of "comparative advantage." But Prof. Christensen isn't so much the un-Porter as he is the un-Peters--Tom Peters, that is.  In the mid-1990s, the management book to beat was still In Search of Excellence (1982), which Mr. Peters co-authored with Robert H. Waterman. These two management consultants did a particularly good job of facing up to the decline of American manufacturing, particularly in relation to Japan, which had been influentially analyzed in works as different as Chalmers Johnson, MITI and the Japanese Miracle (1982), Barry Bluestone and Bennett Harrison, The Deindustrialization of America (1984), Michael J. Piore and Charles F. Sabel, The Second Industrial Divide (1984), and  Rosabeth Moss Kanter, The Change Masters (1985). By the time David Harvey's The Condition of Postmodernity (1991) came along to declare a fundamental change in capitalism's mode of production, prominent business writers had been trying to revive capitalism by exposing the deficiencies of top-down corporate management.

Most famously, Mr. Peters and Mr. Waterman decorously criticized management's selfish cynicism about the capabilities of their employees.  They argued that American executives adhered to an outmoded Theory X, "the assumption of the mediocrity of the masses.” Executives wrongly believed, in the words of Theory X's codifier, the MIT industrial psychologist Douglas McGregor (1960),  that the masses “need to be coerced, controlled, directed, and threatened with punishment to get them to put forward adequate effort." Theory Y, which Mr. Peters and Mr. Waterman upheld, like McGregor before them,  "assumes . . .  that the expenditure of physical and mental effort in work is as natural as in play or rest . . . and the capacity to exercise a relatively high degree of imagination, ingenuity, and creativity in the solution of organizational problems is widely, not narrowly, distributed in the population(emphasis omitted, 95).  (For a discussion of MOOC-based Theory X in higher ed, see "Quality Public Higher Ed: From Udacity to Theory Y.")

In Search of Excellence implied that American management was holding the American worker back.  The way to compete with Japan, Germany, et al was general employee empowerment.  I understand that the only management book to outsell In Search of Excellence in the 1980s was Stephen Covey's The Seven Habits of Highly Effective People, which was in a different way also addressing the empowerment needs of the ordinary employee. Extending the argument, Mr. Peters called a later tome "Liberation Management (1992), claiming that a kind of self-organized worker activity would grow the company's bottom line through the creative pursuit of higher quality.   Oddly enough, this kind of  "human relations" management theory surged during the Reagan years. One culmination was Post-Capitalist Society (1993),  in which the dean of management theorists, Peter Drucker, prophesized the replacement of the firm's managerial layers with the "intellectual capital" of knowledge workers, who would use their pension-based ownership of their companies to take capitalism away from passive capitalists and their managerial proxies.

(4) Innovation as Governance

I retell this history to help us avoid interpreting Prof. Lepore's account to suggest that Clay Christensen's rise was based on a series of misreadings  of corporate histories that never got fact-checked by his propagandistic discipline.   To the contrary, Prof. Christensen became a pivotal figure in management history by using innovation to re-empower management.  We can see him, in retrospect, as offering a comprehensive antidote to what American capitalists could only regard as the poison of neo-workplace democracy.  Some 1980s business blockbusters were telling stockholders and executives to share power with a new, insufferably smart-ass "no-collar" generation of knowledge workers, and that only this concession would turn the tables on the Japanese.  Many owners and executives must have felt that this price of recovery was too high.

Prof. Christensen was not working alone, of course: the "shareholders revolt" inspired by another Harvard B-school professor, Michael Jensen, was very important, as were other theories and corporate movements. But Prof. Christensen's role was particularly important in "learning organizations" (the subtitle of a 1990 blockbuster, by Peter M. Senge, that disruptive innovation also eclipsed).   Had the future belonged to the Peters, Druckers, and Senges, universities might by now have subjected financial management to the judgments of the collegium, in Jim Sleeper's term. In a post-capitalist university, administration would execute decisions made by faculty and staff collaborating with students in everyday administration and in strategy. But universities have gone in the opposite direction, with their managers controlling not only the allocation of resources but the composition of teaching staffs while, in the online era, shaping the curriculum and its delivery.   If in the 1970s it made sense for Barbara and John Ehrenreich to speak of a joint "professional-managerial class," by the end of the 1990s, managers had broken away from professionals in healthcare, K-12 education, and academia. Management had the easiest time maintaining its authority when it spoke on behalf of innovation.

Prof. Christensen, in short, offered an antidote to an unexpected return of neo-workplace democracy. His work circulated widely in firms who wanted to avoid losing to more "innovative" competitors. But even in those contexts, his work was less about maximizing innovation and more about controlling it.   His theory has rested on three main axioms.

First, as I've noted, he assumed that losing major industrial sectors to other countries is a natural law of capitalism, not a mistake of American management.  This is the meaning of innovation -- it destroys incumbent sectors in the process of creating new ones.  So stop worrying and learn to love the bomb that blew up your (old, less valuable) industries (and communities).

Second, your employees' genuine love of excellence is not the solution: it's the problem.  They will keep making better, higher-quality products (Theory Y is true!).  Meanwhile, disruptive innovation will steal your market share with crappier, lower-quality products at new, low low prices.  Your employees do want to focus on higher quality and smarter technology.  But these are always, in the Christensen model, "sustaining innovations," which are bad for profits.  So a firm needs to lower the quality of goods like photocopying or college teaching.  Prof. Christensen often goaded managers with the inability of great firms with great products to develop worse stuff quickly enough to save themselves. To move downmarket fast enough, they must control their excellence-oriented, highly effective, quality-focused workers, and resubjugate them to the firm's value proposition.

Third, this control must be exerted by management.  It is management that must interpret market requirements, and do so without concern for the human interests at stake and then compel employees to comply with these. In the Christensen antidote to a kind of shared governance with knowledge workers, management had a whole new lease on life and, indirectly, a gigantic claim to company resources. Companies should manage innovation with structures like "heavyweight teams."   Prof. Christensen defined what might have seemed a return of executive Bonapartism as the objective transmission of market signals.  You don't like your product line downgraded or your laboratory closed? Don't blame the messenger! The management team is just transmitting market signals without fear or favor. In the case of university "reform," the management team transmits a preconceived mission: The Innovative University recounts how senior managers at BYU-Idaho pushed through unpopular changes like the elimination of sports teams and summer teaching breaks on the basis of unilateral decision rights--in their case rooted in collaboration with the senior leadership of the LDS Church itself.  (BYU-Idaho has an interesting teaching model that deserves independent analysis: my point here is that it was imposed through top-down managerialism wearing innovation's clothing.)

There's a further aspect of this third feature of the Christensen antidote to knowledge-worker autonomy.   In contrast to professional authority, which is grounded in expertise and expert communities, managerial authority flows from its ties to owners and is formally independent of expertise.  Management obviously needs to be competent, but competence seems no longer to require either substantive expertise with the firm's products or meaningful contact with employees.  The absence of contact with and substantive knowledge of core activities, in managerial culture, function as an operational strength.  In universities, faculty administrators lose effectiveness when they are seen as too close to the faculty to make tough decisions. In the well-known story that Prof. Lepore retold, the head of the University of Virginia's Board of Visitors decided to fire the university president on the grounds that she would not push online tech innovation with the speed recommended by an admired Wall Street Journal article.  The Christensen model does not favor university managers who understand what happens in the classroom and who bring students and faculty into the strategy process.  For employees and customers are exactly the people who want to sustain and improve what they already have, which in disruptive capitalism is a loser's game.

The power of the Christensen script can be seen in the care with which MOOC advocates have been following it since 2012.  They first cast universities as overbuilt incumbents, the kind of places that do indeed hire nonfaculty professionals at ten times the rate of full-time tenured faculty in order to chase high-end customers and avoid the less demanding and underserved masses.  Second, MOOCsters slammed instructional employees as opposed to innovation: articles or books by analysts like Mark C. Taylor, Ann Kirschner, and Richard A. DeMillo heaped scorn on what Dr. DeMillo called "faculty-centered" universities. Third, during the 2012-2013 boom, MOOC entrepreneurs bypassed faculty to connect directly with venture capitalists, politicians, business leaders, and senior university managers.  One triumph of the campaign was the Udacity-San José State contract for three MOOC courses, which must have been the first time in history in which a university's outsourcing contract for one department's remedial curriculum was signed in the presence of the state's governor.  2014's MOOC business plays have continued the outreach to academic managers and the sidelining of teaching professionals (e.g., UC Berkeley, or Udacity's "nano degree").  MOOCs moved in so easy because they fit with the managerial ascendency over the professional authority of professors--the key institutional goal of disruptive innovation.

(5) Towards Sustainable Innovation

Let me steer this discussion back to universities. We need a new era for them, in which they are allowed to find sustainable financing and to support sustaining innovations.  (Something analogous needs to happen for industries that have huge social value, like polymer solar cells, but that can't attract private capital.)   Jill Lepore's critique of Clayton Christensen's historical errors moves us in this direction by discrediting disruption-as-such.  But her effort won't last without broad faculty support for restoring the status of professional knowledge in relation to its decades-long undermining via the theory of disruptive innovation.

One traditional ground of faculty resurgence is to affirm its professional rights and privileges.   This is important, but will not be enough to emerge from a period in which these rights are exactly what disruptive innovation discredited.  The same goes for what I've done here, which is a kind of Foucauldian analysis of innovation as a mode of governmentality.  This is a necessary but not a sufficient condition of moving toward post-disruption.

We also need faculty to tie their professional expertise to the university's anti-elitist, pro-democratic social mission. Michael and I have been posting for a while on faculty- and student-centered higher ed, in company with the MOOC-based focus on learning, which, shorn of the medium's imperial pretensions, was all to the good.

Ironically, faculty can also get help from Clay Christensen's work, where a democratic impulse survives its deep managerial bias.  For example, the impetus of the BYU-Idaho experiment in The Innovative University was the democratic goal of higher quality for more students at a reasonable cost (251).  Figuring out which costs are necessary and which can be dumped required, as it always does in Prof. Christensen's work, asking two questions: (1) what job does the "customer" want done ( not what product does the customer want to buy); and (2), what jobs can the university "do uniquely well."

The Innovative University's answer to the first question was this:
the job that students and policymakers need done is the bestowal of the insights and skills necessary not to just make a living but to make the most of life.  A college degree creates its significant wage-earning advantage because it is designed with more than mere economic goals in mind.  Among those extra-economic goals are the jobs of discovery, memory, and mentoring, jobs that traditional colleges and universities perform as few other institutions can. (331)
This is a fairly basic statement, but it grounds even the "disruptive" (cut-rate) university in human development rather than job training.  It also leads to refocusing the university on its core elements: "(1) discovering and disseminating new knowledge, (2) remembering and recalling the achievements of the past, and (3) mentoring the rising generation" (331).  Again, the formulations are not ideal,  and yet even a university that has been businessed by an innovation gang would look, for example, to reduce the administrative bloat that has raised student costs and disempowered educational staff.

In the company of thousands of educators who've spoken out, Prof. Christensen is right that universities need to recover their educational focus.  It's just not his model of disruptive innovation that will achieve this.  The process cannot be lead by managers and must be lead by faculty and students.  The historical tragedy of the Schumpeter-Christensen model is that it elevated a managerial class that opposed the democratization of invention we now can't do without.  The good news is that there's no reason to make the same mistake twice.

Wednesday, June 4, 2014

Wednesday, June 4, 2014
In two recent posts (here and here) Chris made an educational and ethical case for the public research university shifting its attention to increasing educational intensity and to committing itself to a vision of universal capacity rather than its present practice of intensifying pre-existing social inequalities.  There have been, as far as I can tell, two primary responses to his arguments: 1) an enthusiastic agreement with the basic principle and suggestions for ways it could be extended and 2) a resigned gloom rooted in the widespread sense that our institutions are simply not receptive to the notion.

I want to suggest here that both of these responses are apt but that they should lead us not to resignation but to a recognition that thinking outside the normal structures of our institutions is both desirable and necessary in this instance.

The reasons are several:

First  is the serious probability that the public university as we know it is dead.  That isn't to say that it won't continue to function producing knowledge and graduates of various kinds, teaching as it does, etc. But it is winding down. It has become clear over the last decade that the public university is not fulfilling its fundamental social functions in terms of social mobility and mass education. Nor is it clear that it will be able to continue its research funding given the commitment to austerity in both state capitals and Washington D.C.

To a significant extent these issues are financial. In its present form public research universities are caught in what we might call a "low level equilibrium trap." Despite all the rhetoric about how crucial higher education is to the future, the actual visions of the political class is narrowly focused on their perception of today's job demands, a perception which is instrumentalist in teaching and indifferent to research. Comparatively, the state of California still funds a large percentage of UC and CSU's core costs. But the state's political leadership seems willing to accept the system that we have now and slowly reduce it over time (or not so slowly given the problems with the pension system and the uncertainties facing the medical centers).

We all know the obvious signs of this situation. Governor Brown is openly hostile to investing in higher education, and despite some increased funding in his budget he has made it clear that he has no intention of overcoming the years of austerity or aiding the University facing in facing its increasing costs. Given his support for Patrick Callan's latest reportLieutenant Governor Newsom appears to think that the answer is something akin to Western Governors University.  But the clearest indication of the problem, I think, lies in the Legislature's preference for scholarships over University funding.  By agreeing to increase funding for Cal Grants the state is committing to holding down effective price without increasing funding for universities.   This does not necessarily mean that the state, Jerry Brown aside, is unwilling to support higher education.  It does mean that the State is no longer willing to support the research university in its present form

There is a second component to this slow death of the public research university.  As science faculty have pointed out repeatedly, they spend an inordinate amount of their time applying for grants and seeking to raise the funds necessary to support their research.  I do not want to revisit the arguments about the cross-subsidies (in whatever direction) that complicate the issue of indirect cost recovery.  My point here is that the Federal support for scientific research is in decline, that this will only increase pressure on science faculty, and that in the long-run without increased state funding for basic research the scientific enterprise as we know it cannot be sustained.  In today's "the only thing that matters is the next six months" political economy, political and business leaders may not worry about the long decline of research infrastructure but anyone concerned with the research university must be.

Now I don't think that the end of the public research university as we have come to know it is entirely a bad thing.  We are all aware of its overly bureaucratic nature, the unchecked and hidden expansion of administration, the growth of an overly intrusive audit culture, the threats to faculty rights and academic freedom threatened by online contracts and administrative policing of employee speech, its rising financial burdens on students as well as the expanding size of classes.  Even at its finest, it was a high modernist institution that tended to extended and unnecessary hierarchies.  The triumph of finance in the inner circles of the university has only made matters worse.

But are there alternatives?

One way to begin a discussion is to look at the extremely different notions of cost that exist between UC and Sacramento. Sacramento, in particular the LAO, is convinced that CSU and especially UC are inefficient in the way they provide education to students.  They make this claim based on a fairly simple calculation--dividing total core revenues by the numbers of enrolled students and claiming that the result is the real expense per student. Because UC has never been willing to actually figure out how much goes into the instructional program per student, UCOP and the campuses have been unable to challenge this idea effectively.  So long as the universities are unable to demonstrate to legislators and the public that the funding is necessary for instruction and that it will go to instruction, we will be unable to regain support for higher education institutions.  

There are, to be sure, two different sources for this gap in perception: the cost of research and the growth of administration.  To some extent they overlap (the increased oversight demanded of research funding, questions of safety, legal issues, the growth of IT) but not entirely.  And one thing that would need to be done would be to finally get transparency on where the costs lie and which ones are actually necessary for core function

But there is an even larger conceptual issue at stake here.  We can, I think, approach it by thinking about the ideas of "faculty centered" vs. "student centered" universities.  It is possible to look back at the universities of the 1950s and 1960s (during what Christopher Jencks and David Reisman called the "academic revolution") as "faculty centered" universities.  In that moment of institutional expansion (and especially expansion in the importance of graduate education) universities were centered around the interests of growing disciplines and departments.  Although some radical activists were able to compel the creation of new fields of study in the humanities and social sciences, for the most part academic fields were driven by faculty and academic fields shaped student experiences.

This university (and I know I am overstating its practical reality a bit) was quickly displaced by what we might think of as "Student-Centered University I."  In part "student-centered university I" was driven by the desire for improved rankings that took off in the 1970s and by the increasingly dominant notion of consumer choice in the 1980s that turned students into customers.  But the effort to attract customers, in particular, led to an increasing displacement of the classroom in student lives and the growing importance of both material surroundings in the university and the separation of student services from the instructional core.  Although "student centered university I" continues in part, it has  been replaced by "student-centered university II." "Student-centered University II" is marked by dramatically increasing economic inequality within the student body and it means the worst of both possible worlds for many: rising costs needed to pay for administrative services and material upkeep, worsening conditions of the classroom, increased student debt, and the managerial turn to massive numbers of poorly paid instructors with little to no job security or long-range benefits that they can count on..

What we need is the end of "Student-Centered University II."  Instead, and with acknowledgments to Paul Goodman, we need a new "community of scholars." Goodman rightly argued that the core of any university or college worth its name lay in scholarship--understood as both the creation and communication of knowledge and insight through the educational process.   To achieve a new "community of scholars" increasing educational intensity would be central.  Now, I am not trying to claim that the only spaces that matter in a university are the classrooms, laboratories and libraries. But it does seem fair to me to rethink the University as a place where these spaces are the core of the University in more than name only and in which the interplay between faculty and students is the central dynamic of the institution.  

This would entail a widespread reorganization of resources--one in which student services would be reintegrated with instruction, staff moved back into departments, and faculty involved in advising. Administrations would need to provide greater transparency of costs, and undergraduate programs would be more fully integrated with research.   

It would also entail a serious engagement with students and parents.  At least one central problem would need to be addressed in terms of the infrastructure of the university: would students and parents accept a materially less rich extra-curricular apparatus in exchange for more resources in instruction and a lower overall price?  If it is possible to offer less expensive higher ed, could it be done outside of the branding race rather than--as people like Gavin Newsom propose--by eliminating the rich intellectual life that could be offered on residential campuses?  And would parents and students buy into that vision?

As even a quick look at the questions that need to be addressed will indicate, such a reorganization of the university cannot be done from the top down.  We already had one version of that in Gould Commission.  If anything could demonstrate that real educational imagination and re-thinking will not come from the top, the Gould Commission, with its rush to accept all the conventional wisdom of educational austerity and its displacement into the fantasy of UConlline, should have done it.  The only way that a new public research university can be created will be from the bottom up, with faculty, students, and parents attempting to create a new public discourse.