The Rise of Academic Innovationism

fotogestoeber/Shutterstock

In “Two Cheers for Higher Education,” I show that the success of US universities during the period 1980-2015 derived from the joint impact of three “logics of development”: a logic of disciplinary specialization, a market logic, and a logic of social inclusion. These logics sometimes came into conflict, as when faculty entrepreneurs seemed to flout their academic responsibilities in favor of building their enterprises or when the racial or gender backgrounds of candidates seemed to supersede their scholarly achievements as a basis for advancement. But accommodation was the norm and together these three forces created a special dynamism during the period.

In this blog post, I want to focus on market logic and especially the most important feature of this market logic: the search for new technologies with commercial potential. Professional interests in solving disciplinary puzzles continued to occupy most of the faculty during the period, but administrators began to think more and more about how universities could serve by providing skills and training relevant to local and regional labor markets and by helping to boost industry.

To think in market terms means responding to overt or latent consumer demand. The interest in labor market demand led to the proliferation of occupational-professional programs. The fastest-growing fields during the period included the three Cs: computer science, communications, and criminology. Venerable applied programs in business, engineering, and education grew much larger; about one in seven college students studied business. And the growth of professional master’s programs was arguably one of the most important changes during the period. These degrees combine academic study with practical applications (including internships). The degree showed phenomenal growth through the 1990s and 2000s, with more than 7 million master’s degrees conferred by 2010, greatly exceeding government projections from a decade before. Business, education, health professions, engineering, and computer science represented some two-thirds of the master’s degrees awarded. Master’s candidates expanded in such specialized health fields as physician’s assistant and nurse anesthesia and later in data-based fields like cybersecurity, data analytics, and health informatics. Universities grew these programs not only to respond to student and employer demand but because they could price them at rates that made them self-supporting and, prospectively, to protect potential revenue sources from the designs of nearby competitors.

But the truly new turn in university life came the deployment of academic research in the service of new technology development. I have termed this departure from traditional academic concerns “academic innovationism.” Universities had engaged in patenting since the turn of the 20th century, when a University of Wisconsin scientist, Harry Steenbock, discovered that irradiating foods with ultraviolet light increased their vitamin D content. But the patenting and licensing of discoveries remained a peripheral activity of universities for more than 60 years, due to the contradictory and often restrictive policies of government agencies concerning the patenting of discoveries based on federally funded research. All of that changed in 1980 with the passage of the Bayh-Dole Act, which provided blanket approval for universities to retain patent rights to discoveries made by their professors and graduate students.

The Act ushered in an era of unprecedented academic entrepreneurship. US universities were generating about 300 patents a year in 1980. In 2014, they generated nearly 6,000, a twentyfold increase. A few universities made hundreds of millions of dollars licensing these inventions, including Northwestern University’s compound to ease neuropathic pain and the improved strawberry that UC Davis plant scientists developed. As important as these changes were, they represented only a drop in the bucket relative to patenting and licensing activity in the economy as a whole — and it is also true that many universities’ technology transfer offices lost money.

More important was the impetus the era created for other forms of academic entrepreneurship. Some of the most important companies in the United States had their origins in university research, including Google, Broadcom, and Akami Technologies. University researchers began to collaborate much more actively with industry scientists, inviting them into their labs and serving as scientific advisors to new technology companies. The states also joined in by funding “eminent scholars” programs like the Georgia Research Alliance that required star researchers to collaborate with industry, or by funding centers of excellence such as the California Institutes for Science and Innovation to work on commercially viable technologies in collaboration with industry partners. The goal of creating a vibrant cluster of firms adjacent to campus, following the Stanford-Silicon Valley model, became a widespread ideal. This ideal was realized to a greater or lesser degree in such once-unlikely locales as Ann Arbor, Austin, Boulder, and Salt Lake City. Graduate students served as important connective tissue between researchers and firms with many new doctorates taking jobs in firms maintaining consulting relationships with their advisors. Statistical studies showed that star scientists served as magnets attracting new technology firms to the regions in which they worked.

Under the impetus of “academic innovationism,” universities were helping to drive new technology development — and with it, new jobs and wealth in their surrounding communities.

Source link

Leave a Comment