Back

July 1, 2012

The Decaying Light of Innovation

How the decline of research and development in the United States threatens our future.

IT STARTED AS A MINOR DILEMMA for textile manufacturer Milliken & Co. The polyesters were getting mixed up with the nylons. Even experienced plant supervisors couldn't tell the difference just by eyeballing spools of the two synthetics, and orders were getting mixed up. So they turned to their research department.

Could somebody come up with a washable colorant spray to mark the spools? “One of the research chemists, Hans Kuhn, said, 'I can do that,'”recalls Jeff Lane, the Spartanburg, South Carolina, company's head of research. “He invented a line of colorants, blue for nylon and pink for polyester, that would just wash off during normal processing.” Problem solved.

A more calcified management would have left it at that. But Milliken, a private company led by the innovative Roger Milliken, who died in 2010, was quick to look around for other uses for its invention. How about washable markers? Voila. Crayola bought the idea and, in 1987, came out with a new line of non-staining felt-tipped markers, beloved nowadays by parents of preschoolers whose ideas of coloring outside the lines extend to walls, woodwork and clothing. This is a small example of how research and development can step into a workplace, like a magician pulling bouquets out of the couch cushions. Yet, Milliken is one of only 3% of American companies that has its own R&D department.

For the 147-year-old firm, the ability of its researchers to come up with new products has been a lifesaver. In the 1990s, as competing textile companies went belly-up, one by one, under an onslaught of cheap imports, Milliken beefed up its R&D program and pivoted toward an assortment of new products. It not only survived, it thrived.

American economists study research and development closely because they are widely perceived as the essential ingredients for innovation and economic strength. It's simple, suggests José-Marie Griffiths, a member of the National Science Board and chair of its science and engineering committee: “R&D stimulates innovation, and innovation stimulates the development of new products, services and processes,” she says. “That in turn stimulates jobs and the ability to market products.” Explains one National Science Board report: “Empirical research and surveys of business activities show that innovation leads to new products and services, better quality and lower prices.”

Griffiths offers the obvious example of the Apollo space program and its ramifications. “An R&D innovation benefits not just the slice of the economy that's directly associated with it,” she says. “There's the spillover. From the space program came new lightweight materials needed to adapt to extreme conditions and the miniaturization of components like semiconductors. These things moved from the space sector to other sectors, driving other innovation.”

Beginning to Lose the Edge

Private companies are cutting R&D in the face of hard times, and government support for it is in jeopardy under the hard glare of budget-cutting legislators. For decades, the United States has seemingly been the world's inexhaustible source of technological innovation. This was partly because there have always been lots of Americans with great ideas. But it was also because the idea mavens had ample support from private industry and from federal and state funding agencies. As Adrian Slywotzky, a well-known global management consultant, noted in an essay for BusinessWeek, “The U.S. infrastructure for scientific innovation has historically consisted of a loose public-private partner- ship that included legendary institutions such as Bell Labs, RCA Labs, Xerox PARC, and the research operations of IBM, along with NASA, the Defense Advanced Research Projects Agency [DARPA], and others.”

Out of these partnerships came such transformative technological developments as television transmission, the Internet, laser technology, transistors (and, subsequently, computers), photovoltaic cells and mobile telephones. But the R&D paradigm started to erode in the 1990s, as corporate overseers steered their research away from open-ended “blue sky” inquiry to short-term commercial targets, Slywotzky says.

Under the competitive pressures of an increasingly globalized economy, the acceptable horizon for commercial application began to shrink. “In general, the work that these people were doing could lead to applications 10 or 15 years out,” Slywotzky tells Forward. “That's way beyond the normal corporate commercial timetable.”

According to the National Science Board's 2012 report on science and engineering indicators, total R&D expenditures in the United States in 2009 (the most up- to-date numbers) were about $400 billion, representing an inflation-adjusted 1.7% decline from the previous year. Admittedly, that decline was probably more attributable to the 2008 recession than to a long-term trend, NSB researchers say.

Still, U.S. investment in R&D is trending downward as a percentage of GDP, suggesting it's not the priority it once was. “Since the 1960s, when the U.S. devoted 17% of the federal budget to R&D for agencies like NASA and DARPA, outlays have fallen to around 9% of the budget,” notes a recent report for the Task Force on American Innovation, an alliance of technology companies, research universities and scientific societies. Relatively high R&D expenditures in the past two years were bolstered by almost $19 billion from the American Recovery and Reinvestment Act, the Obama administration's stimulus package, and those funds have now been depleted. Canada's R&D expenditures are even more modest—far more modest, in fact—with about $25 bil- lion total in 2009, representing less than 2% of its GDP.

China and Other Major Players

Meanwhile, other nations are beginning to outpace the United States. “We're still leading the race,” says Tobin Smith, vice president for policy at the Association of American Universities and a member of the Task Force on American Innovation. “But it used to be that there was no one else in the rearview mirror. Other countries are there now, some of them threatening to pass us. Are we going to speed up or continue at the same pace?”

Asia, in particular, has been in the midst of a greatR&D leap, with 10 Asian economies (China, India, Indonesia, Japan, Malaysia, Singapore, South Korea, Taiwan, Thailand and Vietnam) collectively matching U.S. expenditures in 2009. Preliminary indications are that they continued to do so in 2010. The 800-pound gorilla of the group is China, which in 2009 increased its R&D expenditures by 28%. According to NSB, China's expenditure increases in the first decade of the 21st century averaged about 20% a year, while the United States and other Organisation for Economic Co-operation and Development countries only mus- tered between 5% and 6%.

Citing these and other developments, the Task Force on American Innovation, an alliance of private companies, research universities and scientific societies, recently released a scathing assessment of the state of R&D in America, in a report titled “American Exceptionalism, American Decline?”

“Despite a strong history of being a world leader in research and discovery,” says the report, “the United States has failed to sufficiently heed indications that our advantage is diminishing and that we may soon be over- taken by other nations in these areas, which are critical to economic growth and job creation.”

American decline, the report suggests, stretches from underperforming science and engineering departments (we're 27th among developed nations in proportion of college students receiving undergraduate degrees in those areas) to our out-competed inventors (foreigners take out more patents than Americans) to America's role as an importer of high-tech products (we import more than we export). While China has been tripling its ranks of graduating engineers and scientists, to about 800,000 a year, we're stuck somewhere around 250,000.

“The U.S. is not sending a clear investment or policy signal that there is strong support for innovation,” the report contends. The message of decline is clearly getting through to the federal government, with President Obama urging technology companies and universities to step up their game.

The president has proposed an 11.6% increase in federal funding for research this year, but only a 1.4% increase for 2013. Last year, he did sign a reauthorization of the America COMPETES Act, approving $45 billion in new spending on science research education. But congressional funding for the measure has remained gridlocked.

Where Is the Action?

At companies like Milliken, for one. Today, Milliken has hundreds of R&D-inspired products on the market—from flame-resistant material for uniforms and specialized plastics for food storage to electrical wire insulation and the backing for duct tape. And it's profitable and debt-free, company officials say. While the privately held company does not disclose its financial statements, officials say the company's value has increased more than 30% since 2007.

But Milliken, with its corps of scientists and researchers recruited from top science universities, attracted by the company's policy of delegating 15% of their work time to independent research, is a relative rarity in the corpo- rate world.

The new paradigm—actually an old one that has become the innovation standard—involves corporations or consortiums of corporations contracting for services with research universities. Thus, the American Iron and Steel Institute, representing 24 North American steelmakers and 140 suppliers and customers of the steel industry, has contracted with the Massachusetts Institute of Technology and the University of Utah to come up with new technologies that eliminate carbon dioxide emissions from the steelmaking process to reduce industrial greenhouse gases.

Researchers from both projects, using molten oxide electrolysis at MIT and pure hydrogen in a flash oven at Utah, say their technologies will be available in the 2020s. The projects are being funded by AISI, which kicked in $6.5 million for the laboratory phase of the projects, after the Department of Energy provided part of the funding for initial feasibility studies. Similar corporate-university efforts are going on in the aluminum and semiconductor industries.

Research Flavors: Applied vs. Basic

Those who monitor R&D spending distinguish between “applied” and “basic” research. Product-minded private companies do much of the applied research, aiming for commercial objectives or solutions to specific product- focused technological problems. The federal government finances most of the basic research—that is, research that looks at subjects without specific applications in mind. And it's there—in the battle for a piece of the stretched-to-the-limit federal budget—that American innovation faces its greatest challenge, critics say. Why should the government finance a researcher's idle curiosity about the nature of things, if there will be at best a dimly perceived commercial pay-off in 10 or 15 years? Why should taxpayers pay for cracking the genetic code of the fruit fly or for the search for some elusive atomic particle that may or may not actually exist? Or even for grappling with fundamental issues concerning the nature of the universe and life on other planets?

These are the kinds of questions that legislators are asking as they grimly prepare to trim R&D budgets. Scientists argue that basic research ultimately lays the foundation for game-changing commercial applications. This is how progress works, they contend. Says astrophysicist George Smoot: “People cannot foresee the future well enough to predict what's going to develop from basic research. If we only did applied research, we would still be making better spears.”

The Impact of Budget Gridlock

Such arguments have met with varying degrees of acceptance from budget makers. At the moment, researchers from coast to coast are holding their breath until after the November election. Last summer, in negotiations to raise the federal debt limit, congressional budget makers agreed to cut deficits by more than $2 trillion by 2021, and they created a bipartisan “super committee” to come up with a concrete plan to trim $1.2 trillion. But that super commit- tee failed, and barring an eleventh-hour accord between a newly elected president and Congress, there would be automatic cuts, including $500 billion from defense, on Jan. 1. A Congress that has so far found it impossible to agree on very much of anything will have about 55 days in a lame-duck session after the election to head off these draconian cuts.

This so-called budget “sequestration” measure places all discretionary spending in jeopardy. While most of the attention has focused on defense cuts—which could affect DARPA's funding level, among a lot of others— sequestration could easily decimate research projects in energy, the physical sciences and elsewhere.

Even without the sequester, most of those “legendary institutions” Slywotzky talks about have been limping along with less funding and lower ambitions, R&D experts say.

Take the legendary Bell Labs, AT&T's super research program whose scientists have won seven Nobel Prizes, nine U.S. Medals of Science and 12 U.S. Medals of Technology. During its heyday, most of Bell Labs' operations were concentrated in three facilities in northern New Jersey, where hundreds of topnotch scientists and researchers operated like a yeast culture of interdisciplinary creativity, exchanging ideas and collaborating on projects.

When AT&T split into three companies in 1995, though, it parceled out the labs, with part staying at an attenuated AT&T and part going to Lucent Technologies (itself a spin-off in 1995 when AT&T was forced to deregulate). In 2006, Lucent merged with Alcatel, and their combined research operations became Alcatel-Lucent Bell Labs. In 2008, the parent company pulled the labs out of basic sci- ence research to focus on more marketable electronic and nanotechnology research, prompting critics to pronounce the move as shortsighted, a devastating blow to Bell Labs' capacity for innovation.

“For Bell Labs, yet another chapter in its storied history comes to a close, taking the once iconic institution closer to being just another research arm of a major corporation,” said Wired contributor Priya Ganapati in an August 2008 issue of the magazine. Most of the other hubs of technological innovation have likewise been parceled out to corporations or shut down. Meanwhile, once-freewheeling federal agencies like DARPA and NASA have been kept alive while being hamstrung by lower funding levels and more restricted missions. For example, Xerox PARC—which brought consumers the first workable personal com- puter (the Xerox Alto) and the Ethernet—concentrates mostly on providing customized R&D and technological expertise for Fortune 500 companies and promising start-ups. RCA Laboratories (which became the Sarnoff Corporation in 1987) was once the research arm of RCA and the source of most innovations in television and broadcasting in the mid-20th century. It has been subsumed into SRI International and plays a role similar to Xerox PARC's in the corporate world.

DARPA, known for its creation of the Internet (then a modest network of document-sharing computers in four military installations called ARPAnet), is still a research arm of the Department of Defense, but with a strong emphasis on keeping it real—that is, little speculative basic research. During the last round of budgeting, the agency was spared major cuts to its $2.8 billion budget (about 20% less than it was averaging during President George W. Bush's second term), but all bets are off if sequestration goes into effect. Same with NASA, which faces a slight loss to its already badly battered $17.8 bil- lion budget (about 5% less than what it was allotted 10 years ago), but enough to force it to withdraw from a planned joint project with the European Space Agency to send unmanned missions to Mars in 2016 and 2018.

Just as disturbing, none of these cash-strapped research entities, Slywotzky says, appear poised to unveil any game-changers—new technologies or products that can generate not just hundreds but thousands of new jobs and an outward splash of collateral technologies.

Notable Exceptions Exist

There is an exception, he adds: IBM, whose Watson computer has made huge strides in the study of arti- ficial intelligence. Following Watson's victory against two “Jeopardy” TV game show champions last year, the company is collaborating with Columbia and the University of Maryland to train the machine as a physician's assistant. Because of its artificial brain- power, Watson has the potential to quickly absorb and interpret complicated medical records and to share what it finds using language rather than data. “This is probably the transistor of the 21st century,” Slywotzky says. “Think of all the applications it will evolve into.”

One other huge node of innovation that Slywotzky doesn't mention is Silicon Valley. By most standards, this California mecca of information technology is now the most successful of them all, with its vastly popular social media companies and its array of new, highly marketable electronic gadgets, like tablet com- puters and smartphones. Yet companies like Google and Facebook have not been generators of the kind of “spillover” technologies that Bell Labs produced. And they certainly haven't created a lot of jobs at home. The Bureau of Labor Statistics (BLS) says the region lost 19.9% of its high-tech jobs in the first eight years of the century. BLS researchers said they could not give exact numbers for the years since, but some indicators show the downward trend continuing.

Apple, the current superstar of Silicon Valley, has created 700,000 jobs in manufacturing products like the iPad and the iPhone—most of them in Asia. The company employs 43,000 in the United States.

To be sure, some R&D authorities remain deeply skeptical about broad-brush portrayals of American decline predicated on R&D funding levels.

“I reject that narrative,” says G. Pascal Zachary, professor of practice at Arizona State University, an author and frequent commentator on technology and innovation. The problem is not levels of spending, Zachary contends, but the outcomes. “The National Institutes of Health's [NIH] budget has doubled in the past 10 years,” he says. “It's more than $30 billion now. Most of it's going for basic research. No one would argue that NIH has done a stellar job at reducing health care costs or improving life spans.”

Zachary and others are looking for ways to make the research system more efficient—by streamlining NIH, for example, with its multiple institutes focused on individual diseases, or by creating a system of intra- agency competition for R&D funds. Yet there is little indication any of his ideas will be adopted.

One of America's greatest innovations of the mid-20th century was the U.S. mode of innovation itself, many experts have suggested. What the country's 21st century research establishment, and its advocates still have not developed, is a strategy that will give the United States an advantage in the globalized economy. It will have to be a strategy that not only nurtures creative thinkers but also paves the way for a new wave of innovation that will strengthen and sustain the American economy.


Ed Newton is a Washington, D.C.-based writer, formerly of the L.A. Times, Newsday and the New York Post, as well as the former managing editor of New Times- Broward Palm Beach.

To search, type what you're looking for and results will appear automatically