The Secret to Building the Next Silicon Valley

Other regions have tried to capture the tech title for decades. Why haven't they succeeded?
Collage of images of dense silicon valley housing wartime technology and money
Photo-Illustration: Sam Whitney; Getty Images

Political leaders have been trying to replicate Silicon Valley’s high-tech magic since the invention of the microchip. A tech-curious Charles de Gaulle, then president of France, toured Palo Alto in his convertible limousine in 1960. Russian Federation President Dmitri Medvedev dressed business casual to meet and tweet with Valley social media tycoons in 2010. Hundreds of eager delegations, foreign and domestic, visited in between. “Silicon Valley,” inventor and entrepreneur Robert Metcalfe once remarked, “is the only place on earth not trying to figure out how to become Silicon Valley.”

In the US, too, leaders have long tried to engineer another Silicon Valley. Yet billions of dollars of tax breaks and “Silicon Something” marketing campaigns later, no place has matched the original’s track record for firm creation and venture capital investment—and these efforts often ended up benefiting multinational corporations far more than the regions themselves. Wisconsin promised more than $4 billion in tax breaks and subsidies to Taiwanese electronics manufacturer Foxconn in 2017, only to see plans for a $10 billion factory and 13,000 jobs evaporate after hundreds of millions of taxpayer dollars had already been spent to prepare for Foxconn’s arrival. Amazon’s 2017 search for a second headquarters had 238 American cities falling over each other to woo one of the world’s richest corporations with tax-and-subsidy packages, only to see HQ2 go to two places Amazon likely would have chosen anyway because of their preexisting tech talent. One of the winners, Northern Virginia, promised Amazon up to $773 million in state and local tax subsidies—a public price tag for gleaming high-tech towers that seems especially steep as Amazon joins other tech giants in indefinitely pushing back post-pandemic plans to return to the office.

While the American tech industry is vastly larger than it used to be, the list of top tech clusters—the Bay Area, Seattle, Boston, Austin—has remained largely unchanged since the days of 64K desktop computers and floppy disks. Even the disruptions of the Covid-19 pandemic have done little to alter this remarkably static and highly imbalanced tech geography.

Still, politicians are trying again. Bills working their way through Congress include the US Innovation and Competition Act (USICA), which contains big boosts to research spending, $10 billion in new grants and subsidies to develop “regional innovation hubs, and $52 billion to expand domestic semiconductor production.” The Build Back Better Act now battling its way through the Senate includes more than $43 billion for tech-inflected programs to boost local economies. These measures emphasize investment over tax breaks, and in sum invest far more in place-based economic strategies than the US has in decades. They are promising. But they are only a start.

You don’t have to travel far in Silicon Valley to find a techno-libertarian proclaiming that the sector’s success is purely the result of entrepreneurial hustle and that the best thing government can do is get out of the way. But that conclusion ignores history. In reality, public spending played an enormous role in growing high-tech economies in Silicon Valley, Seattle, Boston, and Austin. Understanding how this happened is essential to imagining where tech might grow next.

During World War II, the US government’s unprecedented mobilization of people and resources remade America’s economic map. Depression-ravaged Midwestern assembly lines jolted back to life at the government’s command, churning out Jeeps and tanks instead of passenger cars. Scientists and technologists set aside usual research pursuits to join the wartime “army of brains.” Many were involved in the top-secret push to develop an atomic bomb, living in entirely new communities constructed by the military in places so remote that they could remain unnoticed: the New Mexico desert, the arid plains of Eastern Washington, the hollows of rural Tennessee.

World War II was the test case for using government investment to spur scientific progress and remake regional economies. The Cold War took it to scale. Military spending that had shrunk at war’s end surged back by the early 1950s amid a new atomic arms race with the Soviet Union and war in Korea. Take a stroll around an American university campus today, note the number of science buildings erected in the 1950s and 1960s, and you can see the poured-concrete results.

Initially, the regions at the top of the high-tech heap were on the East Coast; Boston was the nation’s largest tech economy well into the 1980s. The region that eventually dislodged Boston from its high-tech throne was, before the war, best known as the nation’s capital of prune production. The one thing setting the future Silicon Valley apart from its agricultural counterparts was Stanford University, which had some pretty good engineering programs and a few alumni tinkering in garage startups nearby.

As Cold War spending surged into the military installations of the Pacific West, the Valley economy transformed. Rightly anticipating the enormous sums the government would spend on academic science, Stanford administrators reorganized the university to beef up programs like physics and electrical engineering. Major East Coast electronics firms and defense contractors set up branch operations to be close to the region’s military facilities and tap into Stanford-trained engineering talent. In 1955, Los Angeles-based Lockheed opened its Missiles & Space Division a few miles down the road from Stanford’s campus. The defense giant remained Silicon Valley’s largest employer into the 1980s, doing work so top-secret that its engineers couldn’t disclose it at the family dinner table.

Unlike the big computer makers of the East, the Valley built small. Its transistorized electronics and communications devices proved essential to the development of missiles and rockets, and, in later years, the personal computer and internet industries. The billions of dollars in federal grants and contracts that flowed into a 10-by-10-mile strip of California countryside became the foundation of the Silicon Valley to come.

NASA and the Pentagon ordered silicon semiconductors and integrated circuits from startups like Fairchild Semiconductor, becoming bedrock customers for a firm whose founders went on to establish Intel, venture capital firm Kleiner Perkins, and other iconic Valley names. Like Apple cofounder Steve Wozniak, whose father was a Lockheed engineer, the offspring of Valley defense workers grew up familiar with and fascinated by advanced electronics. They tinkered in their basements, took summer jobs at Hewlett-Packard and Atari, and—as Wozniak did with a fellow child of the Valley, Steve Jobs—started tech companies of their own.

Similar things happened in other places that became and remain capitals of American tech. Defense and space spending boosted Austin, enlarged the Texas semiconductor industry, and elevated the research reputation of the University of Texas. Seattle boomed, thanks to the expanding Cold War military, its growing public research institutions, and the defense contracts rolling into Boeing (then the region’s largest employer). In the early 1970s, a teenaged Bill Gates would sneak into the University of Washington computer lab after hours to write his first software programs.

It wasn’t just tech policy that made these regions what they are, however. Social spending mattered too. In the prosperous postwar years, the GI Bill sent millions of veterans to college and helped them buy homes. States like California enlarged public higher education systems, making it easy to obtain a low-cost, top-flight university education. Schools and local infrastructure were well-funded, especially in the growing suburbs that many tech people and companies called home.

Early Silicon Valley was filled with people from modest backgrounds who benefited greatly from this broad mix of public investment. The first generation of high-tech entrepreneurs were preachers’ sons from small-town Iowa and farm boys from Texas, whose engineering smarts handed them education, economic mobility, and their pick of tech jobs. The second, Baby Boom generation graduated from college hooked on computers, unencumbered by student debt, and itching to build new things. Steve Jobs’ father never finished high school, but he could get a job as a laser technician in 1960s Silicon Valley that paid him enough to buy a suburban home and send his son to a public high school with its own computer lab.

The US government had a transformative impact on high-tech development when its leaders were willing to spend big money on research, advanced technology, and higher education—and keep at it for quite some time.

In recent decades, this has changed. Political leaders embraced tax cuts, not spending, as tools to grow the economy. In 1978, Californians voted for a property tax cap that has drained local governments of resources ever since, leaving schools underfunded and infrastructure crumbling. The proportion of the federal budget devoted to research and advanced technologies steadily declined, as did state budgets for higher education. Modern Silicon Valley grew wealthy amid this pullback from the public realm, so it’s not surprising that many tech leaders are dismissive (if not resentful) of government, and most believe public policy had little to do with their entrepreneurial success.

Instead of investing in places and people, would-be Silicon Valleys dangled industry subsidies and tax breaks to lure tech firms from elsewhere in the world. The academic research emphatically shows that this elaborate shell game can be a supremely costly job-creation strategy. For every job gained, regions sacrifice tax revenue and funnel limited resources away from wider public needs.

The next Silicon Valley will not come from a race to the bottom, from who can offer the most tax cuts, the leanest government, the loosest regulations. It will result from the kind of broad, sustained public investment that built the original Valley.

Historically, such willingness to invest usually involved world war or a comparable geopolitical threat. The pattern holds today. Lawmakers wrangle over anti-poverty measures like the child tax credit but pass enormous military budgets with ease. Anxiety about China’s technological ambition drives spending-averse lawmakers to vote for semiconductor subsidies and greater research budgets. Yet the challenges facing the nation and the world—climate change, inequality, the erosion of democracy—require innovations beyond weapons of war.

This is a long game. There are no photo ops or glossy, buzzword-loaded promotional brochures. It shifts the focus away from simply growing the tech and toward nurturing the people that make it. It abandons the free-market mythos in favor of a recognition that public and private, working together, are how the American tech sector grows.

These are investments that spark and sustain basic research without an immediate demand to commercialize, that generously fund higher education so graduates have the economic freedom to pursue sometimes risky and iconoclastic projects. The mRNA vaccines for Covid-19 demonstrate the benefits of this kind of long-term approach. Under development for years, tested and refined outside of market demands, the technology was ready to be rapidly deployed when crisis struck.

A next generation of high-tech places will come from investments in people, as well as in technology. Because this is the 2020s, not the 1950s, those investments need to be made with equity in mind. The original Silicon Valley has a woeful record when it comes to gender and racial diversity, especially at its top ranks. The next Silicon Valley can do better.

There are things that only governments can do: invest at scale, incentivize private markets to push the technological envelope, serve as a deep-pocketed customer for high-tech products, support mass educational advancement and economic opportunity. The recent legislative proposals are a start, but not yet enough.

Ten billion dollars is a lot of money, but it is not enough to generate the kind of robust, long-lasting “regional innovation hubs” lawmakers are hoping for, especially if the investment is spread across too many places. And political leaders who embrace semiconductor subsidies but reject new social spending are missing the point: Sustainable tech-driven economies need to be about more than a flashy new factory, or about drawing educated workers from elsewhere. They should spread opportunity and prosperity wider and create space for new entrants—people, ideas, companies—to participate.

After decades of the government pulling back, it is hard to chart a more interventionist course. But when we take the long view, we see that the next Silicon Valley is out there. It just requires the leadership, political will, and policy imagination to make it so.


More Great WIRED Stories