Investors underwrite new era of chips

In the past couple years, chip companies have contemplated life after Moore’s Law, the semiconductor industry’s guiding principle for half a century.

Their assumption has been that with chips no longer able to rely on ever-smaller silicon circuits to lower power consumption and improve performance, another approach had to be found.

Perhaps the most promising avenue, at least in the short term, is special purpose, or custom, silicon targeting specific tasks, such as those that take advantage of artificial intelligence, or machine learning. Big money has been following, with a new generation of well-funded chip startups now revitalizing a moribund sector of venture investing.

Whether the money keeps pace this year is an open question. More likely than not, dollars will shift to follow-on deals, as companies able to show progress seek additional capital. What’s not under debate is that chip investing looks forever changed.

The hard realization for the semiconductor industry is that the decades-long era of one-size-fits-all CPUs and GPUs has passed. Circuit shrinks continue but at a slower pace to nodes such as 7 nanometers with power leakage high and performance gains less pronounced.

The result has been a quiet rethinking of the way chips are conceived and growing momentum behind task-specific processors.

Today “we’re putting dedicated silicon everywhere” from the data center to the edge of the network, said Eric Klein, a partner at hardware-focused venture firm Lemnos Labs. And from the new generation of chip entrepreneurs, “we’ve definitely seen a large increase in applications from people working on custom silicon.”

Investors trace the start of this new era to the 2016 sale of Nervana Systems to Intel for an estimated $408 million. While the price tag wasn’t huge, it showed that money could be made with this new type of company.

The funding boom that followed channeled $1.02 billion into young companies last year in the United States, almost double 2017, according to data from Thomson Reuters. Through mid-June this year, another $408 million was deployed, suggesting an annual total in the neighborhood of $800 million, another solid year even if a decline.

At the heart of revival in chip investing is a four-letter word: data. The custom chips are being designed to crunch through large data sets that aren’t easily or efficiently handled with racks of CPUs or GPUs. They typically make use of AI, or machine learning algorithms, and boast of lower power requirements.

They also serve real customer needs. Applications show up not only in stock trading, where the data inputs change second to second, but in a variety of tasks, such as credit card fraud detection, where millions of daily purchases flood into the data center.

Custom chips are also being applied to image recognition, computer vision for autonomous vehicles, robotics and edge devices, where processing on location saves the cost of transporting huge streams of data to the network core.

“Everything is about data – massive, massive data,” said Lip-Bu Tan, chairman of Walden International, who has coined the term workload specific processors. “That is a new class of processor coming up.”

What makes many of these chips unique from a design perspective is they are programmable and can be adapted to a need. This flexibility is both reassuring to investors and a point of differentiation. Key to this programmability is software, not just software used to design the chips, but to map the new tasks directly to the hardware circuitry to improve performance.

This heightened interdependence between software and hardware has companies relying on new skills, including a revitalized focus on custom compilers, again offering a point of company differentiation. When software was routinely developed for Intel’s line of x86 chips, in contrast, standard compilers could be used.

So far incumbents in the chip business haven’t gone after this type of opportunity, said Shahin Farshchi, a partner at Lux Capital, who has invested in Aeva and Mythic.

“We do see the opportunity for many of these companies to become large multi-billion-dollar standalone companies driven by their ability to execute and by the scale of the markets they well into,” Farshchi said.

Not surprisingly, entrepreneurs have flocked into the market. Nearly 50 companies alone are estimated to be in the AI space working on inference and training chips.

“Machine learning is attracting a lot of money,” noted Stan Reiss, a general partner at Matrix Partners who has invested in Lightmatter. The company is developing an optical computer chip and raised an additional $22 million in February in a GV-led round.

More capital is likely. According to investors, this money isn’t likely to be chasing new bets on AI training chips, server-based semiconductors designed to learn from large data sets, in large part because bets have been made and because big internet companies, including Google and Amazon, have their own chips under development.

But inference chips, or chips optimized to apply that learning to new data and draw conclusions, could see considerable funding with new companies raising capital.

Also drawing interest are lidar chips for autonomous vehicles, which have attracted plenty of money and where companies remain earlier than with ML. Later-stage activity could pick up.

Investors also have shown a good bit of interest in intelligent edge processors, sensors and other circuitry with industrial uses linked to the internet of things. Volumes are anticipated to be high and data crunching can be of particular value as it prevents raw data from being transported to the network core.

Another factor making the investments attractive is the way they conceive of chip development from the hardware perspective. This is through the use of programmable FPGAs and application specific ASICs.

While capital intensity is still high, ready-to-be-programmed FPGAs from chipmakers such as Lattice Semiconductor can be quickly turned into prototypes using improved electronic design automation tools. Outside vendors can even be brought in to tune ML algorithms to a specific chip.

A generation of engineers is coming to understand the steps, making talent available, investors say.

What this allows companies to do is take an FPGA design and migrate it to an ASIC, where the software can be baked permanently into the silicon. Fabs are available to spin these ASICs at low volumes.

So now a $4 million or $5 million seed round can enable a startup to get a pilot ASIC to customers, even facing the extraordinary complexity of taping out a modern chip, or laying out its design for production. Half a decade ago, a $4 million to $5 million round was a much harder deal to do.

“All of a sudden in 2019 it’s a viable round,” Klein said.

Once a design gets to market, profitability can ramp quickly, even if a company has soaked up $25 million to $50 million in venture funding. This is because the customers that place a chip in a server or network device pay the marketing and promotional costs of the product, not the chip designer.

On the other hand, big hurdles do face this generation of startups. AI chips targeting the data center or compute infrastructure need to be developed on the most advanced production equipment running at 7 nanometers or 10 nanometers, complicating design and production. As a result, mask costs can be astronomical, sometimes rising to $10 million.

“You need to push the envelop to the advanced node,” Tan said.

Startups also need key design wins, no easy task as sales cycles for initial customers can be long.

By in large, investors expect this year to remain a solid one for capital in the sector. As deals shift to later rounds, company count could be down. But capital should be strong.

“I think you’ll see continuing investment in the area,” Klein said.

Yet while it may be slower than last year’s surge, customer demand is real. “It’s not a want, it’s a need,” he said. “The overall need for the problem will not go away.”

One significant risk is the possibility of an economic slowdown. Unsettled financial markets could pressure follow-on rounds in the quarters or years to come for companies with $30 million of funding and urgent demands for $20 million more.

“There is always a financing risk that any chip investor takes,” Reiss said.

VCJ Venture Chips
Hrach Simonian, general partner, Canaan Partners. Photo courtesy of the firm.

But promising opportunities remain. For Hrach Simonian, a general partner at Canaan Partners, this includes the autonomous vehicle. Already, Simonian has backed Aeva, with its unusual Doppler approach to lidar.

“I think the connected car will be a big market,” he said.

Simonian said he’s interested in custom chips that can improve power efficiency as they deploy ML and process data. He’s also keeping an eye on opportunities in the connected car space, where AI might be used to connect systems within a car and perhaps monitor things such as the health and fitness of passengers.

Tan also remains active, with 23 new and follow-on semiconductor deals in 2018 and 10 new deals so far in 2019. His firm averages between 15 and 30 a year.

“I used to have a hard time talking to my brothers and sisters” about semiconductor deals, he said. “Now they join me for investments.”

It seems eyes remain open to this market in transition.

To access additional data in PDF format, click on the following links

Chip investments surge

While capital increases, deal flow shrinks