The electronic design automation (EDA) industry started in the 1980s and primarily was driven by the test and PCB industries. The test industry was focused on simulation so that test vector sets could be developed and optimized. The PCB industry needed help managing complexity as system sizes grew.
That complexity soon was eclipsed by IC complexity and the costs associated with making a mistake skyrocketed. The convergence of schematic capture and simulation marked the birth of the industry and with it came the establishment of the register transfer level (RTL) of abstraction and the electronic design interchange format (EDIF) as a means of moving information between tools.
The first step in automation came with place-and-route at the gate and transistor level, followed several years later by logic synthesis, which connected RTL with the physical layer. From that point onward, the flow has been systematically refined and improved, while technology scaling brought in many new challenges.
Around 2000, the industry attempted to define and build a new, higher level of abstraction called the electronic system level (ESL). This was really the first attempt at hardware/software co-design and while it did result in some new and successful tools, like high-level synthesis, ESL was in most respects a failure.
But the need did not go away, and recently it has seen a resurrection, driven by the rapid acceptance of RISC-V and the growing ecosystem that is surrounding the open-source processor architecture.
ESL attempted to define an optimum chip design by considering what should be executed in software, on which processor, or what needed to be in hardware to meet performance goals, and how to connect them. It proved to be too complex a task at the time, and provided too few benefits.
With RISC-V, many people have a much tighter goal. How do you optimize a processor core for the execution of a specific task. With the right level of analysis, they can improve their overall metrics — be it area, speed, power, or combination of them — by orders of magnitude compared to what they would achieve by using standard components or IP, and optimizing an implementation.
What they can’t do is speed up the implementation and verification of the processor they have defined. What is frustrating to many of them is that the kinds of optimization they are likely to see from the traditional EDA flow is peanuts compared to the optimization they already have done. They don’t care if the design finishes up being within 5% of optimum. They would probably be willing to give up considerably more if they could do it faster and more reliably.
This is an area where the EDA industry could be disrupted, and this is a classic disruption as described by Clayton Christensen in his “The Innovator’s Dilemma” book. It is an EDA flow that is nowhere near as sophisticated as the existing flows. It may include many design restrictions, often hidden within the processor architecture, that would enable someone to create a processor-based system in a fraction of the time and cost of the traditional EDA flow.
Yes, it would produce inferior results for existing users of EDA tools, but it would enable a new class of designer to consider a custom solution where they had relied on preexisting components in the past. These likely would be small chips targeting IoT types of devices, but they could provide significant gains over the existing choices by having smaller, cheaper, more highly tuned chips for a specific application.
It is unlikely that existing EDA companies would embrace these customers, who would not be willing to spend a large amount of money on such tools. It is equally unlikely to come from an open-source movement, which would require a fairly large number of companies to come together and basically prepay for those tools to be built and then maintained. There are many good-enough tool technologies that exist today, and they would not have to deal with the challenges of the latest fabrication nodes.
It requires someone to work out a good set of limitations to simplify the flow. They have to find a way to streamline the process such that it becomes push button, not only in its implementation, but also in its verification. Maybe that comes from a new language or abstraction. Maybe the user never even has to see hardware in a way they would recognize today.
This is but one example of a domain-specific EDA flow that can utilize the inherent simplifications that come from restricting aspects of the design. There are potentially many more, and even with increased investment coming into the industry, I doubt any money would find its way into these types of flows. The companies behind them are just not seen to be big enough, or important enough.
Disruption is always possible, and the more an industry says it is unlikely to happen, the more likely they are not looking to new directions or new customers. They are focused on delivering what their existing customers need.
Address:
Singapore - 108 Keng Lee Road, #03-01, Keng Lee View, Singapore 219268
USA - Henderson, NV 89053,PHONE 510-209-9371
Hongkong - Flat/RM 1205, 12/F, Tai Sang Bank Building, 130-132 Des Voeux Road Central, Hongkong
Changsha - 3005 Unit A, Yage International, Yuelu District, 410000 Changsha