Skills that future chip engineers should possess

March 27, 2026

9

Prior to the Synopsys Converge keynote address, Semiwiki interviewed Ravi Subramanian, Chief Product Management Officer at Synopsys, who highlighted several key trends shaping the future of artificial intelligence, semiconductor technology, and engineering. 

His discussion focused on how silicon chip design and systems engineering are converging, largely driven by the rapid development of AI and the need for more efficient computing infrastructure. This conversation provided an in-depth analysis of the technological, economic, and engineering challenges that will define innovation over the next decade.

The significance of integration

One of the points Ravi first explored was the significance of the "Converge" initiative. It symbolizes the merging of two traditionally separate engineering fields—silicon engineers and systems engineers. 

Silicon engineers focus on designing semiconductor chips, while systems engineers design complete products such as automobiles, medical devices, and industrial machinery. Historically, these two fields operated relatively independently. However, modern technologies, especially those driven by artificial intelligence, demand close collaboration between the two disciplines. 

For example, self-driving cars, robots, and smart devices all rely on the coordinated work of specialized chips, complex software, sensors, and physical systems. Therefore, the lines between hardware engineering and systems engineering are becoming increasingly blurred.

Another key theme of the interview was how to measure the performance of AI systems. Traditionally, the industry has focused on metrics such as "tokens per second," which measures how quickly an AI system processes information. However, Ravi explained that the industry is now paying more attention to efficiency-based metrics such as "tokens per dollar" and "tokens per watt." 

These metrics assess how much efficient computation AI can perform relative to cost and energy consumption. This shift is crucial because running large AI systems is extremely costly and energy-intensive. For example, Ravi mentioned that AI-assisted search queries can require four to six times the energy of traditional search queries. With the widespread application of AI, improving energy efficiency will become one of the most pressing challenges facing the technology industry.

Artificial intelligence technology and global economic growth

Ravi also links artificial intelligence (AI) technology to global economic growth. He explains that the current global economy generates approximately $117 trillion annually. Of this, about $41 trillion comes from physical products requiring engineering and manufacturing, and about $60 trillion comes from the service sector. Many economists believe that global GDP could double in the next 25 years, reaching approximately $250 trillion. Ravi states that this growth will be largely driven by productivity gains from AI. However, these AI systems heavily rely on advanced semiconductors and computing infrastructure, meaning the semiconductor industry will play a central role in driving future economic expansion.

To understand the future direction of AI hardware, Ravi identifies four key components that determine the performance of AI systems: computing, interconnect, storage, and power. Computing refers to processors, such as GPUs and dedicated AI accelerators, which are responsible for performing the computations required to train and run AI models. Interconnect refers to the technology for transferring data between chips and computing nodes. Efficient data transfer is crucial because data transfer typically consumes more power than the computation itself. Storage, especially high-bandwidth memory, is another major challenge because modern AI models require massive amounts of data to operate effectively. Ravi warned that if AI data centers consume most of the available memory resources, memory shortages could even disrupt certain industries. Finally, power consumption is a major constraint, as large AI systems require significant amounts of electricity to operate.

The interview also highlighted the potential for major changes in the semiconductor supply chain. Ravi pointed out that the semiconductor industry is entering the first decade of a major transformation as companies adjust manufacturing processes, design methodologies, and infrastructure to support the booming AI economy. This transformation will impact everything from chip architecture to memory production and data center design.

In conclusion, Ravi emphasized that future engineers will need broad, interdisciplinary knowledge. Systems engineers must understand semiconductor technology, while chip designers must understand the physical principles and system behavior of the real world. As AI continues to expand into robotics, autonomous systems, and other forms of "physical AI," the integration of software, hardware, and physical systems will become increasingly important. 

This convergence will ultimately determine the future of technological innovation.

Source: Compiled from semiwiki

We take your privacy very seriously and when you visit our website, please agree to all cookies used.