๐”– Bobbio Scriptorium
โœฆ   LIBER   โœฆ

Shifts in INTEGRATION: 20 years of VLSI design

โœ Scribed by Ralph H.J.M Otten


Publisher
Elsevier Science
Year
2002
Tongue
English
Weight
68 KB
Volume
32
Category
Article
ISSN
0167-9260

No coin nor oath required. For personal study only.

โœฆ Synopsis


Shifts in INTEGRATION: 20 years of VLSI design

Twenty years ago activity in design automation increased almost stepwise. Beside the increased industrial effort, it was also noticeable in the birth of new conferences and new journals. And the editorials and forewords of these conferences and journals motivated the new initiatives with the very fast change in the world of integrating circuits. It was the time that many people dreamed of what was called a silicon compiler, an automatic translation from some specification into a complete mask set for an integrated circuit. Designing chips would be an effort comparable with programming. The desired functionality was captured in some high-level language and this compiler would produce the specification of a complete mask set for a chip with that functionality. Now, two decades later, one may ask: ''did it change so much, are chips designed drastically different from how they were designed then, do we have silicon compilers?'' The answer to the latter question, if taken as ''can any chip be designed in a high level specification and provided with a mask set without further human intervention?'', is certainly ''no!''. And scepticists may say ''no!'' to all of these questions. The dominant routing methodology today is maze routing, a technique known and used in the early 1960s. Force-directed placement, an even older concept, is very close to how chip components are placed today. Binary decision diagrams dominated the logic synthesis literature for almost a decade, is of the same age. So what is essentially new in circuit integration? Was there a significant development?

To start, silicon compilers of the scope envisioned in the early 1980s are certainly within reach and all the needed ingredients were properly developed. There is not a real problem in getting a microprocessor of a few thousand transistors automatically synthesized. But that fact is hardly of significance now, for several reasons. One is of course that a mircoprocessor of a few thousand transistors is no longer of interest. Moore's Law, formulated in the 1960s, predicted an exponentional growth of chip complexity. This law has remained incredibly close to reality up to now. The algorithms, developed for synthesizing a tenthousand-transistor microprocessor, are mostly superlinear, but even if they were linear there was already an exponentional growth in computational capacity necessary to maintain the adequacy of these algorithms for chips of today. The truth is that most problems that these algorithms try to solve do not even allow an efficient computation, i.e., no polynomial algorithm does exist for these problems (provided NPaP).

Another reason for the obsoleteness of the silicon compiler of the 1980s is that the models underlying the algorithms are not realistic anymore. Just to mention one effect that could be totally ignored then, and became of extreme importance in the 1990s, is wire delay. Interconnect used to be interpreted as a circuit node, meaning that the potential on an interconnect tree was the same everywhere on that tree at all times. The impact on delay was only through the wire's capacitance and even that was considered to be small compared to the connected gate capacitance. The main objective was to minimize area. Smaller area allowed more functionality on


๐Ÿ“œ SIMILAR VOLUMES


20 years of DES โ€” How it was designed
โœ Carl Meyer ๐Ÿ“‚ Article ๐Ÿ“… 1997 ๐Ÿ› Elsevier Science ๐ŸŒ English โš– 105 KB

An IBM developed algorithm was adopted in 1977 as a national standard: the Data Encryption Standard (DES).Although the entire algorithm was made available to the public, the design considerations were not published. Many people speculated that the lack of disclosure was due to some 'trap door' or hi

Preservation of fungi in water (Castella
โœ Claudia Hartung Capriles; Sofia Mata; Marianne Middelveen ๐Ÿ“‚ Article ๐Ÿ“… 1989 ๐Ÿ› Springer Netherlands ๐ŸŒ English โš– 389 KB

Five-hundred ninety-four strains of fungi were studied. They were found being preserved with Castellani's method with distilled water during 1 to 20 years. 62~o of the strains (n = 368) did grow when subcultured and maintained their main morphological features. 90~o of the 20 years old strains of di