Moore’s law in context with computing

Moore’s law is basically a thumb rule in computing (hardware) that says that the number of transistors that can be placed inexpensively on an integrated circuit doubles approximately every two years. It was given by Intel co-founder Gordon E. Moore, who described the trend in his 1965. Later, the period was quoted 18 months by Intel Executive David House.

Moore stated that the number of transistors incorporated on an integrated circuit would rise exponentially such that the performance of the circuit would double every two years. Every digital electronics equipment seemed to obey Moore’s law in terms of their memory capacity, speed of operation or external feature integration. In fact, it almost seems that, smaller the size of a device, greater is the features ingrained in it. The capabilities of many digital electronic devices are strongly linked to Moore’s law: processing speed, memory capacity, sensors and even the number and size of pixels in digital cameras. This exponential improvement has dramatically enhanced the impact of digital electronics in nearly every segment of the world economy. The industry expects that Moore’s law would continue as far as 2020.

Why Moore’s Law is in News?

Moore’s laws has been frequently in news and often cited in relation to any development in hardware. For more than 50 years, the semi-conductor industry has been upholding Moore’s Law.

Recently, the Researchers in Australia had made with pinpoint accuracy a working transistor consisting of a single atom, marking a major stride towards next-generation computing. The device comprises a single phosphorus atom, etched into a silicon bed, with “gates” to control electrical flow and metallic contacts that are also on the atomic scale. It was also claimed that the astonishing run of success in the field of nanotechnology, Moore’s law could hit a wall by the end of this decade without a breakthrough in miniaturizing transistors.


Leave a Reply