Microcontrollers

Building Blocks of Electronic Devices

Microcontrollers - Building Blocks of Electronic Devices

How Did They Start

1Like many types of digital technologies, the initial breakthrough that introduced microcontrollers onto the market was quickly followed by further innovations that made these devices more efficient and more affordable.

The first microcontroller was developed in 1971. It would take 3 years until the device was ready to go commercial, however, which happened in 1974 when the TMS 1000 hit the market. This chip had all the basics required to make a computer, including a processor, memory and a clock all on the one tiny integrated circuit.

By 1977 the Intel 8048 was on the market. It ended up being a huge commercial success for Intel. At this point, however, microcontrollers were not as cost effective as they would eventually become. To erase the memory on these devices, for instance, the memory had to be exposed to ultraviolet light. This required a quartz window on the device and that drove up the cost considerably. The other option was to use a microcontroller that could only be programmed once, which was also undesirable as far as cost went. It wouldn’t be until the early 1990s that this problem was fixed.

Today, microcontrollers are very inexpensive. This allows them to be used in many different industrial applications, medical applications and more. It also allows those who simply have an interest in computer science to use these devices for experimentation and education. Since modern microcontrollerscan be erased electronically, they can be used over and over again. Simulators have also been developed in recent years, which allow programmers to try out their code on a simulated microcontroller before they install it on the real thing.