https://secure.gravatar.com/avatar/ad516503a11cd5ca435acc9bb6523536?s=96

TELDAT Blog

Communicate with us

How to be a leader in technology without losing anonymity

Apr 10, 2018
m

IT

PC´s It is well known that, ever since they first appeared in the 80s, PCs have CPUs with Intel architecture. The fact that there are many systems based on PCs is also common knowledge. However, the general public may be unaware of the features of embedded systems around us (such as cell phones, tablets, vehicles, appliances…).


A great number of systems use an architecture created by an atypical company: ARM. Born in the 1980s by partners who were close to filing for bankruptcy (Apple, as strange as it may seem now, and Acorn Computer, a currently unknown or almost forgotten name) and whose interests were often far from aligned from those of their customers (VLSI Logic). Given its inception (sponsored by the BBC), few would have betted on ARM becoming the undisputed leader in microcontrollers of the 21st century. The company’s history is detailed in a book written by Daniel Nenni and Don Dingee, “Mobile Unleashed; the origin and evolution of ARM processors in our devices “, 2017 Semiwiki LLC, Danville CA.

The history of ARM: from anonymity to leadership

In the late 1980s, a battle was still being waged between two giants: Intel, that thanks to IBM had conquered the world of PCs, and Motorola, while having lost that battle, was still active and relatively influential in workstations (where Unix was the most successful operating system). Most CPUs of 16/32 bits were equipped with either the 68000 Motorola Semiconductor (later called Freescale), or the 80286 Intel model.

However, the first real personal computers at the time remained at 8 bits. The now-forgotten Commodore, Atari and Apple II (later on, Apple III as well) were based on a microprocessor (6502) that derived from the 6800 Motorola architecture and were manufactured by Mos Technology (founded by Motorola defectors). The 6502 microprocessor added around 25000 transistors to the CPU and peripheral components, which was a very reasonable amount when compared to the 68000 that gave its name to the MC68000 model.

At this point, RISC (Reduced Instruction Set Computer) architectures started to appear. Their goal was to stop a complex instruction set computer (CISC) from operating at all times, thus reducing consumption, making the chip design easier and preventing the system from slowing down. A team in Acorn and VLSI technology came up with the ARM1 microcontroller, based on the 6502 model and the RISC standards, in 1985. They also managed to provide graphs, icons and sound (which, so far, were only available in XEROX or ICL workstations).

To further complicate matters, Olivetti took over Acorn Computer with the aim to sell only PC-compatible products. However, Olivetti had AT&T (owner of the UNIX operating system) as a US partner and eventually began to see the potential of combining RISC and UNIX in the professional sector (i.e., workstations and CAD systems).

Ever since they emerged at the beginning of the 1990s, PDA (Personal Digital Assistant) calls and phones required low consumption, graphic capabilities, efficient operating systems, touch screens and handwriting recognition.
Motorola, on the one hand, and Apple, on the other, contributed to ARM’s success. Motorola experienced the shift from analogic to digital phones and, to the detriment of its own chips, searched elsewhere (Texas Instruments) for DSP and low-consumption RISC. Nokia and Ericsson, leaders in the field of mobile devices at the time, followed the same path.

ARM’s meteoric rise to success

At the time, companies thought users would need PDAs. However, except for some minor success (like Palm), this idea was finally dropped. Despite the efforts made in trying to integrate reduced systems that were compatible (e.g., Windows CE), PDAs were never a big hit. In 1993, Apple launched a very expensive assistant, Newton, that was equipped with the ARM RISC 610 model. Even though it bombed, it was used as reference for future devices that were a mix of telephones and PDAs (i.e., the smartphone).

The focus the main chip manufacturers put on microcontrollers for workstations (HP with PA Architecture, IBM) and the development of free UNIX versions (LINUX), have helped ARM architecture become a staple in embedded and educational systems (for instance, Raspberry PI) that are now commonly used.

With cores ranging from 8 to 64 bits, ARM has sold more than 800 licences. It is estimated that as many as 60,000 million microcontrollers with ARM technology have been manufactured to date. 95% of cell phones have an ARM processor, or one that derives from the ARM technology (such as Apple’s A-n). Controllers are often linked to a CPU (ARM or Cortex, one of ARM’s registered trademarks) and peripheral components, thus building a System on an Integrated Circuit (SoIC).

Thus far, we have not mentioned Japan, whose industry towards the end of the previous century was key in the development of electronics and largely responsible for the spread of microcontrollers (after the all-mighty MITI sponsored a multi-million project called TRON to design optimum products). After the undeniable success registered in fields such as vehicle electronics, where Japan remains a global player, the country’s industry has followed the same route. In 2016,

Masayoshi Son, CEO and main shareholder of Softbank, purchased ARM Holdings for an exorbitant amount in a transaction that was widely discussed within the financial sector.

ARM is at the heart of several SoICs used for access and telecommunication purposes, albeit not as much as in mobile telephony.

References:
[1] Daniel Nenni and Don Dingee “Mobile Unleashed; the origin and evolution of ARM processors un our devices”, 2017 Semiwiki LLC, Danville CA
[2] Yukiko Fujisawa “Introduction to H8 Microcontroller”, p. 14, 2003 Ohmsha Ltd, Tokyo Japan
[3] EETimes: www.eetimes.com/document.asp?doc_id=1330152

Related Posts