Beta
×

Welcome to the Slashdot Beta site -- learn more here. Use the link in the footer or click here to return to the Classic version of Slashdot.

Thank you!

Before you choose to head back to the Classic look of the site, we'd appreciate it if you share your thoughts on the Beta; your feedback is what drives our ongoing development.

Beta is different and we value you taking the time to try it out. Please take a look at the changes we've made in Beta and  learn more about it. Thanks for reading, and for making the site better!

The Ever So Unlikely Tale of How ARM Came To Rule the World

samzenpus posted about 7 months ago | from the top-of-the-class dept.

Businesses 111

pacopico writes "About 24 years ago, a tiny chip company came to life in a Cambridge, England barn. It was called ARM, and it looked quite unlike any other chip company that had come before it. Businessweek has just published something of an oral history on the weird things that took place to let ARM end up dominating the mobile revolution and rivaling Coke and McDonald's as the most prolific consumer product company on the planet. The story also looks at what ARM's new CEO needs to do not to mess things up."

cancel ×

111 comments

Sorry! There are no comments related to the filter you selected.

FIRST POST (-1)

Anonymous Coward | about 7 months ago | (#46326721)

Wow I got a first post

Re:FIRST POST (0)

ArcadeMan (2766669) | about 7 months ago | (#46327111)

Posted from your cellphone, I presume. You just made ARM proud.

The future could be all in the fabs (2, Interesting)

mrspoonsi (2955715) | about 7 months ago | (#46326731)

And Intel have the advantage there.

Re:The future could be all in the fabs (2)

BasilBrush (643681) | about 7 months ago | (#46326853)

And some of Intel's fabs are manufacturing ARM chips. Intel doesn't have to lose for ARM to win.

Re:The future could be all in the fabs (2)

MachineShedFred (621896) | about 7 months ago | (#46326895)

More than that, doesn't Intel has an ARM license from when they acquired the detritus left over from the Compaq / DEC merger? They made the StrongARM series, and XScale CPUs for some time. I don't know if they sold the license to Marvell, or if it was just the XScale designs.

Re:The future could be all in the fabs (1)

BasilBrush (643681) | about 7 months ago | (#46326965)

Anyone who wants an ARM license can have one. Just pay the subscription and the royalties. It's open to all. (FRAND)

Re:The future could be all in the fabs (2)

alen (225700) | about 7 months ago | (#46327009)

that ARM license just means you can make the same chips ARM designs

Apple and Qualcomm have architecture licenses where they can design their own ARM chips. this is why a 1 GHz A7 CPU performs just as well in real life as a 2GHz samsung CPU. that and designing the software for the instruction set helps too

Re:The future could be all in the fabs (1)

BasilBrush (643681) | about 7 months ago | (#46327155)

Right, there's a range of licenses at a range of prices. But surely they are all available to all, including the architectural license.

Re:The future could be all in the fabs (5, Insightful)

NapalmV (1934294) | about 7 months ago | (#46327609)

You mean using a C compiler instead of a Java interpreter helps with speed and power consumption? Who could have thought?

Re:The future could be all in the fabs (0)

Anonymous Coward | about 7 months ago | (#46327901)

Not having to interpret bytecode might help speed things, but that would actually require using your brain to understand the differences. We are on Slashdot after all.

The new Android RunTime core which precompiles things at install time should speed things up to almost-if-not-equal to bare metal.

Re:The future could be all in the fabs (1)

K. S. Kyosuke (729550) | about 7 months ago | (#46332577)

Actually, the major performance benefits of Obj-C against Java probably lie in the fact that it doesn't force you to do virtual method calls everywhere around the place and that it supports value semantics for structs (being effectively a superset of C), reducing pointer dereferences and memory moves. Or so I understood it. And if ART helps, it's not because AOT is magically better for Java than JIT (it isn't).

Re:The future could be all in the fabs (1)

Megane (129182) | about 7 months ago | (#46327895)

I think they sold StrongARM to Marvell.

Re:The future could be all in the fabs (1)

drinkypoo (153816) | about 7 months ago | (#46331329)

It doesn't actually matter because XScale, ironically, didn't scale. Oh, it scaled up, but it wouldn't scale down. You could use it in non-mobile embedded applications where you needed a lot of horsepower at a low cost and they did; you'd see XScale in places like casino gaming entertainment modules. Now you're as likely to see Linux on x86 because frankly, there was never any need to miniaturize. Casino games are massive, and onboard graphics are more than adequate so there's no need for expansion cards which are the only thing which really have the potential to make a PC take up a lot of space. But since XScale couldn't achieve the very low power consumption of the competition you couldn't make a device with a modern profile and adequate battery life by the changing standards of the day, and we waved bye-bye to XScale.

I have an iPaq H2215 here. I have run Linux on it on occasion. It was fairly peppy while doing it, especially for having just one core. But without the expanded battery the life was best measured in minutes if you were doing anything of interest, say involving streaming video.

Given that Marvell is continuing to design it's ow (1)

hattig (47930) | about 7 months ago | (#46332287)

...it seems likely that the ARM Architecture License the Intel acquired in the Digital takeover/litigation mess also transferred to Marvell.

Re:The future could be all in the fabs (2)

Tough Love (215404) | about 7 months ago | (#46327121)

And Intel have the advantage there.

Physics has the advantage. Clock scaling already ended, now feature size shrinking is grinding to a halt. The action has now shifted to ballooning core counts and power management strategy, where Intel has no compelling advantage. Intel's feature size advantage has not got long to live.

Re:The future could be all in the fabs (2)

blackraven14250 (902843) | about 7 months ago | (#46328791)

Considering how Intel managed to go from NetBurst (massively power-hungry) to Haswell (very power efficient), I wouldn't doubt their ability to out-engineer the companies currently designing ARM chips.

You can get there from here (2)

dbIII (701233) | about 7 months ago | (#46331821)

Considering how Intel managed to go from NetBurst (massively power-hungry) to Haswell

They didn't. It grew out of a different project. NetBurst was a dead end.

Re:The future could be all in the fabs (3, Insightful)

rev0lt (1950662) | about 7 months ago | (#46328953)

I'm reading this and laughing. I've read the same kind of statement when they're using 300nm tech, 95nm tech, 65nm tech, and so on and so forth. Their public roadmap has 5nm tech around 2019-2022 (http://en.wikipedia.org/wiki/5_nanometer). And as x86 inheritance slowly fades away, they can actually produce way smaller chips without backwards compatibility if the market demands it (very few applications run 1978 instructions nowadays, same goes for all that 16-bit protected mode wazoo).

Re:The future could be all in the fabs (2)

evilviper (135110) | about 7 months ago | (#46330499)

Your claim is utterly ridiculous. CPU cache OVERWHELMINGLY dominates the x86/x64 die. Intel/Amd could eliminate ALL processor functions, entirely, and the die STILL wouldn't be "way smaller" (or "way cheaper" for that matter).

Re:The future could be all in the fabs (1)

rev0lt (1950662) | about 7 months ago | (#46330821)

CPU cache OVERWHELMINGLY dominates the x86/x64 die.

Yeah, so? Are you saying that 4MB on-die cache is somehow smaller in an ARM chip? And if you look at it closely, you don't see a huge increase in cache sizes in the past 10 years, do you? The tendency is to actually shrink slightly in the future, because of latency issues. But hey, lets focus on what we know today - 4MB of cache in 32nm (common today for ARM) is not the same area as in 22nm (used *today* on Haswell). Or in 5nm. And if you can free an additional 20% of real estate in the chip by cleaning seldom used parts, you do get a smaller die.
Regarding pricing, memory manufacturing and testing is a somewhat cheap process, when compared to testing the functions of the CPU. Cutting parts and shrinking the design would produce less defects, and it would probably have a direct impact on price. I know, utterly ridiculous.

Re:The future could be all in the fabs (2)

evilviper (135110) | about 7 months ago | (#46330925)

if you can free an additional 20% of real estate in the chip

...which you can't.

Cutting parts and shrinking the design would produce less defects, and it would probably have a direct impact on price. I know, utterly ridiculous.

Not "ridiculous" so much as a minuscule difference that will not have a visible affect on price or performance.

Re:The future could be all in the fabs (1)

K. S. Kyosuke (729550) | about 7 months ago | (#46332681)

So you're saying that because caches are the overwhelming issue, we should throw away x86 and design a modern ISA to improve on code density and to minimize memory transfers? ;-)

Re:The future could be all in the fabs (1)

evilviper (135110) | about 7 months ago | (#46332983)

That is what would be required to meet GP's silly and imaginary Intel roadmap. Except, of course, that wouldn't be "x86" any longer, so it doesn't quite fit.

And *I* certainly never said "we should" do anything of the sort.

Re:The future could be all in the fabs (4, Insightful)

timeOday (582209) | about 7 months ago | (#46327249)

Another possibility is there is no real future - nobody will reap the profits Intel did for the last 30 years.

Intel's earnings last quarter were $2,630 M compared to $156 M for ARM holdings. So if ARM is "ruling the world" like this story claims, then ruling the world just ain't what it used to be. And I guess that is likely, if semiconductors stagnate as they seem to be.

Fabs cost gazillions... (1)

Anonymous Coward | about 7 months ago | (#46327399)

2,630M vs 156M: how much does Intel *have* to invest in fabs to continue being competetive? How much does ARM? That's where the difference is...

Re:Fabs cost gazillions... (4, Insightful)

MouseTheLuckyDog (2752443) | about 7 months ago | (#46329109)

Except when you include the pofits for making ARM chips from Qualcomn, Apple ( if Apple had actually seperated out their chip making division, ), Samsung, Allwinner etc. that number changes drastically.

The only company making money off Intel chips is Intel there are many companies making ARM chips and you have to include the companies making the chip.

Re:Fabs cost gazillions... (0)

Anonymous Coward | about 7 months ago | (#46329733)

Cost of doing Business,and don't count Intel out yet

Re:Fabs cost gazillions... (1)

Darinbob (1142669) | about 7 months ago | (#46331171)

ARM doesn't have to make their own chips, they just have to license the designs to others. Thus it can potentially be a larger impact on the economy than Intel. Ie, add in all the earnings from everyone who makes ARM chips.

Re:The future could be all in the fabs (0)

Anonymous Coward | about 7 months ago | (#46329527)

WhatsApp's last/current quarter earnings will be 16Billion.. is it ruling the world? I think not. Earnings is a nice metric, but shouldn't be the only one.

Re:The future could be all in the fabs (3, Informative)

leathered (780018) | about 7 months ago | (#46327269)

Not really, what matters most is cost, and at that ARM wins hands down. Most ARM chips cost less than $5, with some selling for pennies. Intel enjoys 60%+ margins on everything it sells and they will experience a lot of pain giving them up.

The only way Intel can compete is if they sell their mobile chips at or below cost. Oh wait, they already are. [slashdot.org]

Re:The future could be all in the fabs (1)

mcrbids (148650) | about 7 months ago | (#46327559)

Most ARM chips cost less than $5, with some selling for pennies.

Not to nitpick, but it's likely that *most* ARM chips made actually sell for pennies, given that they are turning up in some very unlikely places. [bunniestudios.com] The question isn't whether or not Intel will sink ARM, that's very unlikely. The question is only how much and to what degree. There's an *astronomical* market for low-speed chips that cost $0.03 for embedded/microcontroller use.

Re:The future could be all in the fabs (2)

jrumney (197329) | about 7 months ago | (#46329673)

Not to nitpick, but it's likely that *most* ARM chips made actually sell for pennies

Not to nitpick, but the cheapest chips with ARM cores inside are a couple of dollars. Your link mentions 8051 microcontrollers being found inside SD cards, but 8051 microcontrollers are not ARM chips. It also mentions a Samsung eMMC chip with an ARM instruction set, but that eMMC chip is likely to cost a few dollars.

Re:The future could be all in the fabs (1)

farble1670 (803356) | about 7 months ago | (#46328137)

The only way Intel can compete is if they sell their mobile chips at or below cost.

yes, and that's how they crushed AMD. they are good at this.

Re:The future could be all in the fabs (2)

Alioth (221270) | about 7 months ago | (#46332623)

That won't work with ARM though. The reason why so many people build ARM is that you can license the core and add whatever peripherals you want onto the die. Or you can license the whole architecture and design your own complete ASIC around it, like what Apple does.

With Intel, you get what Intel makes. You can't make your own custom Atom. You can't license the Atom architecture and make it part of your ASIC. You get only what Intel decides it wants to put on the die. So they can't even compete even way below cost price because they aren't offering the hardware manufacturer the same thing.

Re:The future could be all in the fabs (2)

tlhIngan (30335) | about 7 months ago | (#46328745)

Not really, what matters most is cost, and at that ARM wins hands down. Most ARM chips cost less than $5, with some selling for pennies. Intel enjoys 60%+ margins on everything it sells and they will experience a lot of pain giving them up.

The only way Intel can compete is if they sell their mobile chips at or below cost. Oh wait, they already are.

And not to terribly fast either - given how the Apple A7 is running rings around Intel's chips.

Granted, different architectures and different OSes, but the benchmarks that run on both Android and iOS generally show the A7 being faster (and not by a little bit - by a lot).

And when 64-bit ARMs come out on Android, expect the Intel offerings to fall further behind. The A7 is fast because of the optimizations that the ARMv8 and 64-bit architecture allow.

Re:The future could be all in the fabs (1)

rev0lt (1950662) | about 7 months ago | (#46329051)

Intel enjoys 60%+ margins on everything it sells

Following that line of thought, that is probably the margin they have when they are manufacturing some of those $5-a-pop chips, right? Top-of-the-line processors are expensive, because of market pressures (aka "people will pay for them"), but also because of defect ratios and ROI of the manufacturing process. Building/upgrading factories is expensive. Traditionally, speciallized processors and old lines are way cheaper - so cheap, that they prefer to license the core design to someone else instead of building it themselves. Go and see how many variations of 8031/32, 8051/52 you have today - probably more than one hundred. I would bet you have more 8051 microcontrollers running *today* than the whole sum of their desktop chips, including the low power, embedded/hardened lines.

Re:The future could be all in the fabs (1)

Klivian (850755) | about 7 months ago | (#46329793)

I would bet you have more 8051 microcontrollers running *today* than the whole sum of their desktop chips, including the low power, embedded/hardened lines.

Perhaps, depending on the age distributon of the equipment. For anything designed the last 5 years it's more than likely that those pesky old 8051s have been replaced by ARMs, Coretex M0s, M3s and M4s. So a more accurate statement should be"I would bet you have more ARM microcontrollers running *today* than the whole sum of their desktop chips, including the low power, embedded/hardened lines.

Re:The future could be all in the fabs (1)

rev0lt (1950662) | about 7 months ago | (#46330525)

For anything designed the last 5 years it's more than likely that those pesky old 8051s have been replaced by ARMs, Coretex M0s, M3s and M4s

Actually, its not. For many applications, this would require rewriting of the software stack, for a chip with roughly the same die size and possibly less funcionality. 8051 is a microcontroller, not a microprocessor - and it is 8bit. It does not distinguish between IO and memory access. However, it does have a semi-complete serial interface, and 8 lines of digital I/O - and this is the base version. A ton of variations exist, with extra ports, A/D and D/A funcionality, extra timers, assorted controllers in-chip and multiple RAM and ROM configurations. It would never be replaced by a powerful 16-bit or 32-bit processor, because its not even on the same league.

Re:The future could be all in the fabs (1)

leathered (780018) | about 7 months ago | (#46330075)

I think that's a poor example as Intel no longer make 8031s and 8051s and they can't earn money by selling licenses for them because the patents expired long ago.

Re:The future could be all in the fabs (1)

rev0lt (1950662) | about 7 months ago | (#46330711)

You are right. I wasn't aware that Intel stopped manufacturing them in 2007 - but I haven't read anything about expired licensing. The VHDL for the cores is somewhat widely available, but haven't find any info about if they are "free". I would doubt they have expired, but they may have opened it to everyone - after all, the early Beatles catalog is not public domain, is it?

Re:The future could be all in the fabs (1)

Kohath (38547) | about 7 months ago | (#46329391)

But it's a 2-3 year advantage. And the future has more years than that.

In the the past and present, battery life, software compatibility, and customer-friendly licensing terms are what matter most.

Also, Intel needs to make chips designed to help customers, not chips designed to help Intel. "We need you to buy these chips so Intel is no longer frozen out of the mobile device market" isn't what customers are interested in hearing.

Acorn Risc Machine (4, Informative)

Grindalf (1089511) | about 7 months ago | (#46326815)

The acorn risk processor was designed for the British "BBC Microcomputer" to be attached via the "Tube" second processor system as a software development system for schools and colleges. This experimental machine was so successful and fast that it became became the new Acorn Archimedes computer which was used by the British Schools to teach kids how to write computer programmes.

Re: Acorn Risc Machine (-1)

Anonymous Coward | about 7 months ago | (#46326897)

How fast can it process risk? I don't like risk... I want security sooner rather than later.

Re:Acorn Risc Machine (3, Informative)

BasilBrush (643681) | about 7 months ago | (#46326919)

Well, I think it's fair to say that the ARM was designed for use in a new computer which turned out to be the Archimedes. It was available first as second processor for the BBC Micro, but that was really just a step in development, not it's original goal.

Creating the ARM simply as a second processor wouldn't have been economically viable. Few people/organisations bought second processors.

Re:Acorn Risc Machine (0)

Anonymous Coward | about 7 months ago | (#46327031)

Without the Newton the chip designs would have ended with the Acorn. Plus, they needed the cash infusion Apple brought at the time.

Remember the old joke, why don't the Brits make computers? Because they haven't figured out a way to make them leak oil yet.

Re:Acorn Risc Machine (2)

AlterEager (1803124) | about 7 months ago | (#46332553)

Remember the old joke, why don't the Brits make computers? Because they haven't figured out a way to make them leak oil yet.

How could that be an old joke? Please name me a time when the Brits didn't make computers.

Re:Acorn Risc Machine (2)

OneAhead (1495535) | about 7 months ago | (#46327087)

The acorn risk processor was designed for the British "BBC Microcomputer" to be attached via the "Tube" second processor system

Oooh then we can stream BBC classics such as Monty Python's Flying Cicrus over you"Tube"!

...golly look at the time - I'll get my coat.

Re:Acorn Risc Machine (4, Interesting)

newcastlejon (1483695) | about 7 months ago | (#46328131)

...it became became the new Acorn Archimedes computer which was used by the British Schools to teach kids how to write computer programmes.

Speaking as someone who was brought up with BBC Micros, pointy little A3000s and a single majestic "don't you dare touch that" RiscPC, this turned out not to be the case in many schools. Certainly there were often computers aplenty, some running quite good educational programmes but most didn't have anything in the way of programming tools, especially the ones with RiscOS. The Micros were much better as anything you wanted to do on them started with a command line (only a kick in the backside away from learning BASIC) but the later models didn't include any development tools whatever unless you count the hidden command line...

...a command line that was so rarely needed they hid it. Acorn were ahead of their time in so many ways; it's a shame they didn't manage to do better outside the UK.

Re:Acorn Risc Machine (1)

Eythian (552130) | about 7 months ago | (#46329221)

You could jump in and start writing applications on RiscOS directly in BASIC if you wanted. I dabbled a bit with it, but didn't have any access to documentation, so only got so far from reverse-engineering (by which I mean reading the source once you figure out shift-double-click shows you inside the !Application bundle.)

It was a very well done OS though. In some ways, it feels like systems now are only just starting to catch up, and in other ways are still a fair bit behind.

Re:Acorn Risc Machine (1)

sr180 (700526) | about 7 months ago | (#46331001)

They did well in Australia. Many private schools had them - mainly for teaching Logo.

Re:Acorn Risc Machine (1)

Alioth (221270) | about 7 months ago | (#46332639)

RiscOS still had the BBC BASIC interpreter (which included a built in ARM assembler). You could use BASIC in a RiscOS window pretty easily (it wasn't hidden), and there's a key combination to drop you out the GUI straight into the * prompt (and you could just type BASIC, and you'd have a full screen BASIC interpreter, just like the older BBC Micro).

Re:Acorn Risc Machine (2)

Alioth (221270) | about 7 months ago | (#46332659)

Not quite accurate.

Acorn needed to move off the 6502, and they explored several different architectures for the new computer they wanted to build. They wanted very low interrupt latency, and they wanted a chip that could use all of the memory bandwidth - back in 1986, the processors for a personal computer were generally much slower than memory and had no cache, for example, the MC68000 takes often 8 or more clock cycles per instruction and doesn't have a cache, the 6502 takes 3 to 4 clock cycles per instruction at 1MHz etc. In particular, Steve Furber (designer of the original ARM chip) was horrified by the Nat. Semi 32016 (if I remember right) that had some instructions that took over 100 clock cycles to execute - and instructions cannot be interrupted while they are executing, so this would result in unpredictable and possibly very poor interrupt latency.

So they decided no one was making a chip they wanted for the price they wanted, so inspired by the simple design of the 6502, they decided to design the ARM. They wanted to make it cheap so it had to be in a plastic package, so it had to dissipate less than 1 watt. They had no tools to estimate power, so everything they did about the chip design was low power to make sure they hit their 1 watt target. When they got the first chips back from the fab (VLSI, if I remember right), they found they had massively overachieved, the ARM1 prototype dissipating only 0.1 watts.

The early processors were tested in a BBC Micro second processor box, but the real goal was to put them in their new computer, the Archimedes.

Re:Acorn Risc Machine (1)

Simon Brooke (45012) | about 7 months ago | (#46332805)

I had one of the very first Archimedes boxes, back before it even had a proper operating system (it had a monitor called 'Arthur', which was really very primitive). But it was a really good feeling sitting in my university bedroom with a computer which in terms of raw processing power was faster than the two fastest machines the university then owned put together. Those original ARM boxes were, by the standards of their time,very remarkable: much faster than contemporary DEC VAX, Motorolla 68000, or Intel 80286 machines. The DEC Alphas which came along at about the same time were faster, but they were also hugely more expensive!

More like 34 years (2)

doghouse41 (140537) | about 7 months ago | (#46326867)

Strange - I can recall discussions of the ARM chip at university back in the early 80's. Either that makes be ten years younger than I though :-) - or someone has their dates wrong.

As I recall (and correct me if I'm wrong) there was a company called Acorn Computer's which produced the BBC Micro - a 6502 based machine.
The 6502 was long in the tooth even in those days (dating back at least to the Commodore Pet ca. 1976).
RISC was flavour of the month in those days, so they set out to create their own RISC based architecture for the next generation of BBC Micro (the Archimedes).

No doubt they had a little help from the local technical college (aka the Cambridge University computing department)

no wrong dates (0)

Anonymous Coward | about 7 months ago | (#46326937)

None of the dates are wrong. You're right that ARM was around in the 80s, but they weren't designing chips at the time. The stuff that happened in the 90s as described in the article relates to the birth of the Acorn RISC Machine.

Re:no wrong dates (2)

Tough Love (215404) | about 7 months ago | (#46327247)

None of the dates are wrong. You're right that ARM was around in the 80s, but they weren't designing chips at the time.

Nonsense. ARM started designing chips in 1983 [wikipedia.org]

Re:no wrong dates (1)

osu-neko (2604) | about 7 months ago | (#46327447)

Nonsense. ARM started designing chips in 1983 [wikipedia.org]

Nonsense. ARM didn't design anything in 1983, as ARM didn't exist at the time. ARM was founded in 1990 [wikipedia.org] to continue development of the already existing design.

Re:no wrong dates (1)

Tough Love (215404) | about 7 months ago | (#46328885)

You trifle. ARM Ltd is the original ARM engineering team spun off from Acorn.

Re:no wrong dates (1)

Alioth (221270) | about 7 months ago | (#46332677)

They most certainly were designing chips in the 80s. The ARM was designed by Steve Furber (hardware) and Sophie Wilson (instruction set), both founding employees of Acorn.

Re:More like 34 years (3, Informative)

mrbester (200927) | about 7 months ago | (#46326951)

ARM themselves incorporated in 1990 (hence the 24 years). However, you are correct that Acorn chips predated the company.

Going back a little further... (4, Informative)

flightmaker (1844046) | about 7 months ago | (#46327065)

A couple of years ago I donated my Acorn System 1 to the Museum of Computing in Swindon. It was on their Most Wanted list! I learned rather a lot with that machine, hand assembling machine code.

Re:More like 34 years (4, Interesting)

BasilBrush (643681) | about 7 months ago | (#46327075)

The 6502 was long in the tooth even in those days (dating back at least to the Commodore Pet ca. 1976).
RISC was flavour of the month in those days, so they set out to create their own RISC based architecture for the next generation of BBC Micro (the Archimedes).

It was still an odd decision to design their own CPU for the successor to the BBC Micro. A more obvious and less risky move would have been to use a 68000 series CPU as a successor to the 6502.

I think it's because there were so many Cambridge academics at Acorn. They made a RISC processor because it was an interesting project which was then at the cutting edge of computer science.

Re:More like 34 years (2)

johnw (3725) | about 7 months ago | (#46327397)

It was still an odd decision to design their own CPU for the successor to the BBC Micro. A more obvious and less risky move would have been to use a 68000 series CPU as a successor to the 6502.

IIRC, they experimented with a chip called the 32016 (or 16032) as a possible successor to the 6502, before deciding to start again from scratch and design their own.

All the 2nd processors for the Beeb - Z80, 6502, 32016 or ARM looked exactly the same from the outside, although when you opened them up the Z80 and 6502 were mostly air, whilst the ARM prototype was stuffed to the gunwales. It didn't even have go-faster stripes or a front air dam.

The odd thing was, early ARMs seemed to manage to produce much more bang for your MHz than x86 chips. An 8 MHz ARM2 ran rings around a 25 MHz 80386. What let them down then was the lack of a floating point co-processor. Later on the relationship seemed to reverse.

Re:More like 34 years (1)

sconeu (64226) | about 7 months ago | (#46327611)

The 16032 (original numbering, later renumbered to 32016) was a National Semiconductor chip.

Wikipedia on the 32016 [wikipedia.org]

Re:More like 34 years (4, Informative)

Anonymous Coward | about 7 months ago | (#46328091)

They considered and rejected the 68000 option. The Atari ST and Commodore Amiga were already dominating the market. A 68000-based Acorn system would have no advantages over those while being "late to the game". They figured that they basically needed to leapfrog the 16-bit systems in order to survive.

Unfortunately, by the time the Archimedes came out, the computing world was standardising around the IBM-compatible PC, and even the Archimedes' superior performance compared to PCs of that era (about the time the first 386 systems appeared) couldn't save it (Atari and Commodore didn't fare much better).

The irony is that the (seemingly-harebrained) decision to design their own CPU (thus ensuring incompatibility with everything else on the planet) on a shoe-string budget ended up hitting the jackpot. The "sane" approach of using a popular chip (680x0, 80x86) would have relegated them to the history books, alongside 5.25" floppies and dBase III.

68000 not more obvious (0)

Anonymous Coward | about 7 months ago | (#46330707)

Nope, Acorn wanted their new cpu to have similar traits as the 6502: simple architecture yet fast, memory mapped I/O and fast interupt response times. Neither Intel's x86 nor Motorola's 68K architecture was up to the job.

Re:More like 34 years (2)

Alioth (221270) | about 7 months ago | (#46332693)

They did it because they couldn't find a satisfactory chip for the machine they wanted to make. They wanted low and reasonably predictable interrupt latency - and remember, these were the days when CPUs for personal computers took between 3 and 20 clock cycles per instruction, and no cache - they wanted to maximise the use of memory bandwidth. The 6502 for instance typically was in a computer with 150ns memory, but would at full speed in a BBC Micro only fetch something from memory every 1000ns (fastest instruction 2 cycles). They looked at the National Semi 32016, but that had a multiply instruction that would take a staggering number of clock cycles (over 100!) which couldn't be interrupted, which would lead to very bad memory bandwidth utilization, unpredictable and poor interrupt latency. Steve Furber said in one of his talks that the interrupt latency was so poor with the 32016 they would only be able to support single density floppy discs without needing additional hardware (i.e. added cost) to buffer the data.

They were inspired by the simple design of the 6502 and decided they could do something similar but 32 bits. Hence the ARM. When the ARM came out most of its instructions would execute in a single CPU cycle (versus the 68000, which would often take 8 cycles or more), and had very low interrupt latency. I remember using a PC emulator on an original ARM system (an Archimedes) running at something like 8MHz. The emulator would run faster than the original IBM PC despite being on a computer with less than double the clock speed and having to emulate not just the CPU but the screen and other peripherals.

I'm glad I RTFA (1)

Anonymous Coward | about 7 months ago | (#46326933)

I RTFA, and now I know:

-ARM designs "parts of chips—such as graphics and communication bits—and they design entire chips."
-"A great many people have not heard of ARM."
-"RISC stands for 'reduced instruction set computing.'
-"Much of Apple’s success has come from the snappy performance of its products and its long-lasting batteries."
-"Intel designs its own chips, which are widely regarded as the most advanced in the world."

Slashdot is so fucking great.

It's Business Insider. (0)

Anonymous Coward | about 7 months ago | (#46326987)

You're reading a publication intended for wannabe CEOs and pointy-haired managers. It's not Engineering Weekly, so give it a rest.

Re:It's Business Insider. (1)

Guy Harris (3803) | about 7 months ago | (#46330295)

Actually, no, it's Businessweek, but

You're reading a publication intended for wannabe CEOs and pointy-haired managers. It's not Engineering Weekly, so give it a rest.

might still be the case.

Re:I'm glad I RTFA (0)

Anonymous Coward | about 7 months ago | (#46328047)

You had to know that it was a bogus article going in. The summary even calls ARM a consumer products company. But they don't sell licenses to consumers. They sell licenses to companies like Apple, Qualcomm, etc. How could that make them a consumer products company? They sell chip designs, not phones. When the summary is that clear that the article is bunk it isn't worth going to read it.

Re:I'm glad I RTFA (1)

QQBoss (2527196) | about 7 months ago | (#46330795)

I RTFA, and now I know:
[...]
-"RISC stands for 'reduced instruction set computing.'

[...]

It is a pity that no one could have strong-arm'ed (does that count as a pun?) in a superior expansion of RISC.

Either
(1) Reduced Instruction Set Complexity
or
(2) Reduced Instruction Set Cycle-time

would be more meaningful.

Very few people designing RISC CPUs in the '80s cared about how many instructions (cue argument for what defines an 'instruction') their CPUs had (certainly not the Motorola 88000 architects that I worked with); they cared about (1) whether the instructions were logically organized to get rid of the requirement to have multiple-length instructions (I was a Thumb hater, I admit it) or (2) that as many instructions as possible that were executed frequently would take a fixed time (ideally one clock) to execute (not including multiply/divide, if they existed). Though, using the former would have really screwed up the backryonym of CISC (which would then have been interpreted to mean "Complex Instruction Set Complexity").

Re:I'm glad I RTFA (0)

Anonymous Coward | about 7 months ago | (#46332573)

It was also interesting to learn that Apple once again invented everything; apparently we can just skip the part where Acron designed & implemented ARM themselves and did a good enough job that Apple became interested, and just skip ahead to Apple white-knighting in and showing this silly computer builders how it should be done.

Fuck everything about that article.

Let's hope they don't get acquired (0)

Anonymous Coward | about 7 months ago | (#46327007)

A single company like Apple could throw a wrench in the works of the entire industry by buying ARM. That would be one way to mess up a good thing.

The Little Chip That Could (5, Interesting)

spaceyhackerlady (462530) | about 7 months ago | (#46327043)

I've always thought ARM was a cool design. Simple, minimalist, sort of a latter-day PDP-11, one of those canonical architectures that just works. Simple chip, not many transistors, low power, good chip for mobile devices. It seems so obvious in retrospect. Especially since that's not what the designers had in mind. They were designing a simple chip because they only had a couple of people and that was all they could afford.

In one of the later scenes in Micro Men [imdb.com] there is a whiteboard in the background with the original ARM requirements, right down to the barrel shifter.

...laura

Re:The Little Chip That Could (4, Interesting)

cold fjord (826450) | about 7 months ago | (#46327259)

"Perfection is finally attained not when there is no longer anything to add but when there is no longer anything to take away..." -- Antoine de Saint Exupéry

A similar point was made about the tight resource constraints of the early Macintosh, and how they created a strong incentive to make use of the toolbox, doing things "the Macintosh way." That paid many dividends over the years.

Re:The Little Chip That Could (1)

Darinbob (1142669) | about 7 months ago | (#46331219)

But it was a lot like other RISC chips of the era. Not significantly better or worse, nothing really special. What really helped it out was that it wasn't aiming for higher and higher performance and kept with an 80s era design; plus introduction of 16-bit Thumb mode; plus licensing of its IP. Then it took offL lower power because it hadn't added in all the performance features, smaller code size for smaller systems, easy to integrate it as a part of an ASIC.

Lesson (2)

Tablizer (95088) | about 7 months ago | (#46327067)

The lesson is to be a light-handed source of standards and building-block supplier instead an All-Encompassing Conglomerate who tries to rule standards from head to foot.

Heed the warning, Google, Oracle, and Apple. (Too late for MS.)

ARM also helped Apple survive (5, Informative)

mveloso (325617) | about 7 months ago | (#46327193)

As a note, back in the day Apple stayed afloat by selling its stake in ARM.

Re:ARM also helped Apple survive (1)

Plumpaquatsch (2701653) | about 7 months ago | (#46329295)

As a note, back in the day Apple stayed afloat by selling its stake in ARM.

Funny how they didn't sell ARM when they where losing money, but when they were back in the black. Coincidently after Jobs killed the Newton, which was the reason ARM was founded.

Re:ARM also helped Apple survive (2)

johnw (3725) | about 7 months ago | (#46332137)

which was the reason ARM was founded.

Hardly - the ARM had gone through several generations and been used in a number of other products before the Newton came along.

Re:ARM also helped Apple survive (1)

Alioth (221270) | about 7 months ago | (#46332705)

However, Steve Furber (hardware designer of the ARM) remarked that it was the Newton that started making people take the ARM seriously. The Newton may have been a flop, but it did make the right people notice ARM.

Re:ARM also helped Apple survive (1)

Plumpaquatsch (2701653) | about 7 months ago | (#46332827)

which was the reason ARM was founded.

Hardly - the ARM had gone through several generations and been used in a number of other products before the Newton came along.

ARM. The company. Do I really have to spell it out for those who confuse "founding" with "foundry"?

Good that someone's competing with Intel (4, Insightful)

Rising Ape (1620461) | about 7 months ago | (#46327677)

As someone who had a BBC Micro as his first computer (lovely machine for tinkering), it's nice to see the descendants of Acorn survive the juggernaut of the PC and x86. And long may it continue, the last thing we need is a vertically integrated colossus like Intel dominating everything, no matter how good their PC processors are.

Consumer Product (1)

dohzer (867770) | about 7 months ago | (#46327765)

Do hardware manufacturers count as "consumers"?

Boycott McDonalds (0)

Anonymous Coward | about 7 months ago | (#46327787)

Besides telling their employees to not eat the food? they serve, McDonalds is investing the US dollars it rakes and converting heavily into the yuan (China)? Abandoning America's inflationary dollar? "Thanks for the loyalty to the country who made you wealthy, you unhealthy giant." Call me a glutton for punishment, whatever, but if I had a dollar for every case of indigestion I got from that food?, I would invest in TUMS stock and be a godzillioniare by now.

You have been served.

No mention of the Archimedes or RISC PC? (4, Interesting)

wonkey_monkey (2592601) | about 7 months ago | (#46327833)

ARM architectures were already in use before ARM the company came into being and went into making mobile processors. They were the CPUs for the Acorn Archimedes [wikipedia.org] and Risc PC [wikipedia.org] .

Ah, I still remember that heady day at Acorn World in 1996 (I think it was), riding the train back clutching my precious StrongARM (not made by ARM themselves, apparently) upgrade. The unimaginable pow-ah!

Later upgrades put RAM on the CPU's daughterboard because the bus become the bottleneck.

Somewhat sadly neglected, my Risc PC now gathers dust in a damp garage, but it made me the aspiring-to-efficiency programmer I am today.

Re:No mention of the Archimedes or RISC PC? (2)

newcastlejon (1483695) | about 7 months ago | (#46328213)

Somewhat sadly neglected, my Risc PC now gathers dust in a damp garage, but it made me the aspiring-to-efficiency programmer I am today.

And mine turned me into the vitriolic why-the-fuck-doesn't-this-modern-crap-work-as-well-as-that-twenty-year-old-Acorn-in-the-loft?! bastard I am today.

Horses for courses, eh?

Re:No mention of the Archimedes or RISC PC? (1)

ChunderDownunder (709234) | about 7 months ago | (#46329287)

Never fear, Mozilla are downsizing Firefox OS to run on $25 handsets with 128MB of RAM.

That's about the amount of RAM your university's *server* had when you accessed the web via an Xterm in NCSA Mosaic back in 1994.

Inevitable... (2)

sootman (158191) | about 7 months ago | (#46328045)

... once they became "powerful enough" and portability mattered. Same way that Intel won on the desktop, really -- compared to mainframes, they were small enough to fit into a useful spot (literally and figuratively) and became powerful enough to be REALLY useful, not just occasionally handy.

But chips themselves don't sell devices -- Intel desktops sold more and more as the OSs and apps got better and better, and it's the same thing with the iPhone and similar devices. Would a 160x160 monochrome Palm Pilot (if it still existed today) sell in iPhone-esque numbers if it had a multicore, gigahertz-plus CPU? The chip makes the product possible, and better products make people want more chips.

And so where is the archimedes (4, Informative)

sugar and acid (88555) | about 7 months ago | (#46328475)

The article skipped over the whole development of the arm processor. It wasn't developed for the newton, the original architecture was for the acorn archimedes risc based computers, launched in 1987.

The key difference that set Acorn apart from every desktop PC type computer manufacturer at the time, is they went down the road of actually designing their own processors for the PC market. This is instead of using one from Motorola or IBM

I think what set the ARM apart going forward was they used modern for the time CPU design principles, but they aimed for a lower end consumer grade market instead of the higher end mainframe/server/workstation/supercomputer market. Because of this they were all about getting the most performance from cheaper slightly older chip fab technologies. All of these ultimately meant that the design constraints imposed early on translated well to mobile applications.

theregister does it better (4, Informative)

another_gopher (1934682) | about 7 months ago | (#46328603)

Tony Smith's articles on the history of micro-computers does this far better:
http://www.theregister.co.uk/2012/06/01/acorn_archimedes_is_25_years_old/

Most important part... MIPS didn't compete. (5, Interesting)

evilviper (135110) | about 7 months ago | (#46328829)

There's one simple reason ARM has a strangle-hold on smart phones and tablets... For years, when such devices were being developed, MIPS Technologies was in a shambles. They were reeling from losing SGI, going IPO, and going through the processes of getting acquired by a string of several different companies. They've basically be AWOL this whole time, handing the upstart new market to ARM on a plate.

MIPS is still competitive. They've got extremely low power processors, multi-core 1GHz+ processors, and they've always been more efficient (higher DMIPS/MHz) than ARM. Despite their virtual absence, they're still used extensively in embedded systems... Your printer, WiFi AP/router, many set-top boxes, etc. They used-to have a dominant lead over ARM, selling something like 2/3rds of all embedded CPUs, but they simply fell apart and ceded the market to the competition. They're even the cheaper option... The first $100 Android ICS tablet found in China was MIPS (not ARM) based, and China's ministry of science keeps developing faster MIPS processors for domestic use, including supercomputers.

If they had competed, it might be MIPS in every smart phone. Even now, if they get back on-course, they could pose a real challenge to ARM, and driving prices down, and dividing the market, as Intel is trying to do with little success.

No story that claims to tell how ARM came to dominate is even remotely complete without a good paragraph about how MIPS, their biggest competitor, stopped competing and nearly GAVE them the market.

Re:Most important part... MIPS didn't compete. (1)

Anonymous Coward | about 7 months ago | (#46330551)

Sorry for you, time for MIPS is over.

It is an inferior architecture, interesting as an historical perspective, a bit like SPARC. Besides, MIPS destroyed itself by suing LEXRA which invented, a bit, the CPU IP business.

For smallish devices, it is very difficult to compete against ARM R & M cores.
For complex CPUs, the kind you find in phones, it is too late, too much has been invested in ARM, its compilers, tools...
Even Intel, pouring billions, hardly manages to compete against ARM.

MIPS didn't gave a market they never owned. Around 2000, the "market" was very much divided with all sorts of 16 and 32 bits architectures : Coldfire, PowerPC, embedded x86, several Japanese cores, Infineon ...
MIPS had some presence in network and set top boxes, but this is ending.

Re:Most important part... MIPS didn't compete. (1)

QQBoss (2527196) | about 7 months ago | (#46331065)

[...]They used-to have a dominant lead over ARM, selling something like 2/3rds of all embedded CPUs, but they simply fell apart and ceded the market to the competition. [...]

Through 2013, Cypress has shipped over 1.7 billion cumulative units of its PSoC 1 Programmable System-on-Chip, which I am fairly certain dwarfs anything MIPS has ever done. I don't have good numbers, but I am quite certain the Motorola 8-bitters shipped on the order of those numbers as well (or will, if you count the ARM variants in the Kinetis catalog as true 6800 descendents). If your intent is to talk about embedded CPUs, not MCUs, Motorola's 68K (and embedded derivatives) still have far surpassed MIPS numbers.

If that doesn't impress you, Microchip claims to have sold more than 7 Billion units of the PIC16 MCU series. [edn.com]

MIPS, while an interesting architecture that I have admired from afar, and which has had solid design wins in the past and will have more in the future, is at best an honorable mention in the embedded systems world for either volume or sales figures.

Did you perhaps mean that 2/3rds of the devices using MIPS architecture were embedded?

Re:Most important part... MIPS didn't compete. (1)

evilviper (135110) | about 7 months ago | (#46331561)

Did you perhaps mean that 2/3rds of the devices using MIPS architecture were embedded?

No.

It will take me quite some time to find the 2/3rds source. But a quick visit to WP finds a reasonably similar one:

"49% of total RISC CPU market share in 1997"

https://en.wikipedia.org/wiki/... [wikipedia.org]

Re:Most important part... MIPS didn't compete. (1)

QQBoss (2527196) | about 7 months ago | (#46332259)

Ah, that is a huge distinction and becomes more logical. In 1997, RISC was still a very small subset of the entire embedded market place. IIRC, since the '80's when these things started being tracked, I don't think any one company has ever held more than ~30% of the entire embedded market for a year (across multiple products, probably calculated by $ volume, not by total shipments), and that wasn't MIPS, for sure.

Re:Most important part... MIPS didn't compete. (2)

Darinbob (1142669) | about 7 months ago | (#46331351)

MIPS wasn't just failing to compete, it was in a very rough shape after its largest company left. For a long time MIPS product lines were about faster and faster chips. You can not just take a high performance chip and scale it down easily without a redesign. MIPS did scale down but it took it some time.

Nokia went with ARM because it was tiny, cheap, and low power; perfect for a UI on a handheld and that's what saved ARM more than the Newton did.

Both ARM and MIPS both cover low and high end embedded systems, but ARM started with low end and grew up, whereas MIPS started with higher end and grew down.

Load More Comments
Slashdot Login

Need an Account?

Forgot your password?

Submission Text Formatting Tips

We support a small subset of HTML, namely these tags:

  • b
  • i
  • p
  • br
  • a
  • ol
  • ul
  • li
  • dl
  • dt
  • dd
  • em
  • strong
  • tt
  • blockquote
  • div
  • quote
  • ecode

"ecode" can be used for code snippets, for example:

<ecode>    while(1) { do_something(); } </ecode>