Want to read Slashdot from your mobile device? Point it at m.slashdot.org and keep reading!

 



Forgot your password?
typodupeerror
×
Businesses Hardware

The Ever So Unlikely Tale of How ARM Came To Rule the World 111

pacopico writes "About 24 years ago, a tiny chip company came to life in a Cambridge, England barn. It was called ARM, and it looked quite unlike any other chip company that had come before it. Businessweek has just published something of an oral history on the weird things that took place to let ARM end up dominating the mobile revolution and rivaling Coke and McDonald's as the most prolific consumer product company on the planet. The story also looks at what ARM's new CEO needs to do not to mess things up."
This discussion has been archived. No new comments can be posted.

The Ever So Unlikely Tale of How ARM Came To Rule the World

Comments Filter:
  • by mrspoonsi ( 2955715 ) on Monday February 24, 2014 @04:24PM (#46326731)
    And Intel have the advantage there.
    • And some of Intel's fabs are manufacturing ARM chips. Intel doesn't have to lose for ARM to win.

      • More than that, doesn't Intel has an ARM license from when they acquired the detritus left over from the Compaq / DEC merger? They made the StrongARM series, and XScale CPUs for some time. I don't know if they sold the license to Marvell, or if it was just the XScale designs.

        • Anyone who wants an ARM license can have one. Just pay the subscription and the royalties. It's open to all. (FRAND)

          • by alen ( 225700 )

            that ARM license just means you can make the same chips ARM designs

            Apple and Qualcomm have architecture licenses where they can design their own ARM chips. this is why a 1 GHz A7 CPU performs just as well in real life as a 2GHz samsung CPU. that and designing the software for the instruction set helps too

            • Right, there's a range of licenses at a range of prices. But surely they are all available to all, including the architectural license.

            • by NapalmV ( 1934294 ) on Monday February 24, 2014 @05:32PM (#46327609)
              You mean using a C compiler instead of a Java interpreter helps with speed and power consumption? Who could have thought?
        • by Megane ( 129182 )
          I think they sold StrongARM to Marvell.
        • It doesn't actually matter because XScale, ironically, didn't scale. Oh, it scaled up, but it wouldn't scale down. You could use it in non-mobile embedded applications where you needed a lot of horsepower at a low cost and they did; you'd see XScale in places like casino gaming entertainment modules. Now you're as likely to see Linux on x86 because frankly, there was never any need to miniaturize. Casino games are massive, and onboard graphics are more than adequate so there's no need for expansion cards wh

        • ...it seems likely that the ARM Architecture License the Intel acquired in the Digital takeover/litigation mess also transferred to Marvell.

        • I thought Basil was talking about the licenses Intel gets from ARM licensees who use Intel's Custom Fab Services. Nothing to do w/ StrongARM/XScale
    • And Intel have the advantage there.

      Physics has the advantage. Clock scaling already ended, now feature size shrinking is grinding to a halt. The action has now shifted to ballooning core counts and power management strategy, where Intel has no compelling advantage. Intel's feature size advantage has not got long to live.

      • Considering how Intel managed to go from NetBurst (massively power-hungry) to Haswell (very power efficient), I wouldn't doubt their ability to out-engineer the companies currently designing ARM chips.
      • by rev0lt ( 1950662 ) on Monday February 24, 2014 @07:25PM (#46328953)
        I'm reading this and laughing. I've read the same kind of statement when they're using 300nm tech, 95nm tech, 65nm tech, and so on and so forth. Their public roadmap has 5nm tech around 2019-2022 (http://en.wikipedia.org/wiki/5_nanometer). And as x86 inheritance slowly fades away, they can actually produce way smaller chips without backwards compatibility if the market demands it (very few applications run 1978 instructions nowadays, same goes for all that 16-bit protected mode wazoo).
        • Your claim is utterly ridiculous. CPU cache OVERWHELMINGLY dominates the x86/x64 die. Intel/Amd could eliminate ALL processor functions, entirely, and the die STILL wouldn't be "way smaller" (or "way cheaper" for that matter).

          • by rev0lt ( 1950662 )

            CPU cache OVERWHELMINGLY dominates the x86/x64 die.

            Yeah, so? Are you saying that 4MB on-die cache is somehow smaller in an ARM chip? And if you look at it closely, you don't see a huge increase in cache sizes in the past 10 years, do you? The tendency is to actually shrink slightly in the future, because of latency issues. But hey, lets focus on what we know today - 4MB of cache in 32nm (common today for ARM) is not the same area as in 22nm (used *today* on Haswell). Or in 5nm. And if you can free an additional 20% of real estate in the chip by cleaning sel

            • if you can free an additional 20% of real estate in the chip

              ...which you can't.

              Cutting parts and shrinking the design would produce less defects, and it would probably have a direct impact on price. I know, utterly ridiculous.

              Not "ridiculous" so much as a minuscule difference that will not have a visible affect on price or performance.

          • So you're saying that because caches are the overwhelming issue, we should throw away x86 and design a modern ISA to improve on code density and to minimize memory transfers? ;-)
            • That is what would be required to meet GP's silly and imaginary Intel roadmap. Except, of course, that wouldn't be "x86" any longer, so it doesn't quite fit.

              And *I* certainly never said "we should" do anything of the sort.

              • by rev0lt ( 1950662 )
                Most of what is "modern x86" today is akin to RISC processors - CISC instructions are decomposed into micro-ops (similar to RISC) instructions, and fed into the execution pipelines. Some of those micro-ops are executed by following a set of rules defined by the cpu's own firmware (microcode), others are not. Some of the micro-ops that aren't have additional complex circuitry, and frequently are legacy instructions. You may not be aware of this, but Intel has been doing (rather unsucessfully, I must add) RIS
                • You may not be aware of this, but Intel has been doing (rather unsucessfully, I must add) RISC processors at least since the beginning of the 90's

                  Everybody and their grandmother knows that... Though the NexGen Nx586 beat them to it. Later acquired by AMD.

    • by timeOday ( 582209 ) on Monday February 24, 2014 @05:07PM (#46327249)
      Another possibility is there is no real future - nobody will reap the profits Intel did for the last 30 years.

      Intel's earnings last quarter were $2,630 M compared to $156 M for ARM holdings. So if ARM is "ruling the world" like this story claims, then ruling the world just ain't what it used to be. And I guess that is likely, if semiconductors stagnate as they seem to be.

      • by Anonymous Coward

        2,630M vs 156M: how much does Intel *have* to invest in fabs to continue being competetive? How much does ARM? That's where the difference is...

        • by MouseTheLuckyDog ( 2752443 ) on Monday February 24, 2014 @07:40PM (#46329109)

          Except when you include the pofits for making ARM chips from Qualcomn, Apple ( if Apple had actually seperated out their chip making division, ), Samsung, Allwinner etc. that number changes drastically.

          The only company making money off Intel chips is Intel there are many companies making ARM chips and you have to include the companies making the chip.

          • What you are describing is a commodity market. This is a future in which the margins (and R&D) for CPUs are more in line what what we are used to for RAM and other chips, not CPUs.
        • ARM doesn't have to make their own chips, they just have to license the designs to others. Thus it can potentially be a larger impact on the economy than Intel. Ie, add in all the earnings from everyone who makes ARM chips.

    • by leathered ( 780018 ) on Monday February 24, 2014 @05:08PM (#46327269)

      Not really, what matters most is cost, and at that ARM wins hands down. Most ARM chips cost less than $5, with some selling for pennies. Intel enjoys 60%+ margins on everything it sells and they will experience a lot of pain giving them up.

      The only way Intel can compete is if they sell their mobile chips at or below cost. Oh wait, they already are. [slashdot.org]

      • by mcrbids ( 148650 )

        Most ARM chips cost less than $5, with some selling for pennies.

        Not to nitpick, but it's likely that *most* ARM chips made actually sell for pennies, given that they are turning up in some very unlikely places. [bunniestudios.com] The question isn't whether or not Intel will sink ARM, that's very unlikely. The question is only how much and to what degree. There's an *astronomical* market for low-speed chips that cost $0.03 for embedded/microcontroller use.

        • by jrumney ( 197329 )

          Not to nitpick, but it's likely that *most* ARM chips made actually sell for pennies

          Not to nitpick, but the cheapest chips with ARM cores inside are a couple of dollars. Your link mentions 8051 microcontrollers being found inside SD cards, but 8051 microcontrollers are not ARM chips. It also mentions a Samsung eMMC chip with an ARM instruction set, but that eMMC chip is likely to cost a few dollars.

      • The only way Intel can compete is if they sell their mobile chips at or below cost.

        yes, and that's how they crushed AMD. they are good at this.

        • by Alioth ( 221270 )

          That won't work with ARM though. The reason why so many people build ARM is that you can license the core and add whatever peripherals you want onto the die. Or you can license the whole architecture and design your own complete ASIC around it, like what Apple does.

          With Intel, you get what Intel makes. You can't make your own custom Atom. You can't license the Atom architecture and make it part of your ASIC. You get only what Intel decides it wants to put on the die. So they can't even compete even way belo

      • by tlhIngan ( 30335 )

        Not really, what matters most is cost, and at that ARM wins hands down. Most ARM chips cost less than $5, with some selling for pennies. Intel enjoys 60%+ margins on everything it sells and they will experience a lot of pain giving them up.

        The only way Intel can compete is if they sell their mobile chips at or below cost. Oh wait, they already are.

        And not to terribly fast either - given how the Apple A7 is running rings around Intel's chips.

        Granted, different architectures and different OSes, but the benchm

      • by rev0lt ( 1950662 )

        Intel enjoys 60%+ margins on everything it sells

        Following that line of thought, that is probably the margin they have when they are manufacturing some of those $5-a-pop chips, right? Top-of-the-line processors are expensive, because of market pressures (aka "people will pay for them"), but also because of defect ratios and ROI of the manufacturing process. Building/upgrading factories is expensive. Traditionally, speciallized processors and old lines are way cheaper - so cheap, that they prefer to license the core design to someone else instead of buildi

        • by Klivian ( 850755 )

          I would bet you have more 8051 microcontrollers running *today* than the whole sum of their desktop chips, including the low power, embedded/hardened lines.

          Perhaps, depending on the age distributon of the equipment. For anything designed the last 5 years it's more than likely that those pesky old 8051s have been replaced by ARMs, Coretex M0s, M3s and M4s. So a more accurate statement should be"I would bet you have more ARM microcontrollers running *today* than the whole sum of their desktop chips, including the low power, embedded/hardened lines.

          • by rev0lt ( 1950662 )

            For anything designed the last 5 years it's more than likely that those pesky old 8051s have been replaced by ARMs, Coretex M0s, M3s and M4s

            Actually, its not. For many applications, this would require rewriting of the software stack, for a chip with roughly the same die size and possibly less funcionality. 8051 is a microcontroller, not a microprocessor - and it is 8bit. It does not distinguish between IO and memory access. However, it does have a semi-complete serial interface, and 8 lines of digital I/O - and this is the base version. A ton of variations exist, with extra ports, A/D and D/A funcionality, extra timers, assorted controllers in-

            • by Klivian ( 850755 )

              For anything designed the last 5 years it's more than likely that those pesky old 8051s have been replaced by ARMs, Coretex M0s, M3s and M4s

              Actually, its not. For many applications, this would require rewriting of the software stack, for a chip with roughly the same die size and possibly less funcionality. 8051 is a microcontroller, not a microprocessor.

              And that is exactly what those ARMs are, they are microcontrollers. It's several years since the ARM microcontrollers started to dip below the $1 pricetag becoming a valid cometitior in most microcontroller designs. Those cheap ARMs have more or less taken over the market for 16 bit micros, and are doing heavy inroadds in the typical 8bit martkets. If you have started a microcontroller based design the last 5-10 years and not included one or more ARM microcontrollers in the evaluation process, you have not

        • I think that's a poor example as Intel no longer make 8031s and 8051s and they can't earn money by selling licenses for them because the patents expired long ago.

          • by rev0lt ( 1950662 )
            You are right. I wasn't aware that Intel stopped manufacturing them in 2007 - but I haven't read anything about expired licensing. The VHDL for the cores is somewhat widely available, but haven't find any info about if they are "free". I would doubt they have expired, but they may have opened it to everyone - after all, the early Beatles catalog is not public domain, is it?
    • by Kohath ( 38547 )

      But it's a 2-3 year advantage. And the future has more years than that.

      In the the past and present, battery life, software compatibility, and customer-friendly licensing terms are what matter most.

      Also, Intel needs to make chips designed to help customers, not chips designed to help Intel. "We need you to buy these chips so Intel is no longer frozen out of the mobile device market" isn't what customers are interested in hearing.

  • Comment removed (Score:4, Informative)

    by account_deleted ( 4530225 ) on Monday February 24, 2014 @04:32PM (#46326815)
    Comment removed based on user account deletion
    • by BasilBrush ( 643681 ) on Monday February 24, 2014 @04:41PM (#46326919)

      Well, I think it's fair to say that the ARM was designed for use in a new computer which turned out to be the Archimedes. It was available first as second processor for the BBC Micro, but that was really just a step in development, not it's original goal.

      Creating the ARM simply as a second processor wouldn't have been economically viable. Few people/organisations bought second processors.

    • Comment removed based on user account deletion
    • by newcastlejon ( 1483695 ) on Monday February 24, 2014 @06:16PM (#46328131)

      ...it became became the new Acorn Archimedes computer which was used by the British Schools to teach kids how to write computer programmes.

      Speaking as someone who was brought up with BBC Micros, pointy little A3000s and a single majestic "don't you dare touch that" RiscPC, this turned out not to be the case in many schools. Certainly there were often computers aplenty, some running quite good educational programmes but most didn't have anything in the way of programming tools, especially the ones with RiscOS. The Micros were much better as anything you wanted to do on them started with a command line (only a kick in the backside away from learning BASIC) but the later models didn't include any development tools whatever unless you count the hidden command line...

      ...a command line that was so rarely needed they hid it. Acorn were ahead of their time in so many ways; it's a shame they didn't manage to do better outside the UK.

      • by Eythian ( 552130 )

        You could jump in and start writing applications on RiscOS directly in BASIC if you wanted. I dabbled a bit with it, but didn't have any access to documentation, so only got so far from reverse-engineering (by which I mean reading the source once you figure out shift-double-click shows you inside the !Application bundle.)

        It was a very well done OS though. In some ways, it feels like systems now are only just starting to catch up, and in other ways are still a fair bit behind.

      • by sr180 ( 700526 )

        They did well in Australia. Many private schools had them - mainly for teaching Logo.

      • by Alioth ( 221270 )

        RiscOS still had the BBC BASIC interpreter (which included a built in ARM assembler). You could use BASIC in a RiscOS window pretty easily (it wasn't hidden), and there's a key combination to drop you out the GUI straight into the * prompt (and you could just type BASIC, and you'd have a full screen BASIC interpreter, just like the older BBC Micro).

    • by Alioth ( 221270 )

      Not quite accurate.

      Acorn needed to move off the 6502, and they explored several different architectures for the new computer they wanted to build. They wanted very low interrupt latency, and they wanted a chip that could use all of the memory bandwidth - back in 1986, the processors for a personal computer were generally much slower than memory and had no cache, for example, the MC68000 takes often 8 or more clock cycles per instruction and doesn't have a cache, the 6502 takes 3 to 4 clock cycles per instru

    • I had one of the very first Archimedes boxes, back before it even had a proper operating system (it had a monitor called 'Arthur', which was really very primitive). But it was a really good feeling sitting in my university bedroom with a computer which in terms of raw processing power was faster than the two fastest machines the university then owned put together. Those original ARM boxes were, by the standards of their time,very remarkable: much faster than contemporary DEC VAX, Motorolla 68000, or Intel 8

  • Strange - I can recall discussions of the ARM chip at university back in the early 80's. Either that makes be ten years younger than I though :-) - or someone has their dates wrong.

    As I recall (and correct me if I'm wrong) there was a company called Acorn Computer's which produced the BBC Micro - a 6502 based machine.
    The 6502 was long in the tooth even in those days (dating back at least to the Commodore Pet ca. 1976).
    RISC was flavour of the month in those days, so they set out to create their own RISC base

    • Re: (Score:3, Informative)

      by mrbester ( 200927 )

      ARM themselves incorporated in 1990 (hence the 24 years). However, you are correct that Acorn chips predated the company.

    • by flightmaker ( 1844046 ) on Monday February 24, 2014 @04:53PM (#46327065)
      A couple of years ago I donated my Acorn System 1 to the Museum of Computing in Swindon. It was on their Most Wanted list! I learned rather a lot with that machine, hand assembling machine code.
    • by BasilBrush ( 643681 ) on Monday February 24, 2014 @04:53PM (#46327075)

      The 6502 was long in the tooth even in those days (dating back at least to the Commodore Pet ca. 1976).
      RISC was flavour of the month in those days, so they set out to create their own RISC based architecture for the next generation of BBC Micro (the Archimedes).

      It was still an odd decision to design their own CPU for the successor to the BBC Micro. A more obvious and less risky move would have been to use a 68000 series CPU as a successor to the 6502.

      I think it's because there were so many Cambridge academics at Acorn. They made a RISC processor because it was an interesting project which was then at the cutting edge of computer science.

      • by johnw ( 3725 )

        It was still an odd decision to design their own CPU for the successor to the BBC Micro. A more obvious and less risky move would have been to use a 68000 series CPU as a successor to the 6502.

        IIRC, they experimented with a chip called the 32016 (or 16032) as a possible successor to the 6502, before deciding to start again from scratch and design their own.

        All the 2nd processors for the Beeb - Z80, 6502, 32016 or ARM looked exactly the same from the outside, although when you opened them up the Z80 and 6502 were mostly air, whilst the ARM prototype was stuffed to the gunwales. It didn't even have go-faster stripes or a front air dam.

        The odd thing was, early ARMs seemed to manage to produce much

      • by Anonymous Coward on Monday February 24, 2014 @06:13PM (#46328091)

        They considered and rejected the 68000 option. The Atari ST and Commodore Amiga were already dominating the market. A 68000-based Acorn system would have no advantages over those while being "late to the game". They figured that they basically needed to leapfrog the 16-bit systems in order to survive.

        Unfortunately, by the time the Archimedes came out, the computing world was standardising around the IBM-compatible PC, and even the Archimedes' superior performance compared to PCs of that era (about the time the first 386 systems appeared) couldn't save it (Atari and Commodore didn't fare much better).

        The irony is that the (seemingly-harebrained) decision to design their own CPU (thus ensuring incompatibility with everything else on the planet) on a shoe-string budget ended up hitting the jackpot. The "sane" approach of using a popular chip (680x0, 80x86) would have relegated them to the history books, alongside 5.25" floppies and dBase III.

        • They considered and rejected the 68000 option. The Atari ST and Commodore Amiga were already dominating the market. A 68000-based Acorn system would have no advantages over those while being "late to the game". They figured that they basically needed to leapfrog the 16-bit systems in order to survive.

          Whilst that sounds reasonable it's not historically possible as a motivation. They started developing the ARM in 1983, so that's when they decided what they were going to use for the BBC Micro successor. And the Atari ST and Amiga didn't come out till 2 years later - mid 1985.

          The Archimedes came out in 1987, so of course it's success was compromised by the ST and Amiga. But had Acorn used an already existing chip in the 68000 series, they could have beaten the ST and Amiga to market with a computer released

      • by Alioth ( 221270 )

        They did it because they couldn't find a satisfactory chip for the machine they wanted to make. They wanted low and reasonably predictable interrupt latency - and remember, these were the days when CPUs for personal computers took between 3 and 20 clock cycles per instruction, and no cache - they wanted to maximise the use of memory bandwidth. The 6502 for instance typically was in a computer with 150ns memory, but would at full speed in a BBC Micro only fetch something from memory every 1000ns (fastest ins

  • by Anonymous Coward
    I RTFA, and now I know:

    -ARM designs "parts of chips—such as graphics and communication bits—and they design entire chips."
    -"A great many people have not heard of ARM."
    -"RISC stands for 'reduced instruction set computing.'
    -"Much of Apple’s success has come from the snappy performance of its products and its long-lasting batteries."
    -"Intel designs its own chips, which are widely regarded as the most advanced in the world."

    Slashdot is so fucking great.
    • by QQBoss ( 2527196 )

      I RTFA, and now I know:
      [...]
      -"RISC stands for 'reduced instruction set computing.'

      [...]

      It is a pity that no one could have strong-arm'ed (does that count as a pun?) in a superior expansion of RISC.

      Either
      (1) Reduced Instruction Set Complexity
      or
      (2) Reduced Instruction Set Cycle-time

      would be more meaningful.

      Very few people designing RISC CPUs in the '80s cared about how many instructions (cue argument for what defines an 'instruction') their CPUs had (certainly not the Motorola 88000 architects that I worked with); they cared about (1) whether the instructions were logically organized to get rid

  • by spaceyhackerlady ( 462530 ) on Monday February 24, 2014 @04:52PM (#46327043)

    I've always thought ARM was a cool design. Simple, minimalist, sort of a latter-day PDP-11, one of those canonical architectures that just works. Simple chip, not many transistors, low power, good chip for mobile devices. It seems so obvious in retrospect. Especially since that's not what the designers had in mind. They were designing a simple chip because they only had a couple of people and that was all they could afford.

    In one of the later scenes in Micro Men [imdb.com] there is a whiteboard in the background with the original ARM requirements, right down to the barrel shifter.

    ...laura

    • by cold fjord ( 826450 ) on Monday February 24, 2014 @05:07PM (#46327259)

      "Perfection is finally attained not when there is no longer anything to add but when there is no longer anything to take away..." -- Antoine de Saint Exupéry

      A similar point was made about the tight resource constraints of the early Macintosh, and how they created a strong incentive to make use of the toolbox, doing things "the Macintosh way." That paid many dividends over the years.

    • But it was a lot like other RISC chips of the era. Not significantly better or worse, nothing really special. What really helped it out was that it wasn't aiming for higher and higher performance and kept with an 80s era design; plus introduction of 16-bit Thumb mode; plus licensing of its IP. Then it took offL lower power because it hadn't added in all the performance features, smaller code size for smaller systems, easy to integrate it as a part of an ASIC.

  • The lesson is to be a light-handed source of standards and building-block supplier instead an All-Encompassing Conglomerate who tries to rule standards from head to foot.

    Heed the warning, Google, Oracle, and Apple. (Too late for MS.)

  • by mveloso ( 325617 ) on Monday February 24, 2014 @05:04PM (#46327193)

    As a note, back in the day Apple stayed afloat by selling its stake in ARM.

    • As a note, back in the day Apple stayed afloat by selling its stake in ARM.

      Funny how they didn't sell ARM when they where losing money, but when they were back in the black. Coincidently after Jobs killed the Newton, which was the reason ARM was founded.

      • by johnw ( 3725 )

        which was the reason ARM was founded.

        Hardly - the ARM had gone through several generations and been used in a number of other products before the Newton came along.

        • by Alioth ( 221270 )

          However, Steve Furber (hardware designer of the ARM) remarked that it was the Newton that started making people take the ARM seriously. The Newton may have been a flop, but it did make the right people notice ARM.

          • However, Steve Furber (hardware designer of the ARM) remarked that it was the Newton that started making people take the ARM seriously. The Newton may have been a flop, but it did make the right people notice ARM.

            Well, the Newton wasn't a flop compared to the Archimedes, that's for sure. 100,000 sold in the first year. Did Acorn ship that many ARMs in Co-CPUs, Archimedes and Risc PCs combined even in a decade?

        • which was the reason ARM was founded.

          Hardly - the ARM had gone through several generations and been used in a number of other products before the Newton came along.

          ARM. The company. Do I really have to spell it out for those who confuse "founding" with "foundry"?

  • by Rising Ape ( 1620461 ) on Monday February 24, 2014 @05:37PM (#46327677)

    As someone who had a BBC Micro as his first computer (lovely machine for tinkering), it's nice to see the descendants of Acorn survive the juggernaut of the PC and x86. And long may it continue, the last thing we need is a vertically integrated colossus like Intel dominating everything, no matter how good their PC processors are.

  • Do hardware manufacturers count as "consumers"?
  • by wonkey_monkey ( 2592601 ) on Monday February 24, 2014 @05:51PM (#46327833) Homepage

    ARM architectures were already in use before ARM the company came into being and went into making mobile processors. They were the CPUs for the Acorn Archimedes [wikipedia.org] and Risc PC [wikipedia.org].

    Ah, I still remember that heady day at Acorn World in 1996 (I think it was), riding the train back clutching my precious StrongARM (not made by ARM themselves, apparently) upgrade. The unimaginable pow-ah!

    Later upgrades put RAM on the CPU's daughterboard because the bus become the bottleneck.

    Somewhat sadly neglected, my Risc PC now gathers dust in a damp garage, but it made me the aspiring-to-efficiency programmer I am today.

    • Somewhat sadly neglected, my Risc PC now gathers dust in a damp garage, but it made me the aspiring-to-efficiency programmer I am today.

      And mine turned me into the vitriolic why-the-fuck-doesn't-this-modern-crap-work-as-well-as-that-twenty-year-old-Acorn-in-the-loft?! bastard I am today.

      Horses for courses, eh?

      • Never fear, Mozilla are downsizing Firefox OS to run on $25 handsets with 128MB of RAM.

        That's about the amount of RAM your university's *server* had when you accessed the web via an Xterm in NCSA Mosaic back in 1994.

  • by sootman ( 158191 ) on Monday February 24, 2014 @06:10PM (#46328045) Homepage Journal

    ... once they became "powerful enough" and portability mattered. Same way that Intel won on the desktop, really -- compared to mainframes, they were small enough to fit into a useful spot (literally and figuratively) and became powerful enough to be REALLY useful, not just occasionally handy.

    But chips themselves don't sell devices -- Intel desktops sold more and more as the OSs and apps got better and better, and it's the same thing with the iPhone and similar devices. Would a 160x160 monochrome Palm Pilot (if it still existed today) sell in iPhone-esque numbers if it had a multicore, gigahertz-plus CPU? The chip makes the product possible, and better products make people want more chips.

  • by sugar and acid ( 88555 ) on Monday February 24, 2014 @06:44PM (#46328475)

    The article skipped over the whole development of the arm processor. It wasn't developed for the newton, the original architecture was for the acorn archimedes risc based computers, launched in 1987.

    The key difference that set Acorn apart from every desktop PC type computer manufacturer at the time, is they went down the road of actually designing their own processors for the PC market. This is instead of using one from Motorola or IBM

    I think what set the ARM apart going forward was they used modern for the time CPU design principles, but they aimed for a lower end consumer grade market instead of the higher end mainframe/server/workstation/supercomputer market. Because of this they were all about getting the most performance from cheaper slightly older chip fab technologies. All of these ultimately meant that the design constraints imposed early on translated well to mobile applications.

  • by another_gopher ( 1934682 ) on Monday February 24, 2014 @06:55PM (#46328603)
    Tony Smith's articles on the history of micro-computers does this far better:
    http://www.theregister.co.uk/2012/06/01/acorn_archimedes_is_25_years_old/
  • by evilviper ( 135110 ) on Monday February 24, 2014 @07:15PM (#46328829) Journal

    There's one simple reason ARM has a strangle-hold on smart phones and tablets... For years, when such devices were being developed, MIPS Technologies was in a shambles. They were reeling from losing SGI, going IPO, and going through the processes of getting acquired by a string of several different companies. They've basically be AWOL this whole time, handing the upstart new market to ARM on a plate.

    MIPS is still competitive. They've got extremely low power processors, multi-core 1GHz+ processors, and they've always been more efficient (higher DMIPS/MHz) than ARM. Despite their virtual absence, they're still used extensively in embedded systems... Your printer, WiFi AP/router, many set-top boxes, etc. They used-to have a dominant lead over ARM, selling something like 2/3rds of all embedded CPUs, but they simply fell apart and ceded the market to the competition. They're even the cheaper option... The first $100 Android ICS tablet found in China was MIPS (not ARM) based, and China's ministry of science keeps developing faster MIPS processors for domestic use, including supercomputers.

    If they had competed, it might be MIPS in every smart phone. Even now, if they get back on-course, they could pose a real challenge to ARM, and driving prices down, and dividing the market, as Intel is trying to do with little success.

    No story that claims to tell how ARM came to dominate is even remotely complete without a good paragraph about how MIPS, their biggest competitor, stopped competing and nearly GAVE them the market.

    • by Anonymous Coward

      Sorry for you, time for MIPS is over.

      It is an inferior architecture, interesting as an historical perspective, a bit like SPARC. Besides, MIPS destroyed itself by suing LEXRA which invented, a bit, the CPU IP business.

      For smallish devices, it is very difficult to compete against ARM R & M cores.
      For complex CPUs, the kind you find in phones, it is too late, too much has been invested in ARM, its compilers, tools...
      Even Intel, pouring billions, hardly manages to compete against ARM.

      MIPS didn't gave a mark

      • by dkf ( 304284 )

        MIPS had some presence in network and set top boxes, but this is ending.

        It's gone. The hotel I was staying in last week had all of its set-top boxes running Android. (Irritating when you switched the device on and had to wait while it booted, but that's not ARM's fault.)

    • by QQBoss ( 2527196 )

      [...]They used-to have a dominant lead over ARM, selling something like 2/3rds of all embedded CPUs, but they simply fell apart and ceded the market to the competition. [...]

      Through 2013, Cypress has shipped over 1.7 billion cumulative units of its PSoC 1 Programmable System-on-Chip, which I am fairly certain dwarfs anything MIPS has ever done. I don't have good numbers, but I am quite certain the Motorola 8-bitters shipped on the order of those numbers as well (or will, if you count the ARM variants in the Kinetis catalog as true 6800 descendents). If your intent is to talk about embedded CPUs, not MCUs, Motorola's 68K (and embedded derivatives) still have far surpassed MIPS

      • Did you perhaps mean that 2/3rds of the devices using MIPS architecture were embedded?

        No.

        It will take me quite some time to find the 2/3rds source. But a quick visit to WP finds a reasonably similar one:

        "49% of total RISC CPU market share in 1997"

        https://en.wikipedia.org/wiki/... [wikipedia.org]

        • by QQBoss ( 2527196 )

          Ah, that is a huge distinction and becomes more logical. In 1997, RISC was still a very small subset of the entire embedded market place. IIRC, since the '80's when these things started being tracked, I don't think any one company has ever held more than ~30% of the entire embedded market for a year (across multiple products, probably calculated by $ volume, not by total shipments), and that wasn't MIPS, for sure.

    • MIPS wasn't just failing to compete, it was in a very rough shape after its largest company left. For a long time MIPS product lines were about faster and faster chips. You can not just take a high performance chip and scale it down easily without a redesign. MIPS did scale down but it took it some time.

      Nokia went with ARM because it was tiny, cheap, and low power; perfect for a UI on a handheld and that's what saved ARM more than the Newton did.

      Both ARM and MIPS both cover low and high end embedded syst

Love may laugh at locksmiths, but he has a profound respect for money bags. -- Sidney Paternoster, "The Folly of the Wise"

Working...