Apple Silicon: history, performance, and Apple's transformation – AppleInsider

AAPL: 194.35 ( +0.32 )
Copyright © 2024 Quiller Media, Inc. All rights reserved.
Apple's M-series chips rejuvenated the Mac range
Decades ago, Apple started down a path that has revolutionized both its own products and the entire technology industry. Here’s where Apple Silicon began, and where it’s going.

From the very beginning of Apple Computer, the company has relied on core parts made by other firms. The Apple I had a MOS Technology 6502 chip at its heart, though co-founder Steve Wozniak designed most of the other original hardware, including the circuit boards.

Apple continued to invent and patent many technologies as it grew, but it was always reliant on other companies to provide the primary chip powering the machines, the CPU. For many years, Apple’s chip supplier was Motorola, and later Intel.

Looking back, it’s fairly clear that this necessary relationship with other firms — often with companies that Apple competed against on various levels — was not what co-founder and CEO Steve Jobs wanted. Apple was at the mercy of Motorola’s or Intel’s timelines, and sometimes shipping delays, which hurt Mac sales.

The prelude to Apple Silicon

Prior to Jobs’ return to Apple in 1997, Apple had used a company called Advanced RISC Machines, later formalized as Arm Holdings PLC, to design a chip for the Newton handheld. Jobs promptly killed the device when he took over as CEO, despite the outcry of fans of the PDA.

Luckily, Arm got a second chance to work with Apple on the iPod project, which required lead Tony Fadell to seek a chip supplier specializing in high-efficiency, low-power chips, among other challenges. The first iPod, which debuted in late 2001, used a Portplayer PP5502 system-on-a-chip (SOC) powered by dual Arm processor cores.

A few years later, Apple used Samsung-designed Arm-based SOC chips for the original iPhone. Apple bought shares in Arm and deepened its relationship with the firm.

Steve Jobs in black turtleneck presenting a smartphone with a backdrop displaying a partial logo resembling an apple.
Apple co-founder Steve Jobs

By the mid-2000s, Apple’s notebook computers were beginning to seriously outpace the iMac as the computer of choice for consumers. Mobile devices like the Palm Pilot had introduced a new level of access to information on the go, and the iPhone’s debut cemented that desire.

It seems obvious now that Apple’s notebooks would eventually need to move to much more power-efficient chips, but at the time the only place you could find such research was deep inside Apple’s secret “skunkworks” labs.

In June of 2005, Jobs announced Apple’s entire Mac lineup would move to Intel-based chips. The Mac had been lagging behind on Motorola chips, and sales were down again.

Long-time customers were worried about the effect of such a major transition, but Apple’s behind-the-scenes work paid off with emulation software called Rosetta. It was a key technology that kept customer’s existing PPC-based programs running smoothly, even on the new machines.

Apart from some early hiccups, the big transition went pretty well, and customers were excited about Apple again.

The move quickly brought more than just speed to Apple’s portables: with the Intel chips being more efficient, it also improved battery life. The new Macs were also able to run Windows more efficiently, a great sales advantage to users who needed or wanted to use both platforms.

Moving towards independence

In hindsight, Jobs and his new executive team had a clear plan to strengthen Apple by reducing dependencies on other companies down to a minimum. The long-term goal was for Apple to design its own chips, and Arm’s architecture became key to that mission.

In 2008, just a year after the iPhone debuted, the first prong of Apple’s plan to make its own chips began to take shape. In April of that year, the company bought a chip design firm called P.A. Semiconductor, based in Palo Alto, for $278 million.

By June of that year, Jobs publicly declared that Apple would be designing its own iPhone and iPod chips. This was disappointing news to both Samsung and Intel.

Samsung wanted to retain its role as Apple’s chip designer and fabricator. Intel tried hard to woo Apple into using their low-power Atom chipsets for the iPhone, and was shocked when Apple declined.

The first iPhone models used Arm-based chips designed and fabricated by Samsung. This changed in 2010 when Apple introduced its first self-designed mobile chip, the A4. It debuted in the the first-generation iPad, and then the iPhone 4.

It was also used in the fourth-generation iPod Touch, and second-generation Apple TV. Samsung was still used to fabricate the chips, but not for much longer.

The following year, Apple began using Taiwan Semiconductor Manufacturing Company (TSMC) to fabricate the chips independently. Apple continues to use TSMC as its chip fab to this day, often buying up its entire production capacity of some chips.

The two big factors in Apple’s favor in using its own chips are all about control and customizability. They give Apple a significant advantage over rival mobile chips from Qualcomm and Samsung.

Using Arm’s RISC template, Apple now has complete control over the design of the chips it uses. This means it can add, change, and optimize chip features — like the Neural Engine — when it needs to, for example, rapidly increase its machine-learning and AI processing capabilities.

Secondly, Apple’s chips — whether for its mobile devices, Macs, or other products like the Apple TV — are in harmony with the software and operating system used. This is a factor that most Android-based device manufacturers can’t compete with.

Google has started designing its own chips for its own Pixel line of smartphones, but Samsung now relies on Qualcomm chips for its mobile devices, which aren’t customizable to Samsung-specific features. The same is true for other Android device makers.

This is why Apple’s chips usually match or beat rival systems even when the rivals have more RAM, more processor cores, and other seeming advantages. Apple can integrate software innovations with hardware features — such as optimizing battery life — on a level almost nobody else can.

On to the Apple Silicon Mac

As early as 2012, rumors began circulating that Apple was planning to eventually introduce Arm-based, Apple-designed chips into its Mac notebook line. It seemed like a logical extension, given the success of the iPhone, but it would likely take years of dedicated engineering work.

It started to become obvious, though unannounced, that such a transition was already in the works.

Apple's developer transition kit for Apple Silicon
Apple’s developer transition kit for Apple Silicon

As AppleInsider mentioned in 2018, pretty much all smartphones and even Microsoft had embraced Arm architecture by this point. The rumored transition away from Intel would ironically be as smooth as the migration to Intel had been in 2005.

Sure enough, in 2019 Intel finally got the official word that their chips would soon be no longer required for the Mac. By this point, Apple was on the A12 chip for the iPhone, and it’s benchmarks suggested that it was desktop-class in terms of performance.

On June 22nd, 2020, Apple CEO Tim Cook announced at WWDC that Apple would move to Apple Silicon chips, designed in-house by Apple itself, over the course of the next two years. Cook stressed that the machines would be able to run Intel-based Mac software using “Rosetta 2” to help with the transition.

Tim Cook in black shirt sitting at a table with a laptop, gesturing with his hands, in a room with large windows and tables.
Apple CEO Tim Cook in front of a MacBook Pro

The first M1 Macs — the Mac mini, MacBook Pro 13-inch and 13-inch MacBook Air — arrived in November of that year. The Mac Pro was the last machine to make the leap, skipping the M1 generation entirely — finally debuting last June with a new M2 Ultra chip on board.

Apart from the delayed Mac Pro, the rest of the Mac line had moved into the M-class chips within the promised two years. The “growing pains” of moving to Arm-based technology were much more minor than in transitions past.

Users responded very positively to the M1 Apple Silicon Macs, praising the huge boost in speed and power that the line brings compared to the Intel-based machines of 2020. Intel and AMD have worked hard to catch up, but Apple Silicon is now the new benchmark by which other machines are measured when it comes to consumer machines.

How Apple Silicon keeps its competitive edge

Since the M1, Apple fans have gotten used to routine double-digit improvements every year or so. Both in comparing the most recent base M-class chip to its successor and in comparing each variant to the one below it, and then again when a new set shows up.

It can’t last forever, but Apple’s ability to design chips perfectly customized to its own hardware has proven to be a major advantage.

This means Apple’s chip designers can give more emphasis to areas that will need more attention, and we’ll be hearing much more about that this coming June. Apple will undoubtedly be expanding the speed, computing power, and complexity of its existing Neural Engine in its next chips, to accommodate the greater use of machine learning and AI technologies.

Already, we’ve previously noted that Apple’s machines can maintain the same high speeds in the MacBook line regardless of whether the machine is on battery or mains power. Most PC laptops only run at their best speed when plugged in, with noisy fans blazing.

Graphic representation of M3 chip with performance statistics, including CPU and GPU core counts, transistor count, and comparisons to M1 and M2 chips.
Graphic representation of M3 chip with performance statistics, including CPU and GPU core counts, transistor count, and comparisons to M1 and M2 chips.

YouTube reviewers of the New MacBook Air models have noted, with some astonishment, that even “pro” apps like Photoshop, Lightroom, and Final Cut Pro run just fine for light- and medium-duty jobs — on a machine that doesn’t have a fan. This would have been a nigh-inconceivable notion to these reviewers even two years ago.

Credit should be given to other chipmakers who have made heroic efforts — albeit not entirely successful — to catch up. Qualcomm, for example, has brought out the Snapdragon 8 Gen 3, which aims to take on the A17 Pro used in the iPhone 15 Pro and Pro Max.

Graphical presentation of Qualcomm Snapdragon 8cx Gen 3 features including 5G connectivity, increased AI acceleration, battery life, video conferencing, and security enhancements.
The A17 Pro and M3’s nearest current competition: Qualcomm’s Snapdragon 8 Gen 3

This is fine, except for the fact that the iPhone 16 will be coming out in the fall, with a new chip that will likely continue the now-tradition of double-digit increases. In particular, we’ll note that GPU performance in the M3 has increased 50 percent over the M1 in just three years.

Apple’s other big advantage, especially in portable devices is energy efficiency which leads to world-beating battery life.

The Snapdragon 8 Gen 3 features seven CPU cores with a mix of performance and efficiency cores, similar to Apple’s approach. Snapdragon 8 Gen 3’s graphics are handled by the Adreno 750 GPU.

The improvement in the Snapdragon chip from its predecessor is truly impressive. But, in post-release benchmarks, Qualcomm’s so-called Apple-beater doesn’t quite live up to the hype.

Geekbench benchmark results for the iPhone 15 Pro and smartphones using the Snapdragon 8 Gen 3
Geekbench benchmark results for the iPhone 15 Pro and smartphones using the Snapdragon 8 Gen 3

Geekbench single-core and multi-core results for new devices like the Nubia Red Magic 9 Pro, the ASUS ROG Phone 8 Pro, and the Samsung Galaxy S24 Ultra get close to what the A17 Pro offers in terms of single-core and multi-core processing. Except “close” is not good enough.

The future’s so bright

It’s easy to throw massive amounts of power at a chip to make it more performant. It’s still astonishing that Intel, AMD, and now Qualcomm’s very best consumer-grade desktop and laptop chips struggle to break even with Apple Silicon at the same power-sipping levels.

The M3 typically surpasses them all in single-core tests, and Apple’s current chips already come in a very close third to Samsung and Qualcomm’s current chips in AI and Machine Learning benchmarks.

Geekbench score changes for single-core and multi-core tests of the base M1, M2, and M3 models
Geekbench score changes for single-core and multi-core tests of the base M1, M2, and M3 models

And all of this is before the big AI/Machine Learning boost we expect in the A18 Pro and M4 chips. It’s worth repeating that Apple accomplishes all this while offering significantly superior battery life — vital for smartphone and notebook users — and in nearly all cases, the same level of performance on the battery as when plugged in.

To be sure, Apple’s everything-on-one-chip design has some weak spots for some users. RAM and the on-board GPU can’t be upgraded later, and the chips themselves aren’t designed to be upgradeable within a given machine. These are advantages the other computer makers can offer that Apple cannot.

Apple also doesn’t seem to care about how to integrate eGPUs and other PCI graphics options, even in its Mac Pro tower. It’s likely not a high priority for the company, given the low sales of dedicated desktops these days, but it hurts those who really need it.

Apple has recently added more graphics capabilities for the M3 generation, such as ray tracing and mesh shading. This has helped close the gap — a little — between Apple devices and PCs with dedicated graphics cards.

Introducing the M4

Given the industry shift to what Apple calls “machine learning” and the rest of the industry shortens to “AI,” the forthcoming M4 chip will likely play catch-up in that area. Apple’s next-generation chips will likely be announced at WWDC, and start appearing in new Macs starting in the fall.

Expectations are that the M4 will be more than just an improvement on the M3, but rather feature a redesign to accommodate a larger Neural Engine and various other AI enhancements. As with the previous generation, the M4 will come in at least three variants.

The base M4, used in products like the Mac mini and entry-level MacBook Air, is codenamed “Donan.” The M4 Pro is currently codenamed “Brava” and will be used in updated higher-end configurations of the MacBooks and Mac mini.

The next version of the Mac Studio is likely to use an M4 Max, while the Mac Pro will eventually be revised to get what is likely to be called the M4 Ultra, codenamed “Hidra.” Indications are that the M4 family will boost processing capabilities significantly as part of its “AI” upgrades, even though it is expected to still be created using the 3nm process.

The bottom line

This era marks the second hugely successful transition for Apple that grew out of its Newton and iPod days. Steve Jobs, Tony Fadell, and many others at Apple deserve a lot of credit for their early faith in the potential of Arm-based chips.

In the areas that the vast majority of computer users care about, Apple has become the industry standard to try and beat. There are still some areas where both Apple and its chip rivals could stand to improve, but the competition between them is good for everyone.

Now that Apple has finally had a taste of being on the high end of consumer-level computer performance, it’s unlikely that they will ever go back. The heads of hardware development at Apple frequently frame their discussions about the future in optimistic terms, implying an expectation of future innovations for years to come.

This has also been a remarkable journey for Arm, originally leveraging RISC to design infrastructure for efficient handheld devices.

Arm chips are now the basis for all mobile device chips, as well as tablets from Apple, Android, and Microsoft, and of course Apple’s notebook and desktop computers as well.

The current hardware team at Apple has described themselves as “giddy” to be able to design custom chips tailored to the particular needs of each of their devices. It is an ability it never would have had control over with other chipmakers.

Both classes of Apple products have seen big leaps in capability, right alongside massive improvements in battery life, as a result.

From the very beginning of Apple Computer, the company has relied on core parts made by other firms. The Apple I had a MOS Technology 6502 chip at its heart, though co-founder Steve Wozniak designed most of the other original hardware, including the circuit boards.
Apple continued to invent and patent many technologies as it grew, but it was always reliant on other companies to provide the primary chip powering the machines, the CPU. For many years, Apple’s chip supplier was Motorola, and later Intel.
Looking back, it’s fairly clear that this necessary relationship with other firms — often with companies that Apple competed against on various levels — was not what co-founder and CEO Steve Jobs wanted. Apple was at the mercy of Motorola’s or Intel’s timelines, and sometimes shipping delays, which hurt Mac sales.
Prior to Jobs’ return to Apple in 1997, Apple had used a company called Advanced RISC Machines, later formalized as Arm Holdings PLC, to design a chip for the Newton handheld. Jobs promptly killed the device when he took over as CEO, despite the outcry of fans of the PDA.
Luckily, Arm got a second chance to work with Apple on the iPod project, which required lead Tony Fadell to seek a chip supplier specializing in high-efficiency, low-power chips, among other challenges. The first iPod, which debuted in late 2001, used a Portplayer PP5502 system-on-a-chip (SOC) powered by dual Arm processor cores.
A few years later, Apple used Samsung-designed Arm-based SOC chips for the original iPhone. Apple bought shares in Arm and deepened its relationship with the firm.
By the mid-2000s, Apple’s notebook computers were beginning to seriously outpace the iMac as the computer of choice for consumers. Mobile devices like the Palm Pilot had introduced a new level of access to information on the go, and the iPhone’s debut cemented that desire.
It seems obvious now that Apple’s notebooks would eventually need to move to much more power-efficient chips, but at the time the only place you could find such research was deep inside Apple’s secret “skunkworks” labs.
In June of 2005, Jobs announced Apple’s entire Mac lineup would move to Intel-based chips. The Mac had been lagging behind on Motorola chips, and sales were down again.
Long-time customers were worried about the effect of such a major transition, but Apple’s behind-the-scenes work paid off with emulation software called Rosetta. It was a key technology that kept customer’s existing PPC-based programs running smoothly, even on the new machines.
Apart from some early hiccups, the big transition went pretty well, and customers were excited about Apple again.
The move quickly brought more than just speed to Apple’s portables: with the Intel chips being more efficient, it also improved battery life. The new Macs were also able to run Windows more efficiently, a great sales advantage to users who needed or wanted to use both platforms.
In hindsight, Jobs and his new executive team had a clear plan to strengthen Apple by reducing dependencies on other companies down to a minimum. The long-term goal was for Apple to design its own chips, and Arm’s architecture became key to that mission.
In 2008, just a year after the iPhone debuted, the first prong of Apple’s plan to make its own chips began to take shape. In April of that year, the company bought a chip design firm called P.A. Semiconductor, based in Palo Alto, for $278 million.
By June of that year, Jobs publicly declared that Apple would be designing its own iPhone and iPod chips. This was disappointing news to both Samsung and Intel.
Samsung wanted to retain its role as Apple’s chip designer and fabricator. Intel tried hard to woo Apple into using their low-power Atom chipsets for the iPhone, and was shocked when Apple declined.
The first iPhone models used Arm-based chips designed and fabricated by Samsung. This changed in 2010 when Apple introduced its first self-designed mobile chip, the A4. It debuted in the the first-generation iPad, and then the iPhone 4.
It was also used in the fourth-generation iPod Touch, and second-generation Apple TV. Samsung was still used to fabricate the chips, but not for much longer.
The following year, Apple began using Taiwan Semiconductor Manufacturing Company (TSMC) to fabricate the chips independently. Apple continues to use TSMC as its chip fab to this day, often buying up its entire production capacity of some chips.
The two big factors in Apple’s favor in using its own chips are all about control and customizability. They give Apple a significant advantage over rival mobile chips from Qualcomm and Samsung.
Using Arm’s RISC template, Apple now has complete control over the design of the chips it uses. This means it can add, change, and optimize chip features — like the Neural Engine — when it needs to, for example, rapidly increase its machine-learning and AI processing capabilities.
Secondly, Apple’s chips — whether for its mobile devices, Macs, or other products like the Apple TV — are in harmony with the software and operating system used. This is a factor that most Android-based device manufacturers can’t compete with.
Google has started designing its own chips for its own Pixel line of smartphones, but Samsung now relies on Qualcomm chips for its mobile devices, which aren’t customizable to Samsung-specific features. The same is true for other Android device makers.
This is why Apple’s chips usually match or beat rival systems even when the rivals have more RAM, more processor cores, and other seeming advantages. Apple can integrate software innovations with hardware features — such as optimizing battery life — on a level almost nobody else can.
As early as 2012, rumors began circulating that Apple was planning to eventually introduce Arm-based, Apple-designed chips into its Mac notebook line. It seemed like a logical extension, given the success of the iPhone, but it would likely take years of dedicated engineering work.
It started to become obvious, though unannounced, that such a transition was already in the works.
As AppleInsider mentioned in 2018, pretty much all smartphones and even Microsoft had embraced Arm architecture by this point. The rumored transition away from Intel would ironically be as smooth as the migration to Intel had been in 2005.
Sure enough, in 2019 Intel finally got the official word that their chips would soon be no longer required for the Mac. By this point, Apple was on the A12 chip for the iPhone, and it’s benchmarks suggested that it was desktop-class in terms of performance.
On June 22nd, 2020, Apple CEO Tim Cook announced at WWDC that Apple would move to Apple Silicon chips, designed in-house by Apple itself, over the course of the next two years. Cook stressed that the machines would be able to run Intel-based Mac software using “Rosetta 2” to help with the transition.
The first M1 Macs — the Mac mini, MacBook Pro 13-inch and 13-inch MacBook Air — arrived in November of that year. The Mac Pro was the last machine to make the leap, skipping the M1 generation entirely — finally debuting last June with a new M2 Ultra chip on board.
Apart from the delayed Mac Pro, the rest of the Mac line had moved into the M-class chips within the promised two years. The “growing pains” of moving to Arm-based technology were much more minor than in transitions past.
Users responded very positively to the M1 Apple Silicon Macs, praising the huge boost in speed and power that the line brings compared to the Intel-based machines of 2020. Intel and AMD have worked hard to catch up, but Apple Silicon is now the new benchmark by which other machines are measured when it comes to consumer machines.
Since the M1, Apple fans have gotten used to routine double-digit improvements every year or so. Both in comparing the most recent base M-class chip to its successor and in comparing each variant to the one below it, and then again when a new set shows up.
It can’t last forever, but Apple’s ability to design chips perfectly customized to its own hardware has proven to be a major advantage.
This means Apple’s chip designers can give more emphasis to areas that will need more attention, and we’ll be hearing much more about that this coming June. Apple will undoubtedly be expanding the speed, computing power, and complexity of its existing Neural Engine in its next chips, to accommodate the greater use of machine learning and AI technologies.
Already, we’ve previously noted that Apple’s machines can maintain the same high speeds in the MacBook line regardless of whether the machine is on battery or mains power. Most PC laptops only run at their best speed when plugged in, with noisy fans blazing.
YouTube reviewers of the New MacBook Air models have noted, with some astonishment, that even “pro” apps like Photoshop, Lightroom, and Final Cut Pro run just fine for light- and medium-duty jobs — on a machine that doesn’t have a fan. This would have been a nigh-inconceivable notion to these reviewers even two years ago.
Credit should be given to other chipmakers who have made heroic efforts — albeit not entirely successful — to catch up. Qualcomm, for example, has brought out the Snapdragon 8 Gen 3, which aims to take on the A17 Pro used in the iPhone 15 Pro and Pro Max.
This is fine, except for the fact that the iPhone 16 will be coming out in the fall, with a new chip that will likely continue the now-tradition of double-digit increases. In particular, we’ll note that GPU performance in the M3 has increased 50 percent over the M1 in just three years.
Apple’s other big advantage, especially in portable devices is energy efficiency which leads to world-beating battery life.
The Snapdragon 8 Gen 3 features seven CPU cores with a mix of performance and efficiency cores, similar to Apple’s approach. Snapdragon 8 Gen 3’s graphics are handled by the Adreno 750 GPU.
The improvement in the Snapdragon chip from its predecessor is truly impressive. But, in post-release benchmarks, Qualcomm’s so-called Apple-beater doesn’t quite live up to the hype.
Geekbench single-core and multi-core results for new devices like the Nubia Red Magic 9 Pro, the ASUS ROG Phone 8 Pro, and the Samsung Galaxy S24 Ultra get close to what the A17 Pro offers in terms of single-core and multi-core processing. Except “close” is not good enough.
It’s easy to throw massive amounts of power at a chip to make it more performant. It’s still astonishing that Intel, AMD, and now Qualcomm’s very best consumer-grade desktop and laptop chips struggle to break even with Apple Silicon at the same power-sipping levels.
The M3 typically surpasses them all in single-core tests, and Apple’s current chips already come in a very close third to Samsung and Qualcomm’s current chips in AI and Machine Learning benchmarks.
And all of this is before the big AI/Machine Learning boost we expect in the A18 Pro and M4 chips. It’s worth repeating that Apple accomplishes all this while offering significantly superior battery life — vital for smartphone and notebook users — and in nearly all cases, the same level of performance on the battery as when plugged in.
To be sure, Apple’s everything-on-one-chip design has some weak spots for some users. RAM and the on-board GPU can’t be upgraded later, and the chips themselves aren’t designed to be upgradeable within a given machine. These are advantages the other computer makers can offer that Apple cannot.
Apple also doesn’t seem to care about how to integrate eGPUs and other PCI graphics options, even in its Mac Pro tower. It’s likely not a high priority for the company, given the low sales of dedicated desktops these days, but it hurts those who really need it.
Apple has recently added more graphics capabilities for the M3 generation, such as ray tracing and mesh shading. This has helped close the gap — a little — between Apple devices and PCs with dedicated graphics cards.
Given the industry shift to what Apple calls “machine learning” and the rest of the industry shortens to “AI,” the forthcoming M4 chip will likely play catch-up in that area. Apple’s next-generation chips will likely be announced at WWDC, and start appearing in new Macs starting in the fall.
Expectations are that the M4 will be more than just an improvement on the M3, but rather feature a redesign to accommodate a larger Neural Engine and various other AI enhancements. As with the previous generation, the M4 will come in at least three variants.
The base M4, used in products like the Mac mini and entry-level MacBook Air, is codenamed “Donan.” The M4 Pro is currently codenamed “Brava” and will be used in updated higher-end configurations of the MacBooks and Mac mini.
The next version of the Mac Studio is likely to use an M4 Max, while the Mac Pro will eventually be revised to get what is likely to be called the M4 Ultra, codenamed “Hidra.” Indications are that the M4 family will boost processing capabilities significantly as part of its “AI” upgrades, even though it is expected to still be created using the 3nm process.
This era marks the second hugely successful transition for Apple that grew out of its Newton and iPod days. Steve Jobs, Tony Fadell, and many others at Apple deserve a lot of credit for their early faith in the potential of Arm-based chips.
In the areas that the vast majority of computer users care about, Apple has become the industry standard to try and beat. There are still some areas where both Apple and its chip rivals could stand to improve, but the competition between them is good for everyone.
Now that Apple has finally had a taste of being on the high end of consumer-level computer performance, it’s unlikely that they will ever go back. The heads of hardware development at Apple frequently frame their discussions about the future in optimistic terms, implying an expectation of future innovations for years to come.
This has also been a remarkable journey for Arm, originally leveraging RISC to design infrastructure for efficient handheld devices.
Arm chips are now the basis for all mobile device chips, as well as tablets from Apple, Android, and Microsoft, and of course Apple’s notebook and desktop computers as well.
The current hardware team at Apple has described themselves as “giddy” to be able to design custom chips tailored to the particular needs of each of their devices. It is an ability it never would have had control over with other chipmakers.
Both classes of Apple products have seen big leaps in capability, right alongside massive improvements in battery life, as a result.

The long-term goal was for Apple to design its own chips

I’m doubtful this was true. I think that in 2006, Jobs had no plan for silicon independence and would have regarded that idea as fanciful. Intel was such a dominant force at that time — it was inconceivable that anyone could beat them.

The M series chips we have today would not exist without a decade of blunders by senior executives at Intel. 

It’s worth mentioning that ARM (not Arm, it is an acronym for Advanced RISC Machines or Acorn RISC Machines) was started in 1990 as a joint venture between Acorn, VLSI and… Apple.
Also, the PPC to Intel transition was not the first heart transplant for Apple; back in 1994 it abandoned Motorola’s 68xxx product line that it started using 10 years prior with the Macintosh and transitioned to PowerPC, and that transition was incredibly smooth and well executed. A 68k emulator was provided and legacy code was supported, and software makers were able to ship “fat binaries” that ran optimally on both architectures.
The transition to OSX/macOS, which is at the core a reskinning of NeXT OS, brought with it NeXT’s own experience in supporting multiple CPUs (it initially shipped on 68k but eventually supported Sun SPARC, HP and Intel) and made the transition from PPC to Intel painless. Apple’s know how in the field is unique and spectacular.

Strange how this article spends no thought on the GPU design yet this is where the real computational horsepower and efficiency has shifted. Nothing runs as hot as an NVIDIA graphics card yet the M series GPUs sip power. Apple abandoned their own GPU design and went back to Imagination Technologies. So why no mention here?

y2an said:
Strange how this article spends no thought on the GPU design yet this is where the real computational horsepower and efficiency has shifted. Nothing runs as hot as an NVIDIA graphics card yet the M series GPUs sip power. Apple abandoned their own GPU design and went back to Imagination Technologies. So why no mention here?

https://discussions.apple.com/thread/253690215?sortBy=best

Apple licensed Imagination IP, for how long I don’t know, but the current designs are entirely Apple’s, which makes sense since the GPU cores are fully integrated into the M Series SOC architecture, as well as the A series.

If you have a link to support your statement, this would be a great time to respond.

Apple licensed Imagination IP, for how long I don’t know, but the current designs are entirely Apple’s, which makes sense since the GPU cores are fully integrated into the M Series SOC architecture, as well as the A series.

If you have a link to support your statement, this would be a great time to respond.

I miss the IBM 3/4/5G powerPC before Intel Mac’s in the overview…! 🤔
Whether you need the soundtrack for a project or want to enjoy music without visuals, extracting audio from video on a Mac is easy with built-in tools.
A returning Apple Card Family promotion can provide a user up to $100 and a family unit up to $200 in rewards for bringing new users to a family group.
The flash deal knocks $450 off a lifetime Babbel subscription for 72 hours only, with access to all 14 languages.
Apple's Worldwide Developers Conference is one of the most anticipated events in the tech world. Here's how you can watch the WWDC keynote and developer sessions.
Apple Books users can now discover the top audiobook picks from Reese Witherspoon, with the service now the preferred home for her eponymous Book Club.
Apple has reportedly been in talks with China Mobile about bringing both Apple TV+ and Apple Arcade to the country.
Social media banter between leakers confirms that it's possible to buy cases for the iPhone 16 now. It's a really bad idea, though.
Steep price drops are in effect on every MacBook Pro, with exclusive discounts on the M3 14-inch and 16-inch laptops in addition to promo code savings on three years of AppleCare.
It's the idea that just won't go away — Apple is again looking at adding screens to AirPods charging cases.
A new leak about the iPhone 16 Pro goes further than previous claims about the bezels and says they will be the thinnest of any smartphone ever made.
A new report doubles-down on the improbability of new hardware debuts at WWDC — and pushes back the rumored arrival of an Apple TV set-top box update.
{{ summary }}

source

Facebook Comments Box

Trả lời

Email của bạn sẽ không được hiển thị công khai. Các trường bắt buộc được đánh dấu *