Ha! OK. That is relieving to hear. I think I readily believed it because of some of the enterprise features companies push for and get.
@philsolis
Research Director / industry analyst @ IDC covering PC, tablet, and smartphone processors and accelerators, including integrated and discrete GPUs and NPUs ... and wireless & mobile connectivity technologies. NYC Metro (Long Island)
Ha! OK. That is relieving to hear. I think I readily believed it because of some of the enterprise features companies push for and get.
Then you can't take notes on your PC, reference documents (slide decks, Excel files, web pages, notes) for needed information. This might make sense for only a small subset of types of meetings.
Hello! I am a creative technologist with a background in journalism, and I maintain these feeds as a service to the Bluesky community. In curating the list of organisations included in the feed, I try to keep it to news orgs that produce original journalism and that I can verify the accounts of.
A lot of this depends on how much it will be running in the first place, and then how much power savings are gained (or latency reduced) on an NPU vs a CPU. Today it just doesn't matter so much but it eventually will.
It's good that Microsoft is doing this now. They can see at scale how different applications perform on different platforms with real-world usage. This is also similar to how genAI required a 30+ TOPS (Int8) NPU on a smartphone to run genAI applications, but Google did it with ~4 TOPS.
The direction Microsoft is headed with CoPilot+ will require NPUs, but they aren't really needed right now. Even applications running on newer CPU cores are more efficient because of things like Arm Kleidi and SME2, for example, but are not as efficient as NPUs can be.
As agentic AI applications are used more often or always running in background, a device w/o NPU will be at a disadvantage. GPU is more powerful & will be used as needed, but uses more power consumption - except for newer GPU cores designed with smaller AI compute cores within like embedded NPUs.
If battery life is important, then leveraging the NPU is key. For applications used occasionally, it will not matter. For someone doing many video calls and using eye tracking (to make it look like you are looking at the camera and not somewhere else on your screen (like other people speaking).
Many third party smartphone applications just run on the CPU because the developers do not have the bandwidth/resources to develop for many different NPUs. CPU is easiest to develop software for, GPU is harder, NPU is the hardest. But for many AI workloads NPU is the most efficient.
I posted on LinkedIn about Panther Lake, aka the Intel Core Ultra series 3 line of mobile PC processors.
Intel on Thursday shared a variety of details about its forthcoming βPanther Lakeβ PC system-on-chip and βClearwater Forestβ server CPUβtwo products that represent a critical and long-awaited step in its heavily scrutinized comeback plan. www.crn.com/news/compone...
Analysis: When Qualcomm revealed its upcoming Snapdragon X2 Elite processors for Windows computers last week, the company made its biggest signal yet that itβs coming for Intel and AMD in the commercial PC marketβa big moment itβs been preparing the channel for. www.crn.com/news/compone...
I posted a LinkedIn article about Qualcomm's four new SoCs - two for smartphones and two for PCs - and what it means for them.
AMD said Wednesday that enterprise AI startup Cohere will expand use of the chip designerβs Instinct GPUs as part of a new agreement. www.crn.com/news/ai/2025...
My LinkedIn post about MediaTek's latest flagship smartphone SoC, the Dimensity 9500.
Google says more on desktop Android, Qualcomm βincredibly excitedβ
And to make it more fun, NPU TOPS is less important than the size of memory, and the speed of memory - especially the bandwidth between the NPU and memory. That's why many newer SOC designs have doubled the interconnect bandwidth or more, and why some RISC-V companies have orders of magnitude more.
It is widely accepted that agentic AI agents that are always running will have to run on the NPU, and when they need more power they can use the CPU and GPU. And the NPU is required to be a CoPilot (<40 TOPS Int8) or CoPilot+ (>40 TOPS Int8) Windows PC now.
There might just be a little bit of tension at most. It's really up to developers, though SDKs will be steer them. Generally, from APU to GPU to NPU, the workloads are more general to more specific, higher power consumption to lower power consumption, easier to use to harder to use.
I am assuming this would only be for some portion. That might be the higher end (but not the highest end that gets used with dGPU) only, or it might be mixed within the higher end.
We wrote an IDC Link on this partnership between Nvidia and Intel. I focused on the PC portion - the opportunity that integrated GPU chiplets provides to Nvidia and what it does for Intel's competitiveness.
Nvidia CEO Jensen Huang said Thursday that the companyβs new deal with Intel will allow the two firms to create a βnew class of integrated graphics laptops,β representing what he called an βunderservedβ market that is βlargely unaddressed by Nvidia today.β www.crn.com/news/compone...
Mario Morales and I wrote an IDC Link (for our clients) based on our attendance at Arm's Client Tech Days 2025 event early September and their subsequent announcements:
my.idc.com/getdoc.jsp?c...
For clients of my research, I have a bit more that I say about Apple's chips including the N1:
my.idc.com/getdoc.jsp?c...
Apple limited the implementation of the Broadcom Wi-Fi chips to 160 MHz channels instead of using 320 MHz. I believe this was to allow leeway in performance when switching to their own Wi-Fi chips. They are still limited channel sizes to 160 MHz with the N1, but that will prob change with the N2.
The iPhone 17 series all use Apple's new Wi-Fi chip (Wi-Fi 7 / Bluetooth 6 / Thread). This has been the other long-anticipated chip from Apple that replaces Broadcom Wi-Fi chips.
And the iPhone Air uses Apple's improved, second generation cellular modem, C1X. (The 16e uses C1.)
It also has satellite connectivity which is key for connecting in the infinite number of deadspots and coverage holes that exist.
The new Apple Watch has 5G now and it is supposed to be more power-efficient. This means that instead of using a 4G chip it is now using a 5G RedCap chip. It is too early for this to be have an eRedCap chip which will need at least another year to be commercially available.
The new research service has all of this and will also include content and data around integrated and discrete accelerators (GPU and NPU) for primary client devices (PCs, media tablets, and smartphones).
/3