Will we ever see the iPad Air's processor in a Mac?

CPU fabrication

It's interesting to note that Apple is already taking a GPU-heavy route with the new Mac Pro, which will feature two extremely high-end GPUs, but only one CPU. However, while GPU-optimised computing is great as an ideal for changing the way we think of computers, there are practical problems with trying to introduce it for operating systems with long legacies, and that can hold it back as being a replacement for the CPU in more general-purpose use.

"The truth is that an awful lot of code cannot be moved over to the GPU," says Kanter. "The point of the CPU is that you spend a lot of resources compensating for poor programming. A lot of the things that architects spend a lot of time creating is just there to tolerate bad code, and the kicker is that if you run bad code on a GPU, it will just run terribly."

Of course, there's the question of which operating system would run on an Apple-chip-based laptop. If it ran OS X, it would have to be a new version adapted for the completely different architecture of Apple chips compared to Intel, and this would make the current range of OS X apps unavailable on it - Apple would have to supply a way for developers to recompile their apps for the new type of machine, though there's no guarantee that all developers will take advantage. It would also mean a third platform for Apple to support, effectively - iOS, OS X for Intel, and OS X for Apple chips.

An alternative might be that instead of adapting OS X to run on the Apple chips, Apple could evolve iOS to include features we've come to expect and rely on, such as mouse support, true multitasking and the ability for apps to pass information to each other. But aside from these changes, there's also the problem that iOS apps wouldn't fit the widescreen format of laptops, so it would need either some form of windowing or more flexibility in apps' layout and shape, which again would mean more work for developers.

Cash in your chips

With all of the issues of developer support and technical capabilities, and the fact that Intel will continue to create more powerful chips, you might wonder why Apple would bother doing any of this at all.

There is another factor, though: cost. Intel's laptop chips cost nearly $300 dollars to buy for manufacturers. Apple's A-series chips are estimated to around $30 to produce. Now, a more complex Apple chip needs to become significantly larger, and costs will increase hugely with that. But let's say that Apple were able to create a chip as powerful as what's in the current MacBook Air for around $150 - that would still make it $100 cheaper than an Intel one. That could allow Apple to create a new lower-priced line of MacBooks or an even smaller Mac mini starting at $500/£400.

That said, Apple doesn't tend to introduce lower-priced options without strong reason, so perhaps it's more likely that Apple would keep the MacBook Airs at their current price and include features such as Retina screens or 4G as standard, giving it a huge feature lead on the competition for the price.

All of the above, though, assumes that an Apple-powered Mac would work roughly the same as current computers do. Things may change by then. Across the industry, 'the cloud' is starting to be used for actual computations, rather than just for storage. Apple's iCloud version of iWork does a lot of work on the server side, YouTube offers video editing through your web browser, and Autodesk already offers cloud rendering for some 3D modelling tasks.

We might see the return of the 'thin client', where your computer only needs a processor powerful to be the interface for these apps, with all the hard work done elsewhere. In that case, even the current Apple chips might be suitable - you wouldn't need a fast computer, you'd just need fast internet. But even that still assumes a fairly traditional form factor for the Mac.

What if even the concept of what a computer is made up of has changed over the next few years? Apple has been working on technologies that make wireless connections utterly configuration-free, and that make wireless video smooth and fast. Its iBeacons technology uses Bluetooth to let devices see when they're close to each other, and pass information back and forth appropriately, while Wi-Fi Direct is being used to establish AirPlay Wi-Fi video connections without a router.

These technologies could form the basis of a system of flexible computing - your desktop computer could be simply a largescreen display with Wi-Fi capabilities, with a wireless mouse and keyboard (or whatever we use to control PCs in the future).

When you sit at your desk, the iPhone in your pocket detects the set up and gives you a custom desktop display on the screen using wireless video, letting you control it by using the mouse and keyboard - the iPhone becomes your computer hardware, capable of performing light tasks itself, and with heavy lifting done by servers in the cloud. In that case, there certainly would be an Apple chip in your future Apple computer, but the Mac may be long gone.

Matt Bolton
Managing Editor, Entertainment

Matt is TechRadar's Managing Editor for Entertainment, meaning he's in charge of persuading our team of writers and reviewers to watch the latest TV shows and movies on gorgeous TVs and listen to fantastic speakers and headphones. It's a tough task, as you can imagine. Matt has over a decade of experience in tech publishing, and previously ran the TV & audio coverage for our colleagues at T3.com, and before that he edited T3 magazine. During his career, he's also contributed to places as varied as Creative Bloq, PC Gamer, PetsRadar, MacLife, and Edge. TV and movie nerdism is his speciality, and he goes to the cinema three times a week. He's always happy to explain the virtues of Dolby Vision over a drink, but he might need to use props, like he's explaining the offside rule.