enarzh-TWfrdehiitjaes

Tech Deals

You want to make your life easier? Than you need this d...
For everyone who loves coffee, tea, hot cocoa, or anyth...
$24.99End Date: Thursday Apr-5-2018 12:46:26 PDTBuy It ...

Tech Word

Let's talk about 8 bits. Where will you find 8 bits? We...
If you are reading this awesome blog post than I am sur...
I had a friend who use to say that he'd rather "have a ...

Member of The Internet Defense League

NordVPNSideBar

Recent Tech

If you’ve been using Google’s messag...
By Jack M. GermainApr 19, 2018 11:19...
Lyft announced today that it will pu...
The cloud is much more than on-deman...
Opening the iTunes Store.If iTunes d...
Moto G6 lineup official with flagshi...
The punchline: Microsoft just unveil...
2 minutes reading time (442 words)

The iPhone X’s new neural engine exemplifies Apple’s approach to AI

Apple’s new iPhone X is billed as “the future of the smartphone,” with new facial recognition and augmented reality features presented as the credentials to back up this claim. But these features wouldn’t be half as slick without a little bit of hidden futurism tucked away in the phone’s new A11 Bionic chip: Apple’s new “neural engine.”

The neural engine is actually a pair of processing cores dedicated to handling “specific machine learning algorithms.” These algorithms are what power various advanced features in the iPhone, including Face ID, Animoji, and augmented reality apps. According to Apple’s press materials, the neural engine performs “up to 600 billion operations per second” to help speed AI tasks (although this stat is hard to put in proper context; operations-per-second is never the sole indicator of performance).

What’s clear about the neural engine is that it’s typical of Apple’s approach to artificial intelligence. AI has become increasingly central to smartphones, powering everything from the speech recognition to tiny software tweaks. But to date, AI features on mobile devices have been mostly powered by the cloud. This saves your phone’s battery power by not taxing its processor, but it’s less convenient (you need an internet connection for it to work) and less secure (your personal data is sent off to far-away servers).

Apple unveils its new neural engine on stage at the iPhone X event.

Apple’s approach is typical of the company’s ethos: it’s focused on doing AI on your device instead. We saw this back in June 2016, when the company introduced “differential privacy” (using statistical methods to mask users’ identity when collecting their data), and at WWDC this year when it unveiled its new Core ML API. The “neural engine” is just a continuation of the same theme. By having hardware on the phone itself that’s dedicated to AI processing, Apple sends less data off-device and better protects users’ privacy.

The iPhone-maker isn’t the only company to pursue this approach. Chinese tech giant Huawei put a similar “Neural Processing Unit” in its Kirin 970 system-on-chip, saying its can handle tasks like image recognition 20 times faster than a regular CPU. Google has developed its own methods of on-device AI called “federated learning,” and has hinted that it too is working on mobile chips for machine learning. ARM has reconfigured its chip design to favor artificial intelligence, and chipmaker Qualcomm says it’s only a matter of time before it, too, launches its own mobile AI chips.

Because although the iPhone X’s neural engine is typical of Apple’s approach to AI, it’s not just the company’s particular quirk. It’s the future of the whole mobile industry.

Can a tattoo on human flesh be copyrighted? We’ll ...
First look: iPhone 8 series with A11 Bionic chip, ...
 

Comments

No comments made yet. Be the first to submit a comment
Guest
Friday, 20 April 2018

Captcha Image