Gurman: iOS 18 AI features to be powered by ‘entirely on-device’ LLM, offering privacy and speed benefits

Share via:


As indicated by much of the research material Apple has been publishing in recent months, the company is investing heavily in all sorts of artificial intelligence technologies. Apple will announce its AI strategy in June at WWDC, as part of iOS 18 and its other new OS versions.

In the latest Power On newsletter, Mark Gurman says to expect the new iPhone AI features to be powered entirely by an offline, on-device, large language model developed by Apple. You can expect Apple will tout the privacy and speed benefits of this approach.

9to5Mac previously found code references in iOS 17.4 that referred to an on-device model called “Ajax”. Apple is also working on server-hosted versions of Ajax too.

The downside to on-device LLMs is they can’t be as powerful as models that are running on huge server farms, with tens of billions of parameters and continually updating data behind them.

However, Apple engineers can take advantage of the full-stack, with software tuned to the Apple silicon chips inside its devices, to make the most out of an on-device approach. On-device models are usually much quicker to respond than trafficking a request through a cloud service, and they also have the advantage of being able to work offline in places with no or limited connectivity.

While on-device LLMs may not have the same embedded rich database of knowledge as something like ChatGPT to answer questions about all sorts of random trivia facts, they can be tuned to be very capable at many tasks. You can imagine that an on-device LLM could generate sophisticated auto-replies to Messages, or improve the interpretation of many common Siri requests, for instance.

FTC: We use income earning auto affiliate links. More.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Popular

More Like this

Gurman: iOS 18 AI features to be powered by ‘entirely on-device’ LLM, offering privacy and speed benefits


As indicated by much of the research material Apple has been publishing in recent months, the company is investing heavily in all sorts of artificial intelligence technologies. Apple will announce its AI strategy in June at WWDC, as part of iOS 18 and its other new OS versions.

In the latest Power On newsletter, Mark Gurman says to expect the new iPhone AI features to be powered entirely by an offline, on-device, large language model developed by Apple. You can expect Apple will tout the privacy and speed benefits of this approach.

9to5Mac previously found code references in iOS 17.4 that referred to an on-device model called “Ajax”. Apple is also working on server-hosted versions of Ajax too.

The downside to on-device LLMs is they can’t be as powerful as models that are running on huge server farms, with tens of billions of parameters and continually updating data behind them.

However, Apple engineers can take advantage of the full-stack, with software tuned to the Apple silicon chips inside its devices, to make the most out of an on-device approach. On-device models are usually much quicker to respond than trafficking a request through a cloud service, and they also have the advantage of being able to work offline in places with no or limited connectivity.

While on-device LLMs may not have the same embedded rich database of knowledge as something like ChatGPT to answer questions about all sorts of random trivia facts, they can be tuned to be very capable at many tasks. You can imagine that an on-device LLM could generate sophisticated auto-replies to Messages, or improve the interpretation of many common Siri requests, for instance.

FTC: We use income earning auto affiliate links. More.



Source link

Disclaimer

We strive to uphold the highest ethical standards in all of our reporting and coverage. We StartupNews.fyi want to be transparent with our readers about any potential conflicts of interest that may arise in our work. It’s possible that some of the investors we feature may have connections to other businesses, including competitors or companies we write about. However, we want to assure our readers that this will not have any impact on the integrity or impartiality of our reporting. We are committed to delivering accurate, unbiased news and information to our audience, and we will continue to uphold our ethics and principles in all of our work. Thank you for your trust and support.

Website Upgradation is going on for any glitch kindly connect at office@startupnews.fyi

More like this

Elon Musk is directing harassment toward individual federal workers

Elon Musk is, in addition to many other...

CFTC report endorses tokenizing trading collateral 

Distributed ledger technology can help solve longstanding challenges...

Tap to Pay on iPhone now available in one...

Following a recent expansion of Tap to Pay...

Popular

Upcoming Events

Startup Information that matters. Get in your inbox Daily!