Apple makes its own AI for its devices – iPon

--

For now, Apple is almost completely out of the “big artificial intelligence war”. Microsoft, Meta and Google come up with small and large developments on a monthly, but rather weekly basis, that can be used for AI, and according to recent news, Apple is also working on a solution adapted to its own needs, and this will be different from the usual .

It is a common thing that Apple tries to differ in some way from the others’ solutions in its developments. He tries to be able to shine in some way, even if he is trailing behind his competitors. The development of artificial intelligence may also be the same, and Bloomberg analyst Mark Gurman recently revealed what will be the special feature of Apple’s AI innovation.

The special generative AI and the language model on which it is based will be able to work entirely on smartphones and tablets.

Last year, Google started strongly emphasizing the importance of artificial intelligence with the Pixel smartphones, and came up with several developments linked to AI. Then Samsung came along and really took this thing to the top by saying that in the Galaxy S24 series, it was not so much the hardware developments of the mobiles, but the Galaxy AI that carried the show.

The Pixels and the Galaxy AI package put together by Samsung also have smaller and bigger functions that use generative AI technology already running locally. Apple, on the other hand, can rely on the fact that it will solve every single function locally with its own artificial intelligence. And the company will probably try to highlight in this case that the security of user data can be guaranteed in this way. Apple always tries to be security-oriented, and in the case of “on-device” AI solutions, it is often argued that sensitive data cannot be shared with third parties.

Open gallery

In addition to better data protection, the advantage of locally run AI operations is that they can operate faster and more reliably. Functions that rely on cloud services are not accessible offline, for example, and may fail with a slow connection, and in some cases, they may generate unexpected data traffic, which may affect people sensitively depending on the subscription package. The waiting time will also be significantly shorter if the iPhone and iPad can produce the results locally.

Creating your own large language model may seem like a big task at first, but in fact it is not at all devilish, and you can put together a working, compact LLM system in a few months even with modest reserves. In the past 1.5 years, several solutions have emerged that were created in a few months, Elon Musk, for example, proudly pointed out last year in connection with the Grok-1 that it was created in roughly four months, but this is not the only such “lightning project”. And Apple has an almost unlimited amount of resources (money) at its disposal.

Open gallery

The company is also said to be in talks with several major AI giants, and Gurman sees Google as the most likely partner on that front at the moment. Several functions will be served by Gemini or another language model from Google. For generative AI functions that require greater computing capacity, the partner’s solution can be considered.


The article is in Hungarian

Tags: Apple devices iPon

-

NEXT Galaxies may have evolved much faster at the dawn of the universe