iOS 18 Improves iPhone’s Neural Engine Performance by 25 Percent, Geekbench Score Suggests

iOS 18 – Apple’s latest operating system for the iPhone – appears to be boosting the Neural Engine performance of the handset by as much as 25 percent, according to claims online. The update, announced at the Worldwide Developers Conference (WWDC) 2024 on June 10, is expected to bring several artificial intelligence (AI) features to the iPhone, in addition to other quality-of-life improvements. As part of this AI initiative, it has allegedly given a boost to the handset’s Neural Engine.

iOS 18 brings better NPU performance

According to a post by user @lafaiel on X (formerly Twitter), an iPhone 15 Pro Max running iOS 18 scored 7,816 points in the Geekbench ML Score test compared to a score of 6,249 on iOS 17.5.1. This translates into an approximately 25 percent better performance in the handset’s machine learning capabilities.

The CoreML Neural Engine Inference was tested in the benchmarks which is responsible for the machine learning tasks on the handset, taking advantage of hardware such as the CPU and the NPU (or in Apple’s terms, the Neural Engine). While an NPU works along the same lines as a Graphics Processing Unit (GPU), instead of accelerating the graphics, it boosts the neural network operations.

iOS 18 is set to bring several notable upgrades to the iPhone, the centre of which are the new AI capabilities that the company calls “Apple Intelligence”. Courtesy of it, the update will bring system-wide text-generation and summarisation capabilities. The iPhone will also support image generation with the help of a new app called Image Playground. Furthermore, Siri – Apple’s voice assistant – is also getting smarter with On-Screen Awareness.

Compatibility

Notably, only the iPhone 15 Pro and iPhone 15 Pro Max powered by the A17 Pro SoC will receive the new AI features across systems. Although Apple has seeded the iOS 18 Developer Beta 1 update to users around the world, Apple Intelligence features will be rolled out later this year and initially only in English (US).

iOS 18 compatibility list includes iPhone XR and later models, up to the iPhone 15 Pro Max.


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Apple Details Its Private Cloud Compute System, Promises Stateless Computation and Verifiable Transparency

Apple Intelligence took centre stage at this year’s Worldwide Developers Conference (WWDC) 2023, highlighting new artificial intelligence (AI) features that will debut with the upcoming iOS 18, iPadOS 18, and macOS Sequoia. During the event, the tech giant revealed that some of the processing for the AI features will be done on-device and the more complex tasks will be handled by its Private Cloud Compute (PCC) system. Apple has also shared details of its PCC architecture and claimed that there is a heavy focus on data privacy and safety.

Apple shares Private Cloud Compute details

Craig Federighi, the Senior Vice President of Software Engineering at Apple, said during the event, “Your data is never stored or made accessible to Apple”. While Apple Intelligence has created a sense of curiosity among many users, some have also appeared sceptical of the company’s ability to fulfil these claims. Among them was Tesla CEO Elon Musk who posted on X (formerly known as Twitter), “It’s patently absurd that Apple isn’t smart enough to make their own AI, yet is somehow capable of ensuring that OpenAI will protect your security & privacy!”. Notably, Apple has stated that it is using its in-house AI models for both on-device and server-based computing.

Now, Apple has shed more light on how its Private Cloud Compute will function in a blog post. Explaining the data security issues with traditional cloud servers, the tech giant claimed that it is building custom infrastructure with key changes to keep user data secure. There are three important pillars — Stateless computation, Non-targetability, and Verifiable transparency.

Private Cloud Compute’s Stateless computation

Traditionally, cloud servers have a straightforward workflow. Data is pinged to the servers where the cloud computers first log it using the credentials of the user. This enables the servers to ping the information back to the user after running the task. Cloud servers also store some or all of the data to offer it to the user as backup, in case the information is requested again (due to the file being corrupt or accidental deletion). This also helps in cost optimisation as the servers do not have to compute the data again.

In contrast, Apple said that its Private Cloud Compute run “stateless data processing” where the user’s device sends data to PCC for the sole purpose of fulfilling the user’s inference request. It also claimed that the user data remains on the server only till it is returned to the device and “no user data is retained in any form after the response is returned.” The company added that user data is not retained even via logging or for debugging.

It also claimed that even Apple staff with privileged runtime access cannot bypass the stateless computation guarantee.

Private Cloud Compute’s Non-Targetability

Cloud servers also face threats externally from hackers and bad actors who try to find vulnerabilities to breach the system. Apple said it has developed two measures to defend user data from attackers.

First, the tech giant is using the protections of Apple silicon and other connected hardware to ensure that hardware attacks would be rare. Due to Apple’s experience in running cloud operations, it has developed hardware that narrows down the possibility of cyberattacks. Further, it adds that any hardware attack at scale would be both “prohibitively expensive and likely to be discovered.”

For small-scale attacks, Apple claims that its extensive revalidation at data centres (once data arrives and before it reaches cloud computers for processing) ensures that hackers cannot target the data of a specific user.

“To guard against smaller, more sophisticated attacks that might otherwise avoid detection, Private Cloud Compute uses an approach we call target diffusion to ensure requests cannot be routed to specific nodes based on the user or their content,” the tech giant added.

Private Cloud Compute’s Verifiable transparency

Finally, Apple is inviting security researchers to verify the end-to-end security and privacy measures of the Private Cloud Compute system. It claimed that once PCC is launched, it will make software images of every production build of the cloud system publicly available for security research.

To further aid research, Apple will publish every production Private Cloud Compute software image for binary inspection across the OS, applications, and all other executable nodes. Researchers will be able to verify against the measurements in the transparency log. Researchers will be offered rewards for finding flaws in the system.


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Apple Planning AI Integrations With Google Gemini and Other AI Models in the Future: Report

Apple surprised many when it announced the integration of OpenAI’s artificial intelligence (AI) chatbot ChatGPT across its upcoming operating systems during the Worldwide Developers Conference (WWDC) 2024 keynote session. This is the first time the tech giant has given a third-party systemwide access to its platform. However, this is just the beginning, if a new report is to be believed. The company is reportedly ready to open its platform for further AI integrations with Google’s Gemini and other leading AI models.

Apple reportedly open for AI integration with Google Gemini

According to a report by 9to5Mac, Apple shared its future AI plans in a post-WWDC keynote discussion. During the session that was moderated by YouTuber iJustine, Craig Federighi, Senior Vice President of Software Engineering at Apple hinted that Gemini could also be integrated with Apple platforms in the future.

“And so we may look forward to doing integrations with different models like Google Gemini in the future. I mean, nothing to announce right now, but that’s our direction,” Federighi was quoted as saying.

Reportedly, Federighi’s mention of Google Gemini was in context to Apple’s intentions of bringing multiple AI models to iOS, iPadOS, macOS, and other operating systems. He highlighted that the company wanted to give users a preference in using an AI model of their choosing as different large language models (LLMs) could be adept in different tasks.

He also added that the decision to onboard ChatGPT first was taken as the tech giant wanted to “start with the best,” as per the publication.

Apple integrates ChatGPT across operating systems

During the WWDC 2024 keynote session, Apple announced that it was integrating ChatGPT access within its upcoming operating systems for iPhone, iPad, and Mac devices. The chatbot will bring various AI features across text and images. Siri will also be able to tap into ChatGPT’s knowledge base to bring answers to various user queries. However, users will have to opt-in before Siri can share the query and any photo or document with the chatbot.

ChatGPT’s capabilities will also be available for users via the systemwide Writing Tools. The Compose feature will also allow users to generate images using the chatbot. Notably, Apple is leveraging GPT-4o-powered ChatGPT for its AI features.


Affiliate links may be automatically generated – see our ethics statement for details.

Check out our Latest News and Follow us at Facebook

Original Source

Exit mobile version