Apple will pay you up to $1 million for hacking into its private AI cloud
Apple is offering a bounty of up to $1 million to anyone who can successfully hack its AI servers and identify vulnerabilities in its Private AI Cloud, which will soon power AI features across its ecosystem. The news coincided with Apple CEO Tim Cook’s recent announcement of “Apple Intelligence” on iPhones with the new iOS 18.1 update.
In a blog post titled “Security research on Private Cloud Compute,” Apple detailed that it would reward up to $1 million for exploits capable of remotely executing malicious code on its Private Cloud Compute servers. Additionally, researchers could earn up to $250,000 for privately disclosing exploits that can access sensitive user data or the prompts customers enter into Apple’s private cloud, the iPhone giant stated.
Apple’s Private Cloud Compute runs on its custom silicon servers, which it describes as “the most advanced security architecture ever deployed for cloud AI compute at scale.”
“In the weeks after we announced Apple Intelligence and PCC, we provided third-party auditors and select security researchers early access to the resources we created to enable this inspection, including the PCC Virtual Research Environment,” Apple said in the blog.
Beyond the primary categories, Apple added that it would consider “any security issue that has a significant impact” with rewards up to $150,000 for exploits that can access user data from privileged network positions.
Apple highlighted its commitment to protecting user privacy and security, noting, “We award maximum amounts for vulnerabilities that compromise user data and inference request data outside the [private cloud compute] trust boundary.”
In addition to the $1 million bounty for major vulnerabilities allowing “remote attack on request data,” Apple offers up to $250,000 for exploits that access sensitive data or user requests beyond the trust boundary.
“Because we care deeply about any compromise to user privacy or security, we will consider any security issue that has a significant impact to PCC for an Apple Security Bounty reward, even if it doesn’t match a published category,” Apple further stated.
This program marks the latest expansion of Apple’s bug bounty initiative, which incentivizes hackers and security researchers to responsibly disclose potential flaws that could endanger user privacy. Apple has previously enhanced security by releasing researcher-only iPhones for controlled testing and vulnerability research.
In its blog, Apple shared further details on the Private Cloud Compute’s security architecture, open-source code, and documentation, describing it as an extension of users’ on-device AI—branded Apple Intelligence—capable of managing advanced AI tasks while safeguarding user privacy.
Apple explained it will “evaluate every report according to the quality of what’s presented, the proof of what can be exploited and the impact to users.”
Those interested in participating in the bounty program can find more details and submit their research through the Apple Security Bounty page.