Apple just made an announcement that shows it means business when it comes to keeping Apple Intelligence secure. The company is offering a massive bug bounty of up to $1 million to anyone who will be able to hack its AI cloud, referred to as Private Cloud Compute. These servers will take over Apple Intelligence tasks when the on-device AI capabilities just aren’t good enough — but there are downsides, which is why Apple’s bug-squashing mission seems like a good idea.

As per a recent Apple Security blog post, Apple has created a virtual research environment and opened the doors to the public to let everyone take a peek at the code and judge its security. The PCC was initially only available to a group of security researchers and auditors, but now, anyone can take a knack at trying to hack Apple’s AI cloud.

A lot of Apple Intelligence tasks are said to be done on-device but for more complex demands, the PCC steps in. Apple offers end-to-end encryption and only makes the data available to the user to ensure that your private requests remain just that — private. However, with sensitive data like what AI might handle, be it on Macs or iPhones, users are right to feel concerned about the potential of the data leaving their device and ending up in the wrong hands.

That’s presumably partly why Apple is now reaching out to anyone who’s interested and offering up to $1 million for hacking the Private Cloud Compute. The company provides access to the source code for some of the most important parts of PCC, which will make it possible for researchers to dig into its flaws.






Share.
Exit mobile version