Apple offers $1 million for anyone who can uncover vulnerabilities in its new AI system (Unsplash)
Apple has launched a security challenge for its new artificial intelligence system, promising up to $1 million to anyone who can find vulnerabilities in Private Cloud Compute.
This technology was developed to process artificial intelligence data with complete privacy, promising full protection and security for the user and ensuring that no personal information is collected—even by Apple itself.
Private Cloud Compute is a cloud infrastructure built with advanced security and end-to-end encryption, designed to process complex AI data privately and delete all information immediately after use.
Apple describes this system as the most advanced security technology for AI cloud computing and encourages the cybersecurity community to test this claim by attempting to breach the system.
To facilitate testing, Apple provides participants with access to the Private Cloud Compute source code and a security guide, with rewards ranging from $100,000 to $1 million for those who find significant flaws. In this way, Apple aims not only to validate its privacy and security promises but also to improve the system with contributions from the technical community.
Source: LADbible | Photo: Unsplash | This content was created with the help of AI and reviewed by the editorial team