Mythos is reportedly being used by a party without permission, raising questions about whether Anthropic can control the distribution of Mythos.
An unknown group of ‘unauthorized’ users claims to have access to Mythos, Anthropic’s much-discussed AI cybersecurity model. The group managed to gain access to the model via a third-party vendor, without Anthropic’s permission or knowledge. Bloomberg was shown screenshots and a live demonstration of Mythos as proof.
Since the announcement of Mythos, Anthropic has claimed to keep the model behind closed doors. Mythos is said to be so effective at detecting software vulnerabilities that Anthropic wants to avoid the model falling into the wrong hands at all costs. Anthropic says it is investigating the matter.
One of the unauthorized users is reportedly employed by a third-party vendor that evaluates Anthropic’s models. This gives the person in question access to Anthropic models before they are launched. To gain access to Mythos, they still had to track down the exact location of the model, which they apparently succeeded in doing. The group claims to have no malicious intent.
Curse or blessing?
Even if the latter is true, the news raises questions about whether the distribution of Mythos is truly watertight. Through ‘Project Glasswing,’ only parties authorized by Anthropic can gain access to Mythos, including Apple, Cisco, and Amazon. As the preferred cloud partner, AWS is allowed to offer Mythos via Bedrock, but only to a limited number of approved organizations.
The launch of Mythos is causing quite a stir in the security world. The model appears to be incredibly good at detecting software vulnerabilities, even those that have remained invisible to the human eye for years. This makes Mythos a powerful tool for securing software, but also a potential catastrophe if it were to fall into the wrong hands. US government agencies want access to Mythos despite their own embargo.
As such, Mythos promises to be a crucial test for Anthropic. Can Anthropic keep the model on a leash?
