Anthropology invalidates OPA’s access to Claude


Anthropology invalidated Openai API access to your models on Tuesday, multiple sources familiar with Wired. OPA was informed that its access was cut due to violations of service conditions.

Anthropological spokesman Christopher Nooli said in a statement: “Claud code has become programmers everywhere, so it’s no surprise to learn OpenAI technical staff who used our programming tools before launching GPT-5.” “Unfortunately, this is a direct violation of our service conditions.”

According to Anthropic commercial conditions, customers are prohibited from using this service to build a competitor’s product or service, including the training of competing AI or “reverse engineer or copying” services. This change is made in OpenAI access to Claude while the Cotton maker is preparing to release a new AI, GPT-5.

According to sources, Openai connected Claude to its internal tools instead of using the conventional chat interface. The sources say the company allows the company to conduct tests to evaluate Claude’s abilities in cases such as programming and creative writing against its artificial intelligence models, and how Claude respond to safety notifications including categories such as CSAM, self -harm and defamation. The results help Openai compare the behavior of their models in the same conditions and make adjustments if necessary.

“This is the standard of the industry to evaluate other artificial intelligence systems to the criterion for improvement and improvement,” said Hannah Wong, a senior communications officer at OPA.

Anthropology says “still ensuring Openai’s access to criteria and safety assessment, as it is across the industry, will continue to access API.” The company did not respond to Wired’s request to explain whether the current Claude API Openai limit on this.

Top technology companies that have access to APIs of competitors have been tactical in the technology industry for years. Facebook did the same on Twitter’s vine (which led to claims of anti -competitive behavior), and last month Salesforce restricted its competitors from access to specific data via API Slack. This is not even for anthropology for the first time. Last month, the company restricted Windsurf direct access to its models after rumors were supposed to obtain it. (This transaction disappeared).

Jared Kaplan, chief of anthropological sciences at the time, spoke about the cancellation of Windsorf access to Claude with Techcrunch, saying, “I think it will be strange to us to sell Claude to OPA.”

The day before disconnecting OpenAI to the Claude API, Anthropic announced a new rate limit on Claude code, its artificial intelligence programming tool, citing explosive use and in some cases violating its service conditions.

Leave a Reply

Your email address will not be published. Required fields are marked *