Home News OpenAI launches a red teaming network to make its models more robust

OpenAI launches a red teaming network to make its models more robust

by WeeklyAINews
0 comment

In its ongoing effort to make its AI methods extra strong, OpenAI at this time launched the OpenAI Pink Teaming Community, a contracted group of specialists to assist inform the corporate’s AI mannequin danger evaluation and mitigation methods.

Pink teaming is turning into an more and more key step within the AI mannequin growth course of as AI applied sciences, notably generative applied sciences, enter the mainstream. Pink teaming can catch (albeit not repair, essentially) biases in fashions like OpenAI’s DALL-E 2, which has been found to amplify stereotypes round race and intercourse, and prompts that may trigger text-generating fashions, together with fashions like ChatGPT and GPT-4, to disregard security filters.

OpenAI notes that it’s labored with outdoors specialists to benchmark and take a look at its fashions earlier than, together with folks collaborating in its bug bounty program and researcher entry program. Nonetheless, the Pink Teaming Community formalizes these efforts, with the objective of “deepening” and “broadening” OpenAI’s work with scientists, analysis establishments and civil society organizations, says the corporate in a weblog submit.

“We see this work as a complement to externally-specified governance practices, equivalent to third-party audits,” OpenAI writes. “Members of the community shall be referred to as upon based mostly on their experience to assist pink crew at numerous levels of the mannequin and product growth lifecycle.”

Exterior of pink teaming campaigns commissioned by OpenAI, OpenAI says that Pink Teaming Community members could have the chance to interact with one another on basic pink teaming practices and findings. Not each member shall be concerned with each new OpenAI mannequin or product, and time contributions — which might be as few as 5 to 10 years a 12 months — shall be decided with members individually, OpenAI says.

See also  Meet Reducto: An AI-Powered Startup Building Vision Models to Turn Complex Documents into LLM-Ready Inputs

OpenAI’s calling on a broad vary of area specialists to take part, together with these with backgrounds in linguistics, biometrics, finance and healthcare. It isn’t requiring prior expertise with AI methods or language fashions for eligibility. However the firm warns that Pink Teaming Community alternatives is perhaps topic to non-disclosure and confidentiality agreements that might impression different analysis.

“What we worth most is your willingness to interact and convey your perspective to how we assess the impacts of AI methods,” OpenAI writes. “We invite functions from specialists from world wide and are prioritizing geographic in addition to area variety in our choice course of.”

The query is, is pink teaming sufficient? Some argue that it isn’t.

In a current piece, Wired contributor Aviv Ovadya, an affiliate with Harvard’s Berkman Klein Heart and the Centre for the Governance of AI, makes the case for “violet teaming”: figuring out how a system (e.g. GPT-4) may hurt an establishment or public good after which supporting the event of instruments utilizing that very same system to defend the establishment and public good. I’m inclined to agree it’s a sensible concept. However, as Ovadya factors out his column, there’s few incentives to do violet teaming, not to mention decelerate AI releases sufficient to have enough time for it to work.

Pink teaming networks like OpenAI’s appear to be the perfect we’ll get — not less than for now.

Source link

You may also like

logo

Welcome to our weekly AI News site, where we bring you the latest updates on artificial intelligence and its never-ending quest to take over the world! Yes, you heard it right – we’re not here to sugarcoat anything. Our tagline says it all: “because robots are taking over the world.”

Subscribe

Subscribe my Newsletter for new blog posts, tips & new photos. Let's stay updated!

© 2023 – All Right Reserved.