Quick News Bit

Here’s how ChatGPT-maker OpenAI says it tackles biases – Times of India

0

ChatGPT was launched last year. While some heap praises on the AI chatbot‘s ability to deliver human-like responses, others target both OpenAI and ChatGPT, accusing them of bias. The company has now addressed the issue explaining how ChatGPT’s behaviour is shaped and how the company plans to improve ChatGPT’s default behaviour.
“Since our launch of ChatGPT, users have shared outputs that they consider politically biased, offensive, or otherwise objectionable. In many cases, we think that the concerns raised have been valid and have uncovered real limitations of our systems which we want to address,” the company said in a blog post.
OpenAI also said that it has also seen “a few misconceptions about how our systems and policies work together to shape the outputs you get from ChatGPT.”

“Biases are bugs”
In the blog, OpenAI acknowledged that many are rightly worried about biases in the design and impact of AI systems. It added that the AI model is trained by the data available and inputs by the public who use or are affected by systems like ChatGPT.
“Our guidelines are explicit that reviewers should not favour any political group. Biases that nevertheless may emerge from the process described above are bugs, not features,” the startup said. It further said that it is the company’s belief that technology companies must be accountable for producing policies that stand up to scrutiny.
“We are committed to robustly addressing this issue and being transparent about both our intentions and our progress,” it noted.
OpenAI said that it is working to improve the clarity of these guidelines and, based on the learnings from the ChatGPT launch, it will provide clearer instructions to reviewers about potential pitfalls and challenges tied to bias, as well as controversial figures and themes.
As a part of its transparency initiatives, OpenAI is also working to share aggregated demographic information about the reviewers “in a way that doesn’t violate privacy rules and norms,” because this is an additional source of potential bias in system outputs.
The company is also researching on how to make the fine-tuning process more understandable and controllable.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsBit.us is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a comment