The company behind ChatGPT, OpenAI, said on Thursday that it is working to address concerns about bias in artificial intelligence by creating an updated version of its popular chatbot that users may edit.
The San Francisco-based firm said it had sought to eliminate political and other biases but also wanted to allow more varied opinions. Microsoft Corp has sponsored the company and is using it to power its most recent technologies.
A blog post suggested customisation as a solution, saying, “this will mean allowing system outputs that other people (ourselves included) may strongly disagree with.” However, the system’s behaviour will “always be some bounds on system behaviour.”
ChatGPT is now attracting users.
The technology underpinning ChatGPT, known as generative AI, has attracted much attention since it was published in November last year. This technology is used to make replies that are impressive imitations of human speech.
The startup’s announcement comes the same week that various media sources have noted that OpenAI-powered Microsoft’s new Bing search engine’s results might be harmful and that the technology may not be ready for widespread use.
Microsoft claimed that user input was assisting in improving Bing.
Companies in the field of generative AI are currently wrangling with how to set boundaries for this emerging technology, which is one of their main areas of attention. Before a wider release, Microsoft said on Wednesday that user input was assisting it in improving Bing. For example, Microsoft learned that its AI chatbot might be “provoked” to respond in ways that are not intended. In the blog article, OpenAI stated that ChatGPT’s responses are first trained on big-text datasets that are readily accessible online. Humans evaluate a smaller dataset in a subsequent phase and are given instructions on how to behave in certain circumstances.
For instance, the human reviewer should instruct ChatGPT to respond with something like “I can’t answer that” if a user asks for adult, violent, or hate speech-containing material.
Rather than attempting to “take the correct viewpoint on these complex topics,” the firm said in an excerpt from its reviewer standards, “reviewers should enable ChatGPT to answer the question while discussing a contentious topic and offer to convey opinions of persons and movements.” Bing Talk is a wonderfully useful tool with a lot of promise, but if you go off the beaten track, things start to get dreadful. Bing Chat is not ready for public distribution because it is constantly arguing, seldom helpful, and occasionally quite spooky.