How would you feel if you’re black and your whole community has been labelled as ‘gorillas’ for the Artificial Intelligence system to recognise you? The answer is absolutely not and this is exactly why Google has been under scrutiny for labelling people for their AI recognition systems. The social media conglomerate, Facebook is no better but it agrees that its Artificial Intelligence System is not as perfect as they thought it was because earlier, the system would recognise only American people i.e., white people with fair complexion. In facial recognition programs such as Face++ black people recognised by the AI system would always be labelled as angry even when they are smiling. Facebook says that it is working towards improving its Artificial Intelligence system and as a part of the process, the company is sharing a newer and better diverse dataset with a wider Artificial Intelligence community to make the system non-biased.
According to a recent report by VentureBeat, Facebook says that it is planning to allow researchers to use dubbed casual conversations as a collection of data to test their Machine Learning and Artificial Intelligence models for bias. As mentioned in the report, the collection of datasets will include approximately 45,186 videos with more than 3,000 people. As mentioned earlier, it is a casual unscripted conversation to which there are no scripted answers for the company’s questions.
These casual conversations will include Facebook-hired individuals who have been explicitly asked to share their gender and age. As mentioned in a report by Engadget, Facebook has also hired trained professionals who will explicitly label the skin tones of people according to the Fitzpatrick scale, a professional system designed to classify human skin tones. These hired professions will additionally label ambient lighting tones as well.
Facebook agrees that its dataset is not perfect and it is working on building a better, fairer and non-biased dataset. The company has begun its testing with casual conversations as a big bold step. As mentioned in multiple other reports, Facebook has also not asked these hired participants to reveal their origin, just age and gender. This is not perfect as well because, in 2021, there is more to gender than it traditionally is, Facebook still has ‘Male’, ‘Female’ and ‘Other’ in their options. The social media conglomerate is working towards making its dataset more inclusive and less biased, so that these Artificial Intelligence systems can recognise and work with all American people, Black people and ultimately move beyond the United States to other parts of the world.