AI-made photos have experts worried

WhatsAppWhatsAppFacebookFacebookTwitterTwitterPinterestPinterestRedditRedditGmailGmailShareShare

 Beijing BeijingUS: People have recently supplied cutting-edge AI systems, such as OpenAI’s DALL-E 2 and Google Research’s Imagen, text descriptions that these algorithms can utilize to create extraordinarily detailed, realistic-looking images.

As a result, the images can be comical or even reminiscent of great art, and they’re being shared widely on social media, including by key players in the IT industry. Objects can be added or removed via DALL-E 2 (which is a newer version of a comparable but less capable AI system OpenAI released last year).

In the future, such on-demand picture generating could serve as a powerful tool for the creation of all sorts of creative output, including art and advertising; DALL-E 2 and a similar system, Midjourney, have already been utilized to assist in the creation of magazine covers. OpenAI and Google have suggested that the technology could be used for image editing or stock photo creation.

There is no public release date for DALL-E 2 or Imagen at this time. As with many other systems already in use, these systems can produce findings that reflect the gender and cultural biases of the data on which they were trained — data that includes millions of photographs scraped from the internet.

Related Posts

Using technology to spread harmful stereotypes and biases can be dangerous. As these systems are capable of creating a wide variety of images from text and can be programmed to generate them, there is a fear that they could be used to automate bias on a vast scale. They can also be used for malicious objectives, such as propagating misinformation.

It’s only recently that the public has been aware of the prevalence of artificial intelligence (AI) in daily life, as well as the potential for prejudices based on gender, race, and other factors. Concerns about the accuracy and potential for racial prejudice in facial-recognition systems have risen sharply in recent years.

It’s no secret that gender and racial prejudices abound in today’s artificial intelligence (AI) systems. Both OpenAI and Google Research have admitted as much in their published research and documentation.

Researchers are still discovering how to measure AI bias, according to OpenAI policy research program manager Lama Ahmad, and OpenAI can utilize what it learns to fine-tune its AI over time. To better comprehend DALL-E 2 and provide input, Ahmad spearheaded OpenAI’s collaboration with a group of independent specialists earlier this year.

Burapha

Sawadee-khrup. I am a multicultural Thai newswriter that is always on the lookout for daily news that are intriguing and unique in my native country Thailand.

Recent Posts

A Poverty Crisis in Asia: Cardinal Aspects and Sustainable Solutions

Asia, a continent varied in culture and economic endowment, is, however, home to some of the gravest poverty challenges found…

March 24, 2025

China Seeks US Investment Despite Trump’s 20% Tariffs

Vice Premier of China, He Lifeng, met business executives at Apple Pfizer and Mastercard on Sunday to discuss trade solutions…

March 24, 2025

Three Weather Systems to Bring Rains Across the Philippines

The Philippine Atmospheric, Geophysical and Astronomical Services Administration PAGASA reported that rains will be delivered by three weather systems in…

March 24, 2025

Muara Tebas Chosen as Site for New Royal Malaysian Navy Headquarters

Royal Malaysian Navy, or TLDM, as it is popularly known, is advancing toward the commissioning of its Naval Region 4…

March 24, 2025

India’s Goli Pop Soda Makes Global Comeback Through Strategic Partnership

Goli soda has received increased worldwide popularity after Fair Exports formed a partnership with Lulu Hypermarket to reintroduce Goli Pop…

March 23, 2025

Asia’s rapid economic growth and its impact on the global economy

Emerging as the economic powerhouse of the world over the past few decades, Asia is truly going through something unprecedented.…

March 22, 2025