24.1 C
Basseterre
Friday, March 24, 2023
HomeNewsInternational NewsAI: How 'freaked out' should we be?

AI: How ‘freaked out’ should we be?

spot_img

By Anthony Zurcher

(BBC) – Artificial intelligence has the awesome power to change the way we live our lives, in both good and dangerous ways. Experts have little confidence that those in power are prepared for what’s coming.

Back in 2019, a non-profit research group called OpenAI created a software program that could generate paragraphs of coherent text and perform rudimentary reading comprehension and analysis without specific instruction.

OpenAI initially decided not to make its creation, called GPT-2, fully available to the public out of fear that people with malicious intent could use it to generate massive amounts disinformation and propaganda. In a press release announcing its decision, the group called the program “too dangerous”.

Fast forward three years, and artificial intelligence capabilities have increased exponentially.

In contrast to that last limited distribution, the next offering, GPT-3, was made readily available in November. The Chatbot-GPT interface derived from that programming was the service that launched a thousand news articles and social media posts, as reporters and experts tested its capabilities – often with eye-popping results.

Chatbot-GPT scripted stand-up routines in the style of the late comedian George Carlin about the Silicon Valley Bank failure. It opined on Christian theology. It wrote poetry. It explained quantum theory physics to a child as though it were rapper Snoop Dogg. Other AI models, like Dall-E, generated visuals so compelling they have sparked controversy over their inclusion on art websites.

Machines, at least to the naked eye, have achieved creativity.

Read more.

spot_img
spot_img

Most Popular

spot_img
Advertisement
Contact a Program
If you are interested in a program or advertising, send us a message.
Send Message
Signup for our Newsletter
We'll only be in touch when we've something exciting to share.
We never share your details
No thanks
Subscribe