White House: tech executives were told that they are responsible for the general public’s safety
The White House gathered tech executives on Thursday and told them they had to defend the public from the risks posed by artificial intelligence (AI).
It was claimed that Sam Altmann of OpenAI, Satya Nadella of Microsoft, and Sundar Pichai of Google had an “ethical” obligation to protect society.
Recently introduced AI products like Bard and OpenAI have captured the interest of the public.
At a White House meeting on Thursday, technology leaders were warned that the government was open to new laws and legislation involving artificial intelligence and that it was up to companies to “ensure the security and safety of their technologies.”
The new technology could endanger safety, privacy, and civil rights, but it also has the ability to enhance lives, according to US Vice President Kamala Harris in a statement released after the meeting.
The “National Science Foundation” will invest $140 million (£111 million) in seven new AI research centres, the White House stated.
Geoffrey Hinton, the developer of AI, resigned from his position at Google earlier this week, claiming he now regretted his work.
Elon Musk and Apple’s Steve Wozniak asked for a halt to the technological deployment in a letter published in March.
Aside from worries that chatbots like ChatGPT and Bard might be unreliable and spread false information, there are also worries that AI could quickly displace people from their jobs.
Another worry is that generative AI might violate copyright regulations. AI for voice cloning might make fraud worse. Videos produced by AI have the potential to propagate false information.
Nevertheless, proponents like Bill Gates have rebuffed calls for an AI “pause,” claiming that such a step would not “solve the difficulties” ahead.
The optimum use of AI advancements, according to Mr. Gates, should be the main focus.
Others, however, think that there is a risk of excessive regulation, which would favour Chinese tech firms strategically.