Musk, Harari, Wozniak and 1,000 more experts call for suspension of advanced AI development

29 March 2023, 22:09 | Technologies 
фото с Зеркало недели

More than a thousand experts, including SpaceX and Tesla founder Elon Musk, Apple co-founder Steve Wozniak, Israeli historian and futurist Yuval Noah Harari and others, signed a letter calling for a pause in the development of advanced artificial intelligence.. As follows from the letter, they believe that the development of artificial intelligence is a danger to the development of society and humanity as a whole..

In particular, the development of AI can lead to profound changes in the history of life on Earth, so it should be developed and managed with extreme caution..

“Unfortunately, this level of planning and management does not exist, even though in recent months the artificial intelligence laboratories have been stuck in an uncontrolled race to develop and deploy increasingly powerful digital minds that no one - not even their creators - can understand, predict.

Experts also noted that modern artificial intelligence systems can already compete with humans in common tasks, which raises a number of questions.. For example, should we allow machines to spread disinformation, should we strive for full automation of workplaces, and should we develop non-human intelligence that in the future can surpass, outsmart and replace humans.. According to experts, it is possible to develop powerful artificial intelligence systems only if we are sure that their impact will be positive, and all the risks that may be brought are avoidable..

“Therefore, we call on all AI labs to immediately suspend training on AI systems more powerful than GPT-4 for at least six months.. This pause must be public and verifiable, and must involve all key stakeholders. If such a pause cannot be introduced quickly, governments should intervene and impose a moratorium,” the letter says..

This pause is necessary for labs and independent AI experts to develop and implement security protocols that will be monitored by external experts.. These protocols will have to ensure that AI development is safe..

Experts believe that the pause will not interfere with the development of AI, but will allow moving away from the "

Previously, a study by Goldman Sachs showed that a breakthrough in the development of AI could lead to the automation of a quarter of the work done in the US and the Eurozone.. If the technology lives up to expectations, it could lead to a "

Источник: Зеркало недели