Should AI development be paused until there are safety rules?

Polls

5th April 2023

ARTIFICIAL intelligence (AI) experts and bosses of tech companies have called for a pause in the training of powerful AI systems, due to the possible risks to society and humanity.

More than 5,500 people, including Apple co-founder Steve Wozniak and Twitter boss Elon Musk (right), have signed a letter warning that AI could present huge risks to society. They reckon that it is becoming so clever it could cause political and economic disruption.

The letter says: “Powerful AI systems should be developed only once we are confident that their effects will be positive and their risks will be manageable.” The people who’ve signed the letter want a sixmonth pause in the development of AI systems more powerful than the recent GPT-4 update used by ChatGPT.

They say that if this doesn’t take place, governments should step in and force it to happen. The experts say the pause should be used by AI labs and independent experts to develop a set of safety rules for advanced AI design and development. They say these rules should be overseen by independent, outside experts.

The letter was published soon after the law enforcement agency of the European Union, Europol, warned about the possible misuse of advanced AI like GPT-4.

They said it could be used for crime and spreading dangerous fake news. For example, some people might use AI to create false videos or information about a political candidate, to try to influence how people vote.

Should powerful AI development be paused until safety rules are put in place?

10 Comments

Leave a Reply

svarghese · 1 year ago

It could lead to many deaths or severe injuries. Better to be safe than sorry!

epicstar12 · 1 year ago

It’s not good if some companies get like too much power using AI. And already schools are complaining about chat GPT usage to write essays.

mrrickroll · 1 year ago

AI is nowhere near as far forward as some people think. Most AI is really really good at 1 thing only, and sucks at pretty much everything else. Safety rules may be needed in two or three decades time but for now... Nah

sing · 1 year ago

Safety first guys. Better safe than sorry!! Stay safe everyone.

mimi365 · 1 year ago

Powerful AI development could lead to fatal injuries for many people…it should be paused until we are absolutely sure it is safe.

duolingo · 1 year ago

computers and technology are becoming really smart. pausing development would only make sure everything is safe.

coola · 1 year ago

As AI is new, we don't know about everything it can do so it may not be a good idea to use it if it is very powerful. It could cause serious damage and harm so it should not be used. As well as this, advanced AI may take jobs from people. Personally, I do not want robots doing everything!

anyu · 12 months ago

It could lead to development and potentially fill up many empty jobs ! It's worth taking the risk as the rewards are high

sailor31 · 11 months ago

I don't like the idea of AI whether there are safety rules or not

jimmym · 11 months ago

No, we nowhere near any major AI developments and so we don’t need to make any safety measures yet.