Existential risk from artificial general intelligence
Existential risk from artificial general intelligence is the hypothetical threat that dramatic progress in artificial intelligence (AI) could someday result in human extinction (or some other unrecoverable global catastrophe). The human race currently dominates other species because the human brain has some distinctive capabilities that the brains of other animals lack. If AI surpasses humanity in general intelligence and becomes "superintelligent", then this new superintelligence could become powerful and difficult to control. Just as the fate of the mountain gorilla depends on human goodwill, so might the fate of humanity depend on the actions of a future machine superintelligence.
Existential risk from artificial general intelligence
Existential risk from artificial general intelligence is the hypothetical threat that dramatic progress in artificial intelligence (AI) could someday result in human extinction (or some other unrecoverable global catastrophe). The human race currently dominates other species because the human brain has some distinctive capabilities that the brains of other animals lack. If AI surpasses humanity in general intelligence and becomes "superintelligent", then this new superintelligence could become powerful and difficult to control. Just as the fate of the mountain gorilla depends on human goodwill, so might the fate of humanity depend on the actions of a future machine superintelligence.
has abstract
Existential risk from artifici ...... problem than naively supposed.
@en
Wikipage page ID
46,583,121
Wikipage revision ID
744,907,478
subject
hypernym
comment
Existential risk from artifici ...... ure machine superintelligence.
@en
label
Existential risk from artificial general intelligence
@en