Humanity is as close to destroying itself as ever, and advancements in technology are partially to blame.
That's according to The Bulletin of the Atomic Scientists, a group of scientists with the blithe task of informing people exactly how we are bringing about the end of the world. Started by some who helped build the atomic bomb as part of the Manhattan Project, The Bulletin keeps a "Doomsday Clock," which is remaining set at three minutes to midnight as a symbol of our impending self-destruction.
While the possibility of nuclear war and the effects of climate change are largely to blame for our perilous situation, not paying enough attention to "emerging technological threats" is another danger to our existence.
"The fast pace of technological change makes it incumbent on world leaders to pay attention to the control of emerging sciences that could become a major threat to humanity," the group wrote.
Advances in biotech, artificial intelligence (especially when it comes to robotic weaponry) and developments in the cyber realm (which we take to mean more sophisticated cyber threats) all have the potential to wreak havoc and are in need of regulation, the group continued.
"The international community needs to strengthen existing institutions that regulate emergent technologies and to create new forums for exploring potential risks and proposing potential controls on those areas of scientific and technological advance that have so far been subject to little if any societal oversight," The Bulletin recommended, calling for all sectors of society to address the "potential devastating consequences of these technologies."
It all may sound a little tin-foil hat, but others have similarly cautioned against letting technology become so advanced that we can no longer regulate or control it.
Elon Musk recently helped form a non-profit company with the aim of advancing digital intelligence for the common good. Called OpenAI, the group is equally wary of artificial intelligence's potential to do harm.
And Musk, along with Stephen Hawking, Steve Wozniak and Noam Chomsky, all signed an open letter in July 2015 calling for a ban on autonomous military weapons that could trigger an AI arms race. Hawking also co-authored an op-ed warning against allowing AI to run amok.
Will these fellows and today's Doomsday Clock pronouncement prevent an ultra-frightening future from unfolding? We can only hope.
Article continues below