0 0
Read Time:4 Minute, 48 Second


It is profoundly difficult to grapple with risks whose stakes may include the global collapse of civilisation, or even the extinction of humanity. The pandemic has shattered our illusions of safety and reminded us that despite all the progress made in science and technology, we remain vulnerable to catastrophes that can overturn our entire way of life. These are live possibilities, not mere hypotheses, and our governments will have to confront them.

As Britain emerges from Covid-19, it could find itself at the forefront of the response to future disasters. The government’s recent integrated review, Britain’s taking of the G7 presidency and the Cop26 climate conference, which will be hosted in Glasgow later this year, are all occasions to address global crises. But in order to ensure that the UK really is prepared, we need to first identify the biggest risks that we face in the coming decades.

Technological progress since the Industrial Revolution has ultimately increased the risk of the most extreme events, putting humanity’s future at stake through nuclear war or climate breakdown. One technology that may pose the greatest threat this century is artificial intelligence (AI) – not the current crop of narrowly intelligent networks, but more mature systems with a general intelligence that surpasses our own. AI pioneers from Alan Turing to Stuart Russell have argued that unless we develop the means to control such systems or to align them with our values, we will find ourselves at their mercy.

By my estimation, the chances of such a risk causing an existential catastrophe in the next century are about one in six: like Russian roulette. If I’m even roughly right about the scale of these threats, then this is an unsustainable level of risk. We cannot survive many centuries without transforming our resilience.

The government’s recent integrated review highlighted the importance of these “catastrophic-impact threats”, paying attention to four of the most extreme risks; the threats from AI, global pandemics, the climate crisis and nuclear annihilation. It rightly noted the crucial role that AI systems will play in modern warfare, but was silent about the need to ensure that the AI systems we deploy are developed safely and aligned with human values. It underscored the likelihood of a successful biological attack in the coming years, but could have said more about the role science and technology can play in protecting us. And although it mentioned the threat of other countries increasing and diversifying their nuclear capabilities, the decision to expand the UK’s own nuclear arsenal is both disappointing and counterproductive.

To really transform our resilience to extreme risks, we need to go further. First, we must urgently address biosecurity. As well as the possibility of a new pandemic spilling over from animals, there is the even worse prospect of an engineered pandemic, designed by foreign states or non-state actors, with a combination of lethality, transmissibility, and vaccine resistance beyond any natural pathogen. With the rapid improvements in biotechnology, the number of parties who could create such a weapon is only growing.

To meet this risk, the UK should launch a new national centre for biosecurity, as has been recommended by the joint committee on the National Security Strategy and my own institute at Oxford University. This centre would counter the threat of biological weapons and laboratory escapes, develop effective defences against biological threats and foster talent and collaboration across the UK biosecurity community. There is a real danger that the legacy of Covid-19 does not go beyond preparing for the next naturally occurring pandemic, neglecting the possibilities of a human-made pandemic that keep experts up at night.

Second, the UK needs to transform its resilience to the full range of extreme risks we face. We don’t know what the next crisis on the scale of Covid-19 will be, so we need to be prepared for all such threats. The UK’s existing risk management system, within the Cabinet Office’s civil contingencies secretariat, is strong in many ways, but it only addresses risks that pose a clear danger in the next two years – making it impossible to adequately evaluate dangers that would take more than two years to prepare for, such as those posed by advanced AI. We also suffer from the lack of a chief risk officer, or equivalent position, who could take sole responsibility for the full range of extreme threats across government.

Third, we need to put extreme risks on the international agenda. These are global problems that require global solutions. The legal scholar Guglielmo Verdirame argues that while the climate emergency and nuclear weapons are covered by at least some international law, there is no global legal regime in force that grasps the gravity of other extreme risks, or that has the necessary breadth to deal with the changing landscape of such risks. The G7 presidency is the perfect opportunity to remedy this. Rather than settle for a treaty on pandemic preparedness, as is being proposed by the prime minister, the UK could set its ambitions higher, and lead the call for a new treaty on risks to the future of humanity, with a series of UN security council resolutions to place this new framework on the strongest possible legal footing.

There is an understandable tendency for even the most senior people in government to see extreme risks as too daunting to take on. But there are concrete steps that the UK can take to transform its resilience to these threats, and there is no better time to do so than now. Covid-19 has given us the chance to make decades’ worth of progress in a matter of months. We must seize this opportunity.



Source link

Happy
Happy
0 %
Sad
Sad
0 %
Excited
Excited
0 %
Sleepy
Sleepy
0 %
Angry
Angry
0 %
Surprise
Surprise
0 %
Abhi
info@thesostenible.com

Average Rating

5 Star
0%
4 Star
0%
3 Star
0%
2 Star
0%
1 Star
0%

Leave a Reply

Your email address will not be published. Required fields are marked *