Friday, 20 November 2015

The universe's resolution limit: why we may never have a perfect view of distant galaxies

James Geach, School of Physics, Astronomy and Mathematics
Can you make out the dot at the bottom of this question mark? What if you stand a few metres away? The finest detail the average human eye can distinguish is about the size of a full stop seen at a distance of a metre. This is called “resolution”. The best resolution for an optical system – like the eye – is roughly given by the ratio of the wavelength of the light you’re viewing in and the size of the aperture that light is passing through.

In astronomy, resolution works just the same. This explains why we build increasingly large telescopes: not only can big telescopes collect more light and therefore see further, the bigger the aperture of the telescope, in principle the better the image.

But now a new study has suggested that the universe actually has a fundamental resolution limit, meaning no matter how big we build our telescopes we won’t see the most distant galaxies as clearly as we would like.

The trouble with telescopes

The largest visible-light telescopes on Earth, such as the Very Large Telescopes and the Keck telescopes, have mirrors about ten metres in diameter, and there are now plans to build telescopes with diameters of 30m to 40m (so-called Extremely Large Telescopes). But there’s a problem: if light from an object (be it a candle, streetlight or star) is perturbed on its journey from source to detection, then we will never be able to produce an image as sharp as the theoretical maximum, no matter how big we make the aperture.

The huge primary mirror of the James Webb Telescope.

We know light can play tricks on us. Ever looked at the bottom of a swimming pool and seen the tiles appear to ripple and dance? Or put a straw into a glass of water and seen it seemingly “break” between the air and the liquid? Light travelling to our telescopes from space has to pass through a turbulent atmosphere, and this causes problems for astronomers.

Like a perfect parallel set of ocean waves encountering a submerged reef, the atmosphere disturbs the waves’ propagation. For electromagnetic waves – light – this has the effect of blurring images. Unless we compensate for it, it means we never reach the theoretical maximum resolution for a telescope. Putting telescopes in space, above the atmosphere, is one solution, but is costly. “Adaptive optics” is another, but is technically challenging.

Quantum foam

The new study, presented at the International Astronomical Union General Assembly this year, makes a prediction about the nature of space using the strange world of quantum physics. It argues that the nature of space-time on the quantum level might give rise to a kind of “fundamental resolution limit” of the cosmos, meaning there might be a cause to be concerned about how clearly future telescopes will be able to see the most distant galaxies.

The idea is as follows. According to quantum mechanics, on the smallest of scales, known as the Planck scale, some 10-35 m (yes, that’s a decimal point with 34 zeros after it before you get to the one), space is described as “foamy”. On those small scales, quantum physics predicts that the universe is seething with so-called “virtual particles” which pop into existence and then quickly annihilate each other – something seen constantly in particle physics experiments. However, for the briefest of moments those particles have energy and therefore – according to the famous equation E=mc2 – mass.

Any mass, no matter how small, is predicted to warp space-time. This is Einstein’s description of gravity. The most dramatic example of this phenomenon in nature is in the gravitational lensing of distant galaxies by massive clusters. Photons – particles of light – travelling through such foaming space-time would be affected by such fluctuations in a similar manner to light passing through our thick and turbulent atmosphere.
Of course, the effect is tiny – almost negligible. But a photon emitted from a distant galaxy making the journey across the universe has to travel a long way. On this journey, the countless “phase perturbations” caused by the foamy nature of space-time might add up. Now, the prediction is that this effect is smaller than even the finest images we can currently make with the best telescopes. But – if the theory is correct – then this cosmic blurring might be apparent in images of distant galaxies made by next-generation telescopes. These include the Hubble’s successor the James Webb Space Telescope, due for launch in 2018.

However, there is so far no accepted theory uniting Einstein’s view of gravity with quantum mechanics – that is one of the key goals of modern physics – so we should take this prediction with a pinch of salt. Even if it is correct, its effects will only really be frustrating to the group of astrophysicists studying the detailed structure of the most distant galaxies.

What’s fascinating is the implication that no matter how big we make our telescopes here on Earth or in space, there is a fundamental natural resolution limit to the universe beyond which we cannot probe, born out from quantum processes, but manifested on cosmological scales. Like a cosmic conspiracy, some of nature’s secrets may be forever concealed.

The Conversation
James Geach, Royal Society University Research Fellow, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Friday, 6 November 2015

Six reasons why China's economy is weaker than you think

Geoffrey M. Hodgson, Hertfordshire Business School
The UK has rolled out the red carpet for Chinese president Xi Jinping on his five-day official visit. He is being given the royal treatment, including a stay at Buckingham Palace, a ride in a state carriage along The Mall and several banquets. The trip will also include plenty of time with the British prime minister, David Cameron, who is keen to discuss the trade and investment that the UK hopes to secure from the visit.
Britain’s pivot to China is largely based on its economic strength. And yet there is cause for concern. Having been the locomotive for global growth following the financial crisis in 2008, Chinese growth has now slowed and its economy is looking increasingly fragile. The latest GDP figures came in at just under 7%, significantly down from the astounding annual rate of more than 9% per year between 1990 and 2010.

Exports from China have declined, and exports to China must battle against the depreciating yuan. China’s slowdown has depressed global commodity prices, adversely affecting big exporting countries such as Brazil and Russia.

Some leading economists have been very optimistic about China. Nobel Laureate Robert Fogel published an article in 2010 that predicted that China’s GDP will grow at an average annual rate of more than 8% until 2040, when its GDP per capita would be twice that projected for Europe and similar to that in the United States. Fogel used a textbook method of analysis to predict an unrelenting upward path.

But as countries grow, their service sectors tend to increase as a proportion of output and employment. Rates of growth of productivity in services tend to be much lower than in manufacturing or agriculture. Hence, in any economy, growth rates are likely to slow down through changes in economic structure. There are several other reasons why China’s economic growth is set to stall.

1. Demographic shifts

China will experience an adverse demographic shift in the coming decades. Three decades of the one-child policy has reduced the number of adults of working age. The recent and ongoing relaxation of that policy, plus a big decline in infant mortality, increases the number of children. Older people are living longer, due to improved healthcare and reduced poverty. Hence the average number of children and old people, which needs to be supported by each person in work, is set to increase dramatically.

China’s population is ageing. Hung Chung Chih /

2. Chinese GDP per capita is still low

GDP is way below that of the US and other developed countries. World Bank Figures for 2014 put China’s GDP per capita at about 24% of that in the US. In the 20th century, only five countries managed to grow from 24% or less of US GDP per capita to 60% or more of US GDP per capita. They were Japan, Taiwan, South Korea, Singapore and Hong Kong. China still has a long way to go.

3. Lack of democracy

While there is some evidence that autocratic governments can help economic development at lower stages of development, particularly by promoting basic industry and infrastructure, there is strong evidence that democratic institutions are much more suited to higher levels of development. Notably, when Japan, Taiwan and South Korea reached about 45% of US GDP per capita, they were established or emerging democracies. A transition to a more democratic government may be necessary as China develops, but this would be very difficult to achieve – and could be highly disruptive.

4. Lack of openness

A democratic government is but one part of a constellation of vital institutions. As Nobel Laureate Douglass North and his colleagues have argued, dynamic modern economies need checks, balances and countervailing power to minimise arbitrary confiscation by the state. Legal systems have to develop significant autonomy from the political elite. In my book Conceptualizing Capitalism I show that absolute GDP per capita in a sample of 97 countries is strongly correlated with absence of corruption and openness of government. China is not an outlier in this test.

5. Problems with land and property rights

Unrest in the Chinese village of Shangpu was triggered over an unpopular land deal. REUTERS/James Pomfret

China’s population is divided into two classes. Chinese citizens are registered with either an urban or rural classification, depending on where they are born. Urban registrants have better education and health services.

Many rural registrants, meanwhile, have rights to the use of land. But these are often anulled after local party officials are bribed by business speculators and sell the land for profit. Frequent local protests result and the whole system of land use is in dire need of radical reform. Currently it fosters corruption and inhibits the skill development of half of the Chinese population.

6. Lack of homegrown talent

Although there are many small firms in China, there are still few mainland-registered large firms. Barry Naughton has noted that of the top ten firms in China exporting high-tech products, nine were foreign. Offshore registration is understandable, because fear of state sequestration persists in a country that did not recognise private property rights in its constitution until 2007. China’s financial system is very heavily concentrated in state hands, with punitive penalties on private lending.

Thus, there are weighty institutional and demographic drags on further rapid growth in China, especially as it enters intermediate levels of economic development that are ill-suited to the continuance of a one-party state. China can succeed, but only through massive and potentially destabilising reform of its political and economic institutions. We should not be surprised by even lower growth rates in the future.

The Conversation
Geoffrey M. Hodgson, Research Professor, Hertfordshire Business School, University of Hertfordshire
This article was originally published on The Conversation. Read the original article.

Wednesday, 4 November 2015

Unheeded cybersecurity threat leaves nuclear power stations open to attack

Nasser Abouzakhar, School of Computer Science
There has been a rising number of security breaches at nuclear power plants over the past few years, according to a new Chatham House report which highlights how important systems at plants were not properly secured or isolated from the internet.

As critical infrastructure and facilities such as power plants become increasingly complex they are, directly or indirectly, linked to the internet. This opens up a channel through which malicious hackers can launch attacks – potentially with extremely serious consequences. For example, a poorly secured steel mill in Germany was seriously damaged after being hacked, causing substantial harm to blast furnaces after the computer controls failed to shut them down. The notorious malware, the Stuxnet worm, was specifically developed to target nuclear facilities.

The report also found that power plants rarely employ an “air gap” (where critical systems are entirely disconnected from networks) as the commercial and practical benefits of using the internet too often trump security.

In one case in 2003, an engineer at the Davis-Besse plant in Ohio used a virtual private network connection to access the plant from his home. While the connection was encrypted, his home computer was infected with the Slammer worm which infected the nuclear plant’s computers, causing a key safety control system to fail. In another incident in 2006 at the Browns Ferry plant in Alabama the controllers for one of the reactors’ cooling pumps failed in reaction to a flood of network traffic on the plant’s internal computer network, which required the reactor to be shut down manually to avoid a meltdown. Although this wasn’t due to any sort of cyber-attack, it shows the susceptibility of industrial control systems to the sorts of events that could be triggered by malicious actors or by malware like Slammer.

The report also found that there is a general lack of knowledge of cybersecurity on the part of management who have generally shown a poor understanding of good “IT hygiene” and how it relates to security. It was quite common, the report said, for factory default passwords to be left unaltered and off-the-shelf software to be used despite known issues that were left unaddressed.

The problem is that the industrial communication protocols and mechanisms still commonly used in nuclear power plants were designed in an era before the internet and cyber-threats were a consideration. These are often insecure and not designed to deal with such challenges. Most of the legacy communication protocols such as Profibus, DNP3 and OPC are still vulnerable to various attacks as they lack any proper authentication techniques.

This means that all a malicious hacker might need to get inside a nuclear power station’s network is Google. Using search terms relevant to the software in use in the plant, Google can turn up direct links to websites leading into its network – with little or no security in the way.

An example of this is the technique used to hack internet-connected webcams. Searching for text used in the webcam login page, Google will turn up links to cameras all over the world. Many users fail to change the default username and password (which are easily found online), meaning that the cameras can be accessed and controlled with ease.

The same sort of techniques can be used to locate, not webcams, but web-connected industrial devices potentially providing access to important facilities. Search engines such as Shodan can identify these sorts of devices and even use geo-location to pin down their physical location.

There’s a lot of civil infrastructure, and a lot of it is vulnerable. Bill Ebbesen, CC BY

Mind the gaps

Unauthorised access by hackers to important systems in a power plant is a serious matter: anything that damages or disturbs the balance of operations within the plant could lead to a shutdown or even dangerous situations when shutdown routines fail, while power surges within the plant could affect transmission infrastructure outside. Whether we are talking about a nuclear power plant or not, the end result is likely to be production failures or financial losses, or even injury and death in. Of course, with a nuclear power plant the risks are that much greater because of the radioactive fuel in use.

Managing cybersecurity risks is challenging – and the Chatham House report makes several recommendations: integrated risk assessments to ensure security measures are properly implemented, and penetration testing where experts attempt to pry into and circumvent security measures, to ensure that the plant’s staff find any security holes before hackers do.

Organisations need to be far more aware of the potential effects of attacks: what could happen if various control systems were used incorrectly. This way it will become more apparent where resources should be dedicated towards protecting them. Only rigorous research and testing will develop the security approaches and technologies needed to respond to this quickly-evolving cybersecurity threat, keeping the power stations running and the lights on.

The Conversation
Nasser Abouzakhar, Senior Lecturer, University of Hertfordshire

This article was originally published on The Conversation. Read the original article.