The 335 million observations of data the Met Office stores every day require huge computational capability. The new supercomputer provides the processing power needed to manipulate the data in a timely and effective way. The complex numerical models developed by the scientists and meteorologists in turn create enormous data outputs, used for climate and weather prediction and by data users throughout the world to make weather outlooks more accurate than ever before.
The Met Office recognises that increases in observational and forecast data volumes have implications across both the public and private weather sectors. To understand this better, the Met Office recently partnered with the Open Data Institute, to carry out a review on 'The state of weather data infrastructure'.
The review is encouraging discussion on how the global weather data infrastructure can be sustainable, and continue to deliver value to society, as well as looking at the need for continuing investment in technical infrastructure and supercomputing resources. It also looks at the role of global, regional and national meteorological services in collecting observations and generating forecasts.
In addition, the review highlights the technology creating new data and in turn generating new, Big Data challenges. Supercomputing is enabling new and improved weather models which are harnessing a variety of sources of weather observations from ground, air, sea and space based monitoring and sensors. These trends exist within a wider landscape of innovation and changing consumer expectations where instant and real-time access to data is increasingly essential.
The Met Office is striving towards making the data more openly accessible and useful to realise the social economic benefits brought by the new supercomputer, which delivers ever increasing accuracy yet exponential increases in data volumes that makes this more challenging.