Back to Table of contents

Primeur weekly 2014-10-20

Special

High Performance Computing Centre in Stuttgart to focus on Cloudification ...

Cash flow calculations using Monte Carlo simulations ...

The Cloud

IBM and SAP partner to accelerate enterprise Cloud adoption ...

EuroFlash

European Commission to launch survey on EU-Brazil co-operation in the area of ICT - Work Programme 2016-2017 ...

Big Data Value Association and European Commision sign Big Data Public Private Partnership ...

New forecasting method: Predicting extreme floods in the Andes mountains ...

Symposium on HPC and Data-Intensive Applications in Earth Sciences: Challenges and Opportunities@ICTP ...

Jülich renews co-operation with Oak Ridge National Laboratory ...

Johannes Gutenberg University Mainz joins Germany's Gauss-Allianz as a full member ...

PRACE supports HPC for Health ...

HP and VMware dramatically simplify and accelerate the delivery of software-defined infrastructure services with EVO: RAIL ...

Supermicro highlights VMware EVO: RAIL, FatTwin Virtual SAN Ready nodes and NVIDIA GRID vGPU SuperServer at VMworld Barcelona ...

Calling on universities to submit their HPCAC-ISC 2015 Student Cluster Competition application ...

PRACE SHAPE and NSilico pool HPC knowledge to develop faster sequencing methods ...

A novel platform for future spintronic technologies ...

Future computers could be built from magnetic 'tornadoes' ...

USFlash

New SGI UV for SAP HANA enables real-time business for large enterprises ...

Cray adds new advanced analytics solution to its Big Data product portfolio ...

Australian teams set new records for silicon quantum computing ...

IBM announces first commercial application of IBM Watson in Africa ...

Among DOE supercomputing facilities, NERSC is at the forefront of data management and storage innovations ...

SC14 announces ACM/IEEE-CS George Michael HPC Fellowships ...

Fujitsu partners with Singapore to set up Centre of Excellence for sustainable urbanisation ...

UC Santa Cruz leads $11 million Center for Big Data in Translational Genomics ...

Australian volcanic mystery explained ...

Supermicro enhances VMware EVO: RAIL and Virtual SAN offerings with Nexenta ...

Dispelling a misconception about Mg-ion batteries ...

Cash flow calculations using Monte Carlo simulations


2 Oct 2014 Heidelberg - At ISC Big Data'14 in Heidelberg,Primeur Magazinehad the opportunity to interview Kurt Stockinger from Zurich University of Applied Sciences (ZHAW) on the analysis of huge amounts of financial data on a massive scale.

Currently, the team of Kurt Stockinger is working on a project which is called "Large Scale Financial Data Modelling". Financial contracts in the world are taken and reduced to about 30 different types. After that, the team performs stress tests: a company or bank has so many loans, so many shares, so many options, what is the risk of all these things?

The idea is based on a project, called Actus, that works with a financial theory from two colleagues of Kurt Stockinger. If you know the amount of loans, you can calculate the cash flow to see how much money you can get through these loans. You may have more than one financial instrument. When you want to do these cash flow calculations for a hundred of them, or for a whole company, bank or insurance company, or maybe a whole country, you want to be able to simulate different scenarios.

One scenario would be that the European Central Bank decreases the interest rate from .15 percent to .10 percent. What is the impact on the portfolio of all the banks? To calculate these cash flows, you basically use Monte Carlo simulations, which are normally used in weather simulations or physics modelling. We use exactly the same ideas but for another type of problem. How much money would a financial product actually evaluate or generate over a specific time?

This is a nice problem from the high performance computing perspective because you can calculate all these things nicely in parallel, Kurt Stockinger explained. You can calculate different scenarios: some interests go up, some go down. If you calculate ten or hundred or even thousand scenarios, you just change one parameter but the result is that you may get 100 terabyte of cash flows or maybe a petabyte. You can basically simulate as much data as you can store.

Primeur Magazinewanted to know whether deterministic methods do not work for this case.

In this case, we are simulating financial contracts with a starting and end time and if you have a bond, it has a 1 percent interest rate over 1 year, so all this is very deterministic. This is the information that is found in the contract and you can basically calculate this. Different scenarios can be interest rates or year curves and for this we are using Monte Carlo simulations.

The simulations are embarrassingly parallel but it is not possible yet to tell the exact performing time of a parallel series of jobs because the team is still working on smaller data sets and currently is building the system. The team has to take out some of the algorithms that were written in Java and see if it can parallellize them a little bit nicer. The team needs to do this exercise first before Kurt Stockinger can provide a theoretical estimation of the run time. This is still future work.

It is not only a high performance computing problem. Since the team wants to standardize the model, it is also a data integration problem. In a physics experiment you have only one source. In the banking world you have tens, hundred or thousand different kinds of sources: trading systems from Zurich or from New York, so you really need to integrate the data in order to make sure that the meaning is everywhere the same. The challenge is to automate the data integration process.

Ad Emmen

Back to Table of contents

Primeur weekly 2014-10-20

Special

High Performance Computing Centre in Stuttgart to focus on Cloudification ...

Cash flow calculations using Monte Carlo simulations ...

The Cloud

IBM and SAP partner to accelerate enterprise Cloud adoption ...

EuroFlash

European Commission to launch survey on EU-Brazil co-operation in the area of ICT - Work Programme 2016-2017 ...

Big Data Value Association and European Commision sign Big Data Public Private Partnership ...

New forecasting method: Predicting extreme floods in the Andes mountains ...

Symposium on HPC and Data-Intensive Applications in Earth Sciences: Challenges and Opportunities@ICTP ...

Jülich renews co-operation with Oak Ridge National Laboratory ...

Johannes Gutenberg University Mainz joins Germany's Gauss-Allianz as a full member ...

PRACE supports HPC for Health ...

HP and VMware dramatically simplify and accelerate the delivery of software-defined infrastructure services with EVO: RAIL ...

Supermicro highlights VMware EVO: RAIL, FatTwin Virtual SAN Ready nodes and NVIDIA GRID vGPU SuperServer at VMworld Barcelona ...

Calling on universities to submit their HPCAC-ISC 2015 Student Cluster Competition application ...

PRACE SHAPE and NSilico pool HPC knowledge to develop faster sequencing methods ...

A novel platform for future spintronic technologies ...

Future computers could be built from magnetic 'tornadoes' ...

USFlash

New SGI UV for SAP HANA enables real-time business for large enterprises ...

Cray adds new advanced analytics solution to its Big Data product portfolio ...

Australian teams set new records for silicon quantum computing ...

IBM announces first commercial application of IBM Watson in Africa ...

Among DOE supercomputing facilities, NERSC is at the forefront of data management and storage innovations ...

SC14 announces ACM/IEEE-CS George Michael HPC Fellowships ...

Fujitsu partners with Singapore to set up Centre of Excellence for sustainable urbanisation ...

UC Santa Cruz leads $11 million Center for Big Data in Translational Genomics ...

Australian volcanic mystery explained ...

Supermicro enhances VMware EVO: RAIL and Virtual SAN offerings with Nexenta ...

Dispelling a misconception about Mg-ion batteries ...