The project is combining Cloud computing with workflow to create Cloud flow. CloudFlow is driven by the end user in the area of water turbine maintenance, repair and overhaul. The Cloud services support the design, the simulation of flow, system and machining, assure the quality, and manage the data, André Stork explained.
There is a desire to use this Cloud-based functionality for faster time-to-market, better products and more cost-efficient development. The HPC resources are leveraged for more complex physical-based simulation, and higher spatio-temporal resolution.
The aim is to develop an infrastructure, open for new Cloud-based engineering services supporting engineering workflows in manufacturing and engineering companies.
CloudFlow has been set up to ease the accessiblity for HPC-based simulation services and to increase the affordability by creating new business models, André Stork said.
The project is running 6 internal experiments and has scheduled two Open Calls.
André Stork gave a few examples of the activities within the project. One of them is CAD on the Cloud. Using a specific and expensive application on a local CAD software can be replaced by a Cloud service. This service can work for a full process from the definition of the needs down to the manufacturing. The user can initiate the development of special services.
A second example is CFD on the Cloud. CFD Cloud simulation enables access to computational resources that can otherwise not be made available in the company. As such, it allows casual CFD users to perform their work more efficiently.
Systems simulation on the Cloud is a third example, cited by André Stork. Typically 1D modelling in multi-physics is being performed based on Modelica. With CloudFlow, users can run batch jobs, perform parameter studies and optimizations on an HPC infrastructure.
PLM on the Cloud is a fourth example. Systems simulation can be executed on the Cloud. Traditionally, users work with point clouds but now, they can use CAD.
Other partners in CloudFlow are the University of Nottingham; CARSA business models; ARCTUR, an HPC provider, SINTEF, and the CloudFlow infrastructure.
The first Open Call that has to collect 7 external experiments for wave 2, opens on June 30, 2014.
CloudFlow is soliciting 7 experiments rooted in computational technolgoy for manufacturing and engineering studies. The experiments have to address workflows along value chains in and across companies. The priority is set on innovative product development and products, André Stork explained.
The consortia usually consist of 1 to 4 partners and ideally include end users, software vendors, an HPC provider, research institutions and are complemented by existing CloudFlow partners.
The Open Call ends on September 30, 2014. The experiments start on January 1, 2015. The duration is one year.
More information is available at www.eu-cloudflow.eu/open-calls .