Back to Table of contents

Primeur weekly 2017-02-20

Focus

HPC expert Genias Benelux to show its skillful expertise in brandnew website ...

Are billion Euro Flagships the right way to finance innovative areas like graphene, human brain research and quantum computing? ...

Exascale supercomputing

Advanced fusion code led by PPPL selected to participate in Early Science Programmes on three new DOE Office of Science pre-exascale supercomputers ...

Focus on Europe

From robotics to particle physics: Data analytics gets the spotlight in Distinguished Talk series at ISC 2017 ...

A new spin on electronics ...

Data mining tools for personalized cancer treatment ...

Why host HPC in Iceland to tackle Big Data for life sciences at Earlham Insititute ...

Biological experiments become transparent - anywhere, any time ...

Middleware

IBM delivers new platform to help clients address storage challenges at massive scale ...

Hewlett Packard Enterprise unveils most significant 3PAR Flash storage innovations to date ...

Hardware

Tokyo Institute of Technology partners with DDN on Tsubame3.0 to build forward-looking AI and Big Data computing infrastructure ...

Mellanox demonstrates four times improvement in crypto performance with Innova IPsec 40G Ethernet network adapter ...

Supermicro launches BigTwin - the industry's highest performing Twin multi-node system supporting the full range of CPUs, maximum memory and all-flash NVMe ...

Applications

Researchers catch extreme waves with higher-resolution modelling ...

Researchers are creating software to 'clean' large datasets, making it easier for scientists and the public to use Big Data ...

Designing new materials from 'small' data ...

Success by deception ...

DNA computer brings 'intelligent drugs' a step closer ...

'Lossless' metamaterial could boost efficiency of lasers and other light-based devices ...

Perimeter Institute researchers apply machine learning to condensed matter physics ...

When treating brain aneurysms, two isn't always better than one ...

Real-time MRI analysis powered by supercomputers ...

Analyzing data for transportation systems using TACC's Rustler, XSEDE ECSS support ...

NCSA facilitates performance comparisons with China's nr. 1 supercomputer ...

IBM delivers Watson for cyber security to power cognitive security operations centres ...

The Cloud

Optimizing data centre placement and network design to strengthen Cloud computing ...

Dutch start-up solution impacts data centres ...

OpenFog Consortium releases landmark reference architecture for Fog computing ...

IBM brings machine learning to the private Cloud ...

IBM accelerates hybrid Cloud adoption by enabling channel partners to offer VMware solutions ...

Oracle launches Cloud service to help organisations integrate disparate data and drive real-time analytics ...

OpenFog Consortium releases landmark reference architecture for Fog computing

8 Feb 2017 Fremont - The OpenFog Consortium has released the OpenFog Reference Architecture, a universal technical framework designed to enable the data-intensive requirements of the Internet of Things (IoT), 5G and artificial intelligence (AI) applications. The RA marks a significant first step toward creating the standards necessary to enable high-performance, interoperability and security in complex digital transactions.

Fog computing is the system-level architecture that brings computing, storage, control, and networking functions closer to the data-producing sources along the cloud-to-thing continuum. Applicable across industry sectors, fog computing effectively addresses issues related to security, cognition, agility, latency and efficiency. The OpenFog Consortium was founded over one year ago to accelerate adoption of fog computing through an open, interoperable architecture.

"Just as TCP/IP became the standard and universal framework that enabled the Internet to take off, members of OpenFog have created a standard and universal framework to enable interoperability for 5G, IoT and AI applications", stated Helder Antunes, chairman of the OpenFog Consortium and senior director for the Corporate Strategic Innovation Group at Cisco. "While fog computing is starting to be rolled out in smart cities, connected cars, drones and more, it needs a common, interoperable platform to turbocharge the tremendous opportunity in digital transformation. The new OpenFog Reference Architecture is an important giant step in that direction."

"The OpenFog Reference Architecture is the culmination of a year-long effort from industry and university research members to ensure we address all the appropriate communications, software, infrastructure and security components of fog computing", stated Jeff Fedders, president of the OpenFog Consortium. "Our goal is to help and support both the business leader and the technologist to create new applications and business models through fog computing. By developing this common framework, we’re addressing the hardware, software and system elements necessary for an OpenFog architecture and a vibrant, supplier ecosystem."

The OpenFog Reference Architecture is a high-level framework that will lead to industry standards for fog computing. The OpenFog Consortium is collaborating with standards development organisations such as IEEE to generate rigorous user, functional and architectural requirements, plus detailed application programme interfaces (APIs) and performance metrics to guide the implementation of interoperable designs.

The massive and growing amounts of data produced, transported, analyzed and acted upon within industries such as transportation, health care, manufacturing and energy - collectively measured in zettabytes - is exposing challenges in Cloud-only architectures and operations that reside only at the edge of the network. Fog computing works in conjunction with the Cloud and across siloed operations to effectively enable end-to-end IoT, 5G and AI scenarios.

For example, with an autonomous vehicle system, a smart car will generate terabytes of data per trip while it connects and communicates in motion with traffic control, municipality infrastructure and other vehicles. Current IoT system architectures cannot address the mission critical nature of this data where latency is measured in sub-milliseconds and reliable network availability and bandwidth is crucial. The OpenFog architecture - which can feature multiple layers of fog nodes acting upon the data closer to its source and managing the fog-to-thing, fog-to-fog and fog-to-Cloud interfaces–successfully addresses these requirements.

The OpenFog Reference Architecture contains a medium- to high-level view of system architectures for fog nodes (smart, connected devices) and networks, deployment and hierarchy models, and use cases. It is part of a suite of technical documents under development by the OpenFog Consortium. Future documents will provide updated requirements and lower-level details, including formal, enumerated requirements that will form the basis of quantitative testbeds, certifications and the specified interoperability of fog elements.

The OpenFog Reference Architecture is based on eight core technical principles, termed pillars, which represent the key attributes that a system needs to encompass to be defined as "OpenFog". These pillars include security, scalability, openness, autonomy, RAS (reliability, availability, and serviceability), agility, hierarchy and programmability.

To view the OpenFog Reference Architecture, you can visit the

"http://www.openfogconsortium.org/ra">OpenFog</a> website.
Source: OpenFog

Back to Table of contents

Primeur weekly 2017-02-20

Focus

HPC expert Genias Benelux to show its skillful expertise in brandnew website ...

Are billion Euro Flagships the right way to finance innovative areas like graphene, human brain research and quantum computing? ...

Exascale supercomputing

Advanced fusion code led by PPPL selected to participate in Early Science Programmes on three new DOE Office of Science pre-exascale supercomputers ...

Focus on Europe

From robotics to particle physics: Data analytics gets the spotlight in Distinguished Talk series at ISC 2017 ...

A new spin on electronics ...

Data mining tools for personalized cancer treatment ...

Why host HPC in Iceland to tackle Big Data for life sciences at Earlham Insititute ...

Biological experiments become transparent - anywhere, any time ...

Middleware

IBM delivers new platform to help clients address storage challenges at massive scale ...

Hewlett Packard Enterprise unveils most significant 3PAR Flash storage innovations to date ...

Hardware

Tokyo Institute of Technology partners with DDN on Tsubame3.0 to build forward-looking AI and Big Data computing infrastructure ...

Mellanox demonstrates four times improvement in crypto performance with Innova IPsec 40G Ethernet network adapter ...

Supermicro launches BigTwin - the industry's highest performing Twin multi-node system supporting the full range of CPUs, maximum memory and all-flash NVMe ...

Applications

Researchers catch extreme waves with higher-resolution modelling ...

Researchers are creating software to 'clean' large datasets, making it easier for scientists and the public to use Big Data ...

Designing new materials from 'small' data ...

Success by deception ...

DNA computer brings 'intelligent drugs' a step closer ...

'Lossless' metamaterial could boost efficiency of lasers and other light-based devices ...

Perimeter Institute researchers apply machine learning to condensed matter physics ...

When treating brain aneurysms, two isn't always better than one ...

Real-time MRI analysis powered by supercomputers ...

Analyzing data for transportation systems using TACC's Rustler, XSEDE ECSS support ...

NCSA facilitates performance comparisons with China's nr. 1 supercomputer ...

IBM delivers Watson for cyber security to power cognitive security operations centres ...

The Cloud

Optimizing data centre placement and network design to strengthen Cloud computing ...

Dutch start-up solution impacts data centres ...

OpenFog Consortium releases landmark reference architecture for Fog computing ...

IBM brings machine learning to the private Cloud ...

IBM accelerates hybrid Cloud adoption by enabling channel partners to offer VMware solutions ...

Oracle launches Cloud service to help organisations integrate disparate data and drive real-time analytics ...