Back to Table of contents

Primeur weekly 2015-04-13

Special

CloudSigma engages in having services built for end users to exploit data available in the public Cloud ...

Companies can benefit from four generic Cloud strategies to enhance their competitive advantage ...

The end of the Netherlands as supercomputer nation? ...

Exascale supercomputing

U.S. Department of Energy awards $200 million for next-generation supercomputer at its Argonne National Laboratory ...

The Cloud

NASA and IBM rally global developers around space exploration on the IBM Cloud ...

Oracle adds new data integration technologies to enterprise Big Data portfolio ...

IBM helps ecosystem of partners and clients build 'Internet of Things' solutions ...

EuroFlash

Codasip selects SuperTest for validation of application-specific C compilers ...

New CRC Press book provides global overview of high performance computing's industrial impact ...

RSC is the leading Russian supercomputer vendor in the local Top50 list ...

ISC 2015 is now open for advance registration ...

SHAPE Second Call awards 11 innovative projects ...

IBM and Barcelona Supercomputing Center - Centro Nacional de Supercomputación celebrate a decade of collaboration ...

A glass fiber that brings light to a standstill ...

Research Organization of Information and Systems signed MoUs with CSC, DSI, and EUDAT ...

New HPC cluster benefiting University of Oxford ...

USFlash

Chancellor helps OSC dedicate Ruby Cluster ...

Celebrating Blue Waters ...

Blue Waters to help researchers tackle ebola ...

Neuromorphic: Rensselaer Researchers To Design the Next-Generation Supercomputer ...

Cosmic debris: Study looks inside the universe's most powerful explosions ...

U.S. scientists celebrate the restart of the Large Hadron Collider ...

UMass Amherst Launches Center for Data Science in major expansion of Big Data research, education and collaboration with industry ...

IBM Research sets new record for tape storage ...

CloudSigma engages in having services built for end users to exploit data available in the public Cloud


13 Apr 2015 Brussels - At CloudScape VII, held in March 2015 in Brussels,Primeur Magazinehad the opportunity to interview Robert Jenkins, CEO and cofounder of CloudSigma, on Cloud computing and Big Data. CloudSigma was founded in 2009. The company was created because the founders wanted to create the same experience the customers were used to from the private Clouds but in a public Cloud setting. CloudSigma is a public Infrastructure-as-a-Service provider.

CloudSigma provides a level of flexibility and control over the Cloud, an environment that people deploy within CloudSigma's infrastructure that's akin to have a dedicated or private infrastructure. As a result of that approach, CloudSigma has a lot of very large customers and service providers. People like this have a need for a very dependable, customizable infrastructure and bedrock from which they can build out a service to customers and users.

At CloudScape VII, Robert Jenkins talked about its vision and activities around bringing Big Data and big public datasets into the public Cloud. Essentially, the idea is that the data is getting bigger and bigger so CloudSigma needs to bring the applications to the data, rather than the data to the applications which was the old way of doing things.

The public Cloud can be that environment where CloudSigma essentially curates and hosts multiple large-scale public datasets in a public Cloud. The company then is able to expose those single copies of the data to multiple users. You no longer need to have your own copy of the data. You can compute against petabytes of storage without actually having to pay for any of that storage. The public Cloud is a really unique environment for doing that.

As part of that, CloudSigma is also a member of the Helix Nebula consortium. Helix Nebula has a Cloud market place that allows customers to come in, do their computing across multiple Clouds. CloudSigma is aiming to marry that with the back-end so that it can federate these datasets across multiple Cloud providers.

You as an end user can come in and say that you want to do this computing or create this service on top of these datasets, set a filter out and get to a short list of providers and then decide to deploy with CloudSigma or T-Systems or whoever it is, based on the cost and service offering of that particular provider.

Cloud Sigma is also working very close with the European Space Agency. It has already datasets hosted in its Cloud that are available to end users and the company is building out with key marquee customers in the commercial sector use cases.

One of the examples Robert Jenkins gave in his presentation was that of a local authority in Rome that was working with the university over there. They took data and married it to a kind of digital buildings. They were able to map land movement and overlay this with the actual buildings. As such, they were able to profile risk. If they saw a building or an area had moved more than a certain amount, knowing the age of that building and the type of building, they could infer a risk in terms of structural damage.

By doing that, they were able to filter across an extremely large area instantly and immediately highlight to the local authority these buildings could be at risk. Actually, as a result of that, they did find buildings that had actual structural damage. They were able to identify to an exact joint inside the corner of this particular building in this street that the building had structural damage and was at risk, so they could ask to go and check it.

You can imagine how that can seem like a small thing but the massive impact that would have to a local authority building inspector roaming around the city, trying to find faults, he can waste his time on 99 percent of the buildings because one knows that the ground hasn't moved and that there is no problem. In this case, it is a massive optimization for the local council.

Using similar data, you can do things like proactive pipeline monitoring for an oil company that has a pipeline going out to the North Arctic and wants to know whether there is permafrost or a shift in the ground and it has thousands of kilometers of pipeline. How do you monitor that? Well, the data is there in the datasets. You can extract that data. You can create a risk profile and say: Well, if this land moves more than X, you want to react proactively, send the engineers out to adjust the pipeline so that you can avoid to have additional stress.

This has a positive environmental impact. It avoids leakages from pipelines that are very damaging and it is also a commercially positive impact for the oil company: it saves money and at the same time, it can avoid leaks.

There are really a lot of different services that can be built on top of the data. CloudSigma is democratizing access to this data. CloudSigma wants engagement from developers and from the commercial sector to come in and build services for end users that sit on top of that data. The consumer can go in, go to a mobile app, put in my building and ask: please, tell me the risk profile of this building. Maybe, the consumer is buying a house and it would be nice to know this information.

These services can be built with the data that is available today in these Clouds. This is what CloudSigma is engaging in in 2015: the exploitation of the data, getting it out there and building an ecosystem around that.

Ad Emmen

Back to Table of contents

Primeur weekly 2015-04-13

Special

CloudSigma engages in having services built for end users to exploit data available in the public Cloud ...

Companies can benefit from four generic Cloud strategies to enhance their competitive advantage ...

The end of the Netherlands as supercomputer nation? ...

Exascale supercomputing

U.S. Department of Energy awards $200 million for next-generation supercomputer at its Argonne National Laboratory ...

The Cloud

NASA and IBM rally global developers around space exploration on the IBM Cloud ...

Oracle adds new data integration technologies to enterprise Big Data portfolio ...

IBM helps ecosystem of partners and clients build 'Internet of Things' solutions ...

EuroFlash

Codasip selects SuperTest for validation of application-specific C compilers ...

New CRC Press book provides global overview of high performance computing's industrial impact ...

RSC is the leading Russian supercomputer vendor in the local Top50 list ...

ISC 2015 is now open for advance registration ...

SHAPE Second Call awards 11 innovative projects ...

IBM and Barcelona Supercomputing Center - Centro Nacional de Supercomputación celebrate a decade of collaboration ...

A glass fiber that brings light to a standstill ...

Research Organization of Information and Systems signed MoUs with CSC, DSI, and EUDAT ...

New HPC cluster benefiting University of Oxford ...

USFlash

Chancellor helps OSC dedicate Ruby Cluster ...

Celebrating Blue Waters ...

Blue Waters to help researchers tackle ebola ...

Neuromorphic: Rensselaer Researchers To Design the Next-Generation Supercomputer ...

Cosmic debris: Study looks inside the universe's most powerful explosions ...

U.S. scientists celebrate the restart of the Large Hadron Collider ...

UMass Amherst Launches Center for Data Science in major expansion of Big Data research, education and collaboration with industry ...

IBM Research sets new record for tape storage ...