CloudSigma provides a level of flexibility and control over the Cloud, an environment that people deploy within CloudSigma's infrastructure that's akin to have a dedicated or private infrastructure. As a result of that approach, CloudSigma has a lot of very large customers and service providers. People like this have a need for a very dependable, customizable infrastructure and bedrock from which they can build out a service to customers and users.
At CloudScape VII, Robert Jenkins talked about its vision and activities around bringing Big Data and big public datasets into the public Cloud. Essentially, the idea is that the data is getting bigger and bigger so CloudSigma needs to bring the applications to the data, rather than the data to the applications which was the old way of doing things.
The public Cloud can be that environment where CloudSigma essentially curates and hosts multiple large-scale public datasets in a public Cloud. The company then is able to expose those single copies of the data to multiple users. You no longer need to have your own copy of the data. You can compute against petabytes of storage without actually having to pay for any of that storage. The public Cloud is a really unique environment for doing that.
As part of that, CloudSigma is also a member of the Helix Nebula consortium. Helix Nebula has a Cloud market place that allows customers to come in, do their computing across multiple Clouds. CloudSigma is aiming to marry that with the back-end so that it can federate these datasets across multiple Cloud providers.
You as an end user can come in and say that you want to do this computing or create this service on top of these datasets, set a filter out and get to a short list of providers and then decide to deploy with CloudSigma or T-Systems or whoever it is, based on the cost and service offering of that particular provider.
Cloud Sigma is also working very close with the European Space Agency. It has already datasets hosted in its Cloud that are available to end users and the company is building out with key marquee customers in the commercial sector use cases.
One of the examples Robert Jenkins gave in his presentation was that of a local authority in Rome that was working with the university over there. They took data and married it to a kind of digital buildings. They were able to map land movement and overlay this with the actual buildings. As such, they were able to profile risk. If they saw a building or an area had moved more than a certain amount, knowing the age of that building and the type of building, they could infer a risk in terms of structural damage.
By doing that, they were able to filter across an extremely large area instantly and immediately highlight to the local authority these buildings could be at risk. Actually, as a result of that, they did find buildings that had actual structural damage. They were able to identify to an exact joint inside the corner of this particular building in this street that the building had structural damage and was at risk, so they could ask to go and check it.
You can imagine how that can seem like a small thing but the massive impact that would have to a local authority building inspector roaming around the city, trying to find faults, he can waste his time on 99 percent of the buildings because one knows that the ground hasn't moved and that there is no problem. In this case, it is a massive optimization for the local council.
Using similar data, you can do things like proactive pipeline monitoring for an oil company that has a pipeline going out to the North Arctic and wants to know whether there is permafrost or a shift in the ground and it has thousands of kilometers of pipeline. How do you monitor that? Well, the data is there in the datasets. You can extract that data. You can create a risk profile and say: Well, if this land moves more than X, you want to react proactively, send the engineers out to adjust the pipeline so that you can avoid to have additional stress.
This has a positive environmental impact. It avoids leakages from pipelines that are very damaging and it is also a commercially positive impact for the oil company: it saves money and at the same time, it can avoid leaks.
There are really a lot of different services that can be built on top of the data. CloudSigma is democratizing access to this data. CloudSigma wants engagement from developers and from the commercial sector to come in and build services for end users that sit on top of that data. The consumer can go in, go to a mobile app, put in my building and ask: please, tell me the risk profile of this building. Maybe, the consumer is buying a house and it would be nice to know this information.
These services can be built with the data that is available today in these Clouds. This is what CloudSigma is engaging in in 2015: the exploitation of the data, getting it out there and building an ecosystem around that.