EU tenders for Open Data platform
The EU has just issued a tender document for what is likely to be the world's biggest Open Data project.
The European Commission has launched a formal tender for SMART 2014/1072 - Deployment of an EU Open Data core platform: implementation of the pan-European Open Data Portal and related services. This promises to be the worlds biggest open data project to data, eclipsing the UK Data.Gov.UK and the US Data.Gov which currently deliver approximately 15,000 and 111,000 data sets respectively.
This is not just about making data available for third party developers and those with the knowledge and skills to extract, sort, curate and import data into their own systems. The EU wants visualisation tools delivered as part of the project. This will make it possible for users of the European Open Data Portal to do their own analytics and create visualisations of the data on the platform itself.
This is likely to be a serious challenge for any vendor because of the underlying processing power required. The successful bidder will almost certainly come from one of the very large cloud providers who will be able to deliver the scalability of processing and storage capability required.
A challenging tender specification
The tender specification runs to 70 pages and provides some interesting insights into what the European Commission wants. As to be expected, there is a big emphasis on the use of Open Source Software in order to prevent any risk of data lock-in by a particular vendor and to remove the issue of licence fees.
This will need to be carefully considered. Even in the open source community, getting enterprise grade support means paying companies who are willing to make the applications stable and provide a roadmap of new features. Given what is being asked for here, it is not unreasonable that the volume of data, the visualisations and the challenge of a multi-lingual solution will exceed the capabilities of current open source programmes.
The tender specification makes it clear that any software chosen must have a minimum 10 year life and this means that the Open Source solutions must have longevity. The advantage of this is that it means a lot of potential software candidates can be eliminated because there is not enough support or future expectation of them. From a commercial perspective, whatever is chosen will change what it taught in universities across Europe as businesses will want staff with right skills to exploit the software.
Another challenge that is overlooked in the tender is that of data curation. The plan is for a pan-European solution through a portal which will not hold data but instead point to where that data is stored. An implementation challenge here is making sure that curation by governments is accurately reflected in the indexing and, more importantly, doesn't break any previous searches or analytics carried out through the portal.
Visualisation of the data is central to this project and the tender document makes it clear that this is something to be done in the portal. This will require a lot of expertise in planning bandwidth and service levels as data will need to be brought into the visualisation engine from across the continent and across a wide range of communication networks.
While there are a lot of other areas where potential bidders are going to have concerns this is not a completely new project. The European Commission has been bringing data sets together for decades and has a well established metadata repository that will help to resolve a large number of the data challenges. There is still more to be done and this is outlined in the tender specification.
Security, scalability and reliability are high on the demands in the tender specification. These will, by their nature, mean that many of the smaller European cloud providers are unable to meet the demands of this document. However, they will also stress the larger bidders because the risks of cyber attack, such as denial of service which is explicitly called out in the tender specification, will be high as will the risk of deliberate data corruption.
Why is the European Commission doing this?
Getting government owned data into the hands of companies who can create new businesses has been a growing movement for some time. The US and the UK have been leading this access to government owned data by providing access to large numbers of data sets through their own Open Data initiatives.
One of the benefits of making this data available is that it has created a huge surge in new technology start-ups that are using advanced analytics to develop new applications and services around the data.
It is not just start-ups that are interested. Academics and researchers who previously would have struggled to get access to this data now have more data at their fingertips than ever. To make even more data available, the UK government recently clarified copyright law in order to give greater emphasis on non-commercial access to data.
There is also a growing market for large commercial companies such as IBM wanting access to the data. IBM has announced that it is currently negotiating access to Data.Gov.UK in order to add all the datasets to its Watson Content Store where it will be available to SoftLayer customers on demand. It is widely expected that IBM will pay a licence fee for access to the data in order to compensate the UK taxpayer who has already paid to collect the data.
The European Commission want to emulate the success of the US and UK governments. While many European countries and the European Commission makes some data available through open data mechanisms, there has not been the same surge in use as seen in the UK and US.
The European Commission believes that this is because much of the data gathered is only used within the country where it is generated. What it wants to do is to encourage companies to look at the data from a pan-European perspective and emulate the start-up success of the US and UK in the tech markets.
This is an exciting move by the European Commission although it could be argued it comes a few years too late. Part of the reason for this is the inward focus around the fiscal challenges the EU has be struggling with but part is also around politics. There is currently no EU regulation that forces countries to participate in this project and it will be interesting to see which countries are willing to take the lead during the development phase.
The Department for Communities and Local Government has issued a paper aimed at helping Local Government become more cyber resilient.
EC Report shows that there has been no improvement in getting more accurate call location raising concerns over emergency service responses.
iomart announces it has achieved accreditation for G-Cloud 6
UK Government want insurers to play their part in reducing cyber risk.