Tag Archives: data

The Shakespeare Review of Public Sector Information

  Screen Shot 2013-06-05 at 03.10.44

Time to reflect on an interesting morning, the morning of 15 May at Policy Exchange in London where Stephan Shakespeare, Chair of the Data Strategy Board and CEO of YouGov, launched The Shakespeare Review of Public Sector Information, an independent review into how the public sector can open up and make better use of data.

Connected Liverpool was present, and witnessed the first ‘big step’ towards a nation wide big data strategy. Besides Shakespeare, Rohan Silva (Senior Adviser to the Prime Minister), Professor Sir Allan Bradley (Founder & Chief Scientific Officer, Kymab), Joe Cohen (Founder and Chairman, Seatwave), John Gibson (Senior Advisor, Number 10), Jonathan Raper (Founder and Director, Placr) and Mary Turner (CEO, AlertMe.com) were also present and part of a panel-discussion to discuss “how the UK can win the next phase of digital evolution?”. Overall, an impressive bunch we would not mind having lunch with.

Picking up on ‘the next phase of digital evolution’, what do they really mean? In short, Phase 1 of the Digital Evolution was about connectivity, bringing together people, organisations and businesses in new ways that increased communications, the channels to information, and the efficiency of operations. Clearly, this refers to the market that America dominated with its innovative and entrepreneurial culture. Google, Ebay, Facebook, Amazon, PayPal, Yahoo, Microsoft, Twitter and of course Apple are all examples of big winners that clearly dominated the first phase of this evolution. Companies that ‘shaped our lives’ (to some extent) and are all located on the West Coast of America.

But, as Stephan Shakespeare described it, it is now time for the UK to step up and be a leader in Phase 2 of the Digital Evolution, a phase that provides equal potential in the capacity to process and learn from data. Data allows us to adapt and improve public services and businesses and enhance our whole way of life, bringing economic growth, wide-ranging social benefits and improvements in how government operates and judges.

Public Sector Information (PSI) provides the very foundation of this. Britain enjoys significant advantages to become a winner in this space because of the size and coherence of our public sector (think of the size and data of our NHS) combined with government’s strong commitment to develop a visionary open data policy.

So why bother? At the bottom-line of all of this is economic growth. Mastering this digital phase will launch Britain from a low-growth economy to a high growth nation. Public Sector Information is key to achieve this but its potential success is interlinked with the important role of our government to create the infrastructure that enables this. As the Independent Review sates, “Consider the role of government: it exists to decide the rules by which people can act, and to administer them: how much, by what method, and from whom to take resources; and how to re-allocate them. Doing it well enables national success; doing it badly means national failure. Ensuring that the process of government is optimised for progress, and does not corrupt into an obstacle to progress, requires continuous data and the continuous analysis of data.”

As Sir Terry Leahy once stated: “to run an enterprise without data is like driving by night with no headlights”. Additionally, Rohan Silva (Senior Adviser to the Prime Minister) explained that this is what government often does: “It has a strong institutional tendency to proceed by hunch, or prejudice, or by the easy option.” In other words, the new world of data is good for government, good for business, and above all good for citizens as we can use data such as education and health, tax and spending, work and productivity etc. to make informed decisions and to consequently, optimise our quality of life and economic growth.

So what is it that Stephan Shakespeare actually recommended? Here we go…..:

  1. Recognise in all we do that PSI, and the raw data that creates it, was derived from citizens, by their own authority, was paid for by them, and is therefore owned by them. It is not owned by employees of the government.
  2. Have a clear, visible, auditable plan for publishing data as quickly as possible, defined both by bottom-up market demand and by top-down strategic thinking, overcoming institutional and technical obstacles with a twin-track process which combines speed to market with improvement of quality: 1) an ’early even if imperfect’ track that is very broad and very aggressively driven, and 2) a ‘National Core Reference Data’ high-quality track which begins immediately but narrowly.
  3. Drive the implementation of the plan through a single channel more clearly-defined than the current multiplicity of boards, committees and organisations that are distributed both within and beyond departments and wider public sector bodies. It should be highly visible and accessible to influence from the data-community through open feedback mechanisms.
  4. Invest in building capability for this new infrastructure. It is not enough to gather and
    publish data; it must be made useful. We lack data-scientists both within and outside of government, and not enough is being done in our education system at school and
    undergraduate level to foster statistical competence.
  5. Ensure public trust in the confidentiality of individual case data without slowing the pace of maximising its economic and social value. Privacy is of the utmost importance, and so is citizen benefit.

These principles ought to be adopted by the government when it will start to create a detailed nationwide Data Strategy. Even though no time frame was given for the production of this strategy, Stephan Shakespeare’s recommended principles seem both straightforward and essential. It is now up to the government to push this agenda forward and to do it quickly so Britain’s opportunity to become a ‘first-mover’ will not be wasted….

Screen Shot 2013-06-05 at 03.10.36

 

 

 

 

 

 

 

 

 

 

Japan radiation monitoring goes crowd, open source

A new open and crowd-sourced initiative to deploy more geiger counters all over Japan looks to be happening. Safecast, formerly RDTN.org, recently met and exceeded its $33,000 fund-raising goal kick-starter, which should help Safecast send between 100 and 600 geiger counters to the catastrophe-struck country.

The data captured from the geiger counters will be fed into Safecast.org, which aggregates radiation readings from government, nonprofit, and other sources, as well as into Pachube, a global open-source network of sensors. Safecast is one of the larger crowd-sourced monitoring efforts, not unlike a similar effort in the United States that predated the Japanese disaster. Safecast plans to deploy hundreds of geiger counters in Japan.

For the last month, the Safecast crew and volunteers have been collaborating with universities in Japan and driving their geiger counters around the country and taking measurements. Safecast’s early monitoring trips north of Tokyo returned some disturbing findings, including elevated radiation levels in a kindergarten classroom.

Safecast link: http://blog.safecast.org/

Government surveillance using data of Google

Google’s Transparency Report showed that government surveillance of online lives is rising rapidly. In the first 6 months of this year, governments from around the world made 21,000 requests for access to Google data.
Google has been publishing a “Transparency Report” twice a year since 2009 and has observed a sharp rise in government demands for data. Top of the list was the US government who demanded data 7,969 times in the first six months of this year. As the
table below shows, India came second and Brazil third.

REQUESTS FOR USERS’ DATA

(January to June 2012)

  • United States – 7,969
  • India – 2,319
  • Brazil – 1,566
  • France – 1,546
  • Germany – 1,533
  • UK – 1,425

On the contrary, Turkey topped the list for requests to remove content. According to Google, these requests often reflect the laws on the ground. For instance, in Brazil there were a lot of requests to remove content during elections as there is a law banning parodies of candidates. The top three reasons for content removal were defamation, privacy and security.

REQUESTS FOR TAKE-DOWNS

(January to June 2012)

  • Turkey – 501
  • United States – 273
  • Germany – 247
  • Brazil – 191
  • UK – 97

Worldwide authorities made 1,789 requests for Google to remove content, up from 1,048 requests for the last six months of 2011. Turkey submitted 148 requests for the removal of data related to the first president of the country, the current government, and national identity and values. Others included claims of pornography, hate speech and copyright.

Google has its own criteria for whether it will remove content – the request must be specific, relate to a specific web address and have come from a relevant authority.

With its Transparency Report, Google aims to share how governments interact with online services and how laws are reflected in online behaviour.

 

 

 

 

DataKind in New York

Are you loving data? In a way that it could keep you busy day and night? Then this might interest you….

From 7 – 9 September, the School of Visual Arts located in Manhattan, New York City, will host an exciting event for data lovers: “DataDrive”. DataDive will allow civic hackers to work directly with NYC government agencies and open data to create innovative and exciting new projects. Additionally, there will be an opportunity to speak with representatives from City government through an evening kick-off event. Meals will be provided throughout the weekend and the whole event is free!

Wanna join? (We think you should.) To register and check out a full schedule of the event(s) visit: http://www.eventbrite.com/event/4028174378

 

Smart technologies in smart infrastructure creates smart building

To become a smart city, you need smart Technologies. Sensors and other meters are needed to collect and analyze data. Smart cities need to be able to check buildings, bridges, sea defenses and road and railway cuttings at the touch of a button. Engineers at Cambridge University are now developing technologies that will allow the conditions of these infrastructures to be monitored in unprecedented detail.

All the infrastructure, old and new, needs to be put under constant surveillance and this can be done with new technologies using wireless sensors and fiber optics. Strain, temperature, displacement, humidity or even a crack in the wall can be monitored. Researchers at the University are developing these technologies, called smart technologies, and they hope to bring them to the market by 2016. The University of Cambridge is working together for this project with industry and technology companies.

There already are a lot of sensor technologies, but they aren’t used routinely enough in infrastructure at the moment. The financial aspect of these technologies is also an issues at this point. Constant monitoring all the city’s infrastructure and maintaining it costs billions of pounds every year. So even a small improvement in efficiency can result in major savings.

Professor Robert Mair is the principal investigator of the Centre for Smart Infrastructure and Construction (CSIC). CSIC is an innovative and knowledge centre that’s involved in the University’s Department of Engineering, Department of Architecture, Computer Laboratory and Judge Business School. Mair said that the project on smart technologies they are working on now is hugely exciting and important.

Because most of the UK’s infrastructure is more than 100 years old, infrastructure owners feel the need to be involved in the emerging technologies in sensors and data management. They can use these technologies to quantify and define the extent of ageing and the consequent remaining design life of their infrastructure.

So these new technologies have a lot of advantages for old infrastructure, but they are also of use for new infrastructures since they can lead to more efficient and economic construction of new infrastructure. This is because engineers will be able to better understand how infrastructure is performing during and after construction. This will lead to more informed decision-making and an improved performance-based design and construction process.

One of the key objectives of the research at the Cambridge University is looking for a way to remove the need for batteries in sensors. A project of CSIC is looking at using micro-electrical mechanical systems in which miniature devices and circuitry can be etched onto a silicon chip as part of the sensors. So they could be able to include a very small turbine to harness the wind power produced by passing trains in the tunnel, making the system entirely self-sufficient. The same technology can be used on bridges by for example using the vibrations from passing vehicles.

Another key research for the IKC is optical-fiber monitoring. Cambridge engineers installed optical fibers around the inside of the old brick tunnel when a new tunnel was built beneath the century-old Thames link tunnel in London.  These fibers continuously measure the changing strains and temperature at every single point along the fiber. Previously, engineers has to use conventional survey techniques to analyze the impact of the new tunnel. Now they can use this new optical fiber technology to measure strain directly and continuously. 

In the future, incorporating optical fibers and sensors during the construction process will enable an unprecedented level of ‘cradle to grave’ analysis of how our infrastructure actually performs. Over-estimation now goes into the use of many components in buildings and structures to guarantee safety. In the future, better monitoring would allow construction firms to make more accurate judgments about how much materials to use. Construction firms should have it easier to insert sensors and optical fibers into walls, facades and beam by adding them to components in the factory before they reach the building site. This is what can be called ‘smart’ building.

£17m has been granted to CSIC to conduct the research. £10m from the Engineering and Physical Sciences Research Council and the Technology Strategy Board and another £7 from the industry collaborators. Professor Mair hopes that the Centre will have advanced the technology and the business cases sufficiently by 2016 and he hopes that they will be able to support their future through industry collaboration alone.