Government 2.0 Taskforce » Open Access Design by Ben Crothers of Catch Media Tue, 04 May 2010 23:55:29 +0000 en hourly 1 And the Mashie Goes To…[drum roll] Mon, 14 Dec 2009 02:24:47 +0000 Mia Garlick Mashie

It gives us great pleasure to announce that the winners of the MashupAustralia contest have now been announced.

In case you have missed it, here is some background about the contest – from launch and initial response to a final wrap-up.

We’ve also tried to follow the conversation that you have been having elsewhere about the contest. Most recently, we came across this interesting four part discussion on the All Things Spatial blog about some of the contest entries (see Part 1, Part 2, Part 3 and Part 4.

All that remains is to know who will go home with the coveted Mashie (see image (yes, the trophy is a potato masher)) and, of course, the prize money. Our esteemed judging panel have deliberated and considered all of the entries against the rules. As we indicated might happen, more than one prize per category has been awarded because there were so many high quality entries.

For Excellence in Mashing, the Mashies go to:

  • Surburban Trends a mashup of different types of crime and census data that allows you compare and contrast suburbs by a range of economic, education, safety and socio-economic indicators. The judges thought the ability to compare suburbs visually combined with the selective choice of statistics was excellent especially in a field dominated by many entries using similar datasets.
  • Know Where You Live which bills itself as a prototype of a mashup of a range of open access government data based on postcodes so that you can truly know where you live. The judges loved the very citizen-centric “common questions” user experience of this app and the groovy, and again, selective repackaging of what could otherwise be considered (we’ll be honest here) slightly boring data. The integration of publicly-held historical photographs and rental price data was a nice touch as was the use of Google’s satellite images in the header. Judges were disappointed that some of the data for states other than NSW wasn’t available for inclusion. The focus on compliance only with the most modern standards compliant browsers was not seen as detrimental to this mashup.

The Highly Commendable Mashups were:

  • geo2gov which serves an excellent example of what can be possible with open government data. This entry provided an online service that will take a location description in a wide range of formats, and map that location to the government. The testament to its utility was demonstrated by the fact that several other entries used geo2gov. Contest judge Mark Pesce said that this app that was such an impressive prototype of what was possible with government data that it made his geeky pants wet.
  • Firemash a timely entry that analyses notices from the state of New South Wales’ Rural Fire Service and sends you a tweet if you are at risk. The judges were particularly impressed with this entry’s use of different services and its real time web goodness. The ability for citizens to submit fire information and be notified of nearby fires was quite unique. The dual purpose – citizen to Government and Government to citizen – possibilities with this site made it one of the few submitted mashups to explore data in this way.

And the Notable Mashing Achievements were:

  • In Their Honour which brings together service records, maps and photographs for each of the service men and women who have died for Australia. The judging panel felt that although a similar service already exists provided by the Australian War Memorial,  this entry explored the data in a noticeably different way attracting the opportunity for a different kind of engagement with the same datasets.
  • LobbyLens shows connections relevant to government business (e.g., government suppliers, government agencies, politician responsibilities, lobbyists etc.). For the judging panel, the relationship visualizations that this entry gave aligned well with much of purpose of Government 2.0 even if its usability needs a lot of work.
  • FlipExplorer this entry combines an interactive online search interface, 3D tagcloud, and timeline widget, which allows you to browse through the Powerhouse Museum’s Collection as you would any physical book. Although not truly a mashup of more than one data source, the judges felt that this was an impressive use of a visual interface.

For the People’s Choice Award, once we adjusted for the malicious voting up and voting down (shame on you who partook), the clear winner was In Their Honour — which is consistent with the judge’s thoughts on its usability. As commenter Nerida Deane said “I just looked up my Great Uncle Al and found the site easy to use and I liked the information it gave me. Maybe one day I’ll have a chance to visit his memorial.”

The Student Prize goes to Suburban Trends (obviously) and to Suburban Matchmaker, which the judges felt was a clever idea (albeit potentially raising some interesting questions for future ethics classes). Because rewarding and encouraging our students can never be a bad thing, the judges also agreed to award both Earth:Australia and Community Rivers each a partial student prize for a commendable effort in student mashing.

Finally, the Transformation Challenge for entries that enhance and/or make datasets available for re-use programmatically – the bonus prizes are awarded to geo2gov (see above), Neogopher (judges’ comment: this provides a pretty comprehensive set of transformation and API access to many of the datasets in one place) and absxml (judges’ comment: a nice conversion idea that needs a bit of work to make it more usable.).

Finally, as part of any awards ceremony some thank-yous are required. A repeated big thank you to all of those involved in organizing the hackfests, to those who participated in the events and everyone who submitted entries or provided comments and feedback. Many thanks are due to our esteemed judging panel for their time and attention to all 82 entries. Thanks also to the Federal, State and Territory government agencies who provided datasets for the contest. And to the teams on the Taskforce Secretariat at the Australian Government Information Management Office and at the Department of Broadband, Communications and the Digital Economy who provided the project support. This has been a collaboration in every sense of the word and hopefully demonstrated its purpose namely, to show what is possible when agencies liberate their data…

[cue the music to cut the presenter short and get them off the stage]

]]> 7
Australia, You Have Been Mashed Wed, 18 Nov 2009 05:32:56 +0000 Mia Garlick
OpenAustralia Hackfest, Halans CC BY-NC-SA

OpenAustralia Hackfest, Halans CC BY-NC-SA

Last Friday November 13, 2009 saw the close of the entry period for our Mashup Australia contest. While our esteemed judging panel is now hard at work assessing the entries, it’s timely to pause and consider how it has gone so far….in word: wow!! The response has been fantastic.

As you may recall, the contest was designed to provide a practical demonstration of the benefits that open access to public sector information can provide. We asked you – the community – to help us with this. We released some datasets on terms and in formats that enable reuse and asked you to help us show the benefit that can result. And show us you did.

We have had 81 entries — a huge result that positions this contest on par with similar contests held in other jurisdictions (or possibly even with greater impact if you pro rata entries per head of the population ☺). The entries are diverse in their focus – from Australia’s world heritage listed areas, to a Darwin bus map, to “Know Where You Live” (a visualization of Australian Government data based on your geographic location with accompanying images relevant photo).

Without doubt, considerable momentum for the contest was generated by the hackfests that were organized to get people together, sharing skills and ideas and building things. One of these – GovHack (see this report back) – was supported by the Taskforce but four others – the GoogleHackNights #1 and #2, the Melhack and the OpenAustralia Google Hackfest (see this report back) – were self-organised. All of them were huge successes.

All in all, I think its fair to say that you have definitely helped us demonstrate the innovative potential that can be unlocked when government information itself is unlocked, both via the hackfests but also via the blog posts explaining how you created your mashups (see e.g. “Building mashups for the society (Mashup Australia”) and “In Their Honour – Mapping Anzac Graves”). You have also helped us better understand how government data can be improved with your feedback about your experiences in trying to use the data (thanks, for example, to pamelafox and Jo Decker).

A big thank you to all of those involved in organizing the hackfests, to those who participated in the events and everyone who submitted entries or provided comments and feedback.

Don’t forget, we are keeping public voting open until 4pm this Friday November 20th and don’t worry geo2gov, we’re on those attempts at vote rigging.

Stay tuned to find out who the Mashies go to….. ]]> 3 Data data everywhere but not a scrap of sense Mon, 16 Nov 2009 02:04:21 +0000 Pip Marlow It was exhilarating to see the enthusiasm around the GovHack event as hordes of developers enjoyed pulling together data sets in new and innovative ways. It is certain that it will provide enthralled users with not only access to, but also insight from, the resulting information combinations.
It was also heartening to see Pamela Fox provide some best proactive tips for developers and data owners in her post stressing the value of structure and standardisation where possible. But I was reminded yesterday in a discussion about social software how much of our total information is now in an unstructured format, where the value lies in the ability to understand the context and meaning of data and its relationship to other information which is not supported in a nice neat way.
This became apparent at the Public Sphere event that Pia Waugh championed earlier this year where everyone struggled to consolidate the extremely valuable – but vast and unmanageable – variety of input in all sorts of different forms. Oral, written, blog posts, tweets, videos,… and many more.
The team did a great job at pulling together a useful summary and set of recommendations but I was left thinking that the increasing torrent of data is leading to diminishing returns as individuals initially try to monitor the real-time fire hose of information and secondly, as they pause to reflect, analyse, and try to derive value from a range of inputs.
So, what am I saying here? Basically that the agenda of Gov 2.0, and of the whole project of providing transparency and openness in government data, cannot be met unless we deal with the challenge of finding the “jewels”, the “gems” in the unstructured data itself. Surely, given that we have a range of companies working with us on the Gov 2.0 project, and we have recognised that utilising the power of semantic technologies is going to play a big part in allowing us to address this issue, would it not be sensible and timely to integrate some of the processes that are already being developed into the way the Gov 2.0 Task Force itself operates – the whole mantra of “eating our own dog food”. A radical thought but perhaps with some merit.

]]> 5
Whole of Government Information Publication Scheme Mon, 09 Nov 2009 04:18:38 +0000 Eric Wainwright Eric Wainwright of eKnowledge Structures has been commissioned by the Taskforce to undertake Project 7 regarding a Whole of Government Information Publication Scheme.

Not a topic that has inspired much discussion so far! But here at eKnowledge Structures, Dagmar Parer and I have been wrestling with our brief under Taskforce Project 7.

The proposed new Freedom of Information legislation, together with the Bill establishing the Office of the Information Commissioner (OIC) are scheduled to come into Parliament by 2009. If the Bills are passed, the Commissioner will have some fairly wide powers relating to Commonwealth information management. The Information Publication Scheme (IPS) will be mandatory for all Commonwealth Departments and agencies. Queensland has been in the forefront with such Schemes, basing its approach on the UK model. It has clearly influenced not only the new Commonwealth legislation but also the Government Information (Public Access) Act in NSW, and the Right to Information Bill in Tasmania.

We are considering how these schemes might be constructed and implemented in a way that actually results in assisting the Government’s objectives for more pro-active and open disclosure of, and around, information held by Government agencies.

Some questions to kick off discussion are:

  • There is a risk that the IPS will be seen by agencies as just an additional compliance chore to add to their existing list – Annual Reports, Senate lists of files, etc. How can we minimise this risk?
  • Can the IPSs be implemented so that they act as a catalyst for more integrated agency information management planning and practices, and clearer information pathways for the public?
  • The Bill (section 8A) refers to ‘operational information’ which must be published, and allows that ‘the agency may publish other information held’. Is there a right balance between maximum pro-active disclosure under these clauses and the potentially extra costs of publishing and maintaining very little used material on agency websites?
  • Can we use IPS’s to advance the visibility, availability and utility of government data from a much wider range of agencies?
  • How best can the OIC create initial momentum for a positive roll-out of Schemes across government, and then assist agencies in the on-going plans required by the new Act?

Or any other comments!

]]> 5
GovHack: govt data + hackers + caffeine == good times Thu, 05 Nov 2009 00:53:24 +0000 John Allsopp John Allsopp from Web Directions was an organiser of GovHack, an event sponsored by the Taskforce. It was held on the 30th and 31st of October 2009 to encourage greater use and availability of government data in support of the MashupAustralia contest.


For those who’ve not heard of them, the rather ominously sounding “hack days” are events that have been gaining popularity with developers around the world. They bring together web focussed designers, developers and other experts to build web applications and mashups in a 24 hour period.

As far as I’ve been able to determine, no Government at any level anywhere in the world has been willing to not only open up their data for people to “hack” but actually host a “hack day” to bring people together to do so.

At least not until last Friday and Saturday, when GovHack, an initiative of the Australian Government 2.0 Taskforce was held at the ANU in CSIRO’s ICT Lab and the ANU’s Computer Science department. Around 150 “hackers” (hacking, btw, is a typically positive term among developers, it’s only in mainstream usage that it tends to have negative connotations) from all over Australia came together and built numerous incredibly sophisticated web applications and mashups, some, like the Judge’s overall winners “Lobby Clue” by teams of people who’d never even met before the day.

Govhack kicked off with an hour or so of short sharp presentations, by members of the Government 2.0 Taskforce, and the developer community, along with “data owners”, both in Government and commercial, spruiking their data wares to the assembled hacker community.

Teams then got down to business, exploring the growing number of government data sets available online, “speed dating” to find hackers in search of teams with skills they needed, and planning their hacks.

Throughout the night, teams coded away, fuelled by caffeine (and it must be said excellent food, fruit, juices, and camaraderie.) Even well after midnight, a couple of dozen remained working, with a palpable buzz in the air, while Taskforce chair Dr Nicholas Gruen was still to be found discussing the merit of various sites and hacks at 2am. A dozen or so hardy souls even managed to hack all night.

Saturday morning saw new teams arrive, and the less hardy return from hotel rooms and home to restart their development. Senator Kate Lundy, now dubbed the “Patron Senator of Geeks” spent quite some time interviewing various participants, with the video hopefully available soon. As the 4pm deadline loomed, frantic (geek speak alert) XML to JSON conversions, JavaScript debugging and API reverse engineering were occurring throughout the CSIT Building on the ANU Campus.

Just what was achieved for all the effort? Before turning to some of the genuinely outstanding projects, a few outcomes from the event illustrate the breadth of the achievements. A few teams found themselves in need of postcode to Local Government Authority conversions, but while the data was sort of available, it was far from easily usable. Stephen Lead from LPMA in NSW took the less than ideal data and transformed it into a far more usable format. Then Mark Mansour from Sensis created a database and API (an Application Programming Interface is a standardized way of applications talking to one another) for the data, to make it much easier for anyone to use. Within an hour or two, two teams at GovHack were actually using this API. In a similar vein, Rob Manson, from MOB created a single JSON API to many of the disparate data sources available on Meanwhile, the NSW State Government launched their new data catalogue to make sure the data was available for GovHack.

So what exactly did people build? In all there were around 20 projects presented at the end of the 24 hours, almost all of which were conceived and built at the event itself. Many were geo/mapping focussed, but others focussed on data visualisation and exploration, the next wave of web applications in many people’s opinion. The level of complexity, sophistication, and novelty of many of the projects was extraordinary, given the tight time constraints. Projects that you can actually use right now included (keep in mind their alpha state)

  • The overall winners LobbyClue, by a team comprising members many of whom had never met before the event. LobbyClue is an in-depth visualisation of lobbying groups’ relations to government agencies, including tenders awarded, links between the various agencies, and physical office locations
  • Know where you live, a stylish presentation of ABS data (along with Flickr Geocoded photos), pulling in relevant information for a particular postcode: rental rates, average income, crime rates, and more. Built by a team of developers who work at News Digital Media.
  • What the Federal Government Does, an enormous tag cloud of the different functions of government, combined with visualisations of government functions shared between departments.
  • Rate A Loo demonstrates a community engagement idea, seeded with government provided data. Allows users to locate and then rate the condition of public toilets.
  • It’s buggered, mate, In true Australian style, allows you to report buggered toilets, roads, etc, with an easy-to-use graphical interface overlayed on a map. Their idea was to combine this with local government services to fix issues in the community. Built by a team of developers from Lonely Planet.
  • Many more fantastic projects can be found at the GovHack site.

A huge thanks to AGIMO and the Taskforce for enabling it all, CISRO’s ICT Lab and the ANU Computer Science Department for providing a venue and network facilities, to Microsoft, whose Project Fund helped make GovHack a reality. Numerous volunteers from AGIMO and the web developer community helped ensure the success of GovHack, and a big thanks to them as well. And of course the participation of so many developers from all over the country ensured that the event produced lasting value. Hopefully more than a few of the 24 hour hacks turn into applications we’ll be using for years to come. Above all thanks to you.

]]> 12
Making Government Data More “Hack”able Wed, 28 Oct 2009 04:56:18 +0000 Pamela Fox At Google, we think it’s pretty awesome that the government is holding a contest to mash government data. As a company with a lot of APIs, we love when people use them to make mashups, and as a company with a mission of making data universally accessible and useful, we love to see governments opening up their data. So we’ve arranged a couple of events in support of the contest. We held a 3-hour “MashupAustralia HackNight” on October 14th, we’re holding another one tonight, and we’re hosting the OpenAustralia HackFest from Nov 7-8. At our first hack night, we started off with talks on the contest, mashups and APIs, and putting data on maps. Then, since we conveniently had a representative from at the event, we took the opportunity to search through their database and find useful datasets. We found a couple really good ones — the NSW Crime set and the Victoria Internet locations set — but we also found a lot of really hard to use sets. Since part of the goal of this contest is to figure out what characters define a useful dataset, and to encourage governments to adopt those, I thought I’d take this opportunity to give a few basic tips:

  • Format: Generally not a good idea to share data in a binary format. It is more compact, but it is less accessible to developers. The best format is an API (REST or XML-RPC) or more simply, an RSS feed with all the entries. The next-best format is a well-structured CSV or spreadsheet, as many database systems can easily input those. If you are going to use a more obscure format, provide tips on how to use it. (This is something that the site could also provide).
  • Size: Some data sources provided zip files that were around 300 megabytes. Most developers aren’t going to download 300 megabytes if they don’t know what the data looks like, and what makes up that size. If you are going to provide a large file, I suggest also providing a preview file.
  • Geo data: The vast majority of the data sources are related to geographic regions or points, but the vast majority also didn’t provide enough geographic data. If possible, you should provide the address and the latitude/longitude coordinate. If the data describes a region, provide an array of coordinates. A great example of this is the NSW fire feed – it provides an address, a point, and a polygon.

These are simple suggestions, but they can make a world of difference in terms of making data useful. We hope to see more government agencies opening up their data for developers and evaluating how they’re doing so. But we also hope to see developers using the current data as much as possible, and coming up with more ideas. Please join us at one of our future events!

]]> 10
The Three Laws of Open Data Tue, 20 Oct 2009 05:40:40 +0000 David Eaves David Eaves is a member of the Taskforce’s International Reference Group.

Over the past few years I have become increasingly involved in the movement for open government – and more specifically advocating for Open Data, the sharing of information government collects and generates freely towards citizens such that they can analyze it, repurpose and use it themselves. My interest in this space comes out of writing and work I’ve down around how technology, open systems and generational change will transform government. Earlier this year I began advising the Mayor and Council of the City of Vancouver helping them pass the Open Motion (referred to by staff as Open3) and create Vancouver’s Open Data Portal, the first municipal open data portal in Canada. More recently, Australia’s Government 2.0 Taskforce has asked me to sit on its International Reference Group.

Obviously the open government movement is quite broad, but my recent work has pushed me to try to distill out the essence of the Open Data piece of this movement. What, ultimately, do we need and are we asking for.  Consequently, while presenting for a panel discussion on Conference for Parliamentarians: Transparency in the Digital Era for Right to Know Week organized by the Canadian Government’s Office of the Information Commissioner I shared my best effort to date of this distillation: Three laws for Open Government Data.

The Three Laws of Open Government Data:

  1. If it can’t be spidered or indexed, it doesn’t exist
  2. If it isn’t available in open and machine readable format, it can’t engage
  3. If a legal framework doesn’t allow it to be repurposed, it doesn’t empower

To explain, (1) basically means: Can I find it? If Google (and/or other search engines) can’t find it, it essentially doesn’t exist for most citizens. So you’d better ensure that you are optimized to be crawled by all sorts of search engine spiders.

After I’ve found it, (2) notes that, to be useful, I need to be able to use (or play with) the data. Consequently, I need to be able to pull or download it in a useful format (e.g. an API, subscription feed, or a documented file). Citizens need data in a form that lets them mash it up with Google Maps or other data sets, or analyze in Excel. This is essentially the difference between VanMaps (look, but don’t play) and the Vancouver Data Portal, (look, take and play!). Citizens who can’t play with information are citizens who are disengaged/marginalized from the discussion.

Finally, even if I can find it and use it, (3) highlights that I need a legal framework that allows me to share what I’ve created, to mobilize other citizens, provide a new service or just point out an interesting fact. This is the difference between Canada’s House of Parliament’s information (which, due to crown copyright, you can take, play with, but don’t you dare share or re-publish) and say, which “pursuant to federal law, government-produced materials appearing on this site are not copyright protected.”

Find, Use and Share. That’s want we want.

Of course, a brief scan of the internet has revealed that others have also been thinking about this as well. There is this excellent 8 Principle of Open Government Data that are more detailed, and admittedly better, especially for a CIO level and lower conversation.  But for talking to politicians (or Deputy Ministers or CEOs), like those in attendance at that panel discussion or, later that afternoon, the Speaker of the House, I found the simplicity of three resonated more strongly; it is a simpler list they can remember and demand.

]]> 5
Mr Gruen goes to Washington Fri, 02 Oct 2009 15:23:42 +0000 Nicholas Gruen Being flat out on other duties has prevented me from providing you with a ‘write up’ of my time in Washington D.C. at the Government 2.0 Summit. To picture yourself there, just picture Kate Lundy’s Sphere Camp and then soup it up by making the speakers the most energetic and powerful movers and shakers in the largest most tech savvy country in world – bring on the CIO and CTO of the United States, don’t give them all that long to talk, so they have to be succinct (we’ve got so much other stuff we need to listen to), have the Twitter feed live on stage and off you go.

So there you have it, a hall of hundreds of people from all around the world, most massively ‘multi-tasking’ away, looking, listening, emailing, twittering, reading the twitter feed, meeting, setting up deals and on it went. Very energising I must say. I won’t give you a blow by blow description, but there were some terrific presentations. On the downside, I think those who have been great at thinking about Web 2.0 – like Clay Shirky and Tim O’Reilly didn’t really excel in explicating the potential of Govt 2.0 but plenty of others who have been working in the area – like Beth Noveck, Vivek Kudra and Aneesh Chopra and Andrew McNaugten did. You can watch virtually the whole thing on line here.

But the most significant impression I left with was not only about the Summit itself which was terrific, but about the US and us in comparison with it. As an Australian public policy practitioner Australia is at the forefront of many if not most areas of public policy. We’ve been major innovators in public policy since Europeans got going on these shores (the First Fleet was a contracting out job – and a successful one! – Trouble set in with the second fleet but I digress . . .). However I’ve got bad news in this area. I’m afraid the US has such structural advantages that it’s way ahead of us. And other countries like the UK have been at it for longer, and they’re ahead too.

This paper by Ellen Miller of the Sunlight Foundation provides an inventory of some of the initiatives underway in the US but also offers it in historical perspective. What is striking is that virtually all the progress in openness in government in the US over the years has not come directly from the government. It’s been prompted and fought for by civil society. The insiders are being dragged along by the work of the outsiders.

Here’s a great example Miller gives. The Securities and Exchange Commission (SEC) maintains a database of the financial reports companies are required to file with them. This database (EDGAR) was always available…to the right kind of people…for a fee. They resisted the idea that this information should be available on the internet. In the 90s however a public domain advocate, Carl Malamud (with a little help from benefactors) purchased access to the data and put it online in an accessible format. The SEC was stunned by the site’s popularity and within two years had put EDGAR online themselves. They managed to do this quickly with the eager assistance of Malamud.

At the same time he put EDGAR online, Malamud created an online database of US patents. He noticed high traffic to the site from the Patent Office, from patent examiners frustrated by their own database. He wasn’t just making government information more easily available to the public, he was making it available to the government themselves! Eventually the Patent Office, like the SECrelented and created their own database

Public.Resource.Org, founded by Malamud, is continuing work like this. For instance, on their siteFedFlix is a joint venture with the government to make the vast amount of movies and footage owned by the National Technical Information Service available to the public. The NTIS lendsPublic.Resource.Org the tapes, and Public.Resource.Org puts them on YouTube. The efforts of outsiders have led to a new resource for the public.

Miller provides other examples. For instance, FedSpending is a non government site that provides all the available data on US Federal Government expenditure and allows users to examine and compare it by department or by state or even whether contracts were competitively bid or not.

Congress was then spurred to license the software for its own USASpending.

This episode is interesting because the information made available was politically sensitive, particularly to one Ted Stevens. Miller mentions several projects to promote transparency. These includeMapLightand Fundrace, both highlighting political donations, and OpenCongress (run by Sunlight), which allows you to compare donations to politicians to their voting records and legislation.

Of course Australia is not without its equivalents. Open Australia stands out for me, but there are plenty of others beavering away. I salute them. But if we’re going to match the world leader in this area (and there’s no reason why we couldn’t have a crack) we’re going to have to get quite a wriggle on.

]]> 14
A League ladder of PSI openness? Fri, 18 Sep 2009 15:03:54 +0000 Nicholas Gruen I attended a function at Parliament House on Wednesday night which was designed to showcase a number of things Google have on the boil, not least the usefulness of their product offerings to governments. Such things as Google Apps and Google’s general capacity to deliver the cloud not just to retail users but also to corporates. (Alan Noble attended today’s TF with a Google bumper sticker on the cover of his laptop which read “My other computer is a data centre”.)

I also got a better look at Wave. It looks more intriguing the more I see of it.

Google is making good progress getting hold of data to make its products – particularly Google Maps – even more useful, but it’s also hard for them not to be frustrated by the silly things which mean that data that you and I have already paid for governments to collect, data collected with the sole purpose of generating public benefits, is not simply, easily, quickly released into a serendipitous world in which we find out (so often to our own surprise) how useful it can become. Google can’t get good data on toilet maps despite the Federal Government’s having it (we earmarked this as an early win when our Taskforce began but at the halfway mark, I guess its status as an early win is running into ontological trouble.  Now we’re fairly confident of a ‘better late than never’ win).  And Google can’t get good data on the precise location of bicycle paths.

Then I had an idea. Since I conveyed it to Google, it seems only fair that I convey it to you. Why doesn’t Google report on governments’ preparedness to release data. It could produce a methodology and apply it consistently. Since Google Maps is an Australian originated product it would make sense to develop the methodology here where it could be applied in ‘beta’ form to Australia’s state governments.

One thing I’ve observed is that State Premiers like to claim that their state is the best or one of the best at something. State Oppositions also spend their time drawing attention to the ways in which the government they are opposing is sending their state to the dogs, choosing whatever comparative stats demonstrate their government’s relative under-performance. And of course there’s no reason to stop at state governments. National governments could also be compared.

The political obstacles to releasing most public sector information are not ‘hard’ ones. By that I mean that in almost all cases, releasing the kind of information that Google is after is not like raising a new tax or closing a school. The reason that the information has not been released is just that it’s a lot easier for a lot of people for it not to be released. Releasing it may involve legal advice and changes to copyright policy. It may involve some cost or inconvenience. And who knows what the information might be used for? Can officials be sure that the information can’t be used to embarrass them or the government? Usually they can’t be sure, and so they decide they’d better be on the safe side. So the reason the data is not being released is not because there are any clear political roadblocks to its release, but because any decision to release it must run the gamut of the that dark dank place which I call the Hall of a Thousand Cobwebs where so many worthy proposals die slowly, quietly and anonymously.

When there’s no political ’story’, no transgression by the government in not releasing the data it’s just so easy not to – even if it’s no real life or death issue for the government. But given the ’softness’ of the obstacles to release, the counterweight provided by a league ladder of openness of public sector information could often provide sufficient visibility for the issue  to make a substantial difference.

I expect that Google would probably like someone else to run such a league ladder – like Transparency International.  But in the spirit of releasing early and releasing often, it seems to me that Google (or anyone else who wants to) could get this ball rolling fairly quickly with a view to handing it over to others once they were ready to carry the baton.

Your thoughts?

]]> 21
What data should we be releasing? Mon, 17 Aug 2009 13:10:10 +0000 Nicholas Gruen Andrew Leigh, freakonomist, econometrician and indefatigable crusader for the power of data has sent us a short and sweet submission (rtf) by email. I was going to ask him to work it up into a guest post, but then I can just quote it here.

Government 2.0 Taskforce Emailed Submission

Author: Andrew Leigh

Submission Text:

From the standpoint of researchers, one of the things that the Taskforce should strongly support is more data. A few examples are below.

• State and territory governments should release geocoded crime statistics of all crime reports. See for example this website created by the New York Times:

• FaHCSIA should release all data (including prices and quality ratings) from its annual Child Care Census.

• The Taskforce should encourage projects such as the digitisation of MP interest registers by

I’d like to open this thread up as a kind of repository to which anyone can add to an inventory of data sets that should be made public. Andrew has mentioned data held by state government agencies. Of course our only clear jurisdiction is in the Federal arena but I think we should be prepared to both talk about and make suggestions/recommendations regarding data held by other agencies – it’s up to them whether they want to accept them. In that spirit I’ll reiterate something I’ve argued previously namely that we should publish (pdf) data on individual companies workers compensation premiums where this provides reasonable information about their past safety record.

So please feel free to use this thread as a record of all the data you, the community think does, should or might exist that we should be trying to get freed to enable us to lead slightly more informed, and so slightly better lives.

]]> 67