Government 2.0 Taskforce » Information Design by Ben Crothers of Catch Media Tue, 04 May 2010 23:55:29 +0000 en hourly 1 Guest Post: The Victorian Department of Justice and Web 2.0 Thu, 31 Dec 2009 01:55:53 +0000 Nicholas Gruen Darren Whitelaw is what I call a public sector entrepreneur – which means nothing more nor less than that he’s someone tries to get things done including new things. He’s with the Department of Justice in Victoria and is very active in Government 2.0 in that state. I suggested to him through Patrick McCormick who is similarly a public sector entrepreneur and recently moved to Justice that a guest post on what the Department had been up to would be welcomed. And so here is his post.

A journey of discovery

The Gov2.0 Taskforce’s final report provides a compelling roadmap for the Australian public sector’s future online journey and contributes new insights and ideas to the global Gov 2.0 conversation. As online service delivery becomes commonplace, and citizen expectations for more efficient and effective public services increase, the role of Web 2.0 in government cannot be underestimated.

The challenge for the public sector, much like the private sector, is not only to make use of these emerging technologies, but also to ensure there is the cultural change to support them. Victoria’s Justice Department has been using various Web 2.0 technologies over the past 18 months – to help respond to Black Saturday bushfires, reduce the impact of problem gambling, tackle excessive drinking, show public support for emergency service volunteers, help people assess their level of fire season readiness, and demonstrate transparency around speed cameras. These efforts have delivered tangible benefits but it hasn’t always been a smooth journey.

We’ve learnt that adoption of community collaboration takes time. Creating online communities built on credibility and trust is a big job, one that involves tinkering, listening, revising and trying again. It’s more a slow burn, than an explosion. And if we are going to fail, it’s best to fail small and fast, so we can adapt and try again. It’s an iterative process.

I can think of three things that have been instrumental to this journey:

1) Provide access to information
2) Enable user-generated content
3) Go where people are

Provide access to information

The horrific bushfires that swept Victoria in February 2009 placed immense pressure on our emergency services. Not only in fighting fires, dispatching equipment and personnel, but in responding to the public’s thirst for information. To help alleviate pressure, we responded by developing a widget (now decommissioned) to provide easy access to latest news, info and pictures about the crisis. This was built using a white-label software solution, spread virally, and used RSS, Twitter, YouTube and Flickr. Only around 130 people installed the widget on their social networking page, but that small base led to more than 80,000 unique views, and more than 26,000 people interacting with it. Not too bad for our first attempt, I reckon.

In August 2008, we launched a new website that mapped the location of all the fixed speed and red light cameras in Victoria. The site also included evidence demonstrating when each camera had last been calibrated and tested, as well as telling motorists with a good driving record how they could apply to have their fine revoked with an official warning.

There’s a major overhaul planned for early 2010, this time using Google Maps to display the camera locations. A lot of effort has gone into busting many of the myths around speed enforcement, driving safety and traffic cameras – recognising people want credible, authoritative information on topics of interest to them – and if we don’t fill that void, others will.

Enable user-generated content

Another path we took on our Web 2.0 journey was user-generated content. Our first attempt was earlier this year on a revamped website designed to help problem gamblers. Along with the usual information to help gamblers and their loved ones, the site gave people the chance to share their stories. Despite a slow start, there have been some really positive and emotional stories.

User-generated content was key to our campaign to give people the chance to show their support for Victoria’s emergency services volunteers. A modern-day twist on an old-fashioned letter writing campaign, instead of dumping a mail bag full of correspondence on a desk, we got people to stick a virtual post-it note on a wall of thanks. This campaign leveraged the benefits of microblogging (contributors were limited to 250 characters) making it quick, and easy, for people to say thanks and also learn about the kind of person it takes to be an emergency services volunteer. Visitors to the site also had the chance to create a blog, post longer messages, and upload photos.

To date, there have been 556 messages of support, and nearly 20,000 people have visited. I encourage you to check out the site
and scroll through the message wall – the posts are inspiring and really show the level of heartfelt appreciation in the community.

Go where people are

The volunteer campaign showed us how important it was to go where the crowds gather. We had a healthy interest in the microblogging message site, and the biggest success was on Facebook, where more than 9,000 people have shown their support by joining the fan page. Hundreds have also taken part in a conversation about how valued our volunteers are by leaving messages on the wall. Twitter users were also quick to show their support, and stay up-to-date with emergency volunteer news, with 1,206 followers to date.

Facebook is also being used to help spread the fire ready message in preparation for this summer. An app has been developed, as a quick test for homeowners and others in fire-prone areas to gauge their level of preparedness. The idea is to raise awareness, then get people to go to the CFA website to complete the detailed self-assessment.

The success of Facebook and Twitter has shown us how important it is for public services to move out from behind our websites and to go to where the people are.

So where to from here?

So what have we learnt? New paths along unfamiliar territory are unlikely to be smooth and trouble-free. That’s why it is vital to be agile and flexible, so failures will be both small and short. It’s also important to tinker first, to always keep listening, to continually revise, and when you’re done, go back and try again.

Perhaps the first step is tackling the biggest barrier: cultural change. The key to accepting Web 2.0 within government relies on a cultural change within the public service itself, rather than a change within technology.

Government’s traditional role-based authority can only get us so far. The input of communities, peers, and others through an authentic and meaningful conversation is vital, and Web 2.0 technologies allow this to happen on a scale never seen before. This two-way interaction is vital for policymakers because of the persuasive authority that comes from fostering this conversation. People like it because it’s not just big-government telling them what to think, feel and do – it’s their family, friends, neighbours and peers as well. It’s not Government vs Citizens, but Government AND Citizens.

This kind of engagement isn’t free. Sometimes, it comes with a significant cost. Not just a financial cost, but on other valuable resources such as time and people as well. There’s also a cost to reputation if the risks aren’t minimised. But the bigger thing to calculate is the cost of not doing it.

As the taskforce wraps up its work, how can we in the Australian public sector use this as a catalyst for our own conversations? How do we mobilise those exploring the Web 2.0 space and continue to share the experiences of our journeys? Not just the good, but the bad and the ugly as well – to learn from our fellow travellers, collectively find our way in this new space and seize opportunities as they arise. By doing so, not only will we be able to deliver more tailored, effective and efficient public services, but be able to foster stronger community engagement and social innovation as well.

Darren Whitelaw (@DarrenWhitelaw) is the General Manager of Corporate Communication at Victoria’s Department of Justice. The views expressed in this post are those of the individual and do not represent those of his employer.
]]> 11
Innovation and Government 2.0 Sat, 19 Dec 2009 14:28:22 +0000 Nicholas Gruen Government 2.0 is integral to delivering on several agendas that the Government has running at present.  It’s central to delivering on Innovation in Government – and that’s the subject of a review which with I have been involved being conducted within the Department of Innovation under the auspices of the Management Advisory Committee which is a forum of Agency Heads established under the Public Service Act to advise Government managing the Australian Public Service.

As part of our own exercise I asked the Australian Public Service Commission (APSC) to have a look at the data it compiled for its State of the Service Report this year.  It has only come out in the last few weeks, so there was no time for them to do the analysis and for us to get it into our draft report.  In fact we’ve not included this in our final report for reasons I’ll explain.  But it’s interesting and deserves to be on the record.

The APSC were somewhat anxious about cross tabulating the two surveys because cross tabulation gives much looser correlations. To understand why, consider that social media is likely to be being used in just some parts of the public service.  My guess is that, of the 26 agencies that reported using social media, most used it in only small pockets within their operations – for instance their marketing and/or communications units would be candidates for using it. So many, perhaps most, perhaps almost all employees working in some of these agencies might well have no access to them, may not even know about them, and yet come up in the survey as employees with access to social media. We’ve spoken to the APSC about bringing social media issues into their employee survey which we hope they will do.

Another concern I have is that the question asked tends to emphasise social media platforms rather than the interactivity of use. The question in the survey of agencies was this:

“Does your agency officially use any of the following social media and networking tools in engaging with external stakeholders? (Multiple Response). Then there was this list of possibilities

  • Facebook
  • My Space (sic)
  • You Tube (sic)
  • Twitter
  • Other

Now these are definitely Web 2.0 tools, but, (and this isn’t a criticism of the APSC as they were just dipping their toe in the water here) they don’t demonstrate to me Web 2.0.  All are often used as Web 1.0 broadcast tools. So a Department’s using the capabilities of any of these tools to broadcast isn’t of much interest to us.  I’d be more interested to know if the agency or any of its staff maintained a blog which had professional content about matters that were within the purview of the agency. That would signal something more interactive going on (although even here, one really needs to look closely to see whether there’s real interaction going on and judge it’s quality).

Anyway, given my reservations I expected the data might not be much use, but thought it was worth seeing what the numbers suggested, however tentatively.

I asked them how the agency answers correlated with perceptions in answers to the employee survey around four issues.

  1. The quality of management
  2. The culture of innovation within agencies
  3. The culture of collaboration with other agencies
  4. Engagement with outsiders.

In short the answers came back as follows.

  1. The quality of management (no result)
  2. The culture of innovation within agencies (strongest result of positive correlation – see table below)
  3. The culture of collaboration with other agencies and/or outsiders (no result)
  4. Job Satisfaction (a negative correlation see table below)

So the results were probably pretty unreliable in any case, but confirmed my priors in one case and were inconsistent with them in the other. Here are the two relevant tables.

Does your agency use Facebook, MySpace, YouTube or Twitter (social networking) in engaging with external stakeholders * q18g. My current agency encourages innovation and the development of new ideas. Crosstabulation



q18g. My current agency encourages innovation and the development of new ideas.



Neither Agree nor Disagree


Not Sure

Does your agency use Facebook, MySpace, YouTube or Twitter (social networking) in engaging with external stakeholders No social networking






Social networking












Does your agency use Facebook, MySpace, YouTube or Twitter (social networking) in engaging with external stakeholders * q17a. I enjoy the work in my current job. Crosstabulation



q17a. I enjoy the work in my current job.



Neither Agree nor Disagree


Not Sure

Does your agency use Facebook, MySpace, YouTube or Twitter (social networking) in engaging with external stakeholders No social networking






Social networking












The latter negative correlation surprised me, and I don’t believe it.  I asked the APSC to do some digging around to find out whether the answers were different in different sized agencies which it seemed to me might be driving the results. Sure enough the closer you looked at the results the less sure you were that there was anything much going on at all, other than the random differences between agencies.  I didn’t do the same with the earlier (positive) correlation as we’d tested the patience of the APSC enough and they were flat out.  In any event, it will be interesting to see the results next year when, with any luck the APSC will include some social networking questions in their employee survey. I’m also hoping some questions will be slanted towards seeking out how much online interaction there is, and not just whether certain platforms that can be used for online interaction are being used.

]]> 3
Making Government Data More “Hack”able Wed, 28 Oct 2009 04:56:18 +0000 Pamela Fox At Google, we think it’s pretty awesome that the government is holding a contest to mash government data. As a company with a lot of APIs, we love when people use them to make mashups, and as a company with a mission of making data universally accessible and useful, we love to see governments opening up their data. So we’ve arranged a couple of events in support of the contest. We held a 3-hour “MashupAustralia HackNight” on October 14th, we’re holding another one tonight, and we’re hosting the OpenAustralia HackFest from Nov 7-8. At our first hack night, we started off with talks on the contest, mashups and APIs, and putting data on maps. Then, since we conveniently had a representative from at the event, we took the opportunity to search through their database and find useful datasets. We found a couple really good ones — the NSW Crime set and the Victoria Internet locations set — but we also found a lot of really hard to use sets. Since part of the goal of this contest is to figure out what characters define a useful dataset, and to encourage governments to adopt those, I thought I’d take this opportunity to give a few basic tips:

  • Format: Generally not a good idea to share data in a binary format. It is more compact, but it is less accessible to developers. The best format is an API (REST or XML-RPC) or more simply, an RSS feed with all the entries. The next-best format is a well-structured CSV or spreadsheet, as many database systems can easily input those. If you are going to use a more obscure format, provide tips on how to use it. (This is something that the site could also provide).
  • Size: Some data sources provided zip files that were around 300 megabytes. Most developers aren’t going to download 300 megabytes if they don’t know what the data looks like, and what makes up that size. If you are going to provide a large file, I suggest also providing a preview file.
  • Geo data: The vast majority of the data sources are related to geographic regions or points, but the vast majority also didn’t provide enough geographic data. If possible, you should provide the address and the latitude/longitude coordinate. If the data describes a region, provide an array of coordinates. A great example of this is the NSW fire feed – it provides an address, a point, and a polygon.

These are simple suggestions, but they can make a world of difference in terms of making data useful. We hope to see more government agencies opening up their data for developers and evaluating how they’re doing so. But we also hope to see developers using the current data as much as possible, and coming up with more ideas. Please join us at one of our future events!

]]> 10
Canberra one day, London the next Sun, 25 Oct 2009 23:52:21 +0000 Nicholas Gruen

Yes folks, a few hours after I was seen in Canberra at the CeBit Gov2 Conference last week I was seen in London talking about PSI.

Having cancelled my trip to London to focus on report writing, I made a video with a few hours (and anxious moments) to spare and uploaded 53 Megs of PSI so that it could be downloaded in London, and it apparently got there and was played with nary a technical hitch.

I’m not a big fan of watching videos – because it’s so much quicker to read transcripts. Alas at this stage there is no transcript. So if you want to watch it, you can view it below.

The ideas I developed were:

  • That opening up PSI is an extension of the principles of competition policy (which were about getting the market access to important infrastructure at marginal cost (OK well at average cost, but if the asset has already been built by the public sector, the optimal price is marginal cost.)
  • The importance of serendipity and the according importance of licencing PSI in a way analogous to APIs for software, which is to say CC.
  • A possible transition from where we are now with PSI 1.0 if I might call it that in which governments supply their data for further value adding towards a model in which the public provide more of that data (quoting the paper I quoted in the Inquiries 2.0 blog post).
]]> 5
The Three Laws of Open Data Tue, 20 Oct 2009 05:40:40 +0000 David Eaves David Eaves is a member of the Taskforce’s International Reference Group.

Over the past few years I have become increasingly involved in the movement for open government – and more specifically advocating for Open Data, the sharing of information government collects and generates freely towards citizens such that they can analyze it, repurpose and use it themselves. My interest in this space comes out of writing and work I’ve down around how technology, open systems and generational change will transform government. Earlier this year I began advising the Mayor and Council of the City of Vancouver helping them pass the Open Motion (referred to by staff as Open3) and create Vancouver’s Open Data Portal, the first municipal open data portal in Canada. More recently, Australia’s Government 2.0 Taskforce has asked me to sit on its International Reference Group.

Obviously the open government movement is quite broad, but my recent work has pushed me to try to distill out the essence of the Open Data piece of this movement. What, ultimately, do we need and are we asking for.  Consequently, while presenting for a panel discussion on Conference for Parliamentarians: Transparency in the Digital Era for Right to Know Week organized by the Canadian Government’s Office of the Information Commissioner I shared my best effort to date of this distillation: Three laws for Open Government Data.

The Three Laws of Open Government Data:

  1. If it can’t be spidered or indexed, it doesn’t exist
  2. If it isn’t available in open and machine readable format, it can’t engage
  3. If a legal framework doesn’t allow it to be repurposed, it doesn’t empower

To explain, (1) basically means: Can I find it? If Google (and/or other search engines) can’t find it, it essentially doesn’t exist for most citizens. So you’d better ensure that you are optimized to be crawled by all sorts of search engine spiders.

After I’ve found it, (2) notes that, to be useful, I need to be able to use (or play with) the data. Consequently, I need to be able to pull or download it in a useful format (e.g. an API, subscription feed, or a documented file). Citizens need data in a form that lets them mash it up with Google Maps or other data sets, or analyze in Excel. This is essentially the difference between VanMaps (look, but don’t play) and the Vancouver Data Portal, (look, take and play!). Citizens who can’t play with information are citizens who are disengaged/marginalized from the discussion.

Finally, even if I can find it and use it, (3) highlights that I need a legal framework that allows me to share what I’ve created, to mobilize other citizens, provide a new service or just point out an interesting fact. This is the difference between Canada’s House of Parliament’s information (which, due to crown copyright, you can take, play with, but don’t you dare share or re-publish) and say, which “pursuant to federal law, government-produced materials appearing on this site are not copyright protected.”

Find, Use and Share. That’s want we want.

Of course, a brief scan of the internet has revealed that others have also been thinking about this as well. There is this excellent 8 Principle of Open Government Data that are more detailed, and admittedly better, especially for a CIO level and lower conversation.  But for talking to politicians (or Deputy Ministers or CEOs), like those in attendance at that panel discussion or, later that afternoon, the Speaker of the House, I found the simplicity of three resonated more strongly; it is a simpler list they can remember and demand.

]]> 5 and lessons from the open-source world Tue, 25 Aug 2009 23:10:13 +0000 Alan Noble A previous blog bost talked about what data government departments should be releasing. In this post I like to talk about how to release it.

One approach is to centralise things. For example, the US Government has established the web site with the stated purpose of “increasing public access to high value, machine readable datasets generated by the Executive Branch of the Federal Government [of the US]”. The UK Government is currently considering a similar approach. The goals are commendable, but in a sense, adopts a traditional “Web 1.0” approach to the challenge of increasing access to public sector information (PSI). To use an analogy drawn from Eric Raymond’s “The cathedral and the bazaar”, can be thought of as a data “cathedral”, which is to say a huge, ambitious, centralised undertaking.

Another approach is decentralised, and would be modeled on a “bazaar”. In this approach, government web sites scattered around the Internet would utilise Web 2.0 technologies to provide data in both human and machine readable data and metadata formats.

In “The cathedral and the bazaar” Eric Raymond was of course describing software development. However if you replace “code” with “data”, and “developer” with “author” the same principles apply, namely:

  • Users should be treated as co-authors, because having more co-authors increases the rate at which the data evolves and improves, i.e., user-generated content (UGC) plays a key role.
  • Release as early as possible, because this increases one’s chances of finding co-authors early and stimulates innovative uses of the data.
  • New data should be incorporated frequently, because, as above, this maximises the rate of innovation and avoids the cost of “big bang” style integration.
  • There should be multiple versions of data sets, a newer version that is known to be of lower quality, and an older version that is stable and higher quality.
  • Data sets should be highly modular, because this allows for parallel and incremental development.

The bazaar approach is flexible and economical and supports evolutionary change. It enables different government agencies to move at different speeds to open up public sector information, one data set at a time.

What about data discoverability, you may say? Doesn’t a data portal make it easier? Well, users don’t expect to find all their news and entertainment at a single web site, so why would they expect to find all of their data at a single web site? The trick is to ensure that government web sites are discoverable and searchable, both technically (through open robots.txt, site maps, etc) and legally (through friendly copyright, provisions such as Creative Commons, etc).

Of course it’s not an either/or scenario. Data cathedrals can coexist with data bazaars, and perhaps different data sets are best served in different ways. A related question is whetherPSI platforms should be government operated at all though, or instead left to the private sector or non-profit organizations.

What do you think? Should government departments embrace some of the principles of the open-source world in order to liberate public sector information?

]]> 71
Making more government data and information available Fri, 21 Aug 2009 07:06:21 +0000 Ann Steward How much support does Government need to provide when it releases government data?

This is one of the important areas for the Taskforce to consider and we would like to hear your views and ideas on this.

Metadata plays an important role in understanding the meaning of data, its use and management. But are there other expectations from those who would like to see more data made available, such as:

  • retention specifications that the agency will need to provide at the time of release of data, for example, formats;
  • details of where and when the data will be archived;
  • how long the data is likely to be captured;
  • how complete the datasets are;
  • would it be helpful to have a general policy, covering all government data releases that sets out what support would be provided – for example, contact points for clarification on the data and its sources;
  • role of disclaimers when releasing data and what should they cover;
  • and so on.

In looking beyond just text data, are support regimes considered to be pre requisites, for example, when images are released? And are they the same regimes or is something new needed?

Are there issues that you have encountered, either with data or images that the Taskforce should take into account as we form our recommendations to Government?

]]> 18
What data should we be releasing? Mon, 17 Aug 2009 13:10:10 +0000 Nicholas Gruen Andrew Leigh, freakonomist, econometrician and indefatigable crusader for the power of data has sent us a short and sweet submission (rtf) by email. I was going to ask him to work it up into a guest post, but then I can just quote it here.

Government 2.0 Taskforce Emailed Submission

Author: Andrew Leigh

Submission Text:

From the standpoint of researchers, one of the things that the Taskforce should strongly support is more data. A few examples are below.

• State and territory governments should release geocoded crime statistics of all crime reports. See for example this website created by the New York Times:

• FaHCSIA should release all data (including prices and quality ratings) from its annual Child Care Census.

• The Taskforce should encourage projects such as the digitisation of MP interest registers by

I’d like to open this thread up as a kind of repository to which anyone can add to an inventory of data sets that should be made public. Andrew has mentioned data held by state government agencies. Of course our only clear jurisdiction is in the Federal arena but I think we should be prepared to both talk about and make suggestions/recommendations regarding data held by other agencies – it’s up to them whether they want to accept them. In that spirit I’ll reiterate something I’ve argued previously namely that we should publish (pdf) data on individual companies workers compensation premiums where this provides reasonable information about their past safety record.

So please feel free to use this thread as a record of all the data you, the community think does, should or might exist that we should be trying to get freed to enable us to lead slightly more informed, and so slightly better lives.

]]> 67
Government 2.0 – It’s the Community, Stupid Tue, 11 Aug 2009 02:25:43 +0000 Tim Watts One of the primary tasks assigned to the Government 2.0 Taskforce is to find ways for Government to use Web 2.0 tools to consult and collaborate with the public. At first blush, this sounds simple. It’s easy to point to dozens of innovative, cheap and practical Web 2.0 apps that could be employed to improve interaction between Government and citizens. However, while Web 2.0 has lowered technical barriers to communication, there are still a series of just as significant social barriers that remain.

As we’ve learnt from previous Australian online consultation trials , Web 2.0 is great at aggregating an enormous number of individual contributions. It enables thousands of people to publicly have their say. However, Web 2.0 can’t turn Ministers into omniscient beings able to conduct thousands of simultaneous conversations. Further, there are both physical and professional limits to the capacity for public servants or political staffers to engage in these conversations as representatives of the Minister. While Web 2.0 has provided the technology for the public to have its say, social limitations remain that prevent it from being heard.

Government needs to learn new skills to be able to effectively listen to the public via Web 2.0. In particular, it needs to learn that the key to listening in the Web 2.0 world is to focus to the community, not individuals. If there’s no functioning community to sort and moderate the contributions aggregated by Web 2.0 technologies, it will be impossible for Government to digest this information.

The good news is that if Government creates an environment that allows them to emerge, there are likely to be viable communities of interest across the gamut of the Commonwealth’s activities. To quote NYU academic and Web 2.0 darling, Clay Shirky:

“Every webpage is a latent community. Each page collects the attention of people interested in its contents, and those people might well be interested in conversing with one another, too. In almost all cases the community will remain latent, either because the potential ties are too weak… or because the people looking at the page are separated by too wide a gulf of time, and so on. But things like the comments section on Flickr allow those people who do want to activate otherwise-latent groups to at least try it.”

People who interact with government via the web often have strong interests in the subject matter of the government website that they share with others using the site. Someone applying for a licence from the government may rely on this interaction for their livelihood. Citizens looking at a niche area of government regulation may share an in-depth knowledge of the subject. If they had the opportunity to talk to each other over a period of time, these common interests might sustain an on-going community.

Where an environment has been provided in which they can form, communities of this kind have emerged in the most unlikely of places. For instance, one of the most successful examples of Government online community building was the US Transportation Security Administration blog which successfully activated latent communities revolving around travel security requirements and the administration of different airports. Once established, these communities can be extremely valuable sources of User Innovation, Eric von Hippel’s brilliant an increasingly important model of bottom up innovation.

However, these latent communities won’t become active unless Government actively creates an environment that encourages their emergence. Unfortunately, as Steven Clift (the Founder of E-Democracy.Org) points out in his contribution to the Personal Democracy Forum’s fantastic Rebooting Democracy series, Government is currently not very good at this:

Government websites don’t have sidewalks, newspaper racks, public hearing rooms, hallways or grand assemblies. There are no public forums or meeting places in the heart of representative democracy online… The typical e-government experience is like walking into a barren room with a small glass window, a singular experience to the exclusion of other community members. There is no human face, just a one-way process of paying your taxes, registering for services, browsing the information that the government chooses to share, or leaving a private complaint that is never publicly aired. You have no ability to speak with a person next to you much less address your fellow citizen browsers as a group.

If the Government wants to effectively listen to the public via Web 2.0, it needs to find ways to allow these ‘fellow citizen browsers’ to collaborate with each other and form on-going communities of interest. Luckily, it’s not in uncharted waters here. We already know a lot about how these communities can be nurtured from the experiences of the business and community sectors. In fact, the recently released First Report of the Smart Services CRC “Social Media: tools for User-Generated Content” project (partly supported by my employer, Telstra) would provide an excellent starting point to inform Government efforts at community building.

It’s all too easy to get caught up in the ‘cool’ factor of Web 2.0. The potential of the technology is so amazing that sometimes we can forget that at the end of the day, it’s still people on either end of the tubes. It’s important to remember that Web 2.0 is all about people. As Michael Wesch has said, “The Machine is Us”. The Government 2.0 Taskforce could do worse than to follow the lead of one of the great political campaigners of our time and hang a sign in the group’s (virtual) war room constantly bringing it back to this fundamental theme. It could read; It’s the Community, Stupid!

]]> 17
Accessibility and Government 2.0 Sat, 11 Jul 2009 21:56:48 +0000 Lisa Harvey Guest Blogger Post from Scott Hollier: Media Access Australia

Universal design is the design of products and environments to be usable by all people, to the greatest extent possible, without the need for adaptation or specialized design.
Ron Mace

One major issue to face the Government 2.0 taskforce will be how to meet the needs of people with disabilities.  In my role as Project Manager for Media Access Australia and as a person with a vision impairment, I’ve been fortunate to have both a profession and personal perspective on access issues so I thought I’d share my thoughts on the topic.

The challenge here will revolve around the use of universal design in the delivery of accessible government information via Web 2.0 technologies.  While the principle sounds good, two questions need to be asked:  what is universal design, and can it realistically be achieved?

One of the easiest mistakes to make is assuming that universal design means that everything has to be approached using a ‘one size fits all’ model.    This would be very difficult, not to mention impractical to actually do.  For example, if the government decides that Facebook is a good medium for communication, should the popular website launch a text-only interface to ensure complete access?  Would this meet universal design requirements?  How would current Facebook users feel about that?

An alternative is not to see universal design as an impossible dream, but to use the concept in practical ways that make mainstream products reach the largest possible audience.  The Center for Universal Design looked across a variety of disciplines, and focused on things like equitable use, flexibility in use, emphasis on simplicity and intuitiveness, the need for perceptible information and tolerance for error.  When we think about the Facebook example, can all these concepts apply without making a text-only site?  I’d argue yes.  Will this make Facebook accessible to every Australian with one or more disabilities? Probably not, but it will get close enough that specialist solutions would be required on such a small scale that it can be provided to the remainder of the population at minimal cost.

The third option is to put it all in the ‘too hard’ basket, which is what has previously happened in Australia.  Other Federal governments around the world like the United States of America have legislation, Section 508, that requires products produced or sold to the government meet accessibility criteria.   Our equivalent legislation, the Disability Discrimination Act, has no comparable requirement.

So what do you think?  Should the government find a one-size-fits-all solution to access?  Should the focus be on making government resources as accessible as possible using mainstream technologies, or is it all just too hard?   Add your thoughts.

About Dr Scott Hollier: Scott Hollier is the Project Manager, New Media for Media Access Australia (MAA), a not-for-profit, public benevolent institution.  Scott’s work focuses on making computers and Internet-related technologies accessible to people with disabilities,  Scott also represents MAA on the Advisory Committee of the World Wide Web Consortium (W3C), the organisation primarily responsible for developing and promoting access standards to media through technology for people with disabilities.   Scott has completed a PhD titled ‘The Disability Divide: an examination into the needs of computing and Internet-related technologies on people who are blind or vision impaired’. Scott is legally blind and as such understands the importance of access at a personal level.

]]> 10