Washington, DC
Subscribe to GeoRSS Subscribe to KML

Thoughts from the North Carolina GIS Conference

Published in Conference, Neogeography  |  5 Comments


Last week I attended and presented at the North Carolina GIS Conference in Raleigh, NC. It was a different conference from the ones I typically attend. It is a much more regional and GIS-focused conference than Where2.0, State of the Map, or Location Intelligence. The attendees are primarily county, regional, or state GIS coordinators, users, managers and some federal GIS experts from Fish & Wildlife or USGS.

Last week I attended and presented at the North Carolina GIS Conference in Raleigh, NC. It was a different conference from the ones I typically attend. It is a much more regional and GIS-focused conference than Where2.0, State of the Map, or Location Intelligence. The attendees are primarily county, regional, or state GIS coordinators, users, managers and some federal GIS experts from Fish & Wildlife or USGS.

What’s interesting for me was the perspective of very local government users that are working on the street and block level. They are working under constrained budgets and with varying levels of mandates coming down from above. The DC GIS department is a great model to follow but doesn’t have the same levels of management above it that a county-GIS department would in western North Carolina. It’s this hierarchy that is both onerous as well as potentially empowering.

Individual GIS departments are seeing an increase in repeated data requests, often for the same data from the same receiving organizations. One way to address this is to easily offer the data through web site or services – however this often goes against the grain of politics and feelings of ownership in an organization that choose to require manual approval of data requests.

In fact, this was notable during presentations and discussions that GIS departments referred to their users as “data producers” and “data requestors”. The concept of a simple “data consumer” was not understood or welcomed, despite the fact that the data is public and free of license.

One solution is the development several state level initiatives that are seeking to provide central repositories of data, tools, or collaboration. Through these state portals, regional offices are encouraged (or mandated) to upload their information on a regular basis. Then, subsequent data requests go through this clearinghouse. One example is NC Street Map. However, you’ll quickly realize (after following that link) that these portals are not nearly as encouraging as the name would imply, despite sounding like the increasingly beloved OpenStreetMap.

In order to get an account, you still must still fill out a ‘request’, and then request data. This may make it easier for government agents to more easily identify data sources, but not necessarily, or easily, get it or use it.

Open Initiatives

Attending the conference were 4 authors of what is known as “NSDI Proposal #2“, or as they jokingly call it opeNSDI. The objective being to open up these, and other, data clearinghouses. Focus on sharing data and interoperability instead of merely vendor specific solutions or tools.

There are varying viewpoints on how federal money, say via a Stimulus package, should be requested and parceled out across the state. County GIS employees are limited, or non-existant, in many departments. Therefore, there are no resources to provide building out a local infrastructure to share and manipulate data. The first request is to pledge to fund a new employee in each county.

The response to this is that this new employee would quickly be repurposed and pulled off of something as inconsequential as data sharing. In addition, hoping for long-term funding rarely works out, and if this job was created would probably be removed in the first round of cutbacks. Instead the idea is to receive a one-time funding to each county to implement a data sharing system that serves local needs as well as can be aggregated up to the state and federal levels.

Either way, it’s not an easy solution, but fortunately there are some really innovate and ingenious pioneers that are forging the path and providing best practices. Counties such as Mecklenburg have built an entirely open-stack portal (read about it on Tobin’s Fuzzy Tolerance blog). This effort was definitely the outcome of very hard work and foresight, but fortunately it’s being recognized as such (including winning the award at the conference for best GIS website in the state) and hopefully encourages others to follow along.

River roots

One of the most exciting applications of modern tools and local efforts was from Wansoo Im, famous for his public toilets mashup with and GIS4Kids.

He’s been working with RiverKeepers, a non-profit organization that monitors for river pollution, in collaborative mapping. IMRivers . Beyond simple placemarking sites, each of these individual instances is then aggregated to the national River Network. Unfortunately, none of the sites syndicate their data via a feed or API. So the information is effectively locked into these portals.

Other interesting talks included Michael Waltuch, an ESRI veteran, on Books as an Paradigm for User Interface Design and Rob Trickel from the Division of Forest Resources on Digital Aerial Sketchmapping. There are still plenty of issues facing the application and utility of geospatial tools. Especially for small organizations and how to face increasing consumption of public data, decreasing budgets, and incredibly advancing technology. Overall, the conference was a well put together and provided valuable insights into local and regional GIS issues and future paths.

Similar Posts


  1. Epicanis says:

    February 24th, 2009 at 9:27 am (#)

    “The concept of a simple “data consumer” was not understood or welcomed”
    As it should be, in my personal opinion (for what it’s worth).
    The trend of calling everyone a “consumer” has got to be one of the most insidiously destructive social trends in modern history, even if it hasn’t been especially dramatic.
    People are passive enough as it is without vendors and politicians alike convincing them that their role is simply to wait like a baby bird for the Mama-Bird “Supplier” to come along and vomit up some “product” for them to use up (“consume”).
    I say to the Hell of their choice with “consumers”, it’s about time we started giving some love to participants. (OpenStreetMap is a good example of a service targets and encourages “participants” well, I think…)

  2. Andrew says:

    February 24th, 2009 at 9:47 am (#)

    Interesting thought @epicanis. I do favor the term ‘participant’, but recognize that there are still official sources of qualified data. And the important point is that when this data is gathered by the government, it should be freely, and easily, shared.

    OpenStreetMap is a great example of crowd-sourced gathering of data. Subsequently, official sources should accept feedback, alterations, and additions as appropriate.

    Check out Sean’s post about QA for CrowdSourced Geodata for some of the thoughts on how we want to incorporate this concept into GeoCommons.

  3. Jason Birch says:

    February 24th, 2009 at 1:36 pm (#)

    I can think of a few datasets that we’d want to accept feedback on, especially stale features, but given the level of surveying/engineering rigour that is applied to most of our base datasets, public alterations aren’t really on the plate.

    Did Dr. Im do the IM Rivers thing? That was one of the best parts of GeoWeb for me last year 🙂

  4. Dave Smith says:

    February 25th, 2009 at 4:18 pm (#)

    There is definitely value in bidirectional flows. Some of the federal paradigms have been in throwing money at building nodes for data which then feeds into central systems. Essentially, “contributors” throw their data over the fence to the feds, where it vanishes into some black hole. But do the contributors really care about the quality of what they submit? Timeliness? Completeness? Maybe, maybe not. If it validates and the light’s green, all’s good.

    Meanwhile, the data received may actually be full of problems, only to get recontaminated with every refresh cycle.

    States and others will, as such, “participate” as long as the money’s flowing, and as long as it’s mandated that they participate. But what happens if the money dries up? What real stake or care do participants have beyond that? They don’t use the data they submitted to the feds for their own purposes, they just use their own operational data stores.

    BUT – what if the feds, instead of just throwing money at stakeholders to build nodes, build value-add capabilities to make it worth participants’ while – modeling, analysis, data services, reporting, et cetera – and demonstrate the value – on a cross-borders level, on that macro scale, et cetera – to the participants? Watersheds don’t honor political boundaries. Road networks do not come to a screeching halt at the state lines. Who else to do it at those levels but the feds?

  5. Andrew says:

    February 27th, 2009 at 10:27 am (#)

    @dave correct, if contributors only have a mechanism for throwing over a fence then of course they won’t feel a need to maintain responsibility. Analysis, as you point out, is one way to do this. But that serves a minority of stakeholders (albeit very valuable work). Foremost there should be a two-way interchange. Local data flows up to regional, to federal, and then is disseminated in a timely fashion, maintaining ownership and links to all regions. So TIGER could in fact start with national data, be augmented by local data and then redistribute this improved version.

    This model is exactly what OpenStreetMap is following and why organizations like the UNJLC are keen to work with the project. The UNJLC can gather and maintain data they care about in a way they trust, but also offer it to OSM. OSM will build it into their dataset and offer it *back* to UNJLC and partners – effectively being another maintainer, integrator and supporter of these valuable datasets.

    This lack of bi-directionality is a problem in both government and commercial. NAVTEQ and Google both are very good at asking local municipalities for their data, cleaning it up and then building it into products. But they then *charge* the providers for this value add instead of giving it back to them in the original formats.