Date: Wed, 16 Nov 2005 11:48:11 -0500
From: Danielle Donkersloot
Who has a great example of a way to present and report visual or habitat assessment data? Specifically, looking for examples of graphs and other outputs. In the world of biological monitoring, it’s easy, but visual/habitat data is difficult for us to get a handle on right now. PLEASE HELP.
Thanks in advance…
609-633-9241 (direct line)
PO Box 418
Trenton, NJ 08625
Responses to Question 1
Date: Wed, 16 Nov 2005 15:46:26 -0500
From: “Wills, Sherrill”
When the EPA began to promote the RBP sampling protocol to the state (PA), I was relatively new to the program. What I did notice was that: the substrate composition and surrounding land use was usually noted , as were field readings (temp, DO, pH, conductivity, sometimes alkalinity and/or hardness). Many of the 12 habitat parameters in the first RBP rendition (which most of us prefer to still use) were observed in one way or another and noted. The state biologists have met annually, and a final part of the meeting was a description of the assessment method, a look at the sheet, and a field trip. At the stream, the EPA biologist explained what he looked for when he assessed each parameter. We all then scored the site. The final part of the exercise was for each individual to state their score for each parameter. It was very interesting to see how the “bug” biologists and the “fish” biologists looked at the same parameter at the same site. I believe that another EPA biologist may have recorded each response by parameter or else our individual score sheets were kept. The most interesting result was that when the totals were added up, the end result was usually within the same category (Optimal, suboptimal, marginal, poor). This was repeated at the next two or three annual meetings as well as being a workshop at the annual Region 3 Biologists Workshop at Cacapon State Park, WV. The PA Central Office personnel now travel the state to the regional offices for occasional updates on collection techniques as well as to “recalibrate” the staff and to train new biologists. The regional biologists also train coworkers, interns, and occasionally county staff and private consultants involved in remedial follow-up reporting. Some local watershed groups, with their consultants and some of us from the state as well as EPA, have held habitat assessment workshops, with the result of a human “calibration” of sorts. I feel more comfortable with data submitted by people from these training classes. While we DO NOT use citizen information for regulatory purposes, we do use it as watchdog information. It is impossible to know where every problem is in 11 counties, or if there is chronic discharge problem that doesn’t show up during plant inspections.
While this training is time consuming and gets to be kinda boring after a while, it is good to discuss scoring methods and thoughts among the people that use the information in order that everyone starts from a common point. It’s important to calibrate your instruments; it’s important to “calibrate” the humans, too.
Date: Wed, 16 Nov 2005 17:32:38 -0500
From: Geoff Dates
Subject: Re: [volmonitor] data reporting
Good point! I led a habitat assessment training once with about 30 people. I divided them into 10 groups of 3. The assessment was done in 3 shifts at 3 sites. So, each group got to do 3 sites. In other words, each site was done by all the groups.
At the end, we compared results and they were all within a very few points of each other at each site..
I think what happened in each group was a process of scoring each characteristic as a group. Sometimes the individuals in the group gave very different scores, say for bottom composition. In the process of discussing it, though, each person explained their rationale. Then a “negotiating” process happened where they attempted to agree on a number. Sometimes they averaged their 3 scores for each characteristic. Out of a possible of 120, all groups were within 10 points of each other. Most groups were within 5.
From this experience, I decided that no one should do these kinds of visual, subjective surveys alone. Two is better, three is ideal. The negotiating process got rid of the “outliers.” In some cases, the “outliers” were people who just did not follow instructions.
I think that these are very important activities. They can provide useful information that helps interpret bio-assessments. They can be an “early warning” and screen for problems. Maybe most important, it changes the way they look at streams.
River Watch Program Director
231 24D Heritage Condos
Woodstock, VT 05091
802-457-9808 w & h
River Network Web Site: www.rivernetwork.org
From: Anne Lewis (firstname.lastname@example.org)
Date: May 18, 2012
Has anybody looked at using Google Fusion Tables for managing their data? The functionality looks impressive. You can even create maps with it.
Responses to Question 2
From: Thorpe, Anthony Paul email@example.com
Date: May 18, 2012
I just started messing with this, prompted by your email, Anne. It looks very cool! It has been a long-standing wish of my to allow volunteers to create graphs of their data or to view existing data without having to download PDF files. This might be the ticket.
I was able to import my 2011 data and make a Chlorophyll/Phosphorus graph. In just a few clicks you can aggregate by site, and display average values.
Limitations for tweaking the graphs seem to be numerous. I can’t figure out how to add axis labels or display certain information, for example. I’ll be looking for work-arounds to get this done. One bonus: by uploading more data to the table, the graph will update dynamically.
I like it and hope to be experimenting with this a lot this summer.
Thanks for the tip!
Coordinator, Lakes of Missouri Volunteer Program
302 ABNR University of Missouri
Columbia, MO 65211
From: Howard Webb (firstname.lastname@example.org)
Date: May 18, 2012
I volunteer with LMVP (Lakes of Missouri Volunteer Program), and while we have not formally used Fusion Tables, or published anything with them; I have played around with them and found them useful for several things:
They are good for geographically dispersed water sampling. Last year there was a one day sampling of the Niangua watershed, and I used Fusion Tables to map a float trip down a stretch of the Little Niangua. I just loaded the table with the data, addes some html formatting, and let it create the view as a map.
It can be a bit kludgey (uneven display quality) and is not quite ready for prime time, but definitely moving in a good direction and getting better. The following is what you want:
1) Create a Google Docs spreadsheet and make an input form for it
2) Create a Fusion Table from the spreadsheet
3) Make a script to update the table from the spreadsheet (cut and paste in the instructions)
4) Create a view on the table (filter for a single location), separate view for each location
5) Visualize the view as a chart.
The form does a nice job of collecting the data and moving it to the table. I had some problem embedding the chart, but just got it working. I may need to set the chart to refresh itself periodically.
Feel free to enter data into the form and try it out. This is ‘play’ and I occasionally delete the data, but you cannot hurt it.
Date: Thu, 02 Aug 2007 16:12:59 -0400
From: Kim Cressman
Subject: [volmonitor] Providing data to volunteers
The volunteer program I help coordinate is a monthly sampling program, and we send out a quarterly newsletter that contains all of their data. We include relevant water quality standards for comparison, but it seems like that’s rather meaningless to our volunteers.
Do any of you include data in newsletters? How do you make it meaningful?
I’ve been thinking about including averages, so the volunteers can compare their data to everyone else’s. My concern is oversimplification – every water body is different, and if one person’s baseline is different from someone else’s, I don’t want them to think their water is necessarily bad. We’re in south Florida, so there’s also huge variation between the dry season (when the water is very clear) and the rainy season (lots of turbidity) – but this is natural, and again, not necessarily bad.
The challenge here is to give the data context without dumbing it down. Any feedback you can offer will be helpful – what’s worked for you?
Environmental Resources Division
City of Cape Coral
P.O. Box 150027
Cape Coral, FL 33915
Responses to Question 3
Date: Thu, 02 Aug 2007 17:36:59 -0400
From: David Kirschtel
I think that this would be a good case for the use of sparklines to present the data.
from the entry at wikipedia (http://en.wikipedia.org/wiki/Sparkline):
Sparkline is a name proposed by Edward Tufte for “small, high resolution graphics embedded in a context of words, numbers, images”.
Tufte describes sparklines as “data-intense, design-simple, word-sized graphics”. Whereas the typical chart is designed to show as much data as possible, and is set off from the flow of text, sparklines are intended to be succinct, memorable, and located where they are discussed. Their use inline usually means that they are about the same height as the surrounding text.
He gives some really nice examples in his book Beautiful Evidence. If you follow the link in the wiki article “Edward Tufte’s explanation of sparklines” you’ll get to an online version of that section of the book. The advantage to you is that would would be able to create context by showing the annual (dry/monsoon) and longer term trends in the data as well as presenting their most current data point in a relatively simple graphical format. If you then stack up all the sparklines from each of your individual volunteers you and they(!) will be able get really good sense of overall trends in the data — without having to do a lot of complex analysis.
A nifty example that Tufte gives is that he plots the sparklines from several dozen mutual funds. All the lines fluctuate in virtual the identical manner – suggesting that despite all the marketing hype most funds are holding very similar portfolios. A huge amount of data is plotted very concisely and in an easy to digest format in order to show some very interesting large scale patterns – patterns that you would never see by looking at row upon row of numbers.
It may take a little bit of effort to get this set up, but I really think that it will help in enhancing meaning, and community for your citizen scientists.
David Kirschtel, Ph.D.
Sr. Program Manager
Washington, DC, 20009
Date: Fri, 03 Aug 2007 09:19:08 -0400
From: “Sullivan, Chris”
What is the typical age of your volunteers? That could help with determining a good way to distribute the data. I think it is great that you get the results back to the monitors at all, let alone worry about making it relevant and meaningful to them. A good comparison would be to use the results from a water body that most people are familiar with and that is known to be impaired, then also include data from a familiar water body that is known to have excellent water quality. With this information setting up your range, I think the monitors would be able to get a better picture of the water quality at their sites/locations.
500 Hawthorne Ave
Derby, CT 06418
Date: Fri, 03 Aug 2007 09:57:52 -0400
From: ginger north
Subject: RE: [volmonitor] Providing data to volunteers
I, too, think your newsletter full of data is quite an achievement, we only send published data out every 5 years! Because of the large time span, data trends and summaries are the focus so they can see the trends and compare them to others in their watershed but we do not compare everyone’s data to each other. Just within a watershed and at their own sites over time. Hopefully this makes it more meaningful to them than tables or graphs of all raw data. I do think that a yearly analysis might be a better time frame but we haven’t managed to get there yet. This is a good way to show long term trends but does make the possibility of over simplification a concern. So there are pluses and minuses to this method as well.
I guess the first question may be to ask them if this is a meaningful way to see their data. If they are happy then no worries.
Stream Watch Coordinator
Delaware Nature Society
Date: Fri, 03 Aug 2007 11:22:11 -0400
From: Kim Cressman
To answer Chris’s question, our volunteers are mostly retired, so we’re working mainly in the age range of 60+. Ginger, we asked for feedback in our most recent newsletter, and only heard back from one person! She mentioned that she just wants to know what it means – is it good, bad, or neutral? And why are some areas higher in “stuff” than others? I think some of the basics are worthy of a newsletter article and inclusion in training materials – what are we testing, why are we testing it, why do things change in summer, why is my canal different from that other one. Mostly people just want to know if it’s good or bad, which is an oversimplification that I want to avoid. But it would probably be a good idea to at least highlight data that exceeds standards. And I think Chris’s idea of comparing it to water bodies that people are familiar with is a good one.
Thanks to everyone who’s replied so far!
Environmental Resources Division
City of Cape Coral
P.O. Box 150027
Cape Coral, FL 33915
Date: Fri, 03 Aug 2007 12:09:54 -0400
From: Carolyn Sibner
We also monitor monthly (but only from spring into fall), and have also struggled for years with how to best present our data. I also feared the oversimplification that comes from averaging, esp. considering how much of a factor weather plays in our bacteria results.
The more data I have (we’ve been monitoring some areas for 6 years now), however, the more comfortable I am that I know which sites are more likely to have a problem than others.
In an effort to look at our data over more than one year at a time, we recently went through our data and compared each result to the state Water Quality standard for each parameter. So, for instance, at each site we calculated how many times the samples from that site met the state standard for safe bacteria levels for primary contact (swimming). We did this year by year, so we could see whether it seemed to be getting better from year to year, or worse, or whatever.
We then set up a scale similar to the grades we received in school, since we figure most people are familiar with that old A – F grading scale. And then we color coded each “grade”, with blue being the best water quality (it met or exceeded its water quality standard at least 95% of the times we sampled), and red meaning it did not meet its standard even 60% of the time.
We are still tweaking it, and trying to figure out what is the best way to display this info. Excel tables are the easiest to manage and update, but they can still be kind of dry and hard to interpret. We set up Word documents that are easier to read and understand, but they are a bit of a pain in the patootie to update (i.e. it takes too much time to format and reformat the lines and spacing each time we add a new set of numbers).
We can now look at a summary sheet for a site and tell right away, by the colors, which parameters are doing well, and which ones are having problems, by individual years and over time.
I gave each volunteer the summary for their site at the end of last year, but think that Chris made a good point that being able to compare their data to a site that they are already familiar with would help them better understand how their data compares to other sites.
If you’d like, I can email a sample site summary to you so you can see what it looks like… and then you can give me your feedback on how to make it even better!
Water Quality Manager
Housatonic Valley Association
So. Lee, MA
Date: Fri, 03 Aug 2007 12:34:59 -0500
From: KRISTINE F STEPENUCK
Like others have mentioned, you’re ahead of the curve by presenting annual reports to your groups. In Wisconsin with volunteer streams data, we also don’t do annual reports, but the Citizen Lakes Monitoring Network does. Those are summary reports with some key information about water quality in the lakes and a trophic status index score, as well as other information to help people to understand their results. They’re sent to the volunteers about their individual lakes. They’re also available online (http://dnr.wi.gov/lakes/CLMN/).
For streams, our approach hasn’t been as consistent, but we’ve done things similar to what others have suggested. One way I liked, in terms of getting information out to citizens in a usable format for them, was to create brochures with simplified data summaries about a localized area – usually a watershed or several small watersheds – so data could be compared. For macroinvertebrate data results, we coded a map with red, yellow, and green labels for how the water quality scores came out for various sites (red for poor scores, green for good scores). For other parameters we included short descriptions of why the parameter was important to monitor and a summary of scores for that area. We generally report medians, since that tends to be a good representation for smaller data sets (though now, with over 4000 data points, means and medians are often equal to one another).
We did a longer data results report for a 10 year data summary, again including short descriptions of why it’s important to monitor what we do, and comparing statewide medians/means, lows, and highs, with the local means/medians, lows and highs, and discussing areas that may be important to do follow up monitoirng.
The brochures and 10 year summary are posted online at: http://watermonitoring.uwex.edu/wav/monitoring/databaseResults.html
With our online database, citizens also have an ongoing opportunity to graph data results over time for a single site or to compare sites. Here’s a link to that database if you want to see how we’ve presented data in graphs and tables (http://www.uwex.edu/ces/erc/watervol/). A weakness to those graphs and tables is that we have no supporting information about what the data mean- people viewing them either need a base knowledge of what’s presented.
Volunteer Stream Monitoring Coordinator
University of Wisconsin Extension and
Wisconsin Department of Natural Resources
(() phone: (608)264-8948 or 608-265-3887
(+) e-mail: email@example.com
Date: Fri, 03 Aug 2007 14:00:18 -0400
From: Carolyn Sibner
Kim and everyone,
The folks at UMass in Amherst have a lot of info on how to display and manage data. Below is a link to their site (I hope it works! – if not, try copying and pasting it into your browser). There’s all kinds of info on their website about monitoring both rivers and lakes!
Date: Mon, 06 Aug 2007 10:53:48 -0400
From: Tony Williams
You are correct, for water quality and monitoring most people just want to know if it is good or bad….and so the challenge is how do we answer their question with not to much oversimplification, but also at the same time expand on the opportunity to educate them on the complexities of these water bodies and why one may have “higher stuff” then another and is that a good thing or a bad thing.
We use a health index, or 0-100 point grade and color system to try and explain in a simple manner is the water good, fair or poor. I am always explaining why high nutrient levels are not so good and yet high oxygen levels are good…it often confuses people and even volunteers at first………and so the oversimplification process then works to show them “how” the water is doing….but the follow up to explain and educate them is what really matters.
This is our volunteer program
and this is our new attempt to show the data results for each site…so the volunteer can “see all their data” …but it is a lot of work when you have lots of data and lots of volunteers.
…and in response to Eric’s – “in the digital age, the data gathered by volunteer monitors should be available in real time, all the time.”
For our volunteer program, we go through all the data with a certain level of quality assurance checks before we make it available….this take a bit of time to do and at least for us our volunteers know this…but we feel confident in presenting the data when we finally do. So for us, the real time, all the time volunteer data to just have it out there isn’t as important as having a quality package of data to present with our interpretation of what is happening. After that then the data is available for all.
Director of Monitoring Programs
The Coalition for Buzzards Bay
Nashawena Mills – 620 Belleville Avenue
New Bedford, Massachusetts 02745
Tel. 508-999-6363 x.203
Date: Mon, 06 Aug 2007 11:37:34 -0400
From: Linda Green
Tony and others,
Thanks for replying with your fine website and also your reply to Eric’s Eckl’s comment “In the digital age, the data gathered by volunteer monitors should be available in real time, all the time.” It takes time, often A LOT of time to make sure that the data being presented has been correctly analyzed, with necessary quality assurance procedures. It is important to remember that most volunteer monitoring programs are not analytical labs with the infrastructure to process samples speedily and not everyone is using the latest electronic gizmos to upload results. It also depends on how many volunteers and locations you have in your program! There is also a vast difference between presenting numerical data and transforming data to results and then ultimately to information that folks can understand and use.
The volunteers in the URI Watershed Watch program know, well we hope they know, that it does take time to make sure everything is correct before it is shown to the world. We post all our results on-line. We do know that folks are keenly interested in bacteria results, so within a week of water sampling we post those results to our website, with a simple coding of red font being over water quality standards and black font below them, which makes a nice at-a-glance visual. It takes a lot longer for the rest of the data and results to get posted. We link all results to explanatory factsheets, and send customized reports and attend watershed meeting upon request.
URI Watershed Watch Program Director
URI Cooperative Extension Water Quality
College of the Environment and Life Sciences
CIK, 1 Greenhouse Road
Kingston, RI 02881-0804
Date: Mon, 06 Aug 2007 22:00:33 -0400
From: Eric Eckl
Fascinating, so when I wrote: “In the digital age, the data gathered by volunteer monitors should be available in real time, all the time,” it evoked the expectation of instantaneous upload, which raised concerns about quality control. Thanks for that. I understand the objection.
But what I’m really driving at is something a little different — the inherent shortcoming of having the data housed in thousands of different websites, Excel spreadsheets, newsletters, etc… It seems right now that if you want to access monitoring data that you didn’t gather yourself, you have to hunt and peck for it: post requests to listservs, ask around, call your peers, etc…
Shouldn’t it all be housed in the same website, just a mouse click away for everybody who might need it?
There are several organizations competing to make that possible. One is Google (of course), with Google spreadsheets:
I’m curious whether the prospect of combined master database of water quality monitoring data is attractive or scary?
Water Words That Work
P.O. Box 2182
Falls Church, VA 22042-2182
Cell: (703) 635-4380
Date: Tue, 07 Aug 2007 10:02:00 -0400
From: Kami Watson
I’d like to know if anyone has further information about EPA’s proposed WXQ database. My understanding is that this is supposed to be a “user friendly” and updated version of Storet.
I have the 2007 Storet/WQX User’s Conference in Austin, TX during Nov 27-29 on my calendar and hope to be able to attend to get more insight and perhaps share some ideas. However, knowing that Storet simply is not user friendly for watershed groups and volunteers, does anyone know if WXQ will truly be, or will you still need significant IT knowledge to utilize it. The complexity of Storet is the reason our Save Our Streams programs do not use currently use this database to store their data.
Also, other than the requirement of general metadata, will there be a request for the protocol used and whether or not it has a QAPP with EPA? Will there be a control measure in place such as an administrator that posts volunteers input data after review or will it be directly uploaded by the watershed group, state, tribal, or local officials and volunteers?
With Save Our Streams programs being a national monitoring program that molds itself to the needs of states and local communities, we also see the development of individual databases for these programs. Where SOS projects and programs do not have the strenghts in financial and technical partnerships to develop these databases and websites, volunteers are often still keeping their data in spreadsheets on personal computers and in filing cabinets. The volunteers know their data is important and they would like to make it accessable to the general public, but as others have mentioned, it is often kept in a local database and one must do a thorough search to find it. The request for a free national database is one that is expressed to me quite often at training workshops and by our Save Our Streams volunteers. I’m hoping that WQX will fill this need. Does anyone have any insight on this? It would make sense to me to collect water quality data in one house and then allow those who want to utilitze the data to extract it and put it into mapping programs, etc, depending on their individual needs. With the decrease in funding for state and local government to maintain volunteer monitoring programs, and the increase in watershed and conservation groups, and local schools and universities taking an active role in gathering water quality data, whether for education or grassroots advocacy purposes, its available. While the validity of the data can be argued, that is where the metadata comes into play along with the EPA QAPP and quality assurance and quality control measures put into place.
Any insight into this would be helpful.
Save Our Streams Program Coordinator
Izaak Walton League of America
707 Conservation Lane
Gaithersburg, MD 20878
(301) 548-0150 ext. 220
Date: Tue, 07 Aug 2007 12:07:58 -0400
I’m extremely pleased to hear you’re planning to attend the STORET/WQX user’s conference in November. I hope other volunteer monitoring program coordinators can also attend, and lend their voices together to offer ideas, ask questions, and seek support from those who manage data
at the state and federal level.
Our STORET/WQX team is developing a tool to allow small users (such as volunteer monitoring organizations and Tribes) to transfer data from local databases (e.g. spreadsheets, Access) to our national water quality data warehouse via the Web. (The national data warehouse currently stores, and makes accessible, data from many state water quality agencies, other agencies, universities, a few volunteer monitoring programs,etc.) The goal is to make this small users’ tool as user-friendly as possible, something someone who understands a bit about data and the Web could master. I don’t believe it will require that you have an approved QAPP, but it will require data of documented quality, so there will still be a need to include metadata re: how, when, why, where and what the data are all about (and yes, which protocols were used). But since having that metadata is critical to any
well-thought-out program, we hope it won’t be a concern.
I believe the plan is for data to be uploaded by the watershed/volunteer group’s data manager after it has gone through your own internal QA process, rather than having it go through some sort of outside
administrator. Large organizations such as the IWLA may want to provide this function for their subgroups. It seems unlikely (and possibly unwise) to me that individual volunteers would upload data to the national warehouse unless that was their role in their organization.
The STORET/WQX team could use your input as it develops its tool for small users, which should be complete by the middle of next year. If you manage data for your program and have questions, concerns or suggestions, please feel free to contact the team directly, at: STORET@epa.gov (put “small users’ tool” in the subject line).
Find out more about the November STORET/WQX User’s Conference in Austin online. The STORET/WQX team hopes to be able to demonstrate the beginning of the small users’ tool at the conference and, again, could use your input. We will also be talking about WQX and the small users’ tool at the upcoming Regional volunteer monitoring conference for region 3 states (VA, PA, DC, WV, MD, DE) in Virginia in October (check out
Also, you may have missed our Watershed Academy webcast on WQX on June 21 (“Using STORET to Characterize your Watershed”). You can listen to the audio broadcast and see the slides by going to
Date: Thu, 23 Aug 2007 21:03:05 +0000
From: Jo Latimore
Subject: [volmonitor] Request for report examples!
Hello! I’m part of the team who administers Michigan’s volunteer stream and lake monitoring program (the Michigan Clean Water Corps, or MiCorps). We’re putting together our annual conference, and I’m leading a session on “Presenting Volunteer Data Effectively”. I enjoyed the recent thread on the topic, and would like to be able to provide session attendees with a number of examples of reports or presentations put together by volunteer monitoring groups to communicate their data/analyses/
conclusions to various audiences. These audiences might be the volunteers themselves, local decisionmakers, the media, etc. – one point I’m hoping to make is that the best way to present your results can vary from one audience to another.
I’d appreciate links to websites, attached files, or anything else you can provide.. .thanks in advance!
Responses to Question 4
Date: Tue, 22 Jan 2008 11:40:10 -0500
From: Jo Latimore
Subject: RE: [volmonitor] Request for report examples!
Back in August I posted a request to this list for examples of volunteer monitoring data reporting. I’ve compiled a listing of many of the examples that I found or were sent to me, and include them below, if you are interested.
Thanks to all who contributed!
Michigan Clean Water Corps
Examples of Volunteer Monitoring Data Presentation and Reporting
Available Online (Winter 2007)
1. Buzzard’s Bay Baywatchers Program (Massachusetts) http://savebuzzardsbay.org/baywatchers/ (Interactive mapping and data reporting site.)
http://www.savebuzzardsbay.org/GetConnected/Publications? (publications of their volunteers’ data; includes 127 signs they made for their sites, a map with charts, poster, and a synthesis report.)
2. Lakes of Missouri Volunteer Program
http://www.lmvp.org/Data/2006/index.htm (online annual data report,
with charts and a paragraph about each lake)
3. Loudoun Watershed Watch (Loudoun County, Virginia)
www.loudounwatershedwatch.org (includes downloadable Excel file with
data – bugs, chemistry, bacteria, and charts for all sites)
4. University of Delaware Citizen Monitoring Program
http://citizen-monitoring.udel.edu/index.shtml (Data reports, maps of
5. Charles River Watershed Association (Massachusetts)
http://www.crwa.org/water_quality/monthly/monthly.html (In ‘monthly
maps’ section, data reported with color-coded maps, site descriptions,
parameter explanations, annual summary reports)
6. Minnesota Pollution Control Agency’s Citizen Stream-Monitoring
summaries written for each site, with volunteer’s name, info about land
use in the watershed, and results)
7. Friends of the Rouge (Rouge River, MI)
http://www.therouge.org (Click on the stonefly; scroll down to links to
8. Huron River Watershed Council (Michigan)
http://www.hrwc.org/our-work/programs/adopt/ (scroll down to Adopt-A-Stream for
a collection of monitoring reports through the years – most recent are
organized by subwatershed with maps)
9. Michigan Clean Water Corps
http://www.micorps.net (“View Data” lets user explore data sets for
volunteer lake and stream monitoring across the state that follow
statewide standard protocols; summary Annual Reports for the lakes
monitoring program are also available)
Data Reporting and Presentation Guidance
1. Massachusetts Water Watch Program (Ready, Set, Present!)
2. Eleanor Ely’s “Writing to Be Read” workshop:
http://writingtoberead.wordpress.com (Aims to give useful advice to
people at environmental agencies and organizations who need to write
about environmental topics for different audiences.)
3. Water Words That Work
4. Volunteer Monitor newsletter
5. Michigan Clean Water Corps – 2007 Conference Proceedings
http://www.micorps.net/conference2007_proceedings (“Presenting Your
Monitoring Data” slide show)
Date: Wed, 23 Jan 2008 12:09:50 -0800 (PST)
From: Revital Katznelson
Subject: RE: [volmonitor] another report example
From the Clean Water Team, the citizen monitoring program of the State Water Resources Control Board: This report may be an addition to the list you have started (although it was written by the technical support coordinator at the time, yours truly). It is a summary report on stormwater quality data collected by over 70 volunteers, in 26 locations within the Russian River watershed, during the first few hours of the first storm event of the 2002/03 California winter. I believe the part about logistics will be of special interest to others contemplating a ‘first-flush’ monitoring event with multiple teams. Please go to http://www.waterboards.ca.gov/northcoast/publications_and_forms/available_documents/ and scroll down to Russian River First Flush Summary Report. Additional graphics are shown in Appendix D on the same page.
510 406 8514
Date: Wed, 26 Mar 2008 16:10:27 -0500
From: Alabama Waterwatch
Thanks for the accidental message. It brought to my attention that our
website is not mentioned on the list below. Below are some of the basic
stats of our program. Our website features interactive maps, graphs and
access to water data collected since 1993.
Total Water Quality Records 51967
Total Water Chemistry Records 42613
Total Bacteriological Records 9077
Total Bioassessment Records 277
Total Monitors Certified 4689
Total Sampling Sites 1879
Total Training Sessions 1171
Total Waterbodies Monitored 742
Total Citizen Groups 248
From: firstname.lastname@example.org [mailto:email@example.com] On Behalf Of Linda Green
Sent: Monday, January 14, 2008 7:56 AM
To: CSREESVolMon@lists.uwex.edu; ‘Volunteer water monitoring’
Subject: [CSREESVolMon] Using Data to tell a story
Apologies for cross-postings…
At an upcoming conference we will be presenting a workshop called “Putting it All Together – Using Data to Tell a Story”. On a planning call for the workshop, we got to sharing stories about different experiences we’ve had in preparing reports, presentations, etc. We thought that it’d be useful to ask all of you in the volunteer monitoring realm for your favorite tips and techniques for using data to tell a story.
What has worked best for you? Do you have examples of materials that have worked really well to turn data into a compelling water quality success (or failure) story? Do you have such materials available online (if so, where)? Or do you have tips for other volunteer monitoring coordinators about what to expect when preparing reports or data analysis presentations? We all agreed that you should plan to take at least twice (and probably three times) as long as you expect to develop compelling data stories.
What’s been your worst experience with data presentations or reports? You can come clean about your own mistakes, or of course let us know about someone else’s debacle.
We’ll compile your suggestions into a fact sheet handout for the workshop, and then will post it on-line afterwards, so please let us know if you’d like to be quoted or anonymous when you reply.
Thanks so much everyone!
Linda Green & Elizabeth Herron
USDA-CSREES Volunteer Water Quality National Facilitation Project
URI Watershed Watch Program
URI Cooperative Extension Water Quality
College of the Environment and Life Sciences
CIK, 1 Greenhouse Road
Kingston, RI 02881-0804
Responses to Question 5
From: Streamkeepers [mailto:Streamkeepers@co.clallam.wa.us]
Sent: Monday, January 14, 2008 1:13 PM
To: Linda Green
Subject: RE: [CSREESVolMon] Using Data to tell a story
Hi Linda & Elizabeth,
Interesting questions! Clearly posed by people who have long grappled with these questions.
Our data was used (along with other sources of information) for a comprehensive 2004 report on the state of our county’s streams. Here’s the link: http://www.clallam.net/streamkeepers/html/state_of_the_waters.htm. The basic formula was to try to present information about every stream system, with a combination of summary text in 3 categories (overview, implications for people, and implications for fish), a health-rating table (with ratings of healthy, compromised, impaired, highly impaired, and critically impaired) in 3 categories (WQ, biology, and habitat), and a “particular concerns” and “recommendations” box. Plus maps, photos, and a host of explanatory material in introductory and appendix chapters. It took LOTS of work, both by staff and a paid writer/editor, and was about 2 years late in publication. Of course, now we have a template that we can follow and improve on in future years.
This past year, we took on the challenge of creating a display for fair booths “telling the story” of our data. We decided on the following:
–Use the B-IBI as our summary-statistic of stream health (luckily, we have a pretty rich set of B-IBI data).
–Present the B-IBI in map form with colored dots (see “bug ratings” (5.4 MB jpg file), attached).
–Have a poster explaining what the B-IBI is all about and why it’s important (see “BIBI” (678 KB jpg file), attached).
–Present 3 stream case-studies in which data is presented in a way that it not only tells about the health of the stream, but also gets across important watershed-process concepts, so that viewers can have a deeper understand of how watersheds work and how they get disturbed (see “Peabody” (579 KB jpg file), “Bell” (562 KB jpg file), and “Salt” (617 KB jpg file) posters in that order). We wanted to concentrate on concepts that people were less likely to know, such as:
—–the connection between development, hydrology, stormwater, and sediment delivery and movement;
—–the importance of LWD and LWD recruitment;
—–the role of riparian areas.
Again, it took a LOT longer than we thought it would for these 5 posters, probably 60 hours of my time and 350 hours of a contractor’s. These posters will soon be on our website, once our volunteer webmaster gets them loaded.
Some things we’ve decided/found out:
–Think of it as a story. Start out by saying, “What’s the story of this creek? What’s the story we’re trying to tell?” So we started with watershed-assessment documents, plus what we collectively knew about the creeks. Then we looked at the data and whether it supported the story. Then we decided which data to focus on and how to present it. In the process, we certainly became familiar with our data gaps!
–It’s okay to present something that’s not conclusive and say that it’s not conclusive. That’s science.
–Colored dots on maps are good, but too many maps can be overwhelming in a display. We’re working on a slideshow now which has lots of maps, but that’s okay when you have a presenter describing what people are seeing on the maps.
–You can probably present one or two other concepts along with a basic dot on a map. So for instance, we’ve put question-marks in dots that only represent one year of B-IBI data. In another case, we show our monitoring sites along with whether they’re sponsored by an outside client and whether they’re monitoring a restoration project (see “ccwr sk client sites” (472 KB jpg file) , attached).
–Figure out when you need to be comprehensive, and when you just want to focus down on a few salient data findings.
–Multiple presentational graphics are good: we’ve tried integrating text, maps, photos, charts, tables, and graphs.
–Photos in particular are important, so that people can see what the landscape impacts look like, then look at what the data tells about the results of those impacts.
–Headings and subheadings are critical: get across whatever basic message you want to convey in the big letters, so that someone just passing by the booth will at least see those important points (and hopefully be drawn in enough to want to take a look!).
–Callout text of various types really helps make graphs meaningful.
–A good report-production team needs to have people with the following skills/knowledge: watershed ecology, the available data, statistical analysis, graphing, GIS, pedagogy, page layout, and wordsmithing. If you’re lucky, some people will have several of these! We needed a basic team of 3 people.
–The review process is critical. We showed drafts to our advisory board, our volunteer data-analysis team, and our education/outreach team. We gots lots of feedback and went through many, many drafts. As frustrating as it often was, the posters just kept getting better.
–For graphs and maps, you’ve got to check the color-production of the printers and projectors you’ll be using. We found, for example, that with our projector, our orange and yellow dots were indistinguishable, and with our plotter, one of our color orthophotos didn’t show the land-cover features we were trying to show, so we had to take our poster file somewhere else to get printed.
That’s what comes to mind right now. I’m sending this on to some of our volunteers/colleagues to see if they have anything to add.
Ed Chadd & Adar Feller
Streamkeepers of Clallam County
Clallam County Department of Community Development
223 E. 4 St., Suite 5
Port Angeles, WA 98362
360-417-2281; FAX 360-417-2443
Date: Wed, 23 Mar 2005 10:36:59 -0500
From: Caitlin Cusack
Subject: [cem] Note from Andy Whitman
See below for helpful info. from Andy Whitman.
Caitlin and others,
If you have ever asked yourself why a volunteer monitoring program is measuring so many variables then you may find the following documents of interest :
(http://www.manometmaine.org/documents/FMSN2004-3LSIndex.pdf;http://www.manometmaine.org/documents/NHIndex.pdf and http://www.manometmaine.org/documents/SFIndex.pdf are examples of LSI described in the first document). The first document describes an rigorous, straight forward approach to create simple indices derived from large data sets. The resulting index is very easy to teach (takes about 40 hours to train on the full set of variables versus 4 hours max for the index), easy to apply (2 variables in <30 minutes, the index takes about 8% of the time that it takes to measure the full set of variables), has lower equipment cost (~$600 per set of equipment versus $150 for a DBH tape and 2 – 100 yd tapes), easy to explain (most volunteers could easily understand the ecological basis of the index by the end of the 4 hour training), and scientifically efficient (the index captures captures about 85% of the information gained by from the full set of variables).
If you struggle with getting volunteers to attend lengthily training efforts, equipment expenses, the challenges of managing unwieldy databases, getting large numbers of samples, QA/QC, and volunteer retention due to demanding program commitments, this example may provide you with some ideas of how to simplify your volunteer monitoring program with little to not loss of information yielded by your data stream. Caitlin you may want to include this on your CEM CD.
Manomet – Maine
Editor's note: We archived these three files on our site in the event the links change. They are available here:
FMSN2004-3LSIndex.pdf (229 KB pdf file)
NHIndex.pdf (339 KB pdf file)
SFIndex.pdf (313 KB pdf file)