Uncomplicated Data: Start with Reporting?
How much do we love a set of data visualizations that look like this? Of course, so much. In so many ways, our lives are now conditioned to expect a boatload of data, telling a robust story, neatly arrayed in a colorful visual...extra points of there's a gradient because that makes it also very pretty.
But most people also don't know it takes pretty sophisticated data in huge batches that are meticulously cleaned and prepared to get a visualization like this to really do it's job.
The fact is, this is complicated data right here.
And my firm belief is that we really need to strive to uncomplicate data completely so we can get it and analyze it so that we can use it.
So, let's break down one of the most common questions I hear when nonprofits are getting on the ol' data train.
"What should we even be measuring?"
I both love and hate this question. Love it because it IS the end all and be all: pick your metrics wisely and your data will serve you from day one (assuming you're collecting it well which is another post altogether). I hate it because figuring out the answer is a bigger challenge than you might be expecting and you're gonna get mad at me because I'm going to have to break out logic models and theories of change and that's before we get to literature reviews and choosing appropriate metrics.
Or, do I?
While all of that will probably have to happen down the road, I think there's a much easier way to get started right now on figuring out what you want to and maybe even should be measuring. I have an idea. It might be radical. Hear me out.
Start with Reporting.
Wait, what now? That's right, I said it. Typically, we don't think about reporting until a year from now when the grant report is due and we'll figure everything out then. But, actually, if we started reporting today, in about six weeks there would be a clear, emergent path to a theory about what your program or intervention is actually accomplishing...with clues toward appropriate metrics. Sound too good to be true? I promise it's not. but it does involve some work.
Daily Journaling or End-of-Day Summaries = Reporting
I'll wait while you catch your breath from the shock that you probably already have a good source of data if you're doing any reflective jotting down or summarizing of activities at the end of each day. This can take any form and is primarily for you. If you write down what was planned and intended and then what actually happened with some notes about why you thought that was the outcome of that day, that's all you really need. The key is doing this consistently, every day.
If we're able to look at the full view with our eyes open, over time, our reporting will actually reveal the kind of impact that the activity, program, or organization is actually having. This will lead you to a clearer picture of how that intervention is working in a given population AND it'll give you some clue as to the metrics you could consider adopting to start evaluating that program toward its logical outcomes.
Done. You've just inverted the process of setting up a theoretical measurement framework and you're already at least one step ahead.
Brace Yourself for an Unanticipated Story
The reason I really love this more inductive, qualitative approach right at the beginning is that often we tell ourselves tales about how this "thing" that we're doing will work. With that in mind, every day we look for that to happen. If it does, we count it a win, and, if not, that's a big L. The bummer here is that those "Ls" can be big teachers for us and we often just chalk them up to failure because a program or activity isn't doing what we want it to. The reality is, though, it could be doing something much more profound.
Real-Life Example: Little Libraries and Student Achievement Outcomes
This is one of my treasured examples of how inductive research can really teach you something if you're willing to see it.
I was working at an organization that wanted to improve 3rd-grade reading scores for primarily black and brown students in the public school system. In a move that defied logic for me, one of the "programs" they undertook was placing little libraries in public parks throughout the city thinking that this would directly deliver books to the kids who were in the parks without the hassle of going to the excellent public library available in this town. Now, technically, this was a response to national studies that showed that often access to books contributed to low achievement scores for low-income communities of color so, in some way, I guess they thought it was data-informed. Every week for a whole summer, volunteers would load those little libraries up with books aimed at elementary and middle school kids and wait and see what happened. What they found is that in all the little libraries except one, the books were largely untouched. The re-stocking efforts became very light and come October, the 3rd-grade reading scores were worse than ever. Their conclusion: access wasn't a problem and something else might be the issue. Activity failed.
But, hang on just a minute. Let's talk about the one library that was touched. It was the one that was located in the middle of the playground of a park in the historically black neighborhood, now more inclusive of Latinos and Middle Eastern and African immigrants. That little library was virtually destroyed over the course of the summer: door broken repeatedly, paint chipped, muddy, the books in there living a life. This was shrugged off as disregard or disrespect and no attention was paid further....except by me.
This library was on the way to my parked car and every afternoon, rain or shine, I would note what was happening with it. What did I see? It was used. Books would turn over rapidly. Some you'd find in the yard ripped to shreds--I'm guessing those weren't like much. But there was also restocking of the little library by the community. Consistently, they would replace middle-school-aged books with early childhood cardboard or picture books. Those would turn over independently. I would regularly find library books in there. One day, I saw the plexiglass window which had been cracked was repaired with duct tape. And as the summer waned and August turned to September, the books started drying up. By November, it was empty and deserted.
So, was this a fail?
Deductive Fails Are Often Inductive Wins, if We Accept Them
No, this was a treasure trove of cultural clues right here. It told me moms were likely the ones interacting with the libraries the most, that they did have access to books in their home and liked the idea of sharing the wealth or getting a new cycle of books to use at home, that the little library was more convenient or comfortable than the actual real library that was downtown and also really fancy. Who, did I learn, was not interested? Elementary and Middle-Schoolers.
An inductive approach--reporting first--opens our field of vision to what could be happening instead of moving too quickly to what should be happening and can create some really valuable, timely considerations for either adjusting what you're doing to meet an intended outcome or consider looking toward a different outcome.
If you're wondering what to measure, just start jotting down what is happening right now. The measurement will come to you.
Comments
Post a Comment