Grantmakers starting with evaluation should begin with a simple framework
By Jen Riley, director, Social Impact Measurement Network Australia (SIMNA)
If you’re a grantmaker, chances are you’ve wrestled with evaluation. There are many questions, a great deal of data, and often no clear place to begin.
You might be asking: What matters? Who should I be listening to? How do I avoid building a system that looks good on paper but never gets used?
You’re not alone. Evaluation can quickly become overwhelming. My advice: start simple. Start with a table.
Why start simple?
The best evaluation frameworks aren’t the ones that collect the most data or produce the glossiest reports. They’re the ones that are useful – for you, your board, your grantees and the communities you ultimately serve.
Michael Quinn Patton, in his book Utilization-Focused Evaluation, puts it plainly: if evaluation isn’t used, or isn’t useful, then what’s the point?
A simple starting point ensures that your evaluation efforts are anchored in what matters most: who needs to know what, and why?
The four questions that matter
I recommend starting by building a simple table. It doesn’t require a strategy retreat, a six-month project or a consultant’s report. Just a blank page, some sticky notes or a spreadsheet will do.
Your column headings should be:
- Stakeholder
- What do they want to know?
- When do they want to know it?
- In what format?
This structure is simple but powerful:
- Stakeholder: Who are you trying to serve with this information? Your board? Funders? Grantees? Communities? Internal staff? List them here.
- What do they want to know? Be specific. Do they want to see if funds were spent as intended? Understand community outcomes? Hear stories of change? Identify one or two things each stakeholder wants to know about your programme and put each in a new row.
- When do they want to know it? Timing is everything. Does your board want quarterly dashboards? Do grantees want quick feedback after reporting? Do community members want annual open forums? Attach a timeframe to each “what”.
- In what format? People use information differently. A board might prefer charts or commentary (or a combination of both), while grantees may want a short call. Community members might need a plain-language flyer, a PowerPoint presentation, or a verbal account delivered during a webinar.
Once you’ve filled in this table, you’ll see a roadmap emerging.
An example in practice
Let’s say you’re funding youth mental health programme. Here’s what your table might look like at the start:
This table captures what really matters: who needs to know, what they need to know, when, and in what form. From there, you can see where to start collecting data, how to present it, and where the biggest gaps are. Most importantly, it keeps you honest about the purpose of evaluation: not to tick boxes, but to provide value to real people.
Adding detail to your evaluation framework
Once you’ve mapped your stakeholders and their needs, you can begin to layer in more detail:
- Indicators and measures: What data do you need to collect to answer each stakeholder’s questions?
- Systems: Where will you store this data? A grants management system such as SmartyGrants, a shared drive or a dashboard tool?
- Processes: Who will collect the data? How often? How will you ensure quality and consistency?
Resist the temptation to overcomplicate things. A strong foundation, built from your simple table, will save you collecting too much irrelevant data.
Common pitfalls to avoid
- Collecting data because you can, not because it’s needed. Every data point should tie back to a real stakeholder question.
- Missing deadlines. Delivering a report three months late often means it won’t be used.
- Using the wrong format. A dense 40-page report is useless if your board only has time to read one page.
- Ignoring the end user. If evaluation isn’t useful to the people it’s meant to serve, it won’t be used.

Final thoughts
Grantmakers don’t need perfect evaluation systems; they need useful ones. Starting with a simple stakeholder table brings clarity to the chaos. It ensures your evaluation is grounded in real-world use and allows you up to add complexity later without losing sight of the basics.
If you’re sitting at your desk staring at a mountain of reporting requirements and data possibilities, take a breath. Pull out a piece of paper. Write down the four column headings. Then start with the people who matter most.
That’s evaluation worth doing.
Author: Jen Riley is the co-chair of SIMNA and previously worked as SmartyGrants chief impact officer
This help sheet was first published in Making Change Visible: A Practical Guide to Impact Measurement for Funders part of a joint publication between SmartyGrants and Philanthropy Ireland.