Transparency Talk

Category: "Reports" (6 posts)

How "Going Public" Improves Evaluations
October 17, 2017

Edward Pauly is director of research and evaluation at The Wallace Foundation. This post is part of the Glasspockets #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

ED_finalAs foundations strive to be #OpenForGood and share key lessons from their grantees' work, a frequent question that arises is how foundations can balance the value of openness with concerns about potential risks.

Concerns about risk are particularly charged when it comes to evaluations. Those concerns include: possible reputational damage to grantees from a critical or less-than-positive evaluation; internal foundation staff disagreements with evaluators about the accomplishments and challenges of grantees they know well; and evaluators’ delays and complicated interpretations.

It therefore may seem counterintuitive to embrace – as The Wallace Foundation has – the idea of making evaluations public and distributing them widely. And one of the key reasons may be surprising: To get better and more useful evaluations.

The Wallace Foundation has found that high-quality evaluations – by which we mean independent, commissioned research that tackles questions that are important to the field – are often a powerful tool for improving policy and practice. We have also found that evaluations are notably improved in quality and utility by being publicly distributed.

Incentives for High Quality

A key reason is that the incentives of a public report for the author are aligned with quality in several ways:

  • Evaluation research teams know that when their reports are public and widely distributed, they will be closely scrutinized and their reputation is on the line. Therefore, they do their highest quality work when it’s public.  In our experience, non-public reports are more likely than public reports to be weak in data use, loose in their analysis, and even a bit sloppy in their writing.  It is also noteworthy that some of the best evaluation teams insist on publishing their reports.
  • Evaluators also recognize that they benefit from the visibility of their public reports because visibility brings them more research opportunities – but only if their work is excellent, accessible and useful.
  • We see evaluators perk up when they focus on the audience their reports will reach. Gathering data and writing for a broad audience of practitioners and policymakers incentivizes evaluators to seek out and carefully consider the concerns of the audience: What information does the audience need in order to judge the value of the project being evaluated? What evidence will the intended audience find useful? How should the evaluation report be written so it will be accessible to the audience?

Making evaluations public is a classic case of a virtuous circle: public scrutiny creates incentives for high quality, accessibility and utility; high quality reports lead to expanded, engaged audiences – and the circle turns again, as large audiences use evaluation lessons to strengthen their own work, and demand more high-quality evaluations. To achieve these benefits, it’s obviously essential for grantmakers to communicate upfront and thoroughly with grantees about the goals of a public evaluation report -- goals of sharing lessons that can benefit the entire field, presented in a way that avoids any hint of punitive or harsh messaging.

“What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work?”

Asking the Right Questions

A key difference between evaluations commissioned for internal use and evaluations designed to produce public reports for a broad audience lies in the questions they ask. Of course, for any evaluation or applied research project, a crucial precursor to success is getting the questions right. In many cases, internally-focused evaluations quite reasonably ask questions about the lessons for the foundation as a grantmaker. Evaluations for a broad audience of practitioners and policymakers, including the grantees themselves, typically ask a broader set of questions, often emphasizing lessons for the field on how an innovative program can be successfully implemented, what outcomes are likely, and what policies are likely to be supportive.

In shaping these efforts at Wallace as part of the overall design of initiatives, we have found that one of the most valuable initial steps is to ask field leaders: What is it that you don’t know, that if you knew it, would enable you to make important progress in your own work? This kind of listening can help a foundation get the questions right for an evaluation whose findings will be valued, and used, by field leaders and practitioners.

Knowledge at Work

For example, school district leaders interested in Wallace-supported “principal pipelines” that could help ensure a reliable supply of effective principals, wanted to know the costs of starting such pipelines and maintaining them over time. The result was a widely-used RAND report that we commissioned, “What It Takes to Operate and Maintain Principal Pipelines: Costs and Other Resources.” RAND found that costs are less than one half of 1% of districts’ expenditures; the report also explained what drives costs, and provided a very practical checklist of the components of a pipeline that readers can customize and adapt to meet their local needs.

Other examples that show how high-quality public evaluations can help grantees and the field include:

Being #OpenForGood does not happen overnight, and managing an evaluation planned for wide public distribution isn’t easy. The challenges start with getting the question right – and then selecting a high-performing evaluation team; allocating adequate resources for the evaluation; connecting the evaluators with grantees and obtaining relevant data; managing the inevitable and unpredictable bumps in the road; reviewing the draft report for accuracy and tone; allowing time for grantees to fact-check it; and preparing with grantees and the research team for the public release. Difficulties, like rocks on a path, crop up in each stage in the journey. Wallace has encountered all of these difficulties, and we don’t always navigate them successfully. (Delays are a persistent issue for us.)

Since we believe that the knowledge we produce is a public good, it follows that the payoff of publishing useful evaluation reports is worth it. Interest from the field is evidenced by 750,000 downloads last year from www.wallacefoundation.org, and a highly engaged public discourse about what works, what doesn’t, why, and how – rather than the silence that often greets many internally-focused evaluations.

--Edward Pauly

No Moat Philanthropy Part 4: Beyond the Transactional
October 5, 2017

Jen Ford Reedy is President of the Bush Foundation. On the occasion of her fifth anniversary leading the foundation, she reflects on efforts undertaken to make the Bush Foundation more permeable. Because the strategies and tactics she shares can be inspiring and helpful for any grantmaker exploring ways to open up their grantmaking, we are devoting our blog space all week to the series. This is the fourth post in the five-part series.

Reedyjenniferford-croppedWe have a grantmaking model that is based on the belief that, if we do it right, we will create more good by what we inspire than by what we directly fund. Principle #4 and #5 of No Moat Philanthropy are directly related to this, how connecting and sharing with others can advance your foundation’s mission.

Principle #4: Value every interaction as an opportunity to advance your mission

Our tagline and our strategy are one and the same: We invest in great ideas and the people who power them. We know that the only way anything happens is through people. Any place or field, therefore, is limited by the ambitions and the skills of the people in it.

The Bush Fellowship has been a flagship program of the Foundation for decades. We hear repeatedly from Bush Fellows that the experience changed what they thought was possible in their life and career. With the Bush Fellows program as our source code, we’ve been working for the past five years to ensure that all of our programs have the same effect. How can we encourage people to think bigger and think differently? How can we be a force for optimism?

This notion of a foundation being a force for optimism is not an obvious one. After all, we mostly tell people no. Last year, 95 percent of people who applied for the Bush Fellowship did not receive one. We’ve worked diligently to make sure all applicant interactions with us are helpful and encouraging, regardless of grant or fellowship outcome. And our surveys suggest the work is paying off. For example, 79 percent of declined Bush Fellowship applicants said the process increased their beliefs that they can accomplish “a lot.”

“If we do grantmaking right, we will create more good by what we inspire than by what we directly fund.”

To have this impact with each applicant, we:

Operate hotlines to speak with Bush staff. For our open programs, we have established hotlines for potential applicants. We will speak with people as many times as they desire to provide coaching on their idea or proposal. For applicants, this is a way to clearly understand what we are looking for and to vet ideas with us. For Bush staff, this is a way to provide coaching and encouragement to strengthen proposals and to influence activities beyond those we fund.

Give feedback about declined applications. We offer feedback to declined applicants for our major grant and fellowship programs because we see this as another valuable opportunity to provide coaching and encouragement. We have also witnessed applicants using the feedback to improve their plans and proposals, which benefits both them and us. This two-way dialogue also allows applicants to share how we can improve the process for them.

Find ways to support declined applicants. In the course of our processes, we learn about far more amazing people and organizations than we can actually fund. Therefore, we try to find ways to be useful to more than just the limited number of accepted applicants. For example, we consider declined Bush Fellowship finalists to be part of our “Bush Network” and invite them to bushCONNECT. We also provide declined Bush Prize finalists with a $10,000 grant. In our hiring process, we offer unsuccessful finalists the chance to meet with our hiring consultant for an individual coaching session. In addition, across all our programs and operations, we try to craft our applications and our processes so that the experience of applying adds value to an applicant’s thinking and planning.

Every interaction is an opportunity to influence and be influenced.  Every interaction is an opportunity for shared learning. And that brings me to our fifth and final principle…

Bush-altlogo-color Principle #5: Share as you go.

In the past five years, we’ve been working to get more of what we are thinking — and learning — out to the community. This has required adjusting our standards and prioritizing just getting something out, even if it is not glossy and beautiful. It has required a new, shared understanding with grantees and Fellows that their reports and reflections will be public, so as many people as possible can benefit from their experience. It has required designing our internal work — like strategy documents for the Board — with external audiences in mind so they are ready to share.

We believe that if we do it right, we can have as much and potentially more impact from sharing the stories and spreading the lessons from our grantees and Fellows as from the investments themselves. This belief is at the heart of all our communications (see learning paper: “Communications as Program”) and is also reinforced with specific tactics such as:

“We potentially have more impact from sharing the stories and spreading the lessons from our grantees and Fellows.”

Post grantee reports on our website. We introduced “Learning Logs” to make grant reports public, and we hope, to give them life and utility beyond our walls. We refer prospective applicants to relevant learning logs as they craft their proposals, and we hear from applicants that they have indeed learned from them. Grantees and Fellows also share that they read one another’s Learning Logs as a way to get new ideas for overcoming barriers.

Share lessons along the way. We are publishing learning papers (like this one) as we believe we have something useful to share. We intended this to lower the bar of who, when and how we share. Our learning papers are not beautiful. Most of them are not based on statistically significant evaluation methodologies. They simply document a staff effort to process something we are working on and to share our reflections.

Tie evaluation to audience analysis. We invest heavily in external evaluations of our work, but in doing so we have found that the end-product is often only useful to our staff and key stakeholders. Consequently, we introduced a different approach to thinking about evaluation with a sharing mindset. We use a framework to identify the audiences who might care about or benefit from the lessons of an evaluation, what questions are relevant to each group, and what form or output would be most useful to them.

Webinar to the max. Webinars are not a particularly novel activity; however, we view them as a core tool of permeability. We host a webinar at the beginning of every application period for Grant and Fellowship programs to explain the process and what we are looking for. We also host them when we have a job opening to discuss the role and what it is like to work here. We host them annually for our Foundation initiatives to explain what we are up to and where we are headed. Most webinars feature a staff presentation followed by an open Q&A with videos archived on our website for anyone who missed it.

If you’ve been reading this series all week, you might be wondering when I’m going to get to the downsides of No Moat Philanthropy. All new approaches have their pain points.  So, come back tomorrow and I’ll share our pain and why we believe it is worth it.

--Jen Ford Reedy

How To Keep Me Scrolling Through What You Are Sharing
August 10, 2017

Tom Kelly is Vice President of Knowledge, Evaluation & Learning at the Hawai‘i Community Foundation. He has been learning and evaluating in philanthropy since the beginning of the century. @TomEval  TomEval.com

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new research and tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Tom Kelly Hi ResHello, my name is Tom and I am a Subscriber. And a Tweeter, Follower, Forwarder (FYI!), Google Searcher, and DropBox Hoarder. I subscribe to blogs, feeds, e-newsletters, and email updates. My professional title includes the word “Knowledge,” so I feel compelled to make sure I am keeping track of the high volume of data, information, reports, and ideas flowing throughout the nonprofit and foundation worlds (yes, it is a bit of a compulsion…and I am not even including my favorite travel, shopping and coupon alerts).

It is a lot and I confess I do not read all of it. It is a form of meditation for me to scroll through emails and Twitter feeds while waiting in line at Aloha Salads. I skim, I save, I forward, I retweet – I copy and save for later reading (later when?). In fact, no one can be expected to keep up, so how does anyone make sense of it all, or even find what we need when we need it? Everyone being #OpenForGood and sharing everything is great, but who is reading it all? And how do we make what we are opening for good actually good?

Making Knowledge Usable

We have all experienced at some point Drowning in Information-Starving for Knowledge (John Naisbitt’s Megatrends…I prefer E.O. Wilson’s “starving for wisdom” theory). The information may be out there but rarely in a form that is easily found, read, understood, and most importantly used. Foundation Center and IssueLab have made it easier for people in the sector to know what is being funded, where new ideas are being tested, and what evidence and lessons are available. But nonprofits and foundations still have to upload and share many more of their documents than they do now. And we need to make sure that the information we share is readable, usable, and ready to be applied.

Hawaii Community Foundation Graphic

DataViz guru Stephanie Evergreen recently taught me a new hashtag: #TLDR – “Too Long, Didn’t Read.”

She now proposes that every published report be available in three formats – a one-page handout with key messages, a 3-page executive summary, and a 25-page report (plus appendices). In this way the “scanners,” “skimmers” and “deep divers” can access the information in the form they prefer and in the time they have. It also requires writing (and formatting) differently for each of these sets of eyes. (By the way, do you know which one you are?)

From Information to Influence

But it is not enough to make your reports accessible, searchable, and also easily readable in short and long forms; you also need to include the information people need to make decisions and act. It means deciding in advance who you want to inform and influence and what you want people to do with the information. You need to be clear about your purpose for sharing information, and you need to give people the right kinds of information if you expect them to read it, learn from it, and apply it.

“Give people the right kinds of information if you expect them to read it, learn from it, and apply it.”

Too many times I have read reports with promising findings or interesting lessons, and then I race through all the footnotes and the appendices at the back of the report looking for resources that could point me to the details of evidence and data or implementation guidance. I usually wind up trying to track down the authors by email or phone to follow up.

A 2005 study of more than 1,000 evaluations published in human services found only 22 well-designed and well-documented reports that shared any analysis of implementation factors – what lessons people learned about how best to put the program or services in place. We cannot expect other people and organizations to share knowledge and learn if they cannot access information from others that helps them use the knowledge and apply it in their own programs and organizations. YES, I want to hear about your lessons and “a-ha’s,” but I also want to see data and analysis of the common challenges that all nonprofits and foundations face:

  • How to apply and adapt program and practice models in different contexts
  • How to sustain effective practices
  • How to scale successful efforts to more people and communities

This means making sure that your evaluations and your reports include opening up the challenges of implementation – the same challenges others are likely to face. It also means placing your findings in the context of existing learning while also using similar definitions so that we can build on each other’s knowledge. For example, in our recent middle school connectedness initiative, our evaluator Learning for Action reviewed the literature first to determine specific components and best practices of youth mentoring so that we could build the evaluation on what had come before, and then report clearly about what we learned about in-school mentoring and open up  useful and comparable knowledge to the field. 

So please plan ahead and define your knowledge sharing and influence agenda up front and consider the following guidelines:

  • Who needs to read your report?
  • What information does your report need to share to be useful and used?
  • Read and review similar studies and reports and determine in advance what additional knowledge is needed and what you will document and evaluate.
  • Use common definitions and program model frameworks so we are able to continually build on field knowledge and not create anew each time.
  • Pay attention to and evaluate implementation, replication and the management challenges (staffing, training, communication, adaptation) that others will face.
  • And disseminate widely and share at conferences, in journals, in your sector networks, and in IssueLab’s open repository.

And I will be very happy to read through your implementation lessons in your report’s footnotes and appendices next time I am in line for a salad.

--Tom Kelly

How to Make Grantee Reports #OpenForGood
July 20, 2017

Mandy Ellerton and Molly Matheson Gruen joined the [Archibald] Bush Foundation in 2011, where they created and now direct the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community challenges, using approaches that are collaborative and inclusive of people who are most directly affected by the problem. This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Ellertonmandy20152
Mandy Ellerton

When we started working at the Bush Foundation in 2011, we encountered a machine we’d never seen before: the Lektriever. It’s a giant machine that moves files around, kind of like a dry cleaner’s clothes rack, and allows you to seriously pack in the paper. As a responsible grantmaker, it’s how the Bush Foundation had meticulously tracked and stored its files for posterity - in particular, grantee reports - for decades.

In 2013, the Bush Foundation had the privilege of moving to a new office. Mere days before we were to move into the new space, we got a frantic call from the new building’s management. It turned out that the Lektrievers (we actually had multiple giant filing machines!) were too heavy for the floor of the new building, which had to be reinforced with a number of steel plates to sustain their weight.

MMG 2015 Headshot1
Molly Matheson Gruen

The Lektrievers symbolized our opportunity to become more transparent and move beyond simply preserving our records, instead seeing them as relevant learning tools for current audiences. It was time to lighten the load and share this valuable information with the world.

Even with all this extra engineering, we would still have to say goodbye to one of the machines altogether for the entire system to be structurally sound. We had decades of grantee stories, experiences and learning trapped in a huge machine in the inner sanctum of our office, up on the 25th floor. 

Learning Logs Emerge

We developed our grantee learning log concept in the Community Innovation Programs as one way to increase the Foundation’s transparency. At the heart of it, our learning logs are a very simple concept: they are grantee reports, shared online. But, like many things that appear simple, once you pull on the string of change – the complexity reveals itself.

“Every Community Innovation project is an opportunity for others to learn and the learning logs are a platform to share this learning.”

Before we could save the reports from a life of oblivion in the Lektriever, build out the technology and slap the reports online, we needed to entirely rethink our approach to grantee reporting to create a process that was more mutually beneficial. First, we streamlined our grant accountability measures (assessing whether the grantees did what they said they’d do) by structuring them into a conversation with grantees, rather than as a part of the written reports. We’ve found that conducting these assessments in a conversation takes the pressure off and creates a space where grantees can be more candid, leading to increased trust and a stronger partnership.

Second, our grantee reports now focus on what grantees are learning in their grant-funded project. What’s working? What’s not? What would you do differently if you had it to do all over again? This new process resulted in reports that were more concise and to the point.

Finally, we redesigned our website to create a searchable mechanism for sharing these reports online. This involved linking our grant management system directly with our website so that when a grantee submits a report, we do a quick review and then the report automatically populates our website. We’ve also designed a way for grantees to be able to designate select answers as private when they want to share sensitive information with us, yet not make it entirely public. We leave it up grantee discretion and those selected answers do not appear on the website. Grantees designate their answers to be private for a number of reasons, most often because they discuss sensitive situations having to do with specific people or partners – like when someone drops out of the project or when a disagreement with a partner holds up progress. And while we’ve been pleased at the candor of most of our grantees, some are still understandably reluctant to be publicly candid about failures or mistakes.

But why does this new approach to grantee reporting matter, besides making sure the floor doesn’t collapse beneath our Lektrievers?

Bushfoundation-Lektriever photo
The Lektriever is a giant machine that moves files around, kind of like a dry cleaner’s clothes rack. The Bush Foundation had meticulously tracked and stored its files for posterity - in particular, grantee reports - for decades. Credit: Bush Foundation

Learning Sees the Light of Day

Learning logs help bring grantee learning into the light of day, instead of hiding in the Lektrievers, so that more people can learn about what it really takes to solve problems. Our Community Innovation programs at the Bush Foundation fund and reward the process of innovation–the process of solving problems. Our grantees are addressing wildly different issues: from water quality to historical trauma, from economic development to prison reform. But, when you talk to our grantees, you see that they actually have a lot in common and a lot to learn from one another about effective problem-solving. And beyond our grantee pool, there are countless other organizations that want to engage their communities and work collaboratively to solve problems.  Every Community Innovation project is an opportunity for others to learn and the learning logs are a platform to share this learning, making it #OpenForGood.

We also want to honor our grantees’ time. Grantees spend a lot of time preparing grant reports for funders. And, in a best case scenario, a program officer reads the report and sends the grantee a response of some kind before the report is filed away. But, let’s be honest – sometimes even that doesn’t happen. The report process can be a burden on nonprofits and the only party to benefit is the funder. We hope that the learning logs help affirm to our grantees that they’re part of something bigger than themselves - that what they share matters to others who are doing similar work.

We also hear from our grantees that the reports provide a helpful, reflective process, especially when they fill it out together with collaborating partners. One grantee even said she’d like to fill out the report more often than we require to have regular reflection moments with her team!

Learning from the Learning Logs

We only launched the learning logs last year, but we’ve already received some positive feedback. We’ve heard from both funded and non-funded organizations that the learning logs provide inspiration and practical advice so that they can pursue similar projects. A grantee recently shared a current challenge in their work. It directly connected to some work we knew another grantee had done and had written about in their learning log. So, since this knowledge was now out in the open, we were able to direct them to the learning log as a way to expand our grantee’s impact, even beyond their local community, and use it to help advance another grantee’s work.

Take, for example, some of the following quotes from some of our grantee reports:

  • The Minnesot Brain Injury Alliance's project worked on finding ways to better serve homeless people with brain injuries.  They reflected that, "Taking the opportunity for reflection at various points in the process was very important in working toward innovation.  Without reflection, we might not have been open to revising our plan and implementing new possibilities."
  • GROW South Dakota addressed a number of challenges facing rural South Dakota communities. They shared that, “Getting to conversations that matter requires careful preparation in terms of finding good questions and setting good ground rules for how the conversations will take place—making sure all voices are heard, and that people are listening for understanding and not involved in a debate.”
  •  The People's Press Project engaged communities of color and disenfranchised communities to create a non-commercial, community-owned, low-powered radio station serving the Fargo-Moorhead area of North Dakota. They learned “quickly that simply inviting community members to a meeting or a training was not a type of outreach that was effective.”

Like many foundations, we decline far more applications than what we fund, and our limited funding can only help communities tackle so many problems. Our learning logs are one way to try and squeeze out more impact from those direct investments. By reading grantee learning logs, hopefully more people will be inspired to effectively solve problems in their communities.

We’re not planning to get rid of the Lektrievers anytime soon – they’re pretty retro cool and efficient. They contain important historical records and are incredibly useful for other kinds of record keeping, beyond grantee documentation. Plus, the floor hasn’t fallen in yet. But, as Bush Foundation Communications Director Dominick Washington put it, now we’re unleashing the knowledge, “getting it out of those cabinets, and to people who can use it.”

--Mandy Ellerton and Molly Matheson Gruen

Blind Spots No More: Introducing Transparency Trends
April 13, 2016

(Janet Camarena is director of transparency initiatives at Foundation Center.)

Janet Camarena

There are some lessons you learn that you never forget. "Mirror, signal, blind spot," is thankfully one of those lessons for me, dating all the way back to driver's ed when I was equal parts excited and horrified that someone was handing me the keys to a moving vehicle. I still recall the teacher emphasizing how important it is when changing lanes to first check the mirror for what is behind you; signal to let others know you are entering/exiting a lane; and then to check your blind spot, assuming there is someone invisible to you that only looking over your shoulder and out the window will reveal.

"The new Transparency Trends tool helps foundations benchmark openness."

So, is our new Glasspockets' Transparency Trends a mirror, a signal, or a viewer for revealing blind spots a foundation may be creating? It actually serves all of these purposes. Transparency Trends, created with support from the Barr Foundation, aggregates the data we have collected from all foundations that have taken and publicly shared their "Who Has Glass Pockets?" self-assessment transparency profiles, and allows the user to interact and display the data in a variety of ways.

The default view displays data about all 77 participating foundations, and users can perform a number of helpful transparency benchmarking activities with the tool, including:

  • Learn which transparency elements are most and least commonly shared online;
  • Access lists of which participating foundations share each transparency indicator;
  • Access statistics about the sharing frequency of each transparency element;
  • Compare a specific foundation to a select peer group by region/asset/foundation type; and
  • Download a customized report detailing suggested improvements for a particular foundation.

Some interesting facts quickly reveal both strengths and blind spots:

Searchable Grants Performance Assessment
  • Nearly two-thirds of participating foundations provide searchable grants via their websites;
  • 87% of participating foundations provide key staff biographies;
  • Fewer than half of participating foundations post a Code of Conduct online;
  • Despite all of the talk about impact, only 22% of participating foundations share foundation performance assessments via their websites; and
  • Only 31% of participating foundations use their websites to collect grantee feedback.

The more I explore Transparency Trends, the more excited I became about the "Mirror, signal, blind spot" rule of the road as a metaphor for the importance of philanthropic transparency. After all when you are handed the keys to a foundation, it's great if someone also hands you some institutional memory so you can have a view of the road travelled so far and what has been learned so you can actually get somewhere rather than driving in circles.

And since there are likely others who are travelling a similar path, the notion of signaling to the world what direction you are going resonates as well, since you might get there faster (and more efficiently) via a pooled or shared ride approach, or by at least sharing your road maps and shortcuts.

And finally, are you and the others on the road actually creating blind spots that prevent those around you from knowing you exist and building on your shared efforts? From Transparency Trends, you can see that fewer than half of participating foundations have a Knowledge Center that shares the lessons they are learning, and only 12% have open licensing policies that make it clear how to build on the knowledge the foundations funds and produces.

Knowledge Center Open Licensing

As fun as it is to explore the data on the pinwheel display, don't miss the opportunity to download a customized report. Since the reports are particularly helpful as a mechanism to surface both the transparency blind spots and strengths a particular foundation might have, Transparency Trends is accessible to any foundation, whether or not they have previously participated in Glasspockets.

So, if you have not submitted a profile to Glasspockets, you can still explore and extract helpful information from the tool by completing a short questionnaire about your existing transparency practices. The questionnaire will not be shared without your permission, but it will allow you to view your foundation as compared to others in our database.

Customized ReportA customized report from Transparency Trends

Our hope is these reports will serve to encourage greater foundation transparency by quickly surfacing data that identifies areas in which a foundation is behind its peers in regards to specific transparency indicators. And for those foundations that have already participated, you get a shortcut to your customized report since you will skip the questionnaire and go directly to a report to reveal your strengths and weaknesses, or areas where you may inadvertently be creating blind spots.

And speaking of blind spots, I have been thankful for the "Mirror, signal, blind spot" mantra many times when it has literally saved my life. I can recall several occasions when I've ritually check the blind spot, convinced it was empty, and only because I did the over-the-shoulder check did I avoid a collision. I'm reminded of this particular lesson at the launch of Transparency Trends because perhaps philanthropy needs a way to do the over-the-shoulder check as well. By visualizing both philanthropy's strengths and weaknesses when it comes to greater openness, we can collectively work toward a future with fewer blind spots, more awareness of those around us, and a clear view of what we have learned from the road travelled so far.

Explore Transparency Trends and let me know what you think.

-- Janet Camarena

Glasspockets Find: GEO Funders is using grantee feedback to reshape philanthropy
March 30, 2015

(Eliza Smith is the Special Projects Associate for Glasspockets at the Foundation Center-San Francisco.)

6a00e54efc2f80883301a511bd210d970c-150wiAt the  end of 2014, GEO Funders, a network of grantmakers that works to reshape the way philanthropy operates, released a report, Strengthening Relationships with Grantees. In the report, GEO Funders examines how grantmaking has improved over the last few years, and how it can continue to improve in the years to come.

GEO Funders gathered data from 637 nonprofits in order to determine best practices in building more effective relationships with grantees.; According to the report, clear feedback mechanisms, and listening to grantee feedback, are core practices that set the stage for effective collaboration between  grantmakers and grantees to ensure a better future for philanthropy.

Here are some of the report’s key findings:

  • 53% of funders now ask grantees for feedback. This number is up from 36% six years ago.An increasing number of grantmakers now regularly seek feedback from grantees and are creating more opportunities for grantee input to inform grantmaker strategy and practice.
  • A median of 25% of grant dollars now goes to general operating support. This number is up from 20% six years ago. Grantmakers increased the types of support most commonly associated with boosting nonprofit success, including general operating, multiyear and capacity-building support.
  • 76% of funders evaluate their work. Grantmakers evaluate their work, but most are not getting all that they could out of these efforts because the focus remains on internal uses.
  • 80% of funders say collaboration is important. Although grantmakers believe it is important to coordinate resources and actions with other funders to achieve greater impact, they are unlikely to support grantees to do the same.

You can read the report in its entirety here. But one of the best ways to dive in is the summary, which GEO Funders has published as an infographic, making the data-heavy report much easier to digest.

With this data in mind, how has your foundation helped to reimagine and reshape philanthropy? Have you enlisted feedback from your grantees to better your grantmaking process? Share with us in the space below.

--Eliza Smith

Share This Blog

  • Share This

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Subscribe to Transparency Talk

Categories