Transparency Talk

Category: "Data" (135 posts)

Candid Announces Inaugural #OpenForGood Award Winners
May 30, 2019

Janet Camarena is director of transparency initiatives at Candid.

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

Open For Good Awardees and Committee MembersLeft to Right: Meg Long, President, Equal Measure (#OpenForGood selection committee); Janet Camarena, Director, Transparency Initiatives, Candid; Awardee Savi Mull, Senior Evaluation Manager, C&A Foundation; Awardee Veronica Olazabal, Director, Measurement, Evaluation & Organizational Performance, The Rockefeller Foundation; Clare Nolan, Co-Founder, Engage R + D (#OpenForGood selection committee).

Yesterday as part of the Grantmakers for Effective Organizations Learning Conference, Candid announced the inaugural recipients of the #OpenForGood Award, which is designed to recognize and encourage foundations to openly share what they learn so we can all get collectively smarter. The award, part of a larger #OpenForGood campaign started in 2017, includes a set of tools to help funders work more transparently including a GrantCraft Guide about how to operationalize knowledge sharing, a growing collection of foundation evaluations on IssueLab, and advice from peers in a curated blog series.

The three winning foundations each demonstrate an active commitment to open knowledge and share their evaluations through IssueLab, an open repository that is free, searchable, and accessible to all. Selected by an external committee from a globally sourced nomination process, the committee reviewed the contenders looking for evidence of an active commitment to open knowledge, creative approaches to making knowledge shareable, field leadership, and incorporating community insights into knowledge sharing work.

And the Winners Are…

Here are some highlights from the award presentation remarks:

C and A FoundationC&A Foundation
Award Summary: Creativity, Demonstrated Field Leadership, and Willingness to Openly Share Struggles

The C&A Foundation is a multi-national, corporate foundation working to fundamentally transform the fashion industry. C&A Foundation gives its partners financial support, expertise and networks so they can make the fashion industry work better for every person it touches. Lessons learned and impact for each of its programs are clearly available on its website, and helpful top-level summaries are provided for every impact evaluation making a lengthy narrative evaluation very accessible to peers, grantees and other stakeholders. C&A Foundation even provides such summaries for efforts that didn’t go as planned, packaging them in an easy-to-read, graphic format that it shares via its Results & Learning blog, rather than hiding them away and quietly moving on as is more often the case in the field.

The Ian Potter FoundationIan Potter Foundation
Award Summary: Creativity, Field Leadership, and Lifting Up Community Insights

This foundation routinely publishes collective summaries from all of its grantee reports for each portfolio as a way to support shared learning among its existing and future grantees. It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than questions to satisfy the typical ritual of a grant report that goes something like submit, data enter, file away never to be seen, and repeat.

Beyond being transparent with its grantee learning and reports, the Ian Potter Foundation also recently helped lift the burden on its grantees when it comes to measurement and outcomes. Instead of asking overworked charities to invent a unique set of metrics just for their grant process, foundation evaluation staff took it upon themselves to mine the Sustainable Development Goals targets framework to provide grantees with optional and ready-made outcomes templates that would work across the field for many funders. You can read more about that effort underway in a recent blog post here.

The Rockefeller FoundationThe Rockefeller Foundation
Award Summary: Field Leadership, Consistent Knowledge Sharing, and Commitment to Working Transparently

The Rockefeller Foundation can boast early adopter status to transparency and openness—it  has had a longstanding commitment to creating a culture of learning and as such was one of the very first foundations to join the GlassPockets transparency movement and also to commit to #OpenForGood principles by sharing its published evaluations widely. Rockefeller Foundation also took the unusual step of upping the ante on the #OpenForGood Pledge aiming for both creating a culture of learning and accountability, with its monitoring and evaluation team stating that: “To ensure that we hold ourselves to a high bar, our foundation pre-commits itself to publicly sharing the results of its evaluations - well before the results are even known.” This ensures that even if the evaluation reports unfavorable findings, the intent is to share it all.

In an earlier GlassPockets blog post, Rockefeller’s monitoring and evaluation team shows a unique understanding of how sharing knowledge can advance the funder’s goals: “Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.”  Rockefeller’s use of IssueLab’s open knowledge platform is living up to this promise as anyone can currently query and find more than 400 knowledge documents funded, published, or co-published by the Rockefeller Foundation.

Winners will receive technical support to create a custom Knowledge Center for their foundation or for a grantee organization, as well as promotional support in knowledge dissemination. Knowledge Centers are a service of IssueLab that provides organizations with a simple way to manage and share knowledge on their own websites. By leveraging this tool, you can showcase your insight, promote analysis on your grantees, and feature learnings from network members. All documents that are uploaded to an IssueLab Knowledge Center are also made searchable and discoverable via systems like WorldCat, which serves more than 2,000 libraries worldwide, ensuring your knowledge can be found by researchers, regardless of their familiarity with your organization.

Why Choose Openness?

The #OpenForGood award is focused on inspiring foundations to use existing and emerging technologies to collectively improve the sector. Today, we live in a time when most expect to find the information they need on the go, via tablets, laptops, and mobile phones, just a swipe or click away. Despite this digital era reality today only 13 percent of foundations have websites, and even fewer share their reports publicly, indicating that the field has a long way to go to creating a culture of shared learning. With this award, we hope to change these practices. Rather than reinvent the wheel, this award and campaign encourages the sector to make it a priority to learn from one another, share content with a global audience, so that we can build smartly one another’s work and accelerate the change we want to see in the world. The more you share your foundation's work, the greater the opportunities to make all our efforts more effective and farther reaching.

Congratulations to our inaugural class of #OpenForGood Award Winners! What will you #OpenForGood?

--Janet Camarena

Transparency: One Small Step for Funders, One Giant Leap for Equity
May 9, 2019

Genevieve Boutilier is a Program Associate at the Peace and Security Funders Group.

This post also appears in the Alliance blog.

Genevieve




Genevieve Boutilier

In order to solve a problem, one must first identify its parameters. This applies, too, to the philanthropic sector; to that end, many of us are pushing for greater transparency in our field. For example, Candid teamed up with a hundred foundations to make public their grants data, assets, policies, and procedures through the GlassPockets initiative, while our funder affinity group colleagues at PEAK Grantmaking and the Transparency and Accountability Initiative advocate for greater transparency with their members. At the Peace and Security Funders Group, we push for transparency through our Peace and Security Funding Index.

For the past five years, the Index has chronicled thousands of grants awarded by hundreds of peace and security funders to get a better sense of who and what gets funded in this sector. This data is useful for understanding the landscape of peace and security funding, including by identifying funding gaps and new funders; however, it has its limits. In the hot-off-the-press 2019 Index, we make the case for how improving this data benefits funders. But beyond benefitting funders, improving the data greatly benefits grantees and the communities they serve, which – in a virtuous cycle – increases funder effectiveness.

On the most basic level, better data gives grantseekers insight into a foundation’s priorities. This allows grantees to more easily identify foundations with similar missions, making space for grantees to spend less time fundraising and more time focusing on their missions – be it fighting for indigenous rights, preventing nuclear war, or helping child soldiers reintegrate into their communities. This opens the door for more open, honest, and equitable relationships between foundations and the grantees they support, which is essential for impactful grantmaking.

But simply understanding who and what gets funded is only the start of the conversation. It’s time to take the conversation to the next level.

By definition, peace and security funders decide who gets a chance at peace by how they award grants. They are the guardians of crucial resources and enormous wealth, and they get to decide how much, how, and when it’s allocated. This is an incredible amount of power. With this power comes the responsibility to engage in the work in ways that center the needs of communities on the frontlines of some of the globe’s greatest challenges.

With timely, more detailed data, this sector can start to answer the tough questions that experts like Edgar Villanueva and Vu Le have been asking: Why are certain regions, issues, and strategies underfunded? Why are certain populations prioritized over others? Why isn't awarding general operating support increasing, especially given the ample evidence that suggests that it’s a best practice? Why are certain kinds of grantees passed over for funding?

”We aren’t collecting data for data’s sake—we’re hoping to transform this sector for the better.”

For our part, we aren’t collecting data for data’s sake—we’re hoping to transform this sector for the better.

To this end, we encourage all funders to start asking the tough questions about their grantmaking, and to increase their knowledge and understanding of equity in the philanthropic sector. Funders can begin to do this in three straightforward ways. First, submit detailed data about your grantmaking to Candid. We at the Peace and Security Funders Group (PSFG) are encouraging our 59 members – who represent a vast majority of the funding in the peace and security field – to submit their detailed 2018 grants data by June 30, 2019, so that we can improve the utility of the Peace and Security Funding Index. Second, funders can join their peers – including a handful of PSFG members – in becoming members of the Justice Funders network; here, they can listen and learn from each other and experts. Finally, funders should assess their own grantmaking practices. Ask yourself, ‘How could I change grantmaking practices to become more transparent and more equitable?’

There are countless other resources to help funders engage, so if you’re stuck and not sure where to go, we at PSFG can try and point you in the right direction.

--Genevieve Boutilier

Don’t “Ghost” Declined Applicants: The Ins and Outs of Giving Applicant Feedback
April 4, 2019

Mandy Ellerton joined the [Archibald] Bush Foundation in 2011, where she created and now directs the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community problems, using approaches that are collaborative and inclusive of people who are most directly affected by the problem.

GlassPockets Road to 100

This post is part of our “Road to 100 & Beyond series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the “Who Has GlassPockets? self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations over time, promising practices in transparency, helpful examples, and lessons learned.

I’ve often thought that fundraising can be as bad as dating. (Kudos to you lucky few who have had great experiences dating!) Lots of dates, lots of dead ends, lots of frustrating encounters before you (maybe) find a match. All along the way you look for even the smallest sign to indicate that someone likes you. “They laughed at my joke!” or, in the case of fundraising, “they seemed really excited about page five of last year’s impact report!” Not to mention the endless time spent doing online searches for shreds of information that might be useful. This reality is part of the reason why Bush Foundation was proud to be among the first 100 foundations to participate in GlassPockets. We believe that transparency and opening lines of communication is critical to good grantmaking, because both in dating and in fundraising, it can be heartbreaking and crazymaking to try and sort out whether you have a connection or if someone’s “just not that into you.” If only there was a way to just “swipe left” or “swipe right” and make everything a little simpler.

“We believe that transparency and opening lines of communication is critical to good grantmaking.”

I’m not proposing a Tinder for grantmaking (nor should anyone, probably, although hat tip to Vu Le for messing with all of us and floating the idea on April Fool’s Day). But over the past several years, Bush Foundation’s Community Innovation program staff has used a system to provide feedback calls for declined applicants, in the hopes of making foundation fundraising a little less opaque and crazymaking. We use the calls to be transparent and explain why we made our funding decisions. The calls also help us live out our “Spread Optimism” value because they allow us to help and encourage applicants and potentially point them to other resources. This is all part of our larger engagement strategy, described in “No Moat Philanthropy.”

 

Ellertonmandy20152
Mandy Ellerton

How Feedback Calls Work

We use a systematic approach for feedback calls:

  • We proactively offer the opportunity to sign up for feedback calls in the email we send to declined applicants.
  • We use a scheduling tool (after trying a couple different options we’ve landed on Slotted, which is relatively cheap and easy to use) and offer a variety of times for feedback calls every week. Collectively five Community Innovation Team members hold about an hour a week for feedback calls. The calls typically last about 20 minutes. We’ve found this is about the right amount of time so that we can offer feedback calls to most of the declined applicants who want them.
  • We prepare for our feedback calls. We re-read the application and develop an outline for the call ahead of time.
  • During the call we offer a couple of reasons why we declined the application. We often discuss what an applicant could work on to strengthen their project and whether they ought to apply again.
  • We also spend a lot of time listening; sometimes these calls can understandably be emotional. Grant applications are a representation of someone’s hopes and dreams and sometimes your decline might feel like the end of the road for the applicant. But hang with them. Don’t get defensive. However hard it might feel for you, it’s a lot harder for the declined applicant. And ultimately, hard conversations can be transformative for everyone involved. I will say, however, that most of our feedback calls are really positive exchanges.
  • We use anonymous surveys to evaluate what people think of the feedback calls and during the feedback call we ask whether the applicant has any feedback for us to improve our programs/grantmaking process.
  • We train new staff on how to do feedback calls. We have a staff instruction manual on how to do feedback calls, but we also have new team members shadow more seasoned team members for a while before they do a feedback call alone.

 

What’s Going Well

The feedback calls appear to be useful for both declined applicants and for us:

  • In our 2018 surveys, respondents (n=38) rated the feedback calls highly. They gave the calls an average rating of 6.1 (out of 7) for overall helpfulness, 95% said the calls added some value or a lot of value, and 81.2% said they had a somewhat better or much better understanding of the programs after the feedback call.
  • We’ve seen the number of applications for our Community Innovation Grant and Bush Prize for Community Innovation programs go down over time and we’ve seen the overall quality go up. We think that’s due, in part, to feedback calls that help applicants decide whether to apply again and that help applicants improve their projects to become a better fit for funding in the future.
  • I’d also like to think that doing feedback calls has made us better grantmakers. First, it shows up in our selection meetings. When you might have to talk to someone about why you made the funding decision you did, you’re going to be even more thoughtful in making the decision in the first place. You’re going to hew even closer to your stated criteria and treat the decision with care. We regularly discuss what feedback we plan to give to declined applicants in the actual selection meeting. Second, in a system that has inherently huge power differentials (foundations have all of it and applicants have virtually none of it), doing feedback calls forces you to come face to face with that reality. Never confronting the fact that your funding decisions impact real people with hopes and dreams is a part of what corrupts philanthropy. Feedback calls keep you a little more humble.

 

What We’re Working On

We still have room to improve our feedback calls:

  • We’ve heard from declined applicants that they sometimes get conflicting feedback from different team members when they apply (and get declined) multiple times; 15% of survey respondents said their feedback was inconsistent with prior feedback from us. Cringe. That definitely makes fundraising more crazymaking. We’re working on how to have more staff continuity with applicants who have applied multiple times.
  • We sometimes struggle to determine how long to keep encouraging a declined applicant to improve their project for future applications versus saying more definitively that the project is not a fit. Yes, we want to “Spread Optimism,” but although it never feels good for anyone involved, sometimes the best course of action is to encourage an applicant to seek funding elsewhere.

I’m under no illusions that feedback calls are going to fix the structural issues with philanthropy and fundraising. I welcome that larger conversation, driven in large part by brave critiques of philanthropy emerging lately like Decolonizing Wealth, Just Giving and Winners Take All. In the meantime, fundraising, as with dating, is still going to have moments of heartache and uncertainty. When you apply for a grant, you have to be brave and vulnerable; you’re putting your hopes and dreams out into a really confusing and opaque system that’s going to judge them, perhaps support them, or perhaps dash them, and maybe even “ghost” them by never responding. Feedback calls are one way to treat those hopes and dreams with a bit more care.

--Mandy Ellerton

GlassPockets Announces New Transparency Levels: Leveling Up Your Practices
March 28, 2019

Janet Camarena is director of transparency initiatives at Candid.

6a00e54efc2f80883301b7c90b6cb7970b-150wi
Janet Camarena

It's an exciting moment for us here at GlassPockets, and for the field of philanthropy, as we’ve just reached the milestone of 100 foundations committing to work more transparently by participating and publicly sharing their “Who Has GlassPockets?” transparency self-assessment profiles on our website. Yesterday, the Walton Family Foundation (WFF) officially became our 100th participant. What you are seeing today is the result of a diligent process that started last summer, as WFF continually worked to improve the openness of its website. With clear pathways to connect directly with staff members, a knowledge center containing lessons learned as well as packaged “flashcards” containing easily shareable bits of information, and a new searchable grants database spanning its 31-year history, WFF is not starting small when it comes to openness. Transparency can be tricky territory for family foundation donors who may be more accustomed to privacy and anonymity when it comes to their giving, so it’s particularly exciting for us to reach the milestone of 100 published profiles thanks to a family foundation enthusiastically embracing a more transparent approach.

When we started with a handful of foundations and fewer than two dozen transparency indicators, it was more experiment than movement. Now that we’ve aggregated data on transparency trends among 100 participating foundations, it’s a good opportunity to pause and reflect on what we are learning from this data that could inform the way forward to a more transparent future for philanthropy.

Transparency Indicators Evolve

GlassPockets Road to 100

Earlier this year I observed that a promising trend we are seeing in the field is that more foundations are developing sections of their websites devoted to explaining how they work, what values they hold dear, and in some cases, how these values inform their work and operations. Among the 100 foundations that have taken and publicly shared their transparency assessments, 42 percent are now using their websites as a means to communicate values or policies that demonstrate an intentional commitment to transparency. As a result we recently added transparency values/policies as a formal indicator to our GlassPockets assessment. But once you have developed such a values or policy statement, how does a foundation live up to it?

That’s where we hope our “Who Has GlassPockets?” assessment will continue to help foundations create a roadmap to transparency. The assessment is not static and has evolved with the field. When we started in 2010, there were 23 transparency indicators based on an inventory of thousands of foundation websites. As we continue to observe website transparency trends, the assessment has now grown to 27 indicators. Aside from the newest indicator for transparency values/policies, based on the kinds of information that foundations are now starting to share, some other new indicators we added since inception are strategic plans, open licensing policies, and use of the Sustainable Development Goals framework(SDGs). And we expect that as the field continues to evolve, this list of indicators will grow as well.

As the list has grown longer, foundations frequently ask us which indicators are the right ones to start with. Some also tell us that they want to participate, but not until they have at least half or even three-quarters of the indicators on the list. Though we applaud striving to be more transparent, the intent of GlassPockets was never that it be considered a “one-size-fits-all” approach, or that we expected that a majority of the indicators be in place to participate. Rather, that the GlassPockets exercise would serve to surface it as a priority, help the foundation evolve its transparency over time, and ideally would be a process the institution revisits on a regular basis, updating the GlassPockets profile with more and more indicators as transparency improves.

New Transparency Levels and Badges

So to help foundations better understand how to get started and how to grow transparency practices over time, we analyzed the data we have been collecting, and some patterns about how transparency evolves in philanthropy are now becoming clearer. We also conducted advisor interviews with a number of GlassPockets participants to better understand what would be most motivational and helpful in this regard. After reviewing everything we’ve learned so far, we have identified three levels through which foundations pass as they chart their course to greater transparency – these represent core, advanced, and champion-level transparency practices that you can view on this chart.

Explore how the Transparency Indicators relate to each level

Core-level transparency practices represent data most commonly shared by participating foundations and are the best place for new participants to begin. Advanced-level transparency practices open up the way you work to the world and represent information shared by about 50 to 70 percent of participating foundations. Champion-level transparency practices, in place at fewer than half of participating foundations, represent information-sharing that is pushing existing boundaries of foundation transparency.

These new levels represent an optional guide that can be helpful to follow but it is not intended to be viewed as a formal set of requirements. As has always been the case, any foundation at any stage of its transparency journey is welcome to participate and chart its own course. However, to motivate participation and progress, GlassPockets will begin awarding Transparency Badges based on the transparency level attained. These badges will appear on the GlassPockets profile, and will also be made available for use on the foundation’s website. Since it is not a one-size-fits-all, all participating foundations will automatically receive the Core GlassPockets transparency badge, and those who attain Advanced (10-18 indicators) or Champion level (19 or more indicators) will receive a badge denoting the appropriate designation.

Learn About the Transparency Badges

On the Level

Based on the new levels described above, GlassPockets will soon be adding the new Transparency Badges to each profile. So, if it’s been awhile since you reviewed your “Who Has GlassPockets?” profile, or if you’re looking for motivation to improve your transparency, now’s the time to review your existing profile, or submit a new one to see how your foundation stacks up. For existing GlassPockets participants, May 28th is the deadline to review your profile and get any updates or changes in to us before we start making the transparency levels and badges visible on the GlassPockets website the week of June 3rd. To update your profile, you can fill out any new links or corrections on this submission form, or simply email me your changes. As always, new profiles can be added at any time and you can learn more about that process here.

And last, but certainly not least, big thanks and cheers to our existing GlassPockets participants for helping us reach this milestone, and a big welcome to those who will help us reach the next one!

-- Janet Camarena

A New Year, a New Transparency Indicator: Coming Soon—Transparency Values & Policies
January 3, 2019

Janet Camarena is director of transparency initiatives at Foundation Center.

Janet Camarena PhotoWhen GlassPockets started nine years ago, it was rare to find any reference to transparency in relation to philanthropy or foundations. The focus of most references to transparency at the time were in relation to nonprofits or governments, but seldom to philanthropy. When we set out to create a framework to assess foundation transparency, the “Who Has GlassPockets?” criteria were based on an inventory of current foundation practices meaning there were no indicators on the list that were not being shared somewhere by at least a few foundations. Not surprisingly, given the lack of emphasis on foundation transparency, there were few mentions of it as a policy or even as a value in the websites we reviewed, so it didn’t make sense at the time to include it as a formal indicator.

GlassPockets Road to 100A lot has changed in nine years, and it’s clear now from reviewing philanthropy journals, conferences, and yes, even foundation websites that awareness about the importance of philanthropic transparency is on the rise. Among the nearly 100 foundations that have taken and publicly shared “Who Has GlassPockets?” transparency assessments, more than 40 percent are now using their websites as a means to communicate values or policies that aim to demonstrate an intentional commitment to transparency. And demonstrating that how the work is done is as important as what is done, another encouraging signal is that in many cases there are articulated statements on new “How We Work” pages outlining not just what these foundations do, but an emphasis on sharing how they aim to go about it. These statements can be found among funders of all types, including large, small, family, and independent foundations.

We want to encourage this intentionality around transparency, so in 2019 we are adding a new transparency indicator asking whether participating foundations have publicly shared values or policies committing themselves to working openly and transparently. In late January the “Who Has GlassPockets?” self-assessment and profiles will be updated reflecting the new addition. Does your foundation’s website have stated values or policies about its commitment to transparency? If not, below are some samples we have found that may serve as inspiration for others:

  • The Barr Foundation’s “How We Work" page leads with an ethos stating “We strive to be transparent, foster open communication, and build constructive relationships.” And elaborates further about field-building potential: “We aim to be open and transparent about our work and to contribute to broader efforts that promote and advance the field of philanthropy.”

  • The Samuel N. and Mary Castle Foundation’s Mission and Core Values page articulates a long list of values that “emerge from the Foundation’s long history,” including a commitment to forming strategic alliances, working honestly, “showing compassion and mutual respect among grantmakers and grantees,” and ties its focus on transparency to a commitment to high standards and quality: “The Foundation strives for high quality in everything it does so that the Foundation is synonymous with quality, transparency and responsiveness.”

  • The Ford Foundation’s statement connects its transparency focus to culture, values around debate and collaboration, and a commitment to accountability: “Our culture is driven by trust, constructive debate, and leadership that empowers innovation and excellence. We strive to listen and learn and to model openness and transparency. We are accountable to each other at the foundation, to our charter, to our sector, to the organizations we support, and to society at large—as well as to the laws that govern our nonprofit status.”

  • An excerpt from the Bill and Melinda Gates Foundation’s “Information Sharing Approach” page emphasizes collaboration, peer learning, and offers an appropriately global view: “Around the world, institutions are maximizing their impact by becoming increasingly transparent. This follows a fundamental truth: that access to information and data fosters effective collaboration. At the foundation, we are embracing this reality through a continued commitment to search for opportunities that will help others understand our priorities better and what supports our decision making. The foundation is also committed to helping the philanthropic sector develop the tools that will increase confidence in our collective ability to address tough challenges around the world…..We will continually refine our approach to information sharing by regularly exploring how we increase access to important information within the foundation, while studying other institutional efforts at transparency to learn lessons from our partners and peers.”

  • The Walter and Elise Haas Fund connects its transparency focus to its mission statement, and its transparency-related activities to greater effectiveness: “Our ongoing commitment to transparency is a reflection of our mission — to build a healthy, just, and vibrant society in which people feel connected to and responsible for their community. The Walter & Elise Haas Fund shares real-time grants data and champions cross-sector work and community cooperation. Our grantmaking leverages partnerships and collaborations to produce results that no single actor could accomplish alone.”

  • The William and Flora Hewlett Foundation’s statement emphasizes the importance of transparency in creating a culture of learning: “The foundation is committed to openness, transparency and learning. While individually important, our commitments to openness, transparency, and learning jointly express values that are vital to our work. Because our operations—both internal and external—are situated in complex institutional and cultural environments, we cannot achieve our goals without being an adaptive, learning organization. And we cannot be such an organization unless we are open and transparent: willing to encourage debate and dissent, both within and without the foundation; ready to share what we learn with the field and broader public; eager to hear from and listen to others. These qualities of openness to learning and willingness to adjust are equally important for both external grantmaking and internal administration.”

These are just a few of the examples GlassPockets will have available when the new indicator is added later this month. Keep an eye on our Twitter feed for updates.

Happy New Year, Happy New Transparency Indicator!

--Janet Camarena

Living Our Values: Gauging a Foundation’s Commitment to Diversity, Equity, and Inclusion
November 29, 2018

Mona Jhawar serves as learning and evaluation manager for The California Endowment.

Mona JhawarThe California Endowment (TCE) recently wrapped up our 2016 Diversity, Equity, and Inclusion (DEI) Audit, our fourth since 2008. The audit was initially developed at a time when community advocates were pushing the foundation to address issues of structural racism and inequity. As TCE’s grantmaking responded, staff and our CEO were also interested in promoting DEI values across the entire foundation beyond programmatic spaces. Over time, these values became increasingly engrained in TCE’s ethos and the foundation committed to conducting a regular audit as a vehicle with which to determine if and how our DEI values were guiding organizational practice.

Sharing information about our DEI Audit often raises questions about how to launch such an effort. Some colleagues are in the early stages of considering whether they want to carry out an audit of their own. Are we ready? What do we need to have in place to even begin to broach this possibility? Others are interested to hear about how we use the findings from such an assessment. To help answer these questions, this is the first of a two-part blog series to share the lessons we’re learning by using a DEI audit to hold ourselves accountable to our values.

While the audit provides a frame to identify if our DEI values are being expressed throughout the foundation, it also fosters learning. Findings are reviewed and discussed with executive leadership, board, and staff. Reviews provide venues to involve both programmatic and non-programmatic staff in DEI discussions. An audit workgroup typically considers how to take action on findings so that the foundation can continuously improve and also considers how to revise audit goals to ensure forward movement. By sharing findings publicly, we hope our experience and lessons can help to support the field more broadly.

It is, however, no small feat. The audit is a comprehensive process that includes a demographic survey of staff and board, a staff and board survey of DEI attitudes and beliefs, interviews with key foundation leaders, examining available demographic data from grantee partners as well as a review of DEI-related documents gathered in between audits. Having dedicated resources to engage a neutral outsider to carry out the audit in partnership with the foundation is also important to this process. We’ve found it particularly helpful to engage with a consistent trusted partner, Social Policy Research Associates, over each of our audits to capture and candidly reflect where we’re making progress and where we need to work harder to create change.

As your foundation considers your own readiness to engage in such an audit process, we offer the following factors that have facilitated a productive and learning oriented DEI audit effort at TCE:

1. Clarity about the fundamental importance of Diversity, Equity, and Inclusion to the Foundation

The expression of our DEI values has evolved over time. When the audit started, several program staff members who focused on DEI and cultural competency developed a guiding statement on Diversity and Inclusiveness. Located within our audit report, it focused heavily on diversity although tweaks were made to the statement over time. A significant shift occurred several years ago when our executive team articulated a comprehensive set of core values that undergirds all our work and leads with a commitment to diversity, equity, and inclusion.

2. Interest in reflection and adaptation

The audit is a tool for organizational learning that facilitates continuous improvement. The process relies on having both a growth mindset and clear goals for what we hope to accomplish. Our 13 goals range from board engagement to utilizing accessibility best practices. In addition to examining our own goals, the audit shares how we’re doing with respect to a framework of institutional supports required to build a culture of equity. By comparing the foundation to itself over time we can determine if and where change is occurring. It also allows us to revise goals so that we can continue to push ourselves forward as we improve, or to course correct if we’re not on track. We anticipate updating our goals before our next audit to reflect where we are currently in our DEI journey.

3. Engagement of key leaders, including staff

Our CEO is vocal and clear about the importance of DEI internally and externally, as well as about the significance of conducting the audit itself. Our executive team, board, and CEO all contribute to the audit process and are actively interested in reviewing and discussing its findings.

Staff engagement is critical throughout audit implementation, reflection on findings, and action planning as well. It’s notable that the vast majority of staff at all levels feel comfortable pushing the foundation to stay accountable to DEI internally. However, there is a small, but growing percentage (23%) of staff who report feeling uncomfortable raising DEI concerns in the workplace suggesting an area for greater attention.

4. Capacity to respond to any findings

Findings are not always going to be comfortable. Identifying areas for improvement may put the organization and our leaders in tough places. TCE has historically convened a cross departmental workgroup to consider audit findings and tackle action planning. We considered co-locating the audit workgroup within our executive leadership team to increase the group’s capacity to address audit findings. However, now we are considering whether it would be best situated and aligned within an emerging body that will be specifically focused on bringing racial equity to the center of all our work.

5. Courage and will to repeat

In a sector with limited accountability, choosing to voluntarily and publicly examine foundation practices takes real commitment and courage. It’s always great to hear where we’re doing well but committing to a process that also raises multiple areas where we need to put more attention, requires deep will to repeat on a regular basis. And we do so in recognition that this is long term, ongoing work that, in lieu of having a real finish line, requires us to continuously adapt as our communities evolve.

Conducting our DEI audit regularly has strengthened our sense of where our practice excels—for example in our grantmaking, possessing a strong vision and authorizing environment, and diversity among staff and board. It’s also strengthened our sense of the ways we want to improve such as developing a more widely shared DEI analysis and trainings for all staff as well as continuing to strengthen data collection among our partners. The value of our DEI audit lies equally in considering findings as well as being a springboard for prioritizing action. TCE has been on this road a long time and we’ll keep at it for the foreseeable future. As our understanding of what it takes to pursue diversity, equity, and inclusion internally and externally sharpens, so will the demands on our practice. Our DEI audit will continue to ensure that we hold ourselves to these demands. In my next post, we’ll take a closer look at what we’re learning about operationalizing equity within the foundation.

--Mona Jhawar

What Does It Take to Shift to a Learning Culture in Philanthropy?
November 20, 2018

Janet Camarena is director of transparency initiatives at Foundation Center.

This post also appears in the Center for Effective Philanthropy blog.

Janet Camarena PhotoIf there was ever any doubt that greater openness and transparency could benefit organized philanthropy, a new report from the Center for Effective Philanthropy (CEP) about knowledge-sharing practices puts it to rest. Besides making a case for the need for greater transparency in the field, the report also provides some hopeful signs that, among foundation leaders, there is growing recognition of the value of shifting to a culture of learning to improve foundations’ efforts.

Understanding & Sharing What Works: The State of Foundation Practice reveals how well foundation leaders understand what is and isn’t working in their foundation’s programs, how they figure this out, and what, if anything, they share with others about what they’ve learned. These trends are explored through 119 survey responses from, and 41 in-depth interviews with foundation CEOs. A companion series of profiles tell the story about these practices in the context of four foundations that have committed to working more openly.

Since Foundation Center’s launch of GlassPockets in 2010, we have tracked transparency around planning and performance measurement within the “Who Has Glass Pockets?” self-assessment. Currently, of the nearly 100 foundations that have participated in GlassPockets, only 27 percent publicly share any information about how they measure their progress toward institutional goals. Given this lack of knowledge sharing, we undertook a new #OpenForGood campaign to encourage foundations to publicly share published evaluations through the IssueLab open archive.

As someone who has spent the last decade examining foundation transparency practices (or the lack thereof) and championing greater openness, I read CEP’s findings with an eye for elements that might help us better understand the barriers and catalysts to this kind of culture shift in the field. Here’s what I took away from the report.

Performance Anxiety

UWW_MAIN_COV_border (1)While two-thirds of foundation CEOs in CEP’s study report having a strong sense of what is working programmatically within their foundations, nearly 60 percent report having a weaker grasp on what is not working. This begs the question: If you don’t know something is broken, then how do you fix it? Since we know foundations have a tendency to be success-oriented, this by itself wasn’t surprising. But it’s a helpful metric that proves the point of how investing in evaluation, learning, and sharing can only lead to wiser use of precious resources for the field as a whole.

The report also reveals that many CEOs who have learned what is not working well at their foundations are unlikely to share that knowledge, as more than one-third of respondents cite hesitancy around disclosing missteps and failures. The interviews and profiles point to what can best be described as performance anxiety. CEOs cite the need for professionals to show what went well, fear of losing the trust of stakeholders, and a desire to impress their boards as motivations for concealing struggles. Of these motivations, board leadership seems particularly influential for setting the culture when it comes to transparency and failure.

In the profiles, Rockefeller Brothers Fund (RBF) President Stephen Heintz discusses both the importance of his board and his background in government as factors that have informed RBF’s willingness to share the kinds of information many foundations won’t. RBF was an early participant in GlassPockets, and now is an early adopter of the #OpenForGood movement to openly share knowledge. As a result, RBF has been one of the examples we often point to for the more challenging aspects of transparency such as frameworks for diversity data, knowledge sharing, and investment practices.

An important takeaway of the RBF profile is the Fund’s emphasis on the way in which a board can help ease performance anxiety by simply giving leadership permission to talk about pain points and missteps. Yet one-third of CEOs specifically mention that their foundation faces pressure from its board to withhold information about failures. This sparks my interest in seeing a similar survey asking foundation trustees about their perspectives in this area.

Utility or Futility?

Anyone who works inside a foundation — or anyone who has ever applied for a grant from a foundation — will tell you they are buried in the kind of paperwork load that often feels futile (which actually spawned a whole other worthy movement led by PEAK Grantmaking called Project Streamline). In the CEP study, the majority of foundation CEOs report finding most of the standard sources of knowledge that they require not very useful to them. Site visits were most consistently ranked highly, with the majority of CEOs (56 percent) pointing to them as one of the most useful sources for learning about what is and isn’t working. Grantee focus groups and convenings came in a distant second, with only 38 percent of CEOs reporting these as a most useful source. And despite the labor involved on both sides of the table, final grant reports were ranked as a most useful source for learning by only 31 percent of CEOs.

”Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge.“

If most foundations find greater value in higher touch methods of learning, such as meeting face-to-face or hosting grantee gatherings, then perhaps this is a reminder that if foundations reduce the burdens of their own bureaucracies and streamline application and reporting processes, there will be more time for learning from community and stakeholder engagement.

The companion profile of the Weingart Foundation, another longtime GlassPockets participant, shows the benefits of funders making more time for grantee engagement, and provides a number of methods for doing so. Weingart co-creates its learning and assessment frameworks with grantees, routinely shares all the grantee feedback it receives from its Grantee Perception Report (GPR), regularly makes time to convene grantees for shared learning, and also pays grantees for their time in helping to inform Weingart’s trustees about the problems it seeks to solve.

Supply and Demand

One of the questions we get the most about #OpenForGood’s efforts to build an open, collective knowledge base for the field is whether anyone will actually use this content. This concern also surfaces in CEP’s interviews, with a number of CEOs citing the difficulty of knowing what is useful to share as an impediment to openness. A big source of optimism here is learning that a majority of CEOs report that their decisions are often informed by what other foundations are learning, meaning foundations can rest assured that if they supply knowledge about what is and isn’t working, the demand is there for that knowledge to make a larger impact beyond their own foundation. Think of all that untapped potential!

Of course, given the current state of knowledge sharing in the field, only 19 percent of CEOs surveyed report having quite a bit of knowledge about what’s working at peer foundations, and just 6 percent report having quite a bit of knowledge about what’s not working among their programmatic peers. Despite this dearth of knowledge, still fully three-quarters of foundation CEOs report that they use what they have access to from peers in informing strategy and direction within their own foundations.

Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge. Now there is every reason for knowledge sharing to become the norm rather than the exception.

--Janet Camarena

Data Fix: Do's & Don'ts for Reporting Geographic Area Served
November 1, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This is the second post in a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoThe first post in our Data Fix series focused on areas that may seem straightforward but often cause confusion, including recipient location data. But don’t confuse recipient location (where the check was sent) with Geographic Area Served (the area meant to benefit from the funding). Data on recipient location, one of our required fields, allows us to match data to the correct organization in our database, ensuring accuracy for analyses or data visualizations. In contrast, Geographic Area Served, one of our highest priority fields, helps us tell the real story about where your funding is making an impact.

How to Report Geographic Area Served

We recognize that providing data on Geographic Area Served can be challenging. Many funders may not track this information, and those who do may depend on grantees or program staff to provide the details. It’s important to keep in mind that sharing some information is better than no information, as funders are currently the only source of this data.

DO DON'T
Do include details for locations beyond the country level. For example, for U.S. locations, specify a state along with providing geo area served at the city or county level. For non-U.S. locations, include the country name when funding a specific city, province, state or region. Don’t be too broad in scope. “Global Programs” may not be accurate if your work is focused on specific countries. Similarly, listing the geo area served as “Canada” is misleading if the work is serving the province of “Quebec, Canada” rather than the entire country.

Do use commas to indicate hierarchy and semi-colons to separate multiple areas served. For example:

  • Topeka, Kansas (comma used to indicate hierarchy)
  • Hitchcock County, Nebraska; Lisbon, Portugal; Asia (semi-colons used to list and separate multiple locations)
Don’t use negatives or catch-all terms. “Not California,” “Other,” “Statewide” or “International” may be meaningful within your organization, but these terms cannot be interpreted for mapping. Instead of “Statewide,” use the name of the state. Instead of “International,” use “Global Programs” or list the countries, regions, or continent being served.

Do define regions. If you are reporting on geo area served at the regional level (e.g. East Africa), please provide a list of the countries included in your organization’s definition of that region. Your definition of a region may differ from that of Foundation Center. Similarly, if your foundation defines its own regions (Southwestern Ohio), consider including the counties comprising that region.

Don’t forget to include the term “County” when reporting on U.S. counties. This will ensure your grant to an entire county isn’t assigned to the same named city (e.g. Los Angeles County, California, rather than Los Angeles, California).

Geographic Area Served in Foundation Center Platforms

Data provided (in a loadable format) will appear in “Grant Details” in Foundation Directory Online (FDO) and Foundation MapsFoundation Maps, including the complimentary eReporter map showing your own foundation’s data, also display an Area Served mapping view. 

Copy of Untitled

If data is not provided, Foundation Center will do one of the following:

  • Default to the location of the recipient organization
  • Add geo area served based on text in the grant description
  • Add geo area served based on where the recipient organization works, as listed on their website or in their mission statement, if this information is available in our database

Responsibly Sharing Geographic Area Served


Although our mission is to encourage transparency through the sharing of grants data, we acknowledge there are contexts in which sharing this data may be cause for concern. If the publishing of this data increases risks to the population meant to benefit from the funding, the grantee/recipient, or your own organization, you can either omit Geographic Area Served information entirely or report it at a higher, less sensitive level (e.g. country vs. province or city). For more information on this topic, please see Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts and Sharing Data Responsibly: A Conversation Guide for Funders.

More Tips to Come!

I hope you have a better understanding of how to report Geographic Area Served through eReporting. Without this data, valuable information about where funding is making a difference may be lost! Moving forward, we’ll explore the required fields of Recipient Name and Grant Description. If you have any questions, please feel free to contact me.

-- Kati Neiheisel

New Guide Helps Human Rights Funders Balance Tension between Risk & Transparency
October 25, 2018

Julie Broome is the Director of Ariadne, a network of European donors that support social change and human rights.  

Tom Walker is the Research Manager at The Engine Room, an international organisation that helps activists and organisations use data and technology effectively and responsibly.

2
Julie Broome

Foundations find themselves in a challenging situation when it comes to making decisions about how much data to share about their grantmaking. On the one hand, in recognition of the public benefit function of philanthropy, there is a demand for greater transparency on the part of funders and a push to be open about how much they are giving and who they are giving it to. These demands sometimes come from states, increasingly from philanthropy professionals themselves, and also from critics who believe that philanthropy has been too opaque for too long and raise questions about fairness and access. 

At the same time, donors who work in human rights and on politically charged issues, are increasingly becoming aware of the risks to grantees if sensitive information ends up in the public domain. As a result, some funders have moved towards sharing little to no information. However, this can have negative consequences in terms of our collective ability to map different fields, making it harder for us all develop a sense of the funding landscape in different areas. It can also serve to keep certain groups “underground,” when in reality they might benefit from the credibility that foundation funding can bestow.

1
Tom Walker

As the European partners in the Advancing Human Rights project, led by the Human Rights Funders Network and Foundation Center, Ariadne collects grantmaking data from our members that feeds into this larger effort to understand where human rights funding is going and how it is shifting over time. Unlike the United States, in which the IRS 990-PF form eventually provides transparency about grantee transactions, there is no equivalent data source in Europe. Yet, many donors find grant activity information useful in finding peer funders and identifying potential gaps in the funding landscape where their own funds could make a difference. We frequently receive requests from donors who want to use these datasets to drill down into specific areas of interest, and map out different funding fields. But these types of sources of data will become less valuable over time if donors move away from voluntarily sharing information about their grantmaking.

Nonetheless, the risks to grantees if donors share information irresponsibly are very real, especially at a time when civil society is increasingly under threat from both state and non-state actors.  It was in the interest of trying to balance these two aims – maintaining sufficient data to be able to analyse trends in philanthropy while protecting grantees – that led Ariadne to partner with The Engine Room to create a guide to help funders navigate these tricky questions.

After looking at why and how funders share data and the challenges of doing so responsibly, The Engine Room interviewed 8 people and surveyed 32 others working in foundations that fund human rights organisations, asking how they shared data about their grants and highlighting any risks they might see.

Funders told us that they felt treating data responsibly was important, but that implementing it in their day-to-day work was often difficult. It involved balancing competing priorities: between transparency and data protection legislation; between protecting grantees’ data and reporting requirements; and between protecting grantees from unwanted attention, and publicising stories to highlight the benefits of the grantee’s work.

The funders we heard from said they found it particularly difficult to predict how risks might change over time, and how to manage data that had already been shared and published. The most common concerns were:

  • ensuring that data that had already been published remained up to date;
  • de-identifying data before it was published
  • Working with third parties to be responsible when sharing data about grantees, such as with donors who fund through intermediaries and may request information about the intermediaries’ grantees.

Untitled designAlthough the funders we interviewed differed in their mission, size, geographical spread and focus area, they all stressed the importance of respecting the autonomy of their grantees. Practically, this meant that additional security or privacy measures were often introduced only when the grantee raised a concern. The people we spoke with were often aware that this reactive approach puts the burden of assessing data-related risks onto grantees, and suggested that they most needed support when it came to talking with grantees and other funders in an open, informed way about the opportunities and risks associated with sharing grantee data.

These conversations can be difficult ones to have. So, we tried a new approach: a guide to help funders have better conversations about responsible data.

It’s aimed at funders or grantmakers who want to treat their grantees’ data responsibly, but don’t always know how. It lists common questions that grantees and funders might ask, combined with advice and resources to help answer them, and tips for structuring a proactive conversation with grantees.

”There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process.“

There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process. It offers prompts that are designed to help you talk more openly with grantees or other funders about data-related risks and ways of dealing with them. The guide is organised around three elements of the grantmaking lifecycle: data collection, data storage, and data sharing.

Because contexts and grantmaking systems vary dramatically and change constantly, a one-size-fits-all solution is impossible. Instead, we decided to offer guidance on processes and questions that many funders share – from deciding whether to publish a case study to having conversations about security with grantees. For example, one tip that would benefit many grantmakers is to ensure that grant agreements include specifics about how the funder will use any data collected as a result of the grant, based on a discussion that helps the grantee to understand how their data will be managed and make decisions accordingly.

This guide aims to give practical advice that helps funders strengthen their relationships with grantees - thereby leading to more effective grantmaking. Download the guide, and let us know what you think!

--Julie Broome and Tom Walker

Philanthropy and Democracy: Bringing Data to the Debate
October 18, 2018

Anna Koob is a manager of knowledge services for Foundation Center.

Anna-koob_tilemediumAs money and politics become increasingly intertwined, the enduring debate around the role of philanthropy in a democratic society has taken on new life in recent months  (see
here, here, here, and here for prominent examples).

One side of the debate sees the flexibility of foundation dollars as a part of the solution to strengthen struggling democratic institutions. Others contend that foundations are profoundly undemocratic and increasingly powerful institutions that bypass government channels to shape the country--and world--to their will. Regardless of where you stand, a practical starting point is to learn more about what grantmakers are actually doing to affect democracy in these United States.

While foundations are required by law to avoid partisan and candidate campaigning, these limitations still leave plenty of room for foundations to engage with democracy in other ways.

Which funders are working on voter access issues? How much money is dedicated to civic engagement on key issues like health or the environment? Which organizations are receiving grants to increase transparency in government? Foundation Funding for U.S. Democracy, offers a free public resource to get at the answers to such questions.

Browse More Than 55k Democracy Grants

Launched in 2014 by Foundation Center and updated regularly, Foundation Funding for U.S. Democracy’s data tool currently includes over 57,000 grants awarded by more than 6,000 funders totaling $5.1 billion dollars across four major categories: campaigns and elections, civic participation, government strengthening, and media.

The tool offers a look at the big picture through dashboards on each of these categories, and also allows you to browse granular grants-level information.  Interested in understanding:

  • The largest funders of campaigns and elections work?
  • Grantmaking in support of civic participation, broken down by population type?
  • The strategies used to affect democracy work?

To paraphrase the slogan of Apple, there’s a dashboard (and underlying data tool) for that!

The site also features a collection of research on U.S. democracy, powered by IssueLab, links to a number of relevant blog posts, and hosts infographics we’ve developed using data from the tool.

What Does the Data Tell Us About Philanthropic Support for Democracy?

Copy of UntitledLess than two percent of all philanthropic funding in the United States meets our criteria for democracy funding, which includes efforts by foundations to foster an engaged and informed public and support government accountability and integrity, as well as funding for policy research and advocacy. It’s a modest amount considering that this subset captures a wide range of topics, including money in politics, civic leadership development, civil rights litigation, and journalism training. Some findings from the data rise to the top:

  1. Funding for campaigns and elections is the smallest of the four major funding categories tracked. While most people might think of elections as the basic mechanism of democracy, this category only constitutes about 12 percent of democracy funding represented in the tool. Civic participation and government each vie for being the largest category with each accounting for about 38 percent of total democracy funding. And relevant media funding accounts for 28 percent. (Note that grants can be counted in multiple categories, so totals exceed 100 percent.)
  • Less than a quarter of funding supports policy and advocacy work. While work to affect policy is often considered front and center when discussing philanthropy’s impact on democracy, the data tool reveals that many funders are working to strengthen democracy in other ways. Supporting civics education for youth, bolstering election administration, strengthening platforms for government accountability, or funding investigative journalism appear as examples of grantmaking areas that strengthen democracy, but have less direct implications for public policy.
  • Funder interest in the census and the role of media in democracy is increasing. Given the turbulence of the last couple of years in the U.S. political system and amid calls for greater philanthropic involvement in strengthening democracy, what changes have we seen in giving patterns? Well, with the caveat that there is a lag between the time when grants are awarded and when we receive that data (from 990 tax forms or direct reporting by foundations), based on reports added to IssueLab and news items posted on Philanthropy News Digest, we are seeing evidence that funders are rallying around some causes to strengthen democratic institutions, including efforts to ensure representativeness in the 2020 census and support for research on media consumption and digital disinformation.

Why Should Funders be Transparent about Their Democracy Work?

Appeals for data sharing in philanthropy often center around the common good -- detailed data helps to inform authentic conversations around who’s funding what, where, among grantmakers, nonprofits, and other stakeholders. But in a field that’s focused on shaping the nature of our democracy and represents funding from both sides of the ideological divide -- including, for example, grantmaking in support of the American Legislative Exchange Council (“dedicated to the principles of limited government, free markets and federalism”) alongside grants awarded to organizations like the Center for American Progress (“dedicated to improving the lives of all Americans, through bold, progressive ideas”), democracy funders tend to be especially cautious about publicizing their work and opening themselves up to increased scrutiny and criticism.  

But the reality is that foundation opacity undermines credibility and public trust. Precisely because of criticism about the lack of democracy in philanthropy, foundations should demonstrate intentional transparency and show that they are living their values as democracy funders. Foundations also find that, particularly in a space that’s rife with speculation, there’s a benefit to shaping your own narrative and describing what you do in your own words. It may not make you immune to criticism, but it shows that you have nothing to hide.

How Funders Can Actively Engage: Submitting Grants Data

Copy of Untitled copy 2Grants data in the platform is either reported directly to Foundation Center via our eReporter program or sourced via publicly available 990 tax forms. While we’re able to get our data-eager hands on foundation grants either way, we prefer sourcing them directly from funders as it lends itself to more recent data -- particularly valuable in the current, fast-paced ‘democracy in crisis’ era -- and more detailed grant descriptions.

To submit your most recent grants (we’re currently collecting grants awarded in 2017), become an eReporter! Export a list of your most recent grants data in a spreadsheet (all grants - not limited to those relevant to democracy), review the data to make sure there’s no sensitive information and everything is as you’d like it to appear, and email your report to egrants@foundationcenter.org. Submit data as often as you’d like, but at least on an annual basis.

Bringing Tangible Details to Abstract Discussions

At Foundation Center, we often tout data’s ability to help guide decision making about funding and general resource allocation. And that’s a great practical use case for the philanthropic data that we collect -- whether for human rights, ocean conservation funding, the Sustainable Development Goals, or democracy. At a time of increased foundation scrutiny, this publicly-available platform can also provide some transparency and concrete details to broaden discussions. What have foundations done to strengthen democracy? And how might they best contribute in these politically uncertain times? For examples, look to the data.

Have questions about this resource? Contact us at democracy@foundationcenter.org.

--Anna Koob

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories