Transparency Talk

Category: "Open Data" (48 posts)

Candid Announces Inaugural #OpenForGood Award Winners
May 30, 2019

Janet Camarena is director of transparency initiatives at Candid.

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

Open For Good Awardees and Committee MembersLeft to Right: Meg Long, President, Equal Measure (#OpenForGood selection committee); Janet Camarena, Director, Transparency Initiatives, Candid; Awardee Savi Mull, Senior Evaluation Manager, C&A Foundation; Awardee Veronica Olazabal, Director, Measurement, Evaluation & Organizational Performance, The Rockefeller Foundation; Clare Nolan, Co-Founder, Engage R + D (#OpenForGood selection committee).

Yesterday as part of the Grantmakers for Effective Organizations Learning Conference, Candid announced the inaugural recipients of the #OpenForGood Award, which is designed to recognize and encourage foundations to openly share what they learn so we can all get collectively smarter. The award, part of a larger #OpenForGood campaign started in 2017, includes a set of tools to help funders work more transparently including a GrantCraft Guide about how to operationalize knowledge sharing, a growing collection of foundation evaluations on IssueLab, and advice from peers in a curated blog series.

The three winning foundations each demonstrate an active commitment to open knowledge and share their evaluations through IssueLab, an open repository that is free, searchable, and accessible to all. Selected by an external committee from a globally sourced nomination process, the committee reviewed the contenders looking for evidence of an active commitment to open knowledge, creative approaches to making knowledge shareable, field leadership, and incorporating community insights into knowledge sharing work.

And the Winners Are…

Here are some highlights from the award presentation remarks:

C and A FoundationC&A Foundation
Award Summary: Creativity, Demonstrated Field Leadership, and Willingness to Openly Share Struggles

The C&A Foundation is a multi-national, corporate foundation working to fundamentally transform the fashion industry. C&A Foundation gives its partners financial support, expertise and networks so they can make the fashion industry work better for every person it touches. Lessons learned and impact for each of its programs are clearly available on its website, and helpful top-level summaries are provided for every impact evaluation making a lengthy narrative evaluation very accessible to peers, grantees and other stakeholders. C&A Foundation even provides such summaries for efforts that didn’t go as planned, packaging them in an easy-to-read, graphic format that it shares via its Results & Learning blog, rather than hiding them away and quietly moving on as is more often the case in the field.

The Ian Potter FoundationIan Potter Foundation
Award Summary: Creativity, Field Leadership, and Lifting Up Community Insights

This foundation routinely publishes collective summaries from all of its grantee reports for each portfolio as a way to support shared learning among its existing and future grantees. It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than questions to satisfy the typical ritual of a grant report that goes something like submit, data enter, file away never to be seen, and repeat.

Beyond being transparent with its grantee learning and reports, the Ian Potter Foundation also recently helped lift the burden on its grantees when it comes to measurement and outcomes. Instead of asking overworked charities to invent a unique set of metrics just for their grant process, foundation evaluation staff took it upon themselves to mine the Sustainable Development Goals targets framework to provide grantees with optional and ready-made outcomes templates that would work across the field for many funders. You can read more about that effort underway in a recent blog post here.

The Rockefeller FoundationThe Rockefeller Foundation
Award Summary: Field Leadership, Consistent Knowledge Sharing, and Commitment to Working Transparently

The Rockefeller Foundation can boast early adopter status to transparency and openness—it  has had a longstanding commitment to creating a culture of learning and as such was one of the very first foundations to join the GlassPockets transparency movement and also to commit to #OpenForGood principles by sharing its published evaluations widely. Rockefeller Foundation also took the unusual step of upping the ante on the #OpenForGood Pledge aiming for both creating a culture of learning and accountability, with its monitoring and evaluation team stating that: “To ensure that we hold ourselves to a high bar, our foundation pre-commits itself to publicly sharing the results of its evaluations - well before the results are even known.” This ensures that even if the evaluation reports unfavorable findings, the intent is to share it all.

In an earlier GlassPockets blog post, Rockefeller’s monitoring and evaluation team shows a unique understanding of how sharing knowledge can advance the funder’s goals: “Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.”  Rockefeller’s use of IssueLab’s open knowledge platform is living up to this promise as anyone can currently query and find more than 400 knowledge documents funded, published, or co-published by the Rockefeller Foundation.

Winners will receive technical support to create a custom Knowledge Center for their foundation or for a grantee organization, as well as promotional support in knowledge dissemination. Knowledge Centers are a service of IssueLab that provides organizations with a simple way to manage and share knowledge on their own websites. By leveraging this tool, you can showcase your insight, promote analysis on your grantees, and feature learnings from network members. All documents that are uploaded to an IssueLab Knowledge Center are also made searchable and discoverable via systems like WorldCat, which serves more than 2,000 libraries worldwide, ensuring your knowledge can be found by researchers, regardless of their familiarity with your organization.

Why Choose Openness?

The #OpenForGood award is focused on inspiring foundations to use existing and emerging technologies to collectively improve the sector. Today, we live in a time when most expect to find the information they need on the go, via tablets, laptops, and mobile phones, just a swipe or click away. Despite this digital era reality today only 13 percent of foundations have websites, and even fewer share their reports publicly, indicating that the field has a long way to go to creating a culture of shared learning. With this award, we hope to change these practices. Rather than reinvent the wheel, this award and campaign encourages the sector to make it a priority to learn from one another, share content with a global audience, so that we can build smartly one another’s work and accelerate the change we want to see in the world. The more you share your foundation's work, the greater the opportunities to make all our efforts more effective and farther reaching.

Congratulations to our inaugural class of #OpenForGood Award Winners! What will you #OpenForGood?

--Janet Camarena

Opening Up Emerging Knowledge: New Shared Learning from IssueLab
May 23, 2019

Janet Camarena is the director of transparency initiatives at Candid.

This post is part of the Glasspockets’ #OpenForGood series in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood.

Balloons1024x512

Though it’s hard to believe, we are already almost halfway through 2019! Given that midpoints are often a time to reflect and take stock, it seemed good timing to mine the knowledge that the field has shared in IssueLab to see some examples of a few of the reports and lessons learned that our GlassPockets foundations have shared over the last six months. Scanning the recent titles, some themes immediately jumped out at me that seemed to be a focus of research across the field, such as racial and gender equity, global trends, and impact measurement.

This is also a good reminder that IssueLab helps make your knowledge discoverable. Though I’m highlighting seven recent publications here, I only had to visit one website to find and freely download them. Acting as a “collective brain” for the field, IssueLab organizes the social sector’s knowledge so we can all have a virtual filing cabinet that makes this knowledge readily available. If it’s been a while since you uploaded your knowledge to IssueLab, you can add any of your publications to our growing library here. It’s a great way to make your knowledge discoverable, mitigate the knowledge fragmentation in the field, and make your foundation live up to being #OpenForGood.

And, speaking of #OpenForGood, our inaugural awards designed to encourage more knowledge sharing across the field will be announced at the upcoming GEO Learning Conference during lunch on May 29th. If you will be at GEO, join us to learn who the #OpenForGood knowledge sharing champions will be! And remember, if you’ve learned something, share something!

Opening Up Evaluations & Grantee Reports

“It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than just the data that the funder might need.”

Foundations pilot initiatives all the time, but do they share what they learned from them once the evaluation is all said and done? And what about all the potentially helpful data filed away in grantee reports? This first cluster of new reports opens up this kind of knowledge:

  • Creative City (published by Animating Democracy, Funded by the Barr and Boston Foundations, April 2019) The Creative City pilot program, created by the New England Foundation for the Arts in partnership with the Barr Foundation, supported artists of all disciplines for art in Boston that would serve to drive public imagination and community engagement. Artists, funders, and administrators alike will find much to learn from this report about how to rethink arts in the context of people and place. One compelling example is the Lemonade Stand installation, created by artists Elisa H. Hamilton and Silvia Lopez Chavez, which made the rounds of many Boston neighborhoods, and attracted many people with its bright yellow kiosk glow. Though it looked on the surface like a lemonade stand, it was actually an art installation inviting the community to connect by exchanging stories about how they turned lemons into lemonade.
  • Giving Refugees A Voice: Independent Evaluation (MacroScope London, Funded by the C&A Foundation, March 2018-February 2019) The C&A Foundation supported the Giving Refugees a Voice initiative, designed to improve working conditions for Syrian and other refugees in the Turkish apparel sector using social media monitoring technology. The pilot initiative used social media monitoring technology to analyze the public Facebook posts of millions of refugees associated with the apparel sector in Turkey. The purpose of this analysis was to galvanize brands, employers, and others to take actions and make changes that would directly improve the working conditions for Syrian people in Turkey. This impact report forthrightly reveals that though the social media efforts were an innovative way to document the scale of the Syrians working informally in the Turkish apparel industry, the pilot fell short of its goals as there was no evidence that the social media analysis led to improved working conditions. Rather than keep such a negative outcome quiet, the C&A Foundation publicly released its findings and also created a blog summary about them earlier this year outlining the results, what they learned from them, and what would be helpful for stakeholders and partners to know in an easy-to-read outline.
  • Grantee Learnings: Disability (Published by Ian Potter Foundation, December 2018) The information documented in this publication has been taken from the final reports of disability-serving grantees, which were submitted to The Ian Potter Foundation following the completion of their projects. The Ian Potter Foundation routinely shares out grantee learnings for each of its portfolios as a way to support shared learning among its existing and future grantees, and this is the most recent of these. The report is easily arranged so that other disability services providers can benefit from the hard-won lessons learned of their peers when it comes to likely areas of shared challenges such as staffing, program planning, working with parents and partners, scaling, evaluation measurement, and technology use. It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than just the data that the funder might need.

Lessons Learned from Scholarship & Fellowship Funding

Donors looking to make a difference using scholarships and student aid to improve diversity, equity, and inclusion have two new excellent sources of knowledge available to them:

  • Delivering on the Promise: An Impact Evaluation of the Gates Millennium Scholars Program (Published by American Institutes for Research, Funded by the Bill & Melinda Gates Foundation, May 2019) This report shares findings from an impact evaluation of the Gates Millennium Scholars (GMS) program and reflects on findings from implementation evaluations conducted on the program since its inaugural year. The GMS program is an effort designed to improve higher education access and opportunity for high achieving low-income students of color by reducing the cost of entry. The program also seeks to develop a new and diverse generation of leaders to serve America by encouraging leadership participation, civic engagement, and the pursuit of graduate education and careers in seven fields in which minorities are underrepresented—computer science, engineering, mathematics, science, education, library science, and public health. It discusses the extent to which the program has made an impact, and offers concluding thoughts on how the Foundation can maximize its investment in the higher education arena. A central argument of this report is that philanthropic activities like the GMS program can indeed play a crucial role in improving academic outcomes for high-achieving, disadvantaged students.
  • Promoting Gender Equity: Lessons From Ford’s International Fellows Program (Published by IIE Center for Academic Mobility Research & Impact, Funded by Ford Foundation, January 2019) As part of its mission to provide higher education access to marginalized communities, the Ford Foundation International Fellowships Program (IFP) sought to address gender inequality by providing graduate fellowships to nearly 2,150 women—50% of the IFP fellow population—from 22 countries in the developing world. This brief explores how international fellowship programs like IFP can advance educational, social, and economic equity for women. In addition to discussing the approach, the program took in providing educational access and opportunity to women. The brief looks at two stories of alumnae who have not only benefitted from the fellowship themselves, but who are working to advance gender equity in their home communities and countries. Activists, advocates, and practitioners can draw upon the strategies and stories that follow to better understand the meaning of gender equity and advance their own efforts to achieve social justice for women and girls worldwide.

Sharing Knowledge about the Social Sector

Foundations invest in knowledge creation to better understand the ecosystem of the social sector, as well as to address critical knowledge gaps they see in the fields in which they work. Thanks to these titles being added to IssueLab, we can all learn from them too! Here’s a couple of recent titles added to IssueLab that shed new and needed light on the fields of philanthropy and nonprofits:

  • Philanthropy in China (Published by Asian Venture Philanthropy Network, Funded by The Rockefeller Foundation, April 2019) Philanthropy is now a global growth industry, but philanthropic transparency norms in other parts of the world are often lacking, so knowledge can be scarce. Philanthropy in China today is expanding and evolving rapidly, so filling in these knowledge gaps is even more pressing. This report presents an overview of the philanthropy ecosystem in China by reviewing existing knowledge and drawing insights from influential practitioners. It also provides an analysis of the key trends, opportunities as well as a set of recommendations for funders and resource providers who are inspired to catalyze a more vibrant and impactful philanthropy ecosystem in China.
  • Race to Lead: Women of Color in the Nonprofit Sector (Published by the Building Movement Project, Funded by New York Community Trust, Robert Sterling Clark Foundation, Community Resource Exchange, New York Foundation, Meyer Memorial Trust, Center for Nonprofit Excellence at the United Way of Central New Mexico, North Carolina Center for Nonprofits, Russ Finkelstein, February 2019) This report is part of the Race to Lead series by the Building Movement Project, seeking to understand why there are still relatively so few leaders of color in the nonprofit sector. Using data taken from a national survey of more than 4,000 people, and supplemented by numerous focus groups around the country, this latest report reveals that women of color encounter systemic obstacles to their advancement over and above the barriers faced by white women and men of color. Another key finding in the report is that education and training are not enough to correct systemic inequities—women of color with high levels of education are more likely to be in administrative roles and are more likely to report frustrations about inadequate and inequitable salaries. Building Movement Project’s call to action focuses on systems change, organizational change, and individual support for women of color in the sector.

Is this reminding you that you have new knowledge to share? Great—I can’t wait to see what you will #OpenForGood!

--Janet Camarena

Don’t “Ghost” Declined Applicants: The Ins and Outs of Giving Applicant Feedback
April 4, 2019

Mandy Ellerton joined the [Archibald] Bush Foundation in 2011, where she created and now directs the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community problems, using approaches that are collaborative and inclusive of people who are most directly affected by the problem.

GlassPockets Road to 100

This post is part of our “Road to 100 & Beyond series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the “Who Has GlassPockets? self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations over time, promising practices in transparency, helpful examples, and lessons learned.

I’ve often thought that fundraising can be as bad as dating. (Kudos to you lucky few who have had great experiences dating!) Lots of dates, lots of dead ends, lots of frustrating encounters before you (maybe) find a match. All along the way you look for even the smallest sign to indicate that someone likes you. “They laughed at my joke!” or, in the case of fundraising, “they seemed really excited about page five of last year’s impact report!” Not to mention the endless time spent doing online searches for shreds of information that might be useful. This reality is part of the reason why Bush Foundation was proud to be among the first 100 foundations to participate in GlassPockets. We believe that transparency and opening lines of communication is critical to good grantmaking, because both in dating and in fundraising, it can be heartbreaking and crazymaking to try and sort out whether you have a connection or if someone’s “just not that into you.” If only there was a way to just “swipe left” or “swipe right” and make everything a little simpler.

“We believe that transparency and opening lines of communication is critical to good grantmaking.”

I’m not proposing a Tinder for grantmaking (nor should anyone, probably, although hat tip to Vu Le for messing with all of us and floating the idea on April Fool’s Day). But over the past several years, Bush Foundation’s Community Innovation program staff has used a system to provide feedback calls for declined applicants, in the hopes of making foundation fundraising a little less opaque and crazymaking. We use the calls to be transparent and explain why we made our funding decisions. The calls also help us live out our “Spread Optimism” value because they allow us to help and encourage applicants and potentially point them to other resources. This is all part of our larger engagement strategy, described in “No Moat Philanthropy.”

 

Ellertonmandy20152
Mandy Ellerton

How Feedback Calls Work

We use a systematic approach for feedback calls:

  • We proactively offer the opportunity to sign up for feedback calls in the email we send to declined applicants.
  • We use a scheduling tool (after trying a couple different options we’ve landed on Slotted, which is relatively cheap and easy to use) and offer a variety of times for feedback calls every week. Collectively five Community Innovation Team members hold about an hour a week for feedback calls. The calls typically last about 20 minutes. We’ve found this is about the right amount of time so that we can offer feedback calls to most of the declined applicants who want them.
  • We prepare for our feedback calls. We re-read the application and develop an outline for the call ahead of time.
  • During the call we offer a couple of reasons why we declined the application. We often discuss what an applicant could work on to strengthen their project and whether they ought to apply again.
  • We also spend a lot of time listening; sometimes these calls can understandably be emotional. Grant applications are a representation of someone’s hopes and dreams and sometimes your decline might feel like the end of the road for the applicant. But hang with them. Don’t get defensive. However hard it might feel for you, it’s a lot harder for the declined applicant. And ultimately, hard conversations can be transformative for everyone involved. I will say, however, that most of our feedback calls are really positive exchanges.
  • We use anonymous surveys to evaluate what people think of the feedback calls and during the feedback call we ask whether the applicant has any feedback for us to improve our programs/grantmaking process.
  • We train new staff on how to do feedback calls. We have a staff instruction manual on how to do feedback calls, but we also have new team members shadow more seasoned team members for a while before they do a feedback call alone.

 

What’s Going Well

The feedback calls appear to be useful for both declined applicants and for us:

  • In our 2018 surveys, respondents (n=38) rated the feedback calls highly. They gave the calls an average rating of 6.1 (out of 7) for overall helpfulness, 95% said the calls added some value or a lot of value, and 81.2% said they had a somewhat better or much better understanding of the programs after the feedback call.
  • We’ve seen the number of applications for our Community Innovation Grant and Bush Prize for Community Innovation programs go down over time and we’ve seen the overall quality go up. We think that’s due, in part, to feedback calls that help applicants decide whether to apply again and that help applicants improve their projects to become a better fit for funding in the future.
  • I’d also like to think that doing feedback calls has made us better grantmakers. First, it shows up in our selection meetings. When you might have to talk to someone about why you made the funding decision you did, you’re going to be even more thoughtful in making the decision in the first place. You’re going to hew even closer to your stated criteria and treat the decision with care. We regularly discuss what feedback we plan to give to declined applicants in the actual selection meeting. Second, in a system that has inherently huge power differentials (foundations have all of it and applicants have virtually none of it), doing feedback calls forces you to come face to face with that reality. Never confronting the fact that your funding decisions impact real people with hopes and dreams is a part of what corrupts philanthropy. Feedback calls keep you a little more humble.

 

What We’re Working On

We still have room to improve our feedback calls:

  • We’ve heard from declined applicants that they sometimes get conflicting feedback from different team members when they apply (and get declined) multiple times; 15% of survey respondents said their feedback was inconsistent with prior feedback from us. Cringe. That definitely makes fundraising more crazymaking. We’re working on how to have more staff continuity with applicants who have applied multiple times.
  • We sometimes struggle to determine how long to keep encouraging a declined applicant to improve their project for future applications versus saying more definitively that the project is not a fit. Yes, we want to “Spread Optimism,” but although it never feels good for anyone involved, sometimes the best course of action is to encourage an applicant to seek funding elsewhere.

I’m under no illusions that feedback calls are going to fix the structural issues with philanthropy and fundraising. I welcome that larger conversation, driven in large part by brave critiques of philanthropy emerging lately like Decolonizing Wealth, Just Giving and Winners Take All. In the meantime, fundraising, as with dating, is still going to have moments of heartache and uncertainty. When you apply for a grant, you have to be brave and vulnerable; you’re putting your hopes and dreams out into a really confusing and opaque system that’s going to judge them, perhaps support them, or perhaps dash them, and maybe even “ghost” them by never responding. Feedback calls are one way to treat those hopes and dreams with a bit more care.

--Mandy Ellerton

GlassPockets Announces New Transparency Levels: Leveling Up Your Practices
March 28, 2019

Janet Camarena is director of transparency initiatives at Candid.

6a00e54efc2f80883301b7c90b6cb7970b-150wi
Janet Camarena

It's an exciting moment for us here at GlassPockets, and for the field of philanthropy, as we’ve just reached the milestone of 100 foundations committing to work more transparently by participating and publicly sharing their “Who Has GlassPockets?” transparency self-assessment profiles on our website. Yesterday, the Walton Family Foundation (WFF) officially became our 100th participant. What you are seeing today is the result of a diligent process that started last summer, as WFF continually worked to improve the openness of its website. With clear pathways to connect directly with staff members, a knowledge center containing lessons learned as well as packaged “flashcards” containing easily shareable bits of information, and a new searchable grants database spanning its 31-year history, WFF is not starting small when it comes to openness. Transparency can be tricky territory for family foundation donors who may be more accustomed to privacy and anonymity when it comes to their giving, so it’s particularly exciting for us to reach the milestone of 100 published profiles thanks to a family foundation enthusiastically embracing a more transparent approach.

When we started with a handful of foundations and fewer than two dozen transparency indicators, it was more experiment than movement. Now that we’ve aggregated data on transparency trends among 100 participating foundations, it’s a good opportunity to pause and reflect on what we are learning from this data that could inform the way forward to a more transparent future for philanthropy.

Transparency Indicators Evolve

GlassPockets Road to 100

Earlier this year I observed that a promising trend we are seeing in the field is that more foundations are developing sections of their websites devoted to explaining how they work, what values they hold dear, and in some cases, how these values inform their work and operations. Among the 100 foundations that have taken and publicly shared their transparency assessments, 42 percent are now using their websites as a means to communicate values or policies that demonstrate an intentional commitment to transparency. As a result we recently added transparency values/policies as a formal indicator to our GlassPockets assessment. But once you have developed such a values or policy statement, how does a foundation live up to it?

That’s where we hope our “Who Has GlassPockets?” assessment will continue to help foundations create a roadmap to transparency. The assessment is not static and has evolved with the field. When we started in 2010, there were 23 transparency indicators based on an inventory of thousands of foundation websites. As we continue to observe website transparency trends, the assessment has now grown to 27 indicators. Aside from the newest indicator for transparency values/policies, based on the kinds of information that foundations are now starting to share, some other new indicators we added since inception are strategic plans, open licensing policies, and use of the Sustainable Development Goals framework(SDGs). And we expect that as the field continues to evolve, this list of indicators will grow as well.

As the list has grown longer, foundations frequently ask us which indicators are the right ones to start with. Some also tell us that they want to participate, but not until they have at least half or even three-quarters of the indicators on the list. Though we applaud striving to be more transparent, the intent of GlassPockets was never that it be considered a “one-size-fits-all” approach, or that we expected that a majority of the indicators be in place to participate. Rather, that the GlassPockets exercise would serve to surface it as a priority, help the foundation evolve its transparency over time, and ideally would be a process the institution revisits on a regular basis, updating the GlassPockets profile with more and more indicators as transparency improves.

New Transparency Levels and Badges

So to help foundations better understand how to get started and how to grow transparency practices over time, we analyzed the data we have been collecting, and some patterns about how transparency evolves in philanthropy are now becoming clearer. We also conducted advisor interviews with a number of GlassPockets participants to better understand what would be most motivational and helpful in this regard. After reviewing everything we’ve learned so far, we have identified three levels through which foundations pass as they chart their course to greater transparency – these represent core, advanced, and champion-level transparency practices that you can view on this chart.

Explore how the Transparency Indicators relate to each level

Core-level transparency practices represent data most commonly shared by participating foundations and are the best place for new participants to begin. Advanced-level transparency practices open up the way you work to the world and represent information shared by about 50 to 70 percent of participating foundations. Champion-level transparency practices, in place at fewer than half of participating foundations, represent information-sharing that is pushing existing boundaries of foundation transparency.

These new levels represent an optional guide that can be helpful to follow but it is not intended to be viewed as a formal set of requirements. As has always been the case, any foundation at any stage of its transparency journey is welcome to participate and chart its own course. However, to motivate participation and progress, GlassPockets will begin awarding Transparency Badges based on the transparency level attained. These badges will appear on the GlassPockets profile, and will also be made available for use on the foundation’s website. Since it is not a one-size-fits-all, all participating foundations will automatically receive the Core GlassPockets transparency badge, and those who attain Advanced (10-18 indicators) or Champion level (19 or more indicators) will receive a badge denoting the appropriate designation.

Learn About the Transparency Badges

On the Level

Based on the new levels described above, GlassPockets will soon be adding the new Transparency Badges to each profile. So, if it’s been awhile since you reviewed your “Who Has GlassPockets?” profile, or if you’re looking for motivation to improve your transparency, now’s the time to review your existing profile, or submit a new one to see how your foundation stacks up. For existing GlassPockets participants, May 28th is the deadline to review your profile and get any updates or changes in to us before we start making the transparency levels and badges visible on the GlassPockets website the week of June 3rd. To update your profile, you can fill out any new links or corrections on this submission form, or simply email me your changes. As always, new profiles can be added at any time and you can learn more about that process here.

And last, but certainly not least, big thanks and cheers to our existing GlassPockets participants for helping us reach this milestone, and a big welcome to those who will help us reach the next one!

-- Janet Camarena

Putting a Stop to Recreating the Wheel: Strengthening the Field of Philanthropic Evaluation
December 13, 2018

Clare Nolan is Co-Founder of Engage R+D, which works with nonprofits, foundations, and public agencies to measure their impact, bring together stakeholders, and foster learning and innovation.

Meg Long is President of Equal Measure, Philadelphia-based professional services nonprofit focused on helping its clients—foundations, nonprofit organizations, and public entities—deepen and accelerate social change.

2
Clare Nolan

In 2017, Engage R+D and Equal Measure, with support from the Gordon and Betty Moore Foundation launched an exploratory dialogue of funders and evaluators to discuss the current state of evaluation and learning in philanthropy, explore barriers to greater collaboration and impact, and identify approaches and strategies to build the collective capacity of small and mid-sized evaluation firms. Our goal was to test whether there was interest in our sector for building an affinity network of evaluation leaders working with and within philanthropy. Since our initial meeting with a few dozen colleagues in 2017, our affinity network has grown to 250 individuals nationally, and there is growing momentum for finding ways funders and evaluators can work together differently to deepen the impact of evaluation and learning on philanthropic practice.

At the recent 2018 American Evaluation Association (AEA) conference in Cleveland, Ohio, nearly 100 funders and evaluators gathered to discuss four action areas that have generated the most “buzz” during our previous network convening at the Grantmakers for Effective Organizations (GEO) conference and from our subsequent network survey:

1. Improving the application of evaluation in philanthropic strategy and practice.

2. Supporting the sharing and adaptation of evaluation learning for multiple users.

3. Supporting formal partnerships and collaborations across evaluators and evaluation firms.

4. Strengthening and diversifying the pipeline of evaluators working with and within philanthropy.

1
Meg Long

We asked participants to choose one of these action areas and join the corresponding large table discussion to reflect on what they have learned about the topic and identify how the affinity network can contribute to advancing the field. Through crowd-sourcing, participants identified some key ways in which action teams that will be launched in early 2019 can provide a value-add to the field.

1. What will it take to more tightly connect evaluation with strategy and decision-making? Provide more guidance on what evaluation should look like in philanthropy.

Are there common principles, trainings, articles, case studies, guides, etc. that an action team could identify and develop? Could the affinity network be a space to convene funders and evaluators that work in similar fields to share evaluation results and lessons learned?

2. What will it take to broaden the audience for evaluations beyond individual organizations? Create a “market place” for knowledge sharing and incentivize participation.

As readers of this blog will know from Foundation Center’s #OpenForGood efforts, there is general agreement around the need to do better at sharing knowledge, building evidence, and being willing to share what foundations are learning – both successes and failures. How can an action team support the creation of a culture of knowledge sharing through existing venues and mechanisms (e.g., IssueLab, Evaluation Roundtable)? How could incentives be built in to support transparency and accountability?

3. How can the field create spaces that support greater collaboration and knowledge sharing among funders and evaluators? Identify promising evaluator partnership models that resulted in collaboration and not competition.

Partnerships have worked well where there are established relationships and trust and when power dynamics are minimized. How can an action team identify promising models and practices for successful collaborations where collaboration is not the main goal? How can they establish shared values, goals, etc. to further collaboration?

4. What will it take to create the conditions necessary to attract, support, and retain new talent? Build upon existing models to support emerging evaluators of color and identify practices for ongoing guidance and mentorship.

Recruiting, hiring, and retaining talent to fit evaluation and learning needs in philanthropy is challenging due to education and training programs as well as changing expectations in the field. How can we leverage and build on existing programs (e.g., AEA Graduate Education Diversity Internship, Leaders in Equitable Evaluation and Diversity, etc.) to increase the pipeline, and support ongoing retention and professional development?

Overall, we are delighted to see that there is much enthusiasm in our field to do more work on these issues. We look forward to launching action teams in early 2019 to further flesh out the ideas shared above in addition to others generated over the past year.

If you are interested in learning more about this effort, please contact Pilar Mendoza. If you would like to join the network and receive updates about this work, please contact Christine Kemler.

--Clare Nolan and Meg Long

Data Fix: Do's & Don'ts for Reporting Geographic Area Served
November 1, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This is the second post in a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoThe first post in our Data Fix series focused on areas that may seem straightforward but often cause confusion, including recipient location data. But don’t confuse recipient location (where the check was sent) with Geographic Area Served (the area meant to benefit from the funding). Data on recipient location, one of our required fields, allows us to match data to the correct organization in our database, ensuring accuracy for analyses or data visualizations. In contrast, Geographic Area Served, one of our highest priority fields, helps us tell the real story about where your funding is making an impact.

How to Report Geographic Area Served

We recognize that providing data on Geographic Area Served can be challenging. Many funders may not track this information, and those who do may depend on grantees or program staff to provide the details. It’s important to keep in mind that sharing some information is better than no information, as funders are currently the only source of this data.

DO DON'T
Do include details for locations beyond the country level. For example, for U.S. locations, specify a state along with providing geo area served at the city or county level. For non-U.S. locations, include the country name when funding a specific city, province, state or region. Don’t be too broad in scope. “Global Programs” may not be accurate if your work is focused on specific countries. Similarly, listing the geo area served as “Canada” is misleading if the work is serving the province of “Quebec, Canada” rather than the entire country.

Do use commas to indicate hierarchy and semi-colons to separate multiple areas served. For example:

  • Topeka, Kansas (comma used to indicate hierarchy)
  • Hitchcock County, Nebraska; Lisbon, Portugal; Asia (semi-colons used to list and separate multiple locations)
Don’t use negatives or catch-all terms. “Not California,” “Other,” “Statewide” or “International” may be meaningful within your organization, but these terms cannot be interpreted for mapping. Instead of “Statewide,” use the name of the state. Instead of “International,” use “Global Programs” or list the countries, regions, or continent being served.

Do define regions. If you are reporting on geo area served at the regional level (e.g. East Africa), please provide a list of the countries included in your organization’s definition of that region. Your definition of a region may differ from that of Foundation Center. Similarly, if your foundation defines its own regions (Southwestern Ohio), consider including the counties comprising that region.

Don’t forget to include the term “County” when reporting on U.S. counties. This will ensure your grant to an entire county isn’t assigned to the same named city (e.g. Los Angeles County, California, rather than Los Angeles, California).

Geographic Area Served in Foundation Center Platforms

Data provided (in a loadable format) will appear in “Grant Details” in Foundation Directory Online (FDO) and Foundation MapsFoundation Maps, including the complimentary eReporter map showing your own foundation’s data, also display an Area Served mapping view. 

Copy of Untitled

If data is not provided, Foundation Center will do one of the following:

  • Default to the location of the recipient organization
  • Add geo area served based on text in the grant description
  • Add geo area served based on where the recipient organization works, as listed on their website or in their mission statement, if this information is available in our database

Responsibly Sharing Geographic Area Served


Although our mission is to encourage transparency through the sharing of grants data, we acknowledge there are contexts in which sharing this data may be cause for concern. If the publishing of this data increases risks to the population meant to benefit from the funding, the grantee/recipient, or your own organization, you can either omit Geographic Area Served information entirely or report it at a higher, less sensitive level (e.g. country vs. province or city). For more information on this topic, please see Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts and Sharing Data Responsibly: A Conversation Guide for Funders.

More Tips to Come!

I hope you have a better understanding of how to report Geographic Area Served through eReporting. Without this data, valuable information about where funding is making a difference may be lost! Moving forward, we’ll explore the required fields of Recipient Name and Grant Description. If you have any questions, please feel free to contact me.

-- Kati Neiheisel

New Guide Helps Human Rights Funders Balance Tension between Risk & Transparency
October 25, 2018

Julie Broome is the Director of Ariadne, a network of European donors that support social change and human rights.  

Tom Walker is the Research Manager at The Engine Room, an international organisation that helps activists and organisations use data and technology effectively and responsibly.

2
Julie Broome

Foundations find themselves in a challenging situation when it comes to making decisions about how much data to share about their grantmaking. On the one hand, in recognition of the public benefit function of philanthropy, there is a demand for greater transparency on the part of funders and a push to be open about how much they are giving and who they are giving it to. These demands sometimes come from states, increasingly from philanthropy professionals themselves, and also from critics who believe that philanthropy has been too opaque for too long and raise questions about fairness and access. 

At the same time, donors who work in human rights and on politically charged issues, are increasingly becoming aware of the risks to grantees if sensitive information ends up in the public domain. As a result, some funders have moved towards sharing little to no information. However, this can have negative consequences in terms of our collective ability to map different fields, making it harder for us all develop a sense of the funding landscape in different areas. It can also serve to keep certain groups “underground,” when in reality they might benefit from the credibility that foundation funding can bestow.

1
Tom Walker

As the European partners in the Advancing Human Rights project, led by the Human Rights Funders Network and Foundation Center, Ariadne collects grantmaking data from our members that feeds into this larger effort to understand where human rights funding is going and how it is shifting over time. Unlike the United States, in which the IRS 990-PF form eventually provides transparency about grantee transactions, there is no equivalent data source in Europe. Yet, many donors find grant activity information useful in finding peer funders and identifying potential gaps in the funding landscape where their own funds could make a difference. We frequently receive requests from donors who want to use these datasets to drill down into specific areas of interest, and map out different funding fields. But these types of sources of data will become less valuable over time if donors move away from voluntarily sharing information about their grantmaking.

Nonetheless, the risks to grantees if donors share information irresponsibly are very real, especially at a time when civil society is increasingly under threat from both state and non-state actors.  It was in the interest of trying to balance these two aims – maintaining sufficient data to be able to analyse trends in philanthropy while protecting grantees – that led Ariadne to partner with The Engine Room to create a guide to help funders navigate these tricky questions.

After looking at why and how funders share data and the challenges of doing so responsibly, The Engine Room interviewed 8 people and surveyed 32 others working in foundations that fund human rights organisations, asking how they shared data about their grants and highlighting any risks they might see.

Funders told us that they felt treating data responsibly was important, but that implementing it in their day-to-day work was often difficult. It involved balancing competing priorities: between transparency and data protection legislation; between protecting grantees’ data and reporting requirements; and between protecting grantees from unwanted attention, and publicising stories to highlight the benefits of the grantee’s work.

The funders we heard from said they found it particularly difficult to predict how risks might change over time, and how to manage data that had already been shared and published. The most common concerns were:

  • ensuring that data that had already been published remained up to date;
  • de-identifying data before it was published
  • Working with third parties to be responsible when sharing data about grantees, such as with donors who fund through intermediaries and may request information about the intermediaries’ grantees.

Untitled designAlthough the funders we interviewed differed in their mission, size, geographical spread and focus area, they all stressed the importance of respecting the autonomy of their grantees. Practically, this meant that additional security or privacy measures were often introduced only when the grantee raised a concern. The people we spoke with were often aware that this reactive approach puts the burden of assessing data-related risks onto grantees, and suggested that they most needed support when it came to talking with grantees and other funders in an open, informed way about the opportunities and risks associated with sharing grantee data.

These conversations can be difficult ones to have. So, we tried a new approach: a guide to help funders have better conversations about responsible data.

It’s aimed at funders or grantmakers who want to treat their grantees’ data responsibly, but don’t always know how. It lists common questions that grantees and funders might ask, combined with advice and resources to help answer them, and tips for structuring a proactive conversation with grantees.

”There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process.“

There are no shortcuts to handling data responsibly, but we believe this guide can facilitate a better process. It offers prompts that are designed to help you talk more openly with grantees or other funders about data-related risks and ways of dealing with them. The guide is organised around three elements of the grantmaking lifecycle: data collection, data storage, and data sharing.

Because contexts and grantmaking systems vary dramatically and change constantly, a one-size-fits-all solution is impossible. Instead, we decided to offer guidance on processes and questions that many funders share – from deciding whether to publish a case study to having conversations about security with grantees. For example, one tip that would benefit many grantmakers is to ensure that grant agreements include specifics about how the funder will use any data collected as a result of the grant, based on a discussion that helps the grantee to understand how their data will be managed and make decisions accordingly.

This guide aims to give practical advice that helps funders strengthen their relationships with grantees - thereby leading to more effective grantmaking. Download the guide, and let us know what you think!

--Julie Broome and Tom Walker

Philanthropy and Democracy: Bringing Data to the Debate
October 18, 2018

Anna Koob is a manager of knowledge services for Foundation Center.

Anna-koob_tilemediumAs money and politics become increasingly intertwined, the enduring debate around the role of philanthropy in a democratic society has taken on new life in recent months  (see
here, here, here, and here for prominent examples).

One side of the debate sees the flexibility of foundation dollars as a part of the solution to strengthen struggling democratic institutions. Others contend that foundations are profoundly undemocratic and increasingly powerful institutions that bypass government channels to shape the country--and world--to their will. Regardless of where you stand, a practical starting point is to learn more about what grantmakers are actually doing to affect democracy in these United States.

While foundations are required by law to avoid partisan and candidate campaigning, these limitations still leave plenty of room for foundations to engage with democracy in other ways.

Which funders are working on voter access issues? How much money is dedicated to civic engagement on key issues like health or the environment? Which organizations are receiving grants to increase transparency in government? Foundation Funding for U.S. Democracy, offers a free public resource to get at the answers to such questions.

Browse More Than 55k Democracy Grants

Launched in 2014 by Foundation Center and updated regularly, Foundation Funding for U.S. Democracy’s data tool currently includes over 57,000 grants awarded by more than 6,000 funders totaling $5.1 billion dollars across four major categories: campaigns and elections, civic participation, government strengthening, and media.

The tool offers a look at the big picture through dashboards on each of these categories, and also allows you to browse granular grants-level information.  Interested in understanding:

  • The largest funders of campaigns and elections work?
  • Grantmaking in support of civic participation, broken down by population type?
  • The strategies used to affect democracy work?

To paraphrase the slogan of Apple, there’s a dashboard (and underlying data tool) for that!

The site also features a collection of research on U.S. democracy, powered by IssueLab, links to a number of relevant blog posts, and hosts infographics we’ve developed using data from the tool.

What Does the Data Tell Us About Philanthropic Support for Democracy?

Copy of UntitledLess than two percent of all philanthropic funding in the United States meets our criteria for democracy funding, which includes efforts by foundations to foster an engaged and informed public and support government accountability and integrity, as well as funding for policy research and advocacy. It’s a modest amount considering that this subset captures a wide range of topics, including money in politics, civic leadership development, civil rights litigation, and journalism training. Some findings from the data rise to the top:

  1. Funding for campaigns and elections is the smallest of the four major funding categories tracked. While most people might think of elections as the basic mechanism of democracy, this category only constitutes about 12 percent of democracy funding represented in the tool. Civic participation and government each vie for being the largest category with each accounting for about 38 percent of total democracy funding. And relevant media funding accounts for 28 percent. (Note that grants can be counted in multiple categories, so totals exceed 100 percent.)
  • Less than a quarter of funding supports policy and advocacy work. While work to affect policy is often considered front and center when discussing philanthropy’s impact on democracy, the data tool reveals that many funders are working to strengthen democracy in other ways. Supporting civics education for youth, bolstering election administration, strengthening platforms for government accountability, or funding investigative journalism appear as examples of grantmaking areas that strengthen democracy, but have less direct implications for public policy.
  • Funder interest in the census and the role of media in democracy is increasing. Given the turbulence of the last couple of years in the U.S. political system and amid calls for greater philanthropic involvement in strengthening democracy, what changes have we seen in giving patterns? Well, with the caveat that there is a lag between the time when grants are awarded and when we receive that data (from 990 tax forms or direct reporting by foundations), based on reports added to IssueLab and news items posted on Philanthropy News Digest, we are seeing evidence that funders are rallying around some causes to strengthen democratic institutions, including efforts to ensure representativeness in the 2020 census and support for research on media consumption and digital disinformation.

Why Should Funders be Transparent about Their Democracy Work?

Appeals for data sharing in philanthropy often center around the common good -- detailed data helps to inform authentic conversations around who’s funding what, where, among grantmakers, nonprofits, and other stakeholders. But in a field that’s focused on shaping the nature of our democracy and represents funding from both sides of the ideological divide -- including, for example, grantmaking in support of the American Legislative Exchange Council (“dedicated to the principles of limited government, free markets and federalism”) alongside grants awarded to organizations like the Center for American Progress (“dedicated to improving the lives of all Americans, through bold, progressive ideas”), democracy funders tend to be especially cautious about publicizing their work and opening themselves up to increased scrutiny and criticism.  

But the reality is that foundation opacity undermines credibility and public trust. Precisely because of criticism about the lack of democracy in philanthropy, foundations should demonstrate intentional transparency and show that they are living their values as democracy funders. Foundations also find that, particularly in a space that’s rife with speculation, there’s a benefit to shaping your own narrative and describing what you do in your own words. It may not make you immune to criticism, but it shows that you have nothing to hide.

How Funders Can Actively Engage: Submitting Grants Data

Copy of Untitled copy 2Grants data in the platform is either reported directly to Foundation Center via our eReporter program or sourced via publicly available 990 tax forms. While we’re able to get our data-eager hands on foundation grants either way, we prefer sourcing them directly from funders as it lends itself to more recent data -- particularly valuable in the current, fast-paced ‘democracy in crisis’ era -- and more detailed grant descriptions.

To submit your most recent grants (we’re currently collecting grants awarded in 2017), become an eReporter! Export a list of your most recent grants data in a spreadsheet (all grants - not limited to those relevant to democracy), review the data to make sure there’s no sensitive information and everything is as you’d like it to appear, and email your report to egrants@foundationcenter.org. Submit data as often as you’d like, but at least on an annual basis.

Bringing Tangible Details to Abstract Discussions

At Foundation Center, we often tout data’s ability to help guide decision making about funding and general resource allocation. And that’s a great practical use case for the philanthropic data that we collect -- whether for human rights, ocean conservation funding, the Sustainable Development Goals, or democracy. At a time of increased foundation scrutiny, this publicly-available platform can also provide some transparency and concrete details to broaden discussions. What have foundations done to strengthen democracy? And how might they best contribute in these politically uncertain times? For examples, look to the data.

Have questions about this resource? Contact us at democracy@foundationcenter.org.

--Anna Koob

Data Fix: Do's and Don'ts for Data Mapping & More!
October 3, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This post is part of a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoAs many of you know, Foundation Center was established to provide transparency for the field of philanthropy. A key part of this mission is collecting, indexing, and aggregating millions of grants each year. In recent years this laborious process has become more streamlined thanks to technology, auto-coding, and to those of you who directly report your grants data to us. Your participation also increases the timeliness and accuracy of the data.

Today, over 1300 funders worldwide share grants data directly with Foundation Center. Over the 20 years we've been collecting this data, we've encountered some issues concerning the basic fields required. To make sharing data even quicker and easier, we've put together some dos and don'ts focusing on three areas that may seem straightforward, but often cause confusion.

Location Data for Accurate Mapping and Matching

Quite simply, to map your grants data we need location information! And we need location information for more than mapping. We also use this information to ensure we are matching data to the correct organizations in our database. To help us do this even more accurately, we encourage you to provide as much location data as possible. This also helps you by increasing the usability of your own data when running your own analyses or data visualizations.

DO DON'T
Do supply Recipient City for U.S. and non-U.S. Recipients. Don't forget to supply Recipient Address and Recipient Postal Code, if possible.
Do supply Recipient State for U.S. Recipients. Don't supply post office box in place of street address for Recipient Address, if possible.

Do supply Recipient Country for non-U.S. Recipients.

Don't confuse Recipient location (where the check was sent) with Geographic Area Served (where the service will be provided). 

What's Your Type? Authorized or Paid?

Two types of grant amounts can be reported: Authorized amounts (new grants authorized in a given fiscal year, including the full amount of grants that may be paid over multiple years) or Paid amounts (as grants would appear in your IRS tax form). You can report on either one of these types of amounts – we just need to know which one you are using: Authorized or Paid.

DO DON'T
Do indicate if you are reporting on Authorized or Paid amounts. Don't send more than one column of Amounts in your report – either Authorized or Paid for the entire list.
Do remain consistent from year to year with sending either Authorized amounts or Paid amounts to prevent duplication of grants. Don't forget to include Grant Duration (in months) or Grant Start Date and Grant End Date, if possible.
Do report the type of Currency of the amount listed, if not US Dollars. Don't include more than one amount per grant.

The Essential Fiscal Year

An accurate Fiscal Year is essential since we publish grants data by fiscal year in our data-driven tools and content-rich platforms such as those developed by Foundation Landscapes, including Funding the Ocean, SDG Funders, Equal Footing and Youth Giving. Fiscal Year can be reported with a year (2018) or date range (07/01/2017-06/31/2018), but both formats will appear in published products as YEAR AWARDED: 2018.

DO DON'T
Do include the Fiscal Year in which the grants were either Authorized or Paid by you, the funder. Don't provide the Fiscal Year of the Recipient organization.
Do format your Fiscal Year as a year (2018) or a date range (07/01/2017-06/31/2018). Don't forget, for off-calendar fiscal years, the last year of the date range is the Fiscal Year: 07/01/2017-06/31/2018 = 2018.

More Tips to Come!

I hope you have a better understanding of these three areas of data to be shared through Foundation Center eReporting. Moving forward, we'll explore the required fields of Recipient Name and Grant Description, as well as high priority fields such as Geographic Area Served. If you have any questions, please feel free to contact me. Thank you! And don't forget, the data you share IS making a difference!

-- Kati Neiheisel

“Because It’s Hard” Is Not an Excuse – Challenges in Collecting and Using Demographic Data for Grantmaking
August 30, 2018

Melissa Sines is the Effective Practices Program Manager at PEAK Grantmaking. In this role, she works with internal teams, external consultants, volunteer advisory groups, and partner organizations to articulate and highlight the best ways to make grants – Effective Practices. A version of this post also appears in the PEAK Grantmaking blog.

MelissaFor philanthropy to advance equity in all communities, especially low-income communities and communities of color, it needs to be able to understand the demographics of the organizations being funded (and declined), the people being served, and the communities impacted. That data should be used to assess practices and drive decision making.

PEAK Grantmaking is working to better understand and build the capacity of grantmakers for collecting and utilizing demographic data as part of their grantmaking. Our work is focused on answering four key questions:

  • What demographic data are grantmakers collecting and why?
  • How are they collecting these demographic data?
  • How is demographic data being used and interpreted?
  • How can funders use demographic data to inform their work?

In the process of undertaking this research, we surfaced a lot of myths and challenges around this topic that prevent our field from reaching the goal of being accountable to our communities and collecting this data for responsible and effective use.

Generally, about half of all grantmakers are collecting demographic data either about the communities they are serving or about the leaders of the nonprofits they have supported. For those who reported that they found the collection and use of this data to be challenging, our researcher dug a little deeper and asked about the challenges they were seeing.

Some of the challenges that were brought to the forefront by our research were:

PEAK Grantmaking reportChallenge 1: Fidelity and Accuracy in Self-Reported Data
Data, and self-reported data in particular, will always be limited in its ability to tell the entire story and to achieve the nuance necessary for understanding. Many nonprofits, especially small grassroots organizations, lack the capability or capacity to collect and track data about their communities. In addition, white-led nonprofits may fear that lack of diversity at the board or senior staff level may be judged harshly by grantmakers.

Challenge 2: Broad Variations in Taxonomy
Detailed and flexible identity data can give a more complete picture of the community, but this flexibility works against data standardization. Varying taxonomies, across sectors or organizations, can make it difficult to compare and contrast data. It can also be a real burden if the nonprofit applying for a grant does not collect demographic data in the categories that a grantmaker is using. This can lead to confusion about how to report this data to a funder.

Challenge 3: Varying Data Needs Across Programs
Even inside a single organization, different programs may be collecting and tracking different data, as program officers respond to needs in their community and directives from senior leadership. Different strategies or approaches to a problem demand different data. For instance, an arts advocacy program may be more concerned with constituent demographics and impact, while an artist’s program will want to know about demographics of individual artists.

Challenge 4: Aggregating Data for Coalitions and Collaborations
This becomes even more complex as coalitions and collaborative efforts that bring together numerous organizations, or programs inside of different organizations, to accomplish a single task. The aforementioned challenges are compounded as more organizations, different databases, and various taxonomies try to aggregate consistent demographic data to track impact on specific populations.

These are all very real challenges, but they are not insurmountable. Philanthropy, if it puts itself to the task, can tackle these challenges.

Some suggestions to get the field started from our report include

  • Don’t let the perfect be the enemy of the good. Pilot systems for data collection, then revisit them to ensure that they are working correctly, meeting the need for good data, and serving the ultimate goal of tracking impact.
  • Fund the capacity of nonprofits to collect good data and to engage in their own diversity, equity, and inclusion efforts.
  • Engage in a conversation – internally and externally – about how this data will be collected and how it will be used. If foundation staff and the nonprofits they work with understand the need for this data, they will more willingly seek and provide this information.
  • For coalitions and collaborative efforts, it may make sense to fund a backbone organization that takes on this task (among other administrative or evaluation efforts) in support of the collective effort.
  • Work with your funding peers – in an issue area or in a community – to approach this challenge in a way that will decrease the burden on nonprofits and utilize experts that may exist at larger grantmaking operations.
  • Support field-wide data aggregators, like GuideStar or the Foundation Center, and work alongside them as they try to collect and disseminate demographic data about the staff and boards at nonprofits and the demographics of communities that are being supported by grantmaking funds.

Grantmakers have the resources and the expertise to begin solving this issue and to share their learning with the entire field. To read more about how grantmakers are collecting and using demographic data, download the full report.

--Melissa Sines

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories