Transparency Talk

Category: "Data" (137 posts)

Is the Environmental Movement Still #SoWhite? Learning from the 2019 Green 2.0 Transparency Report Card
March 12, 2020

6a00e54efc2f80883301bb09f34c00970d-150wi
Whitney Tome

Whitney Tome is the executive director of Green 2.0, which advocates for improved diversity, equity, and inclusion in the mainstream environmental movement.  

As environmental disasters from the recent wildfires in Australia and California to the growing intensity of tropical storms increase, environmental work takes on heightened urgency. We know that crises such as wildfires, rising sea levels, poor water, and air quality disproportionally impact people of color and vulnerable communities, so it’s important that the movement for improving our environment be accessible, welcoming, and open to all.

Since its inception in 2014, Green 2.0 has pioneered accountability measures for the #DiversifyGreen movement writ large. Through our annual Transparency Report Cards, we’ve exposed some of the worst actors within the top 40 environmental NGO’s and foundations while praising those who’ve demonstrated true commitments to diversity with their hiring practices. Our work has been instrumental in putting the spotlight on the glaring diversity issues within the environmental movement, and as a consequence, we’ve seen folks make substantive progress.

Though the diversity statistics for 2019 are encouraging, it is far too premature to declare victory. Some of the top foundations and organizations in this space who claim to be major, influential players, perpetuate a double standard—asking their grantees for their data and equity efforts while not providing their own.

This kind of hypocrisy is not just a glaring weakness, but it needs to be understood as an obstacle to making the kind of progress and impact these organizations seek to make.

"Opportunity, accountability, and intentionality are three pillars that funders and nonprofits alike must stand on."

Let us be clear—opportunity, accountability, and intentionality are three pillars that funders and nonprofits alike must stand on. Environmental leaders cannot afford to lose sight of the significance of diversity at a time when this movement needs greater unity and coordination of resources than ever before. There is too much at stake. Especially for our most vulnerable communities.

Inaction is inexcusable. And data can move people to action. This is why we publish these diversity statistics each year. With the critical support, leadership, and thought-partnership of Guidestar by Candid and Dr. Stefanie K. Johnson, our report cards and data analysis are produced with great care and efficacy because these organizations, like every organization, must be held accountable.

Based on our 2019 findings, we urge leaders in the environmental movement to adopt the following recommendations: Green2.0_logo-NEW

  1. More organizations in the funder sector of the movement need to report their data. As it stands, so few foundations have reported that Dr. Stefanie K. Johnson simply could not make an apples-to-apples comparison of which sector is excelling more rapidly. It is clear that NGOs excel in reporting data and are making strides, and while we assume foundations are making less progress due to lack of commitment to even report data, we simply cannot know for sure. What is clear is that data reporting signals external commitment and reinforces internal resolve to remove barriers to diversity that exist in persistently white organizations.
  2. Leaders must be thoughtful about how the opportunity to diversify manifests differently at different levels of their organizations. For example, while senior staff numbers have increased slightly in this year’s report, leaders have to consider whether that is sustainable if C-Suite professionals stay longer and their organizations are not expanding the number of senior staff positions. When senior positions do open, pushing search professionals to deliver truly diverse slates is an urgent necessity, and underscores the importance of having good data to back up the need. Evidence for the importance of tracking demographic data and using it to advocate for greater inclusion can be seen in the growing diversity of boards noted in this year’s report.
  3. Listen to young people. As we’ve seen, despite their lack of representation in the public sphere, young people are already building separate lanes of influence on climate change. Their leadership, messaging, and organizing strategies are noticeably more inclusive and racially diverse than the institutions that comprise the wider movement. They are nimble and rapidly responsive, in part, because they are the communities they are trying to save.

    "Inaction is inexcusable. And data can move people to action."

While we have faith that the longstanding, mainstream environmental movement will challenge itself to push the envelope on inclusivity, we implore the recalcitrant organizations to step forward and pledge to do better today. Not tomorrow. Not next year. Because many brown and Black communities just don’t have the time.

When Numbers Fall Short: The Challenge of Measuring Diversity in a Global Context
January 16, 2020

Athreya profile OSF 2 (2)
Bama Athreya

Bama Athreya is the Gender and Social Inclusion Advisor at the C&A Foundation, a corporate foundation committed to making fashion a force for good and transforming the industry to be more sustainable and provide decent livelihoods.  

At C&A Foundation we believe many of the challenges we seek to tackle are rooted in social exclusion. We are on a journey to deepen our approach to gender justice, diversity, equity, and inclusion. As part of our own effort to learn, we recently undertook a demographic survey of our 60+ employees worldwide to find out how “diverse” we are as an organization and what it might imply for our efforts to create an equitable organization. It was a first for us and we learned far more than the numbers alone revealed.

The process itself was both eye-opening and humbling. It forced us to reflect on what really matters for our global organization when it comes to diversity and it revealed some of our own implicit biases.

"We believe many of the challenges we seek to tackle are rooted in social exclusion."

We worked with US-based consultants to prepare the survey—covering age, sexual orientation, gender identity, nationality, disability, race, religion, and educational status. Unknowingly, the very act of selecting these categories imposed a US-centric world view, particularly with respect to our understanding of race and ethnicity.

For example, the category “Latinx” was used in the initial survey; this category is very relevant in the US, but reductive in Latin America, confusing in Europe, and irrelevant in South Asia. An important category for Europe—Roma—was not available for selection.

So we tried again, re-surveying our country offices in an attempt to create meaningful country-specific data. This proved far more useful in revealing what we should be considering as we seek to foster an inclusive workplace culture.

In Brazil, for example, race is a very salient concept and we are developing a much stronger understanding of why power dynamics around race may be the single most important thing we can address in that context. Less than half the Brazilian population is white —yet, political and economic structures are predominantly controlled by whites.

In Mexico, we need to consider the significant proportion of indigenous people and “mestizos” (mixed ethnicity). Although Mexicans of European descent are the minority there, they too remain a dominant political and economic class. In India, race itself is a problematic construct. Instead, caste discrimination has played a powerful role in reinforcing social group dominance and oppression for centuries. A dizzying array of ethno-linguistic groups suggests diversity but masks the real and sometimes violent social exclusion based on caste and religion. While historically disadvantaged “scheduled” castes and tribes make up around 25 percent of India’s population, they are significantly under-represented in the country’s economic life.

And throughout South Asia, religion is a political and social flashpoint. This applies to Bangladesh, a majority Muslim country where Hindus and Christians face increasing sectarian violence, as well as India, where, as recent events show, laws and policies excluding Muslims reflect rising Hindu nationalism.

Since C&A Foundation always aims to be open and transparent, it is our practice to openly share what we learn from our research, and this exercise was no exception. However, in the end, due to the importance of country and cultural context, the only demographic categories we felt were appropriate to include in our annual report were gender, disability, and migration status. Age is another context-neutral category we might report globally in the future. But for our 60 staff people spread across the world, we realized that inclusive hiring, promotion and retention policies needed to do more than just look at the numbers, even for these categories.

So what did we learn, and what do we suggest to other foundations undertaking similar surveys?

First, generic global surveys aren’t the best way to tackle region-specific diversity and inclusion challenges. Instead, start with a social inclusion assessment that looks at the local context. Who has power? Who is marginalized? From there you can craft context-specific demographic questions for your employees or your partners.

Lesson two: don’t just play the numbers game. With, at most, a dozen staff in any given country office, we found there is limited value in trying to add them all up to some global statistic on diversity. However, it is important to look at who’s not present in your workplace. For example, in Brazil, we’ve taken affirmative steps to recruit more Afro-Brazilians by hiring a consultancy specialized in searching for Afro-Brazilian professionals. And we are looking carefully at how to create more inclusive workplaces for people with disabilities across all of our country offices. For us, this kind of targeting does more to address diversity than a broad-brush effort.

"It is important to look at who’s not present in your workplace."

Finally, another value of this approach is that you are leading by example to your grantees since you likely ask them to provide you with their own demographic data. Just as we realize the limitations to what we do with this data, we can understand and respect the variety of approaches that our grantees may take to tackle their own specific diversity, equity and inclusion challenges. At C&A Foundation we see our efforts to address inequality as another means to encourage our local grantees to prioritize and embrace their own equity and inclusion agendas. This is where our broader influence may lie—and offers a further compelling reason to continue our own internal journey.

 

In 2020, C&A Foundation`s work in fashion will become part of Laudes Foundation - a new, independent foundation designed to support brave initiatives that will inspire and challenge industry to harness its power for good. The organization will works both to influence capital so that investment encourages good business practices, and through industry to tackle its deep and systemic challenges.

Laudes Foundation is a part of the Brenninkmeijer family enterprise, next to the COFRA  businesses and the family’s other private philanthropic activities, including Porticus, Good Energies Foundation, and Argidius Foundation.

Candid Announces Inaugural #OpenForGood Award Winners
May 30, 2019

Janet Camarena is director of transparency initiatives at Candid.

This post is part of the Glasspockets’ #OpenforGood series done in partnership with the Fund for Shared Insight. The series explores new tools, promising practices, and inspiring examples showing how some foundations are opening up the knowledge that they are learning for the benefit of the larger philanthropic sector. Contribute your comments on each post and share the series using #OpenForGood. View more posts in the series.

Open For Good Awardees and Committee MembersLeft to Right: Meg Long, President, Equal Measure (#OpenForGood selection committee); Janet Camarena, Director, Transparency Initiatives, Candid; Awardee Savi Mull, Senior Evaluation Manager, C&A Foundation; Awardee Veronica Olazabal, Director, Measurement, Evaluation & Organizational Performance, The Rockefeller Foundation; Clare Nolan, Co-Founder, Engage R + D (#OpenForGood selection committee).

Yesterday as part of the Grantmakers for Effective Organizations Learning Conference, Candid announced the inaugural recipients of the #OpenForGood Award, which is designed to recognize and encourage foundations to openly share what they learn so we can all get collectively smarter. The award, part of a larger #OpenForGood campaign started in 2017, includes a set of tools to help funders work more transparently including a GrantCraft Guide about how to operationalize knowledge sharing, a growing collection of foundation evaluations on IssueLab, and advice from peers in a curated blog series.

The three winning foundations each demonstrate an active commitment to open knowledge and share their evaluations through IssueLab, an open repository that is free, searchable, and accessible to all. Selected by an external committee from a globally sourced nomination process, the committee reviewed the contenders looking for evidence of an active commitment to open knowledge, creative approaches to making knowledge shareable, field leadership, and incorporating community insights into knowledge sharing work.

And the Winners Are…

Here are some highlights from the award presentation remarks:

C and A FoundationC&A Foundation
Award Summary: Creativity, Demonstrated Field Leadership, and Willingness to Openly Share Struggles

The C&A Foundation is a multi-national, corporate foundation working to fundamentally transform the fashion industry. C&A Foundation gives its partners financial support, expertise and networks so they can make the fashion industry work better for every person it touches. Lessons learned and impact for each of its programs are clearly available on its website, and helpful top-level summaries are provided for every impact evaluation making a lengthy narrative evaluation very accessible to peers, grantees and other stakeholders. C&A Foundation even provides such summaries for efforts that didn’t go as planned, packaging them in an easy-to-read, graphic format that it shares via its Results & Learning blog, rather than hiding them away and quietly moving on as is more often the case in the field.

The Ian Potter FoundationIan Potter Foundation
Award Summary: Creativity, Field Leadership, and Lifting Up Community Insights

This foundation routinely publishes collective summaries from all of its grantee reports for each portfolio as a way to support shared learning among its existing and future grantees. It’s a refreshing reinvention of the traditional grantee report, placing priority on collecting and sharing the kinds of information that will be helpful to other practitioners, rather than questions to satisfy the typical ritual of a grant report that goes something like submit, data enter, file away never to be seen, and repeat.

Beyond being transparent with its grantee learning and reports, the Ian Potter Foundation also recently helped lift the burden on its grantees when it comes to measurement and outcomes. Instead of asking overworked charities to invent a unique set of metrics just for their grant process, foundation evaluation staff took it upon themselves to mine the Sustainable Development Goals targets framework to provide grantees with optional and ready-made outcomes templates that would work across the field for many funders. You can read more about that effort underway in a recent blog post here.

The Rockefeller FoundationThe Rockefeller Foundation
Award Summary: Field Leadership, Consistent Knowledge Sharing, and Commitment to Working Transparently

The Rockefeller Foundation can boast early adopter status to transparency and openness—it  has had a longstanding commitment to creating a culture of learning and as such was one of the very first foundations to join the GlassPockets transparency movement and also to commit to #OpenForGood principles by sharing its published evaluations widely. Rockefeller Foundation also took the unusual step of upping the ante on the #OpenForGood Pledge aiming for both creating a culture of learning and accountability, with its monitoring and evaluation team stating that: “To ensure that we hold ourselves to a high bar, our foundation pre-commits itself to publicly sharing the results of its evaluations - well before the results are even known.” This ensures that even if the evaluation reports unfavorable findings, the intent is to share it all.

In an earlier GlassPockets blog post, Rockefeller’s monitoring and evaluation team shows a unique understanding of how sharing knowledge can advance the funder’s goals: “Through the documentation of what works, for who, and how/under what conditions, there is potential to amplify our impact, by crowding-in other funders to promising solutions, and diverting resources from being wasted on approaches that prove ineffectual.”  Rockefeller’s use of IssueLab’s open knowledge platform is living up to this promise as anyone can currently query and find more than 400 knowledge documents funded, published, or co-published by the Rockefeller Foundation.

Winners will receive technical support to create a custom Knowledge Center for their foundation or for a grantee organization, as well as promotional support in knowledge dissemination. Knowledge Centers are a service of IssueLab that provides organizations with a simple way to manage and share knowledge on their own websites. By leveraging this tool, you can showcase your insight, promote analysis on your grantees, and feature learnings from network members. All documents that are uploaded to an IssueLab Knowledge Center are also made searchable and discoverable via systems like WorldCat, which serves more than 2,000 libraries worldwide, ensuring your knowledge can be found by researchers, regardless of their familiarity with your organization.

Why Choose Openness?

The #OpenForGood award is focused on inspiring foundations to use existing and emerging technologies to collectively improve the sector. Today, we live in a time when most expect to find the information they need on the go, via tablets, laptops, and mobile phones, just a swipe or click away. Despite this digital era reality today only 13 percent of foundations have websites, and even fewer share their reports publicly, indicating that the field has a long way to go to creating a culture of shared learning. With this award, we hope to change these practices. Rather than reinvent the wheel, this award and campaign encourages the sector to make it a priority to learn from one another, share content with a global audience, so that we can build smartly one another’s work and accelerate the change we want to see in the world. The more you share your foundation's work, the greater the opportunities to make all our efforts more effective and farther reaching.

Congratulations to our inaugural class of #OpenForGood Award Winners! What will you #OpenForGood?

--Janet Camarena

Transparency: One Small Step for Funders, One Giant Leap for Equity
May 9, 2019

Genevieve Boutilier is a Program Associate at the Peace and Security Funders Group.

This post also appears in the Alliance blog.

Genevieve




Genevieve Boutilier

In order to solve a problem, one must first identify its parameters. This applies, too, to the philanthropic sector; to that end, many of us are pushing for greater transparency in our field. For example, Candid teamed up with a hundred foundations to make public their grants data, assets, policies, and procedures through the GlassPockets initiative, while our funder affinity group colleagues at PEAK Grantmaking and the Transparency and Accountability Initiative advocate for greater transparency with their members. At the Peace and Security Funders Group, we push for transparency through our Peace and Security Funding Index.

For the past five years, the Index has chronicled thousands of grants awarded by hundreds of peace and security funders to get a better sense of who and what gets funded in this sector. This data is useful for understanding the landscape of peace and security funding, including by identifying funding gaps and new funders; however, it has its limits. In the hot-off-the-press 2019 Index, we make the case for how improving this data benefits funders. But beyond benefitting funders, improving the data greatly benefits grantees and the communities they serve, which – in a virtuous cycle – increases funder effectiveness.

On the most basic level, better data gives grantseekers insight into a foundation’s priorities. This allows grantees to more easily identify foundations with similar missions, making space for grantees to spend less time fundraising and more time focusing on their missions – be it fighting for indigenous rights, preventing nuclear war, or helping child soldiers reintegrate into their communities. This opens the door for more open, honest, and equitable relationships between foundations and the grantees they support, which is essential for impactful grantmaking.

But simply understanding who and what gets funded is only the start of the conversation. It’s time to take the conversation to the next level.

By definition, peace and security funders decide who gets a chance at peace by how they award grants. They are the guardians of crucial resources and enormous wealth, and they get to decide how much, how, and when it’s allocated. This is an incredible amount of power. With this power comes the responsibility to engage in the work in ways that center the needs of communities on the frontlines of some of the globe’s greatest challenges.

With timely, more detailed data, this sector can start to answer the tough questions that experts like Edgar Villanueva and Vu Le have been asking: Why are certain regions, issues, and strategies underfunded? Why are certain populations prioritized over others? Why isn't awarding general operating support increasing, especially given the ample evidence that suggests that it’s a best practice? Why are certain kinds of grantees passed over for funding?

”We aren’t collecting data for data’s sake—we’re hoping to transform this sector for the better.”

For our part, we aren’t collecting data for data’s sake—we’re hoping to transform this sector for the better.

To this end, we encourage all funders to start asking the tough questions about their grantmaking, and to increase their knowledge and understanding of equity in the philanthropic sector. Funders can begin to do this in three straightforward ways. First, submit detailed data about your grantmaking to Candid. We at the Peace and Security Funders Group (PSFG) are encouraging our 59 members – who represent a vast majority of the funding in the peace and security field – to submit their detailed 2018 grants data by June 30, 2019, so that we can improve the utility of the Peace and Security Funding Index. Second, funders can join their peers – including a handful of PSFG members – in becoming members of the Justice Funders network; here, they can listen and learn from each other and experts. Finally, funders should assess their own grantmaking practices. Ask yourself, ‘How could I change grantmaking practices to become more transparent and more equitable?’

There are countless other resources to help funders engage, so if you’re stuck and not sure where to go, we at PSFG can try and point you in the right direction.

--Genevieve Boutilier

Don’t “Ghost” Declined Applicants: The Ins and Outs of Giving Applicant Feedback
April 4, 2019

Mandy Ellerton joined the [Archibald] Bush Foundation in 2011, where she created and now directs the Foundation's Community Innovation programs. The programs allow communities to develop and test new solutions to community problems, using approaches that are collaborative and inclusive of people who are most directly affected by the problem.

GlassPockets Road to 100

This post is part of our “Road to 100 & Beyond series, in which we are featuring the foundations that have helped GlassPockets reach the milestone of 100 published profiles by publicly participating in the “Who Has GlassPockets? self-assessment. This blog series highlights reflections on why transparency is important, how openness evolves inside foundations over time, promising practices in transparency, helpful examples, and lessons learned.

I’ve often thought that fundraising can be as bad as dating. (Kudos to you lucky few who have had great experiences dating!) Lots of dates, lots of dead ends, lots of frustrating encounters before you (maybe) find a match. All along the way you look for even the smallest sign to indicate that someone likes you. “They laughed at my joke!” or, in the case of fundraising, “they seemed really excited about page five of last year’s impact report!” Not to mention the endless time spent doing online searches for shreds of information that might be useful. This reality is part of the reason why Bush Foundation was proud to be among the first 100 foundations to participate in GlassPockets. We believe that transparency and opening lines of communication is critical to good grantmaking, because both in dating and in fundraising, it can be heartbreaking and crazymaking to try and sort out whether you have a connection or if someone’s “just not that into you.” If only there was a way to just “swipe left” or “swipe right” and make everything a little simpler.

“We believe that transparency and opening lines of communication is critical to good grantmaking.”

I’m not proposing a Tinder for grantmaking (nor should anyone, probably, although hat tip to Vu Le for messing with all of us and floating the idea on April Fool’s Day). But over the past several years, Bush Foundation’s Community Innovation program staff has used a system to provide feedback calls for declined applicants, in the hopes of making foundation fundraising a little less opaque and crazymaking. We use the calls to be transparent and explain why we made our funding decisions. The calls also help us live out our “Spread Optimism” value because they allow us to help and encourage applicants and potentially point them to other resources. This is all part of our larger engagement strategy, described in “No Moat Philanthropy.”

 

Ellertonmandy20152
Mandy Ellerton

How Feedback Calls Work

We use a systematic approach for feedback calls:

  • We proactively offer the opportunity to sign up for feedback calls in the email we send to declined applicants.
  • We use a scheduling tool (after trying a couple different options we’ve landed on Slotted, which is relatively cheap and easy to use) and offer a variety of times for feedback calls every week. Collectively five Community Innovation Team members hold about an hour a week for feedback calls. The calls typically last about 20 minutes. We’ve found this is about the right amount of time so that we can offer feedback calls to most of the declined applicants who want them.
  • We prepare for our feedback calls. We re-read the application and develop an outline for the call ahead of time.
  • During the call we offer a couple of reasons why we declined the application. We often discuss what an applicant could work on to strengthen their project and whether they ought to apply again.
  • We also spend a lot of time listening; sometimes these calls can understandably be emotional. Grant applications are a representation of someone’s hopes and dreams and sometimes your decline might feel like the end of the road for the applicant. But hang with them. Don’t get defensive. However hard it might feel for you, it’s a lot harder for the declined applicant. And ultimately, hard conversations can be transformative for everyone involved. I will say, however, that most of our feedback calls are really positive exchanges.
  • We use anonymous surveys to evaluate what people think of the feedback calls and during the feedback call we ask whether the applicant has any feedback for us to improve our programs/grantmaking process.
  • We train new staff on how to do feedback calls. We have a staff instruction manual on how to do feedback calls, but we also have new team members shadow more seasoned team members for a while before they do a feedback call alone.

 

What’s Going Well

The feedback calls appear to be useful for both declined applicants and for us:

  • In our 2018 surveys, respondents (n=38) rated the feedback calls highly. They gave the calls an average rating of 6.1 (out of 7) for overall helpfulness, 95% said the calls added some value or a lot of value, and 81.2% said they had a somewhat better or much better understanding of the programs after the feedback call.
  • We’ve seen the number of applications for our Community Innovation Grant and Bush Prize for Community Innovation programs go down over time and we’ve seen the overall quality go up. We think that’s due, in part, to feedback calls that help applicants decide whether to apply again and that help applicants improve their projects to become a better fit for funding in the future.
  • I’d also like to think that doing feedback calls has made us better grantmakers. First, it shows up in our selection meetings. When you might have to talk to someone about why you made the funding decision you did, you’re going to be even more thoughtful in making the decision in the first place. You’re going to hew even closer to your stated criteria and treat the decision with care. We regularly discuss what feedback we plan to give to declined applicants in the actual selection meeting. Second, in a system that has inherently huge power differentials (foundations have all of it and applicants have virtually none of it), doing feedback calls forces you to come face to face with that reality. Never confronting the fact that your funding decisions impact real people with hopes and dreams is a part of what corrupts philanthropy. Feedback calls keep you a little more humble.

 

What We’re Working On

We still have room to improve our feedback calls:

  • We’ve heard from declined applicants that they sometimes get conflicting feedback from different team members when they apply (and get declined) multiple times; 15% of survey respondents said their feedback was inconsistent with prior feedback from us. Cringe. That definitely makes fundraising more crazymaking. We’re working on how to have more staff continuity with applicants who have applied multiple times.
  • We sometimes struggle to determine how long to keep encouraging a declined applicant to improve their project for future applications versus saying more definitively that the project is not a fit. Yes, we want to “Spread Optimism,” but although it never feels good for anyone involved, sometimes the best course of action is to encourage an applicant to seek funding elsewhere.

I’m under no illusions that feedback calls are going to fix the structural issues with philanthropy and fundraising. I welcome that larger conversation, driven in large part by brave critiques of philanthropy emerging lately like Decolonizing Wealth, Just Giving and Winners Take All. In the meantime, fundraising, as with dating, is still going to have moments of heartache and uncertainty. When you apply for a grant, you have to be brave and vulnerable; you’re putting your hopes and dreams out into a really confusing and opaque system that’s going to judge them, perhaps support them, or perhaps dash them, and maybe even “ghost” them by never responding. Feedback calls are one way to treat those hopes and dreams with a bit more care.

--Mandy Ellerton

GlassPockets Announces New Transparency Levels: Leveling Up Your Practices
March 28, 2019

Janet Camarena is director of transparency initiatives at Candid.

6a00e54efc2f80883301b7c90b6cb7970b-150wi
Janet Camarena

It's an exciting moment for us here at GlassPockets, and for the field of philanthropy, as we’ve just reached the milestone of 100 foundations committing to work more transparently by participating and publicly sharing their “Who Has GlassPockets?” transparency self-assessment profiles on our website. Yesterday, the Walton Family Foundation (WFF) officially became our 100th participant. What you are seeing today is the result of a diligent process that started last summer, as WFF continually worked to improve the openness of its website. With clear pathways to connect directly with staff members, a knowledge center containing lessons learned as well as packaged “flashcards” containing easily shareable bits of information, and a new searchable grants database spanning its 31-year history, WFF is not starting small when it comes to openness. Transparency can be tricky territory for family foundation donors who may be more accustomed to privacy and anonymity when it comes to their giving, so it’s particularly exciting for us to reach the milestone of 100 published profiles thanks to a family foundation enthusiastically embracing a more transparent approach.

When we started with a handful of foundations and fewer than two dozen transparency indicators, it was more experiment than movement. Now that we’ve aggregated data on transparency trends among 100 participating foundations, it’s a good opportunity to pause and reflect on what we are learning from this data that could inform the way forward to a more transparent future for philanthropy.

Transparency Indicators Evolve

GlassPockets Road to 100

Earlier this year I observed that a promising trend we are seeing in the field is that more foundations are developing sections of their websites devoted to explaining how they work, what values they hold dear, and in some cases, how these values inform their work and operations. Among the 100 foundations that have taken and publicly shared their transparency assessments, 42 percent are now using their websites as a means to communicate values or policies that demonstrate an intentional commitment to transparency. As a result we recently added transparency values/policies as a formal indicator to our GlassPockets assessment. But once you have developed such a values or policy statement, how does a foundation live up to it?

That’s where we hope our “Who Has GlassPockets?” assessment will continue to help foundations create a roadmap to transparency. The assessment is not static and has evolved with the field. When we started in 2010, there were 23 transparency indicators based on an inventory of thousands of foundation websites. As we continue to observe website transparency trends, the assessment has now grown to 27 indicators. Aside from the newest indicator for transparency values/policies, based on the kinds of information that foundations are now starting to share, some other new indicators we added since inception are strategic plans, open licensing policies, and use of the Sustainable Development Goals framework(SDGs). And we expect that as the field continues to evolve, this list of indicators will grow as well.

As the list has grown longer, foundations frequently ask us which indicators are the right ones to start with. Some also tell us that they want to participate, but not until they have at least half or even three-quarters of the indicators on the list. Though we applaud striving to be more transparent, the intent of GlassPockets was never that it be considered a “one-size-fits-all” approach, or that we expected that a majority of the indicators be in place to participate. Rather, that the GlassPockets exercise would serve to surface it as a priority, help the foundation evolve its transparency over time, and ideally would be a process the institution revisits on a regular basis, updating the GlassPockets profile with more and more indicators as transparency improves.

New Transparency Levels and Badges

So to help foundations better understand how to get started and how to grow transparency practices over time, we analyzed the data we have been collecting, and some patterns about how transparency evolves in philanthropy are now becoming clearer. We also conducted advisor interviews with a number of GlassPockets participants to better understand what would be most motivational and helpful in this regard. After reviewing everything we’ve learned so far, we have identified three levels through which foundations pass as they chart their course to greater transparency – these represent core, advanced, and champion-level transparency practices that you can view on this chart.

Explore how the Transparency Indicators relate to each level

Core-level transparency practices represent data most commonly shared by participating foundations and are the best place for new participants to begin. Advanced-level transparency practices open up the way you work to the world and represent information shared by about 50 to 70 percent of participating foundations. Champion-level transparency practices, in place at fewer than half of participating foundations, represent information-sharing that is pushing existing boundaries of foundation transparency.

These new levels represent an optional guide that can be helpful to follow but it is not intended to be viewed as a formal set of requirements. As has always been the case, any foundation at any stage of its transparency journey is welcome to participate and chart its own course. However, to motivate participation and progress, GlassPockets will begin awarding Transparency Badges based on the transparency level attained. These badges will appear on the GlassPockets profile, and will also be made available for use on the foundation’s website. Since it is not a one-size-fits-all, all participating foundations will automatically receive the Core GlassPockets transparency badge, and those who attain Advanced (10-18 indicators) or Champion level (19 or more indicators) will receive a badge denoting the appropriate designation.

Learn About the Transparency Badges

On the Level

Based on the new levels described above, GlassPockets will soon be adding the new Transparency Badges to each profile. So, if it’s been awhile since you reviewed your “Who Has GlassPockets?” profile, or if you’re looking for motivation to improve your transparency, now’s the time to review your existing profile, or submit a new one to see how your foundation stacks up. For existing GlassPockets participants, May 28th is the deadline to review your profile and get any updates or changes in to us before we start making the transparency levels and badges visible on the GlassPockets website the week of June 3rd. To update your profile, you can fill out any new links or corrections on this submission form, or simply email me your changes. As always, new profiles can be added at any time and you can learn more about that process here.

And last, but certainly not least, big thanks and cheers to our existing GlassPockets participants for helping us reach this milestone, and a big welcome to those who will help us reach the next one!

-- Janet Camarena

A New Year, a New Transparency Indicator: Coming Soon—Transparency Values & Policies
January 3, 2019

Janet Camarena is director of transparency initiatives at Foundation Center.

Janet Camarena PhotoWhen GlassPockets started nine years ago, it was rare to find any reference to transparency in relation to philanthropy or foundations. The focus of most references to transparency at the time were in relation to nonprofits or governments, but seldom to philanthropy. When we set out to create a framework to assess foundation transparency, the “Who Has GlassPockets?” criteria were based on an inventory of current foundation practices meaning there were no indicators on the list that were not being shared somewhere by at least a few foundations. Not surprisingly, given the lack of emphasis on foundation transparency, there were few mentions of it as a policy or even as a value in the websites we reviewed, so it didn’t make sense at the time to include it as a formal indicator.

GlassPockets Road to 100A lot has changed in nine years, and it’s clear now from reviewing philanthropy journals, conferences, and yes, even foundation websites that awareness about the importance of philanthropic transparency is on the rise. Among the nearly 100 foundations that have taken and publicly shared “Who Has GlassPockets?” transparency assessments, more than 40 percent are now using their websites as a means to communicate values or policies that aim to demonstrate an intentional commitment to transparency. And demonstrating that how the work is done is as important as what is done, another encouraging signal is that in many cases there are articulated statements on new “How We Work” pages outlining not just what these foundations do, but an emphasis on sharing how they aim to go about it. These statements can be found among funders of all types, including large, small, family, and independent foundations.

We want to encourage this intentionality around transparency, so in 2019 we are adding a new transparency indicator asking whether participating foundations have publicly shared values or policies committing themselves to working openly and transparently. In late January the “Who Has GlassPockets?” self-assessment and profiles will be updated reflecting the new addition. Does your foundation’s website have stated values or policies about its commitment to transparency? If not, below are some samples we have found that may serve as inspiration for others:

  • The Barr Foundation’s “How We Work" page leads with an ethos stating “We strive to be transparent, foster open communication, and build constructive relationships.” And elaborates further about field-building potential: “We aim to be open and transparent about our work and to contribute to broader efforts that promote and advance the field of philanthropy.”

  • The Samuel N. and Mary Castle Foundation’s Mission and Core Values page articulates a long list of values that “emerge from the Foundation’s long history,” including a commitment to forming strategic alliances, working honestly, “showing compassion and mutual respect among grantmakers and grantees,” and ties its focus on transparency to a commitment to high standards and quality: “The Foundation strives for high quality in everything it does so that the Foundation is synonymous with quality, transparency and responsiveness.”

  • The Ford Foundation’s statement connects its transparency focus to culture, values around debate and collaboration, and a commitment to accountability: “Our culture is driven by trust, constructive debate, and leadership that empowers innovation and excellence. We strive to listen and learn and to model openness and transparency. We are accountable to each other at the foundation, to our charter, to our sector, to the organizations we support, and to society at large—as well as to the laws that govern our nonprofit status.”

  • An excerpt from the Bill and Melinda Gates Foundation’s “Information Sharing Approach” page emphasizes collaboration, peer learning, and offers an appropriately global view: “Around the world, institutions are maximizing their impact by becoming increasingly transparent. This follows a fundamental truth: that access to information and data fosters effective collaboration. At the foundation, we are embracing this reality through a continued commitment to search for opportunities that will help others understand our priorities better and what supports our decision making. The foundation is also committed to helping the philanthropic sector develop the tools that will increase confidence in our collective ability to address tough challenges around the world…..We will continually refine our approach to information sharing by regularly exploring how we increase access to important information within the foundation, while studying other institutional efforts at transparency to learn lessons from our partners and peers.”

  • The Walter and Elise Haas Fund connects its transparency focus to its mission statement, and its transparency-related activities to greater effectiveness: “Our ongoing commitment to transparency is a reflection of our mission — to build a healthy, just, and vibrant society in which people feel connected to and responsible for their community. The Walter & Elise Haas Fund shares real-time grants data and champions cross-sector work and community cooperation. Our grantmaking leverages partnerships and collaborations to produce results that no single actor could accomplish alone.”

  • The William and Flora Hewlett Foundation’s statement emphasizes the importance of transparency in creating a culture of learning: “The foundation is committed to openness, transparency and learning. While individually important, our commitments to openness, transparency, and learning jointly express values that are vital to our work. Because our operations—both internal and external—are situated in complex institutional and cultural environments, we cannot achieve our goals without being an adaptive, learning organization. And we cannot be such an organization unless we are open and transparent: willing to encourage debate and dissent, both within and without the foundation; ready to share what we learn with the field and broader public; eager to hear from and listen to others. These qualities of openness to learning and willingness to adjust are equally important for both external grantmaking and internal administration.”

These are just a few of the examples GlassPockets will have available when the new indicator is added later this month. Keep an eye on our Twitter feed for updates.

Happy New Year, Happy New Transparency Indicator!

--Janet Camarena

Living Our Values: Gauging a Foundation’s Commitment to Diversity, Equity, and Inclusion
November 29, 2018

Mona Jhawar serves as learning and evaluation manager for The California Endowment.

Mona JhawarThe California Endowment (TCE) recently wrapped up our 2016 Diversity, Equity, and Inclusion (DEI) Audit, our fourth since 2008. The audit was initially developed at a time when community advocates were pushing the foundation to address issues of structural racism and inequity. As TCE’s grantmaking responded, staff and our CEO were also interested in promoting DEI values across the entire foundation beyond programmatic spaces. Over time, these values became increasingly engrained in TCE’s ethos and the foundation committed to conducting a regular audit as a vehicle with which to determine if and how our DEI values were guiding organizational practice.

Sharing information about our DEI Audit often raises questions about how to launch such an effort. Some colleagues are in the early stages of considering whether they want to carry out an audit of their own. Are we ready? What do we need to have in place to even begin to broach this possibility? Others are interested to hear about how we use the findings from such an assessment. To help answer these questions, this is the first of a two-part blog series to share the lessons we’re learning by using a DEI audit to hold ourselves accountable to our values.

While the audit provides a frame to identify if our DEI values are being expressed throughout the foundation, it also fosters learning. Findings are reviewed and discussed with executive leadership, board, and staff. Reviews provide venues to involve both programmatic and non-programmatic staff in DEI discussions. An audit workgroup typically considers how to take action on findings so that the foundation can continuously improve and also considers how to revise audit goals to ensure forward movement. By sharing findings publicly, we hope our experience and lessons can help to support the field more broadly.

It is, however, no small feat. The audit is a comprehensive process that includes a demographic survey of staff and board, a staff and board survey of DEI attitudes and beliefs, interviews with key foundation leaders, examining available demographic data from grantee partners as well as a review of DEI-related documents gathered in between audits. Having dedicated resources to engage a neutral outsider to carry out the audit in partnership with the foundation is also important to this process. We’ve found it particularly helpful to engage with a consistent trusted partner, Social Policy Research Associates, over each of our audits to capture and candidly reflect where we’re making progress and where we need to work harder to create change.

As your foundation considers your own readiness to engage in such an audit process, we offer the following factors that have facilitated a productive and learning oriented DEI audit effort at TCE:

1. Clarity about the fundamental importance of Diversity, Equity, and Inclusion to the Foundation

The expression of our DEI values has evolved over time. When the audit started, several program staff members who focused on DEI and cultural competency developed a guiding statement on Diversity and Inclusiveness. Located within our audit report, it focused heavily on diversity although tweaks were made to the statement over time. A significant shift occurred several years ago when our executive team articulated a comprehensive set of core values that undergirds all our work and leads with a commitment to diversity, equity, and inclusion.

2. Interest in reflection and adaptation

The audit is a tool for organizational learning that facilitates continuous improvement. The process relies on having both a growth mindset and clear goals for what we hope to accomplish. Our 13 goals range from board engagement to utilizing accessibility best practices. In addition to examining our own goals, the audit shares how we’re doing with respect to a framework of institutional supports required to build a culture of equity. By comparing the foundation to itself over time we can determine if and where change is occurring. It also allows us to revise goals so that we can continue to push ourselves forward as we improve, or to course correct if we’re not on track. We anticipate updating our goals before our next audit to reflect where we are currently in our DEI journey.

3. Engagement of key leaders, including staff

Our CEO is vocal and clear about the importance of DEI internally and externally, as well as about the significance of conducting the audit itself. Our executive team, board, and CEO all contribute to the audit process and are actively interested in reviewing and discussing its findings.

Staff engagement is critical throughout audit implementation, reflection on findings, and action planning as well. It’s notable that the vast majority of staff at all levels feel comfortable pushing the foundation to stay accountable to DEI internally. However, there is a small, but growing percentage (23%) of staff who report feeling uncomfortable raising DEI concerns in the workplace suggesting an area for greater attention.

4. Capacity to respond to any findings

Findings are not always going to be comfortable. Identifying areas for improvement may put the organization and our leaders in tough places. TCE has historically convened a cross departmental workgroup to consider audit findings and tackle action planning. We considered co-locating the audit workgroup within our executive leadership team to increase the group’s capacity to address audit findings. However, now we are considering whether it would be best situated and aligned within an emerging body that will be specifically focused on bringing racial equity to the center of all our work.

5. Courage and will to repeat

In a sector with limited accountability, choosing to voluntarily and publicly examine foundation practices takes real commitment and courage. It’s always great to hear where we’re doing well but committing to a process that also raises multiple areas where we need to put more attention, requires deep will to repeat on a regular basis. And we do so in recognition that this is long term, ongoing work that, in lieu of having a real finish line, requires us to continuously adapt as our communities evolve.

Conducting our DEI audit regularly has strengthened our sense of where our practice excels—for example in our grantmaking, possessing a strong vision and authorizing environment, and diversity among staff and board. It’s also strengthened our sense of the ways we want to improve such as developing a more widely shared DEI analysis and trainings for all staff as well as continuing to strengthen data collection among our partners. The value of our DEI audit lies equally in considering findings as well as being a springboard for prioritizing action. TCE has been on this road a long time and we’ll keep at it for the foreseeable future. As our understanding of what it takes to pursue diversity, equity, and inclusion internally and externally sharpens, so will the demands on our practice. Our DEI audit will continue to ensure that we hold ourselves to these demands. In my next post, we’ll take a closer look at what we’re learning about operationalizing equity within the foundation.

--Mona Jhawar

What Does It Take to Shift to a Learning Culture in Philanthropy?
November 20, 2018

Janet Camarena is director of transparency initiatives at Foundation Center.

This post also appears in the Center for Effective Philanthropy blog.

Janet Camarena PhotoIf there was ever any doubt that greater openness and transparency could benefit organized philanthropy, a new report from the Center for Effective Philanthropy (CEP) about knowledge-sharing practices puts it to rest. Besides making a case for the need for greater transparency in the field, the report also provides some hopeful signs that, among foundation leaders, there is growing recognition of the value of shifting to a culture of learning to improve foundations’ efforts.

Understanding & Sharing What Works: The State of Foundation Practice reveals how well foundation leaders understand what is and isn’t working in their foundation’s programs, how they figure this out, and what, if anything, they share with others about what they’ve learned. These trends are explored through 119 survey responses from, and 41 in-depth interviews with foundation CEOs. A companion series of profiles tell the story about these practices in the context of four foundations that have committed to working more openly.

Since Foundation Center’s launch of GlassPockets in 2010, we have tracked transparency around planning and performance measurement within the “Who Has Glass Pockets?” self-assessment. Currently, of the nearly 100 foundations that have participated in GlassPockets, only 27 percent publicly share any information about how they measure their progress toward institutional goals. Given this lack of knowledge sharing, we undertook a new #OpenForGood campaign to encourage foundations to publicly share published evaluations through the IssueLab open archive.

As someone who has spent the last decade examining foundation transparency practices (or the lack thereof) and championing greater openness, I read CEP’s findings with an eye for elements that might help us better understand the barriers and catalysts to this kind of culture shift in the field. Here’s what I took away from the report.

Performance Anxiety

UWW_MAIN_COV_border (1)While two-thirds of foundation CEOs in CEP’s study report having a strong sense of what is working programmatically within their foundations, nearly 60 percent report having a weaker grasp on what is not working. This begs the question: If you don’t know something is broken, then how do you fix it? Since we know foundations have a tendency to be success-oriented, this by itself wasn’t surprising. But it’s a helpful metric that proves the point of how investing in evaluation, learning, and sharing can only lead to wiser use of precious resources for the field as a whole.

The report also reveals that many CEOs who have learned what is not working well at their foundations are unlikely to share that knowledge, as more than one-third of respondents cite hesitancy around disclosing missteps and failures. The interviews and profiles point to what can best be described as performance anxiety. CEOs cite the need for professionals to show what went well, fear of losing the trust of stakeholders, and a desire to impress their boards as motivations for concealing struggles. Of these motivations, board leadership seems particularly influential for setting the culture when it comes to transparency and failure.

In the profiles, Rockefeller Brothers Fund (RBF) President Stephen Heintz discusses both the importance of his board and his background in government as factors that have informed RBF’s willingness to share the kinds of information many foundations won’t. RBF was an early participant in GlassPockets, and now is an early adopter of the #OpenForGood movement to openly share knowledge. As a result, RBF has been one of the examples we often point to for the more challenging aspects of transparency such as frameworks for diversity data, knowledge sharing, and investment practices.

An important takeaway of the RBF profile is the Fund’s emphasis on the way in which a board can help ease performance anxiety by simply giving leadership permission to talk about pain points and missteps. Yet one-third of CEOs specifically mention that their foundation faces pressure from its board to withhold information about failures. This sparks my interest in seeing a similar survey asking foundation trustees about their perspectives in this area.

Utility or Futility?

Anyone who works inside a foundation — or anyone who has ever applied for a grant from a foundation — will tell you they are buried in the kind of paperwork load that often feels futile (which actually spawned a whole other worthy movement led by PEAK Grantmaking called Project Streamline). In the CEP study, the majority of foundation CEOs report finding most of the standard sources of knowledge that they require not very useful to them. Site visits were most consistently ranked highly, with the majority of CEOs (56 percent) pointing to them as one of the most useful sources for learning about what is and isn’t working. Grantee focus groups and convenings came in a distant second, with only 38 percent of CEOs reporting these as a most useful source. And despite the labor involved on both sides of the table, final grant reports were ranked as a most useful source for learning by only 31 percent of CEOs.

”Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge.“

If most foundations find greater value in higher touch methods of learning, such as meeting face-to-face or hosting grantee gatherings, then perhaps this is a reminder that if foundations reduce the burdens of their own bureaucracies and streamline application and reporting processes, there will be more time for learning from community and stakeholder engagement.

The companion profile of the Weingart Foundation, another longtime GlassPockets participant, shows the benefits of funders making more time for grantee engagement, and provides a number of methods for doing so. Weingart co-creates its learning and assessment frameworks with grantees, routinely shares all the grantee feedback it receives from its Grantee Perception Report (GPR), regularly makes time to convene grantees for shared learning, and also pays grantees for their time in helping to inform Weingart’s trustees about the problems it seeks to solve.

Supply and Demand

One of the questions we get the most about #OpenForGood’s efforts to build an open, collective knowledge base for the field is whether anyone will actually use this content. This concern also surfaces in CEP’s interviews, with a number of CEOs citing the difficulty of knowing what is useful to share as an impediment to openness. A big source of optimism here is learning that a majority of CEOs report that their decisions are often informed by what other foundations are learning, meaning foundations can rest assured that if they supply knowledge about what is and isn’t working, the demand is there for that knowledge to make a larger impact beyond their own foundation. Think of all that untapped potential!

Of course, given the current state of knowledge sharing in the field, only 19 percent of CEOs surveyed report having quite a bit of knowledge about what’s working at peer foundations, and just 6 percent report having quite a bit of knowledge about what’s not working among their programmatic peers. Despite this dearth of knowledge, still fully three-quarters of foundation CEOs report that they use what they have access to from peers in informing strategy and direction within their own foundations.

Thanks to CEP’s research, we have evidence of real demand for a greater supply of programmatic knowledge. Now there is every reason for knowledge sharing to become the norm rather than the exception.

--Janet Camarena

Data Fix: Do's & Don'ts for Reporting Geographic Area Served
November 1, 2018

Kati Neiheisel is the eReporting liaison at Foundation Center. eReporting allows funders to quickly and easily tell their stories and improve philanthropy by sharing grants data.

This is the second post in a series intended to improve the data available for and about philanthropy.

KatiNeiheisel_FCphotoThe first post in our Data Fix series focused on areas that may seem straightforward but often cause confusion, including recipient location data. But don’t confuse recipient location (where the check was sent) with Geographic Area Served (the area meant to benefit from the funding). Data on recipient location, one of our required fields, allows us to match data to the correct organization in our database, ensuring accuracy for analyses or data visualizations. In contrast, Geographic Area Served, one of our highest priority fields, helps us tell the real story about where your funding is making an impact.

How to Report Geographic Area Served

We recognize that providing data on Geographic Area Served can be challenging. Many funders may not track this information, and those who do may depend on grantees or program staff to provide the details. It’s important to keep in mind that sharing some information is better than no information, as funders are currently the only source of this data.

DO DON'T
Do include details for locations beyond the country level. For example, for U.S. locations, specify a state along with providing geo area served at the city or county level. For non-U.S. locations, include the country name when funding a specific city, province, state or region. Don’t be too broad in scope. “Global Programs” may not be accurate if your work is focused on specific countries. Similarly, listing the geo area served as “Canada” is misleading if the work is serving the province of “Quebec, Canada” rather than the entire country.

Do use commas to indicate hierarchy and semi-colons to separate multiple areas served. For example:

  • Topeka, Kansas (comma used to indicate hierarchy)
  • Hitchcock County, Nebraska; Lisbon, Portugal; Asia (semi-colons used to list and separate multiple locations)
Don’t use negatives or catch-all terms. “Not California,” “Other,” “Statewide” or “International” may be meaningful within your organization, but these terms cannot be interpreted for mapping. Instead of “Statewide,” use the name of the state. Instead of “International,” use “Global Programs” or list the countries, regions, or continent being served.

Do define regions. If you are reporting on geo area served at the regional level (e.g. East Africa), please provide a list of the countries included in your organization’s definition of that region. Your definition of a region may differ from that of Foundation Center. Similarly, if your foundation defines its own regions (Southwestern Ohio), consider including the counties comprising that region.

Don’t forget to include the term “County” when reporting on U.S. counties. This will ensure your grant to an entire county isn’t assigned to the same named city (e.g. Los Angeles County, California, rather than Los Angeles, California).

Geographic Area Served in Foundation Center Platforms

Data provided (in a loadable format) will appear in “Grant Details” in Foundation Directory Online (FDO) and Foundation MapsFoundation Maps, including the complimentary eReporter map showing your own foundation’s data, also display an Area Served mapping view. 

Copy of Untitled

If data is not provided, Foundation Center will do one of the following:

  • Default to the location of the recipient organization
  • Add geo area served based on text in the grant description
  • Add geo area served based on where the recipient organization works, as listed on their website or in their mission statement, if this information is available in our database

Responsibly Sharing Geographic Area Served


Although our mission is to encourage transparency through the sharing of grants data, we acknowledge there are contexts in which sharing this data may be cause for concern. If the publishing of this data increases risks to the population meant to benefit from the funding, the grantee/recipient, or your own organization, you can either omit Geographic Area Served information entirely or report it at a higher, less sensitive level (e.g. country vs. province or city). For more information on this topic, please see Are You Over or Under-Protecting Your Grants Data? 5 Ways to Balance Transparency and Data Protection in Sensitive Contexts and Sharing Data Responsibly: A Conversation Guide for Funders.

More Tips to Come!

I hope you have a better understanding of how to report Geographic Area Served through eReporting. Without this data, valuable information about where funding is making a difference may be lost! Moving forward, we’ll explore the required fields of Recipient Name and Grant Description. If you have any questions, please feel free to contact me.

-- Kati Neiheisel

Share This Blog

  • Share This

Subscribe to Transparency Talk

  • Enter your email address:

About Transparency Talk

  • Transparency Talk, the Glasspockets blog, is a platform for candid and constructive conversation about foundation transparency and accountability. In this space, Foundation Center highlights strategies, findings, and best practices on the web and in foundations–illuminating the importance of having "glass pockets."

    The views expressed in this blog do not necessarily reflect the views of the Foundation Center.

    Questions and comments may be
    directed to:

    Janet Camarena
    Director, Transparency Initiatives
    Foundation Center

    If you are interested in being a
    guest contributor, contact:
    glasspockets@foundationcenter.org

Categories