Research and Resources

Survey – Help reform Australia’s outdated Fundraising Regulation

Survey - Help reform Australia's outdated Fundraising Regulation

Fundraising reform and relief is one of the first issues that was raised with the Charities Crisis Cabinet (CCC) as Australian charities responded to COVID-19 and bushfire recovery.

Charities told us Australia’s out of date and not fit for purpose system of fundraising regulation stymied and hindered them when they needed to be nimble, and when they most needed support.  Crisis is not a time to work through seven different sets of rules and regulations in order to place a ‘donate here’ button on a website.

Both the Australian Treasurer and the Royal Commission into National Natural Disaster Arrangements engaged with this issue and have sought to advance the harmonisation of fundraising regulation to establish a single national regulatory scheme.  The CCC commends the Treasurer for elevating this issue into his discussions with state and territory colleagues.

Progress against the Royal Commission’s recommendations is being tracked here, including Recommendation 21.2 – Reform fundraising laws: Australian, state and territory governments should create a single national scheme for the regulation for charitable fundraising.

The CCC and the Fixfundraising coalition will continue to push for the full implementation of the Royal Commission’s recommendation.
 
To do this we commissioned independent research to understand the speed and complexity of registration and compliance regimes that charities and NFPs are required to comply with when they seek to fundraise to support their work and their communities.  Our thanks to all charities and NFPs that participated in the research.
 
The results were released on 27 May 2021 and are available here: Fundraising-Survey-Report-Final- 052021.pdf
 

Impact Investing – Making it happen

Impact Investing - Making it happen

Impact investing is growing in Australia and around the world. For some charities, impact investing offers new ways of achieving their purpose with a different form of investment. There is a lot of money available for impact investing, but very few charities have investment ready proposals.  

In 2018, CCA and Life Without Barriers ran a series of CEO forums to look at what can be done to bridge the gap between investor interest and investment readiness.  We brought together expert impact investing intermediaries with leaders from across the charities sector in a frank sharing of experience, wisdom and learning.

Impact Investing – Making it happen shares the learning and recommendations from the forums.  It provides 12 recommendations for not-for-profit organisations and identifies seven areas where government should act to support capacity development and growth in the impact invesment ecosystem.

We hope charity leaders and Boards find it a useful resource as they turn their mind to the potential of impact investing, and we hope that governments act to realise the promise and the potential of a robust and vibrant impact investing ecosystem.

Our thanks to the charity and not for profit leaders who participated in the forums, and our partners and expert panelists: Life Without Barriers, Social Ventures Australia, Social Outcomes, Koda Capital, NAB and PwC Australia

Media Release: Is this the Australia we want?

Is this the Australia we want? Media Release, 7 May 2019

A new national report has revealed that in many fundamental areas of life, Australia is heading in the wrong direction. 

Australia we want - Second Report

According to David Crosbie (CEO of CCA); ‘We need to look beyond economic indicators and start focusing on the values that make Australia a great place to live.  We are all much more than passengers in an economy.  We are part of families, workplaces and communities.  Within our communities we want to live lives that are worthwhile and enact values we believe in.  This report highlights how far Australia is slipping in achieving some very important values.  It should be a wakeup call for all of us.’

The Australia We Want is the first ever benchmark of how Australia and each State and Territory is performing against values and goals prioritised by leaders from across the charities sector.  These values include; just, fair, safe, inclusive, equality of opportunity, united, authentic, creative, confident, courageous, optimistic, generous, kind, and compassionate.  Drawing on key statistics from the OECD, ABS and the AIHW, to evaluate these values, this second report reveals that:

  • Australian incarceration rates are very high – twice most countries in Europe and rising (how just are we?)
  • Australian suicide rates are higher than our road toll and increasing (how inclusive are we?)
  • Australia is slipping down both the international corruption scale and the scale of international generosity (how authentic and compassionate are we?)
  • We continue to increase our CO2 emissions (how sustainable are we?)
  • The gap between how safe women and men feel walking alone at night is one of the highest in the world (how safe are we?).  
  • NSW and NT are the worst performing and Tasmania is the best performing jurisdiction when it comes to achieving communities where the prioritised values are being achieved.  The ACT has slipped down the rankings and WA has improved since the 2016 report.

The positive news is; volunteering has increased, and Australia isabove average compared to other OECD countries in equality of access to employment, education levels, business and consumer confidence.  Housing affordability and income distribution need improvement.

David Crosbie (CEO of CCA) said: ‘The findings of this report are damning – they tell us that if we are going to live in the kind of Australia we want for ourselves and our children, we need to work at changing some of these fundamental issues.’  

#AusWeWant Solutions Forum – Education – Equal Opportunity

#AusWeWant Solutions Forum - Education - Equal Opportunity

Building the Australia we want starts with education

CCA, with the support of partners Origin Foundation and PwC Australia brought together 30 leaders in the first Australia We Want Solutions Forum in Melbourne on 2 August, 2017. This exceptional gathering of change-makers and thought-leaders focused on the #AusWeWant measure of educational attainment, an indicator for the #AusWeWant value of equal opportunity (and of course with relevance across other values).

The Solutions Forum begins our work in developing a policy and advocacy platform to create the Australia We Want. We have started an important conversation. A report will follow.  We look forward to sharing news and resources and together, creating the momentum for change.

Key News and Resources:

A word from our partner

Sean Barrett, Head of Origin Foundation

Last year, the Community Council of Australia launched the Australia We Want campaign.  It moved the debate on from economic rationalism to paint a picture of a compassionate, fair, inclusive, generous and innovative Australia.  The campaign touched a nerve and provoked widespread media coverage and public discussion.

In the next phase of the campaign, the CCA is taking on the challenge of translating the vision into tangible actions to make Australia a better place.

It is starting by looking at the catalytic role of education in achieving the Australia We Want.

Values

The work of the Productivity Commission in its report Deep and Persistent Disadvantage in Australia (July 2013) makes clear the pivotal role of Education:

‘Education is a foundation capability. It improves a person’s employment prospects and earning capacity, and the evidence points to a relationship between education and better health and raised civic and social engagement.’

‘Civic and social engagement’ summarises many of the values enshrined in the Australia We Want.

The Productivity Commission report cited the critical areas of educational underachievement as being among children in low SES communities, those living in regional and rural areas, and the Indigenous.

Measurement

A measure of education suggested in Social Inclusion in Australia.  How Australia is Faring (Commonwealth of Australia 2010) is:

‘Participating in schooling and completing a Year 12 or Certificate II assists people to find employment, participate in community activities and improve their wellbeing.  Therefore, it is an important indicator of social inclusion.’

Urgency

There is no time to lose. 

The future of work is changing rapidly.  Gone are the days of education and training for a single career and retirement at 65.  It is possible that some of today’s school children will live to be 100 and their working lives will span more than 65 years.  They will have six or more different careers.  More than 60% of today’s schoolchildren will eventually be employed in jobs that have yet to be created.  In these scenarios it is widely accepted that people will need at least 14 years of education.  Only the educated and adaptable will be able to survive in this jobs market.

Educational under performance among Indigenous children is notorious and now a demographic time bomb is emerging.  ABS projections shows that soon 33% of the Indigenous population will be below the age of 14 years.  This compares with 18% in the non-Indigenous population. If another generation is lost to low educational outcomes it will create problems in the welfare, health and justice systems later on.

Second class citizens are being created in rural and regional Australia.  Educational attainment decreases the further you go from metropolitan centres.  The children outside the city are not getting the same educational and life opportunities as their fellow urban Australians.

We have entered a period where facts, and science can be ignored and replaced by the outlandish.

In such circumstances we must rethink education to help address the things that ail our society.

Call to Action

The discourse around education is largely negative.  It is recognised among social marketers that achieving change requires raising awareness of the problems – the negatives – but this must then be followed by presentation of the solutions.  Continued focus on the negatives leads people to ‘turn off’; to regard the problem as intractable.

The CCA is now challenging you, the leaders in education policy development, and delivery to move beyond the current negative discourse on what is wrong with the education system to build on the successes. What can we learn from the initiatives and programs that are re-engaging children in learning and helping them to fulfil their potential?  What are the two or three critical levers that will create an education agenda which will deliver educational advantage to all, and thereby lay the foundations for achieving the Australia We Want.

Conversely, if educational attainment is not improved the goals of the Australia We Want will not be achievable and disadvantage will worsen.  As Prof Tony Vinson explained in his landmark research Dropping Off the Edge: ‘Profiling of Australia’s most disadvantaged communities using social, health and economic indicators highlights the central importance of limited schooling in triggering and sustaining concentrated local disadvantage’.

The Australia We Want, First Report. October 2016

The Australia We Want, First Report

The Australia We Want, First Report provides the first ever benchmark of how Australia, and each State and Territory, is performing against values prioritised by leaders in the charities and not-for-proft sector.  Released on 27 October at the National Press Club, with CCA Chair, Rev Tim Costello asking us all to accept the challenge to Achieve the Australia We Want.

A MESSAGE FROM CCA CHAIR, REV TIM COSTELLO

Is the Australia of today and the future we are creating, the Australia we want?

The Australia We Want, First Report asks us all to answer this question, and to act on the answers we draw from its data and findings. 

The Community Council for Australia and charity leaders from across the sector began this work at a roundtable in 2015. We were determined that debates about Australia’s future move beyond a discussion of the type of economy to be achieved, and to talk fundamentally about the society we want to live in. Australians are more than individual tax paying economic units.  Our productivity, innovation, skills and achievements are actually grounded in flourishing communities within our schools, workplaces, families and local neighbourhoods.

The Australia We Want is the first ever benchmark of how Australia and each State and Territory is performing against values and goals prioritised by leaders from across the charities sector.  These values include: just, fair, safe, inclusive, equality of opportunity, united, authentic, creative, confident, courageous, optimistic, generous, kind, and compassionate.  Drawing on key statistics from the OECD, ABS and the AIHW, to evaluate these values, the report reveals that:

  • Australian incarceration rates are high – 3 times that of Ireland and rising (how just are we?)
  • Australian suicide rates are higher than our road toll and increasing (how inclusive are we?)
  • Inequality in income distribution is higher than most other OECD countries – and growing (how fair are we?) 
  • Australians volunteer less and give less as a percentage of income than five years ago (how kind and generous are we?)
  • Australia is slipping down both the international corruption scale and the scale of international generosity (how authentic and compassionate are we?) 

The values and indicators outlined in this report provide a framework for exploring, debating and evaluating the strengths, weaknesses, opportunities and threats inherent in the way all of us act – charities, businesses, governments, communities. 

I hope that you will join me in a movement for change, and use this information in your work and your sphere of influence to help create a better Australia. We can all do more to achieve the Australia we want. 

Please also stay in touch with CCA’s work.  Our next step is to bring together sector leaders in AusWeWant roundtables later this year and next year.  If you would like to know details as these are scheduled, please contact CEO, David Crosbie and the team at CCA: info@communitycouncil.com.au

Rev Tim Costello

Chair, Community Council for Australia

CCA acknowledges the generous support of Equity Trustees and the Centre for Social Impact in this project.

Owning Our Future – Mergers and Collaborations

Owning Our Future - Mergers and Collaborations

In October and December 2015, CCA staged a major campaign to promote more robust discussion of mergers and collaborations. The campaign included a series of national forums and the report is a record of the discussion and feedback.

The focus of the forums was the #GoodSave case study presentation by Jayne Meyer Tucker who outlined the ideas and practicalities that drove the merger of Good Beginnings Australia and Save the Children Australia.

Views of the attendees varied but the general mood could be summed up in the words of one … “Merger may be a dirty word for many, but that should not stop consideration of possible consortia, partnerships, and collectives, all versions of collaboration that need to be higher on our agendas”.

Owning Our Future: Better Using Our Assets Report

Owning Our Future: Better Using Our Assets Report - 7 April 2015

For over a decade the not-for-profit sector has experienced a wave of growth that outstrips any other Australian industry. Turnover has risen by 40 per cent in the last six years to above $107 billion per annum.  Assets now top $175 billion and the sector employs over one million Australians with a further five million involved as volunteers.  

‘Growth is beginning to slow,’ says CCA CEO David Crosbie. ‘Government funding and philanthropy are stalling, and there is a thick fog of uncertainty largely due to federal policy and funding indecision.’ 

Uncertainty was the key issue raised in both the annual ProBono Australia’s survey of the not-for-profit sector and the PwC-CSI Community Index with many in not-for-profits concerned about the future of their organisation over the coming 12 months. 

There is a shift towards taking control of the sector’s future by better leveraging its $175 billion assets. CCA, PwC, Community Sector Banking, Equity Trustees, Social Ventures Australia and Origin Foundation together rolled out the first round of forums to discuss this matter in the CCA Owning Our Future Series. Over 100 sector leaders joined experts to work through barriers and opportunities to unlock the potential of sector assets, both at organisational and broader policy levels.

Crosbie says the report from the Better Using Our Assets series carries 14 recommendations for not-for-profits and five recommendations for government that would better position organisations to achieve their purpose and access new avenues of funding outside of government.

‘Big issues include risk management; strategy focused on organisation purpose and mission today and in five to ten years time; collaboration and mergers; and the need to communicate value and diversify income streams,’ said Crosbie.

‘Government has a key role in providing regulatory certainty and the policy environment to support the growth of social investment. We also need a shift in relationship with governments and bureaucracy. There are great gains to be made by cutting red-tape and working with the sector to streamline processes and achieve contracting arrangements that are performance oriented and work from a partnership base.’

‘It’s a report that we hope will stimulate thinking and discussion within organisations, within the sector and at a national policy level,’ says Crosbie.

The report will be formally released by the Assistant Treasurer Josh Frydenberg at CCA’s AGM and National Roundtable 8 April 2015. To receive a hard copy or to find out more about the CCA Owning Our Future Series, email deborahs@communitycouncil.com.au

State of the Not for Profit Sector Survey – September 2014

State of the Not for Profit Sector Survey - September 2014

Providing a Voice, Performance now and over the past year

Pro Bono Australia’s State of the Sector Survey 2014 undertaken in partnership with Community Council for Australia and Netbalance Research Institute seeks to provide a voice for all those with an interest in the Not for Profit sector – especially Pro Bono Australia’s subscribers and other stakeholders that contribute to the performance of the sector.

It is not an academic study, there are other surveys that serve that purpose. However, we have used questions which appeared in the 2013 and 2010 Election Manifesto Surveys and the 1200+ respondents in the 2014 survey are drawn from the same pool of sector stakeholders – sector leaders and senior managers, volunteers, advisors, and clients. The survey engaged respondents from all industries and sizes of Not for Profit organisations.

The survey findings are designed to prompt discussion and offer opinions from the sector. In addition to the many numbers, we focus on what respondents have said – their concerns, their hopes and most importantly their views on how to help the sector thrive.

See CCA’s media release here: http://communitycouncil.com.au/node/187

At the lauch of survey finding: CCA Chair Tim Costello, Pro Bono Australia CEO Karen Mahlab and CCA CEO David Crosbie.

Executive Office of the President, Memorandum, 26th July 2013

Executive Office of the President, Memorandum, 26th July 2013

EXECUTIVE OFFICE OF THE PRESIDENT
OFFICE OF MANAGEMENT AND BUDGET
WASHINGTON, D.C. 20503
THE DIRECTOR July 26, 2013

MEMORANDUM TO THE HEADS OF DEPARTMENTS AND AGENCIES

SUBJECT: Next Steps in the Evidence and Innovation Agenda
Executive Summary
The President recently asked his Cabinet to carry out an aggressive management agenda for his second term that delivers a smarter, more innovative, and more accountable government for citizens. An important component ofthat effort is strengthening agencies’ abilities to continually improve program performance by applying existing evidence about what works, generating new knowledge, and using experimentation and innovation to test new approaches to program delivery. This is especially impmiant given current fiscal challenges, as our nation recovers from a deep recession and agencies face tough choices about how to meet increased demand for services in a constrained resource environment.
To help agencies move forward in harnessing evidence and evaluation, this memo:

Provides guidance for 2015 agency Budget submissions and describes plans to prioritize Budget requests that strengthen the use of evidence and innovation.

Invites agencies to participate in a series of workshops and interagency collaborations organized by the Executive Office ofthe President to help agencies develop and strengthen proposals that catalyze innovation and learning. While much ofthe focus will be on proposals that can be implemented without additional resources, there will be limited funding available in the President’s 2015 Budget for strong proposals that require some new funding.
Using Evidence and Innovation to Improve Government Performance
2015 Agency Budget and Performance Submissions
Agencies are encouraged to both: (1) draw on existing credible evidence in formulating their budget proposals and performance plans and (2) propose new strategies to develop additional evidence relevant to addressing important policy challenges. Agency requests are more likely to be fully funded ifthey show a widespread commitment to evidence and innovation.
Evidence in agency budget submissions and performance plans.
Agencies are encouraged to allocate resources to programs and practices backed by strong evidence of effectiveness while trimming activities that evidence shows are not effective. In addition, major new policy proposals, and agency perfmmance plans, should be accompanied by a thorough discussion of existing evidence, both positive and negative, on the effectiveness of those proposals in achieving the policy objective or agency priority goal. Such evidence includes evaluation results, performance measures, and other relevant data analytics and research studies, with a preference for high quality experimental and quasi-experimental studies. (Please include citations for evidence discussed.) Moreover, evidence should be regularly considered during agencies’ data-driven reviews led by their Chief Operating Officers and in annual strategic review processes.
New proposals for developing evidence
In their budget requests, agencies are also encouraged to include new proposals for developing evidence that can be used to improve existing programs or to infmm decisions about new programs. (This includes proposals that build on and enhance cunent efforts.) Recognizing the current budgetary pressures on agencies, OMB encourages agencies to focus their energies on a small number ofhigh-quality proposals that meet one or more ofthe following tests:
• They address important policy questions and generate evidence that could be actionable. In particular, evaluations should measure the outcomes that are relevant for judging whether a program or intervention is achieving its goals.
• They will yield credible evidence ofprogram or policy impacts, for example by utilizing randomized controlled trials or carefully designed quasi-experimental techniques.
• They will help agencies direct a larger share ofresources towards evidence-based practices, for example by modifying grant criteria or better disseminating information.

Agencies are encouraged to consider the following cross-cutting strategies. Specific examples of each strategy are provided in Attachment A.

1. Harnessing data to improve agency results: Proposals should enable agencies and/or researchers to access and utilize relevant data to answer important questions about program outcomes while fully protecting privacy. For example, by linking data on program participants to administrative data on earnings, college-going, health, or other outcomes, agencies may be able to improve their understanding ofprogram performance and ultimately improve results. Projects should build on the recent Executive Order, “Making Open and Machine Readable the New Default for Government Information,” as well as on the Memorandum “Sharing Data While Protecting Privacy” (M-11-02). We especially encourage proposals that use administrative data to track important outcome measures for federal grant programs, and we are open to proposals that substitute higher quality administrative data for existing grantee reporting requirements.

2. High-quality, low-cost evaluations and rapid, iterative experimentation: Proposals should help agencies improve the quality and timeliness of evaluations, for example by building evaluation into ongoing program changes; reducing costs by measuring key outcomes in existing administrative data sets; and drawing on private sector approaches that use frequent, low-cost experimentation to test strategies to improve results and retum on investment. Proposals should utilize randomized controlled trials or careful quasiexperimental techniques to measure the effect ofinterventions on important policy outcomes. We particularly welcome proposals that draw on behavioral insights to improve results and lower costs in direct operations.

3. Using innovative outcome-focused grant designs: Proposals should expand or improve the use of grant program designs that focus Federal dollars on effective practices while also encouraging innovation in service delivery. These include tiered-evidence grants, Pay for Success initiatives and other pay for performance approaches, Performance Partnerships allowing blended funding, waiver demonstrations, incentive prizes, competitive incentive funds that encourage the use of evidence-based practices in fmmula grants, or other strategies to make grant programs more evidence focused.

4. Strengthening agency capacity to use evidence: Proposals should strengthen agency capacity by promoting knowledge-sharing among government decision-makers and practitioners through clearinghouses that help translate strong research into practice; enhancing the skills ofmanagers, program officers, and review panels to assess and use available evidence; and developing common evidence frameworks to better distinguish strong from weak evidence and measure cost effectiveness.

5. Other agency-specific needs: Agencies may propose other strategies that would significantly improve their capacity to use or build evidence to achieve better results or increase cost-effectiveness in high priority programs. In addition to developing strategies to use evidence to promote continuous, incremental improvement, agencies are also · encouraged to submit proposals that would test higher-risk, higher-return innovations with the potential to lead to more dramatic improvements in results or reductions in cost.

While agencies are encouraged to submit proposals that can be implemented within cunent statutory authorities, legislative changes will also be considered. (Please note where a proposal would require legislative changes.) Agencies may also propose new investments in evidence-building infrastructure for high-priority areas in cases where the benefits substantially outweigh the costs. Agencies may wish to consider new financing approaches, set-asides that designate a small fraction of funding for evaluation and evidence development; and partnerships with other federal agencies, state and local governments, non-profit organizations, and academic institutions. We particularly encourage proposals that cross agency boundaries or other functional silos.
Agencies should work with their OMB contacts to agree on a format within their 2015 budget submissions to: (1) explain agency progress in using evidence and (2) present their plans to build new knowledge ofwhat works and is cost-effective. An example of a template that could be used to provide this information to Resource Management Offices is available at https://max.gov/omb/evidence.
Workshop Series and Interagency Collaborations
To support agencies in developing and refining proposals, this September we will begin an interagency collaboration process with a kickoff briefing or call followed by a series of workshops (see Attachment B for details). An initial list of workshop topics is below; we may schedule additional workshops based on agency demand and continue this series after agency budget submissions are finalized to support implementation. Versions ofthese workshops may be tailored to agencies at different stages of experience with evidence-based practices.
• Workshop I: How can agencies focus evaluation resources on the most important
program and policy questions?
• Workshop II: How can agencies use administrative data sets from multiple programs and levels ofgovernment to answer important questions while protecting privacy?
• Workshop III: How can agencies conduct rigorous program evaluations and data
analytics on a tight budget?
.• Workshop IV: How can agencies use their existing authority to tum a traditional
competitive grant program into an innovative, evidence-based one?
• Workshop V: How can agencies harness research findings from the social and behavioral sciences to implement low-cost approaches to improving program results?
The workshops will be designed to build and share knowledge across the Federal government as well as to identify expertise and resources to help agencies implement strong proposals. Beyond the workshops, OMB, DPC, CEA, and OSTP are available to provide other fmms of assistance:

• Technical assistance in designing evaluations and improving tools. This may include connecting your agency with Intergovernmental Personnel Act (IP A) assignments or consultation from outside experts to help design and implement your proposals. For example, a number ofexternal organizations, such as the NYU Governance Lab, J-PAL North America, the Pew-MacArthur Results First initiative and the Coalition for Evidence-Based Policy are seeking Federal partners for evidence and innovation initiatives designed to improve results at the Federal, State, and local levels.
• Guidance and/or technical assistance in meeting government-wide requirements, including the Federal Acquisition Regulation, grants policy circulars, and Paperwork Reduction Act clearance requirements. For example, OMB helped USDA develop a generic clearance package to facilitate review and approval of behavioral insights research covered by the Paperwork Reduction Act.
Additional, up-to-date information on the workshop series, as well as on other available resources, can be found at https://max.gov/omb/evidence.
Next Steps
Agencies should work with senior leadership, including Deputy Secretaries; budget, performance and evaluation officials; program officials; and other relevant staff in order to (1) fulfill the requirements ofthe memo within your 2015 Budget submission; and (2) ensure participation in the EOP workshops and interagency collaboration.
As follow up, please designate up to two agency leads to work with policy, program, budget, evaluation and management support offices to coordinate agency participation in the workshops and send these to Dan Rosenbaum and Andy Feldman ofOMB at evidence@omb.eop.gov by August 15th. Agency leads should be well positioned to ensure workshop participants are able to engage with senior agency leadership on potential applications ofnew tools and approaches. If agencies have suggestions on other topics for workshops, would prefer to have less formal exploratory meetings to discuss preliminary ideas, or are interested in accessing the types oftechnical assistance mentioned above, please send those suggestions and requests to evidence@omb.eop.gov or to your OMB Resource Management Office points of contact.

Attachment A
Examples of Evidence and Innovation Strategies and Tools
Administrative data collected by Federal, State, or local agencies to run programs can be a valuable resource for program improvement and for helping agencies, consumers, andproviders make more informed decisions.
(1) Linking data across programs and levels of government while fully protecting privacy
Linking data across programs can lower evaluation costs and improve their quality, streamline reporting requirements for program providers and participants, and answer important questions about program performance. A number of Federal agencies are cunently developing or using protocols and processes to share personally identifiable data to permit such linkages in ways that fully adhere to laws, regulations, and policies designed to protect individual privacy and confidentiality.
• Example: The Department of Housing and Urban Development has partnered with the Department ofHealth and Human Services to match HUD administrative data with Centers for Medicare & Medicaid Services data. The two agencies recently completed a successful match that will improve understanding of the characteristics of seniors living in publicly subsidized housing and how supportive housing interventions may affect their health care use.
(2) Provider scorecards
Reliable data from government agencies can be used to create provider scorecards that compare how well different service providers perform. Scorecards are a tool for agencies and consumers to make more informed decisions and choices-and for providers to better understand and improve their performance. If data on participant characteristics are available, such as education level or income, scorecards can go a step further by enabling more detailed comparisons of alternative providers that serve people with similar characteristics.
• Example: The College Scorecard, launched earlier this year, highlights key indicators about the cost and value of colleges and universities to help high school students choose a post-secondary school that meets their needs. It is produced by the Department ofEducation and posted on its web site. The Scorecard includes data on costs, graduation rates, loan default rates, and average student debt-and average earnings ofrecent graduates will be added soon.
Many innovative companies use rapidly conducted randomized field trials to identifY highimpact
innovations and move them quickly into production. In the public sector, low-cost, frequent field tests do not replace longer-term, rigorous evaluations -they supplement them.

They allow innovative administrators to say: “Might this help boost results? Let’s try it and see ifit works. “
(1) Applying behavioral insights to improve results and lower costs in direct operations
Human decision making is central to many public policy interventions. Major advances have been made in research regarding the influences that drive people’s decisions and choices, and these new insights can significantly improve policy outcomes at a lower cost.
• Example: Research has revealed the power of”social norms” on behavior, meaning the influence of what others do on our decisions. Building on this insight, the Fiscal Service at the Treasury Department has recently updated the text and format of letters sent to individuals with delinquent debt to the federal government. The new letters, which will be tested against the older version using a randomized control trial, use simplified · language, personalization, and a reference to social norms (i.e., the fact that 94% of outstanding debts are paid off on time and that the recipient is in the fraction that has not yet paid) to motivate a higher rate ofdebt repayment.
(2) Using high-quality evaluation to answer important policy and program questions
Rigorous impact evaluations, especially those using random assignment to program and control groups, can provide strong evidence on key policy or program questions within an agency. They can help determine whether a program works and whether an alternative practice might work better.
• Examples: Current Federal evaluations cover a diverse set ofissues, including the Occupational Safety and Health Administration examining the effectiveness of on-site consultation, inspections, and corrective action letters on worker injury/illness rates, the Millennium Challenge Corporation examining the impact of road improvements in El Salvador or commercial training activities in Ghana, and the Department of Energy examining the effects of smmi grids and dynamic pricing on household energy use.
(3) High-quality, low-cost evaluations that piggy-back on existing programs and datasets
By drawing on existing data to measure outcomes and on program changes that are being implemented anyway, agencies can conduct high-quality randomized evaluations at low cost. For example, when a program change is being phased in gradually or a program is oversubscribed, pmiicipants could in some cases be selected based on random assignment, allowing for rigorous evaluation.
• Example: Hawaii’s Opportunity Probation with Enforcement (HOPE) Program is a supervision program for drug-involved probationers. The program was evaluated using a randomized control trial at a cost of about $150,000 for the evaluation. The low cost for this rigorous evaluation was achieved by measuring outcomes using administrative data (e.g., arrest records) that the state already collected for other purposes, rather than doing costly new data collection. The study found that HOPE group members were 55 percent less likely than control group members to be re-arrested during the first year.
Because many Federal dollars flow to States, localities, and other entities through competitive andformula grants, grant reforms are an important component ofstrengthening the use of evidence in government. The goals include encouraging a greater share ofgrant fimding to be spent on approaches with strong evidence ofeffectiveness and building more evaluation into grant-making so we keep learning more about what works.
(1) Pay for Success
Pay for Success offers innovative ways for the government to pminer with philanthropic and private investors to fund proven and promising practices and to significantly enhance the return on taxpayer investments. Under this model, investors provide the up-front capital for social services with a strong evidence base that, when successful, achieve measurable outcomes that improve the lives offamilies and individuals and reduce their need for future services. Government pays when these measurable results are achieved. The PFS model is particularly well-suited to cost-effective interventions that produce government savings, since those savings can be used to pay for results.
• Examples: The Department ofJustice is coordinating PFS projects to use more effective prisoner re-entry interventions to reduce recidivism and its associated costs. And the Department ofLabor has launched an effmi to test new and more effective strategies for delivering workforce development and preventative social services that cut across existing program siloes, increase job placement and improve job retention.
(2) Tiered-evidence grant designs
“Tiered-evidence” or “innovation fund” grant designs focus resources on practices with the strongest evidence, but still allow for new innovation. In a three-tiered grant model, for example, grantees can qualify for 1) the “scale up” tier and receive the most funding; 2) the “validation” tier and receive less funding but evaluation support; or 3) the “proof of concept” tier and receive the least funding, but also suppmi for evaluation. With a tiered-evidence approach, potential grantees know that to be considered for funding, they must provide demonstrated evidence behind their approach and/or be ready to subject their models to evaluation. The goal is that, over time, interventions move up tiers as evidence becomes stronger. So far five agencies have launched or proposed 13 tiered grant programs in the areas such as education, teenage pregnancy prevention, home visitation programs, workforce, international assistance, and more.
• Example: The Department ofEducation’s Investing in Innovation Fund (i3) invests in high-impact, potentially transformative education interventions, ranging from new ideas with significant potential to those with strong evidence of effectiveness that are ready to be scaled up. Based on the success ofi3, the Department recently issued proposed regulations that would allow its other competitive grant programs to adopt this threetiered model.
(3) Performance Partnerships and Waiver Demonstrations
Performance Partnership pilots enable States and localities to demonstrate better ways to use resources, by giving them flexibility to pool discretionary funds across multiple Federal programs serving similar populations and communities in exchange for greater accountability for results. With waiver demonstrations, Federal agencies suspend certain programmatic requirements in discretionary or mandatory programs to support State and local innovations that are then rigorously evaluated to learn what works and what is cost effective.
• Example: The 2014 Budget would authorize up to 13 State or local performance partnership pilots to improve outcomes for disconnected youth. Pilot projects would support innovative, efficient, outcome-focused strategies using blended funding from separate youth-serving programs in the Departments of Education, Labor, Health and Human Services, Housing and Urban Development, Justice, and other agencies.
(4) Using competitive grants to promote use of evidence in formula grants
Formula grant programs are often the largest grant programs in government, so they are a critical area for advancing more results-focused government. Agencies can improve the effectiveness of formula grant programs by using competitive grants to encourage adoption of evidence-based approaches within formula grants. For instance, agency competitions cim give preference points to State and local applicants implementing evidence-based practices with their formula funds. And formula grants to States can include set-asides for States to award competitively to promote use of evidence.
• Example: For HHS, the 2014 Budget proposes to require that States use five percent of their mental health block grant allocation for grants that use the most effective evidencebased prevention and treatment approaches. The Senate Appropriations Committee adopted this policy in its recent bill.
(6) Multi-phase grant competitions
The quality of grant-funded projects can be enhanced by conducting a multi-phase selection process. In the first phase, before selection, agencies can share research findings with potential applicants to ensure they are integrated into project designs and implementation strategies. Expert input can also be used to develop program models or variations within models that the grant program could test and evaluate. Moreover, preference points can be given to applicants that implement research-informed models and agree to participate in a rigorous evaluation. Multi-phase designs are particularly useful when there are many applications ofvarying quality, where a streamlined pre-application process can identify leading proposals.
• Example: The Promoting Readiness of Minors in the Supplemental Security (PROMISE) program began with coordinated planning by the Depmtments ofEducation, HHS, Labor and the Social Security Administration to review existing research and gather input from experts to develop an integrated service delivery model that was incorporated into the grant solicitation. The next phases are grantee selection and rigorous evaluation of grantees’ approaches.
Evaluation is useful only to the extent that it is being used for decision making. An evaluation plan that focuses evidence-building resources on the most relevant and actionable issues helps generate useful knowledge. Common evidence standards and What Works Clearinghouses, meanwhile, help make existing evidence more useful to decision makers.
(1) Agency-wide evaluation plans
An agency-wide evaluation plan developed with senior policy and program officials can focus evaluation resources on high priority issues-for example, questions that are most important for improving program results-and on rigorous methodologies that produce actionable insights.
• Example: The Department of Labor has a Chief Evaluation Office (CEO) that works closely with program offices to develop and implement evaluation agendas set by policy officials. It also promotes high standards for data systems; monitors and reviews research and evaluation plans initiated by DOL agencies to ensure they are consistent with departmental goals and the highest standards of empirical rigor; works to institutionalize an evidence-based culture through seminars and forums on evaluation topics and findings; and maintains an active connection with outside experts to ensure that the Department is aware ofrelevant research and evaluation findings and activities.
(2) Common evidence guidelines for various types of research studies
Common research standards and evidence frameworks across agencies can facilitate evaluation contracting, information collection clearance, and the strengthening or creation ofresearch clearinghouses and repositories about “what works.” They also help agencies use results from different types ofhigh quality studies to identify effective programs, improve programs, and encourage innovative new approaches.
• Example: Evaluation officials from the Departments of Education, Labor, Health and Human Services, and the National Science Foundation are jointly developing common evidence guidelines for research studies that can be a resource for improving the quality of studies throughout the Federal Government.
(3) Cross-agency learning networks
Inter-agency working groups of evaluation and program officials within the Federal Government can share best practices, including helping spread effective procurement practices, developing common evidence guidelines, and better integrating evaluation and performance measurement
efforts. Other cross-agency groups are forming learning networks around specific policy issues in order to share relevant research and develop shared evaluation strategies.
• Example: The Small Business Administration and the Departments of Agriculture and Commerce, with guidance from OMB and CEA, are working together with the Census Bureau to find more robust ways to evaluate the impact ofFederal business technical assistance programs. The goal of the working group is to develop a standard methodology for measuring the impact of these types of technical assistance programs across the Federal Government.
(4) What Works Clearinghouses
“What works” clearinghouses are repositories that synthesize evaluation findings in ways that make research useful to decision-makers, researchers, and practitioners. Moreover, as Federal innovation funds and other programs provide financial incentives for using and building evidence, these repositories provide useful tools for understanding what interventions are ready for replication or expansion and disseminating results.
• Examples: Cunent “what works” clearinghouses include the Department of Justice’s CrimeSolutions.gov, the Department of Education’s What Works Clearinghouse, the Substance Abuse and Mental Health Services Administration’s National Registry of Evidenced-based Programs and Practices, and the Department ofLabor’s new Clearinghouse ofLabor Evaluation and Research.

Attachment B
Details on Overview Briefing and Initial Workshops
Overview briefing: A kickoff briefing or call for agency leads will provide an overview of tools available to help programs strengthen their abilities to generate and use evidence to improve program performance. It will also preview the workshops. (First week ofSeptember)
The following is an initial list of workshops. OMB and White House policy councils will organize additional workshops on topics in Attachment A based on agency interest. An up-todate workshop schedule can be found at https://max.gov/omb/evidence.
Workshop 1: How can agencies focus evaluation resources on the most important program and policy questions? (Second week ofSeptember)
• Overview: This workshop will engage participants in a focused discussion about the strategies certain agencies use to focus rigorous, independent evaluation on high priority, actionable research questions. Examples will include the Department ofLabor’s use of a Chief Evaluation Officer to coordinate agency-wide evaluation plans, including working with policy, program, evaluation and perfmmance management officials to create annual learning agendas for each division. Other examples will include the use of an evaluation policy statement by the Administration for Children and Families at the Department of Health and Human Services and the statutory structure ofthe Education Department’s Institute for Educational Sciences, which led to significant improvements in the quality of ED’s evaluations.
• Agency preparation and takeaways: Using a diagnostic checklist to assess the quality, relevance, and independence of their evaluation activities, participants in the workshop will assess the strengths of their own evaluation organizations and identify challenges and potential strategies for. overcoming them.
Workshop II: How can agencies use administrative data sets from multiple programs and levels of government to answer important questions while protecting privacy? (Date TBD)
• Overview: This workshop will examine several case studies where Federal agencies have answered compelling programmatic questions by linking data at the Federal level or with a State or local government or other entity. The session will explore:
o How to develop an effective partnership among all the parties involved, including policy officials, research experts, and legal counsel.
o What steps must be taken to ensure compliance with statutes, regulations, and policies governing privacy and confidentiality.
o How to design a data match to ensure it will answer key research questions, including strategies that use aggregated data.
• Agency preparation and takeaways: Participants should come to the workshop with at least one potential data sharing opportunity in mind that would help their agency toanswer an important performance or evaluation question. They will fill out a planning template during or after the session to apply the concepts they learn and help their agencies identify clear steps for progress.
Workshop III: How can agencies conduct rigorous program evaluations and data analytics on a tight budget? (Date TED)
• Overview: What low-cost strategies can agencies use to: (1) conduct strong program evaluations, including experimental and quasi-experimental studies, to identify effective strategies for delivering services and achieving program goals or (2) support data analytics on ways to achieve better results at lower cost? This workshop will review ways that agencies can:
o Embed testing of alternative strategies into their existing grant programs or direct operations.
o Maximize the use of high quality statistical or administrative data currently being collected and reduce the need for costly special purpose surveys.
o Form partnerships with academic experts, including using externally funded Intergovernmental Personnel Act (IP A) assignments, to design and conduct rigorous evaluations and data analyses and reduce evaluation costs.
• Agency preparation and takeaways: Participants should come to the workshop with one or more potential evaluation topics that focus on issues important to their agency. Participants will identify specific options to meet these evaluation needs based on the strategies discussed.
Workshop IV: How can agencies use their existing authority to turn a traditional competitive grant program into an innovative, evidence-based one? (Date TED)
• Overview: At this workshop, the Department ofEducation will explain how program and research officials partnered to design and implement the Investing in Innovation (“i3”) program and how the same innovation fund (or “tiered-evidence”) model is now being adopted by other programs across the agency. The Development Innovation Ventures (DIV) program at USAID, the Workforce Innovation Fund (WIF) at the Department of Labor, the Maternal, Infant, and Early Childhood Home Visiting (MIECHV) Program and the Teen Pregnancy Prevention Program (TPP) at the Department of Health and Human Services, and the Social Innovation Fund (SIF) at the Corporation for National and Community Service may describe their variations ofthe tiered model. The workshop will explore:
o What features make a grant program a good candidate to become an innovation fund?
o What are the perceived legal barriers and how might they be overcome?
o What expertise and resources are needed compared to a traditional grant program?
o What does an innovation fund grant solicitation look like?
o How does the selection process differ from a traditional program?
o How do these grant programs measure success?
• Agency preparation and takeaways: Participants should have at least one potential program candidate in mind when they attend the workshop. They will fill out a planning template during or after the session to apply the concepts they learn and help their agency consider which programs are the best candidates for the tiered-evidence approach.
Workshop V: How can agencies harness research findings from the social and behavioral sciences to implement low-cost approaches to improving program results? (Date TBD)
• Overview: This workshop will review ways in which agencies can apply empirical insights about human judgment and decision-making to federal programs and policies in order to improve outcomes or reduce costs. It will also explore how agencies can:
o Design and evaluate rigorous experiments, using randomized control trials where possible, to test the efficacy ofthese interventions.
o Fmm partnerships with academic experts, including using externally funded IPA assignments, in order to receive conceptual advice on cutting-edge research findings that should inform how policies are designed; and technical support on designing, evaluating, and iterating experimental field studies.
• Agency preparation and takeaways: Participants should come to the workshop with one or more potential program areas that could benefit from the application of low-cost behavioral solutions. Materials to help brainstorm about these areas will be provided in advance.

The FBT discussion – what is the CCA position?

The FBT discussion – what is the CCA position?

By David Crosbie, CEO, Community Council for Australia

In developing policy related to taxation concessions provided to the charities and not-for-profit sector, CCA has consistently adopted a longer term view of what might best serve the sector’s interests, both now and into the future.

It is worth noting that some very influential people, including Ken Henry and a number of senior government ministers have argued that what the sector should be striving for is appropriate terms and conditions for all employees.  The not-for-profit sector should not have to rely on tax concessions to recruit or retain staff.

While CCA agree with this approach in principle, it is clear that at present the sector does not provide employees with the level of financial rewards that are available in other sectors.  At least for the time being, some form of concession is required, particularly for lower paid employees across the sector.

Within this context, the questions we might ask about the future of the Fringe Benefit Tax concessions relate more to whether they are appropriately targeted and provide a good return on investment for the sector.  We need to ask whether the current cost of the FBT concession could be better directed supporting a different approach in providing a wage advantage to those employed in the charities and not-for-profit sector?

Any examination of the FBT concessions reveals that the benefits of the FBT concession have both eroded over time and increasingly favored the highest paid employees in the sector.  Advantages such as uncapped meals allowances and the capacity to claim multiple exemptions enable higher paid employees to gain significant pre-tax income that can be spent in restaurants, function centers and with travel agents.  These provisions clearly favor those with incomes above $150,000 much more than they do those earning less than $60,000.  A Victorian medical specialist working at three hospitals earning over $300,000 can claim three FBT concessions (over $50,000 tax free) and spend $100,000 of their income tax free on travel, meals and venue hire expenses.  This level of benefit is not available to lower paid employees.

The inequity in the FBT concessions makes it more difficult to justify, especially in comparison with some alternative approaches. 

It is for these reasons that CCA argued for further investigation of alternative models including the modeling of a straight income tax concession for all staff employed in charities.  If FBT was phased out and the personal income tax free threshold was increased by $15,000 for all charity employees, what would be the real costs and benefits?

CCA has supported the proposal to cap the meals allowance and to restrict multiple claiming of FBT exemptions, but it has also argued that all savings from these measures should be redirected to enable 15,000 smaller charities to gain Deductible Gift Recipient status.  This would mean most charities could receive tax deductible gifts from Trusts, Public and Private Ancillary Funds, high wealth individuals and the broader community.

The complexity of these issues has meant that at times the CCA position has been misrepresented in the media and in some associated discussions.  CCA has never argued there should be no tax concessions or that government should be clawing back money from the sector.  The CCA position is essentially that over time we need to better target our existing tax concessions.  It is this position that CCA put forward in our response to the Not-for-Profit Sector Tax Concessions Working Group Discussion Paper in December 2012.

You can access this submission on our website, and if you have any concerns or would like to talk through this issue in more details, we would welcome your feedback.