450 likes | 462 Views
Session One – Advancement Data: Metrics and Communication.
E N D
Session One – Advancement Data: Metrics and Communication Your data may be complete and thoroughly clean; your metrics may be perfectly designed to measure what matters; yet without a thoughtful approach to getting that information into the right hands at the right time with the right context, your efforts may still fall short. This session will provide an organizational framework and proven techniques for success.
Advancement Data: Metrics and Communication Lisette Clem ‘85 ‘92MBA Director of Advancement Services Bryant University Smithfield, RI
Agenda • Define “internal constituents” • Why share information? • What advancement data/metrics do we share? • When do we share? • How is the information shared (in what format)?
Internal Constituents • Outside of the Advancement division (be sure to include those folks with whom you share external constituents!): • Controller’s Office • President’s Office • Other organizational divisions to whom donations are being directed
Internal Constituents • Within the Advancement division: • Alumni/Constituent Relations • Development • Marketing/Communications • Within Advancement Services: • aka “the Cool group” • aka “Team Awesome”
TODAY’S FOCUS: Using the Power of Information Sharing in engaging and informing our divisional constituents (and getting them to pay attention!)
Our Goal: To CONSISTENTLY provide a Proactive vs. Reactive approach to information sharing and analysis, focusing on both detailed reporting (for Annual Fund managers) and high-level summaries (for VPs) to accommodate everyone’s needs* *In this case study: without the benefit of a Business Intelligence system or built-in dashboards (it’s coming!)…
WHY should we proactively share information/data? (What does it matter?)
Why? • Communication and analysis of the usefulness of the data in fact justifies renewed investments in technology • Fewer data requests in to our Report Writing staff; enables multi-tasking • Enhanced stewardship (for soft credit gifts) • More effective prospect management (air of friendly competition at monthly PM meetings) • Less chaos prior to Trustees’ meetings (!)
WHAT do we share? • General Development Performance Reporting: • Gift and Pledge Processing (daily transaction reports) • Fiscal Year Status • Campaign Reporting • Prospect Management • Annual Giving • Fiscal Year performance and Trend data • Alumni (Constituent) Participation Rate • Data Mining model performance • Definition/Results • Strategy Recommendations • Event Management • Budget Reporting • Expenses/Revenue • Return on Investment • Alumni Engagement Tracking
Prospect Management: Key areas of reporting • Prospect Pool • Visit Reports • Tickler Reports (to support Moves Management) • Results Reports • Pledges and Gifts • Planned and Pending Solicitations • Ongoing Program Management
Annual Giving: Fiscal Year performance and Trend data
Fiscal Year Annual Giving FY14 vs. FY15 New PledgesMonthly Comparison(Excludes Planned Gifts)As of 5/31/15
Constituent Participation Rate • The Politics of Participation • One size does NOT fit all • CASE , VSE, and US News have standards for their comparisons • These standards may or may not be useful, or even accurate, for your purposes • When submitting for them, make every attempt to understand their standards, to preserve the value of benchmarking with other institutions
Alumni Participation Rate (at Bryant University) • In both our external and our internal alumni participation rates, senior giving donors are counted as alumni donors, as anyone who attends Bryant for 2+ semesters has the option of calling themselves an alumnus/ae. The entire senior class is included in the denominator. • Our “lost” alumni percentage is approximately 5.5%, down from approximately 10% four years ago. ALL NOT-LOST Alumni are considered “of record” per CASE standards, regardless of giving history. • In our internal donor count, all soft credit donors ARE included; however, approx. half of these are alum/alum spouses, which CAN be counted per CASE. (Counting the remainder of our soft credit donors accounts for approximately +0.4% of our internal participation rate.) • Our internal alumni participation rate excludes those on a “Do Not Solicit (DNS)” code (approx. 650 DNS alums of our 36,500 alumni of record). • Including ALL degree holders vs. only including UG degree holders decreases our alumni participation rate by just 0.2%.
Data Mining: Definitions/Results and Strategy Recommendations
Current Models • Affinity Insight Acquisition Scoring Model: The scores from this model can be used to identify those never donors most likely to be responsive to annual fund appeals. • Predictive Affinity Retention Scoring Model: The scores from this model can be used to identify and retain those alums who have given in the past but have not made a gift in the current fiscal year. • Predictive Affinity Donor Scoring Model: The scores from this model can be used to identify alums most likely to make “leadership-level” annual gifts, as well as new major giving prospects.
FY12 Results – Acquisition Model (Recall that each score, or decile, represents 5% of our alumni population, so expected results would be 20% for four scores.)
Data Mining Strategy: Recommendations • Acquisition Model: • “Acquisition Campaign” – New Donor Drive • Targeted DM to scores 15-20 – “flyer” format (not letter!) • Telefund focus on scores 15-20 • Targeted e-solicit messages to scores 15-20 • Retention Model: • “Retention Campaign” – Save a Donor • Intense Telefund and E-Solicit focus to scores 15-20 • Follow up with personal outreach to all those not yet renewed by 5/31/13 – Firm Goal is 100% renewal for scores 15-20 • Donor Scoring Model: • Leadership Giving Campaign – Raising Sights • Assign scores 17-20 to Bryant Fund Leadership Giving officer • Personal Outreach
Data Mining: Coming Soon!! • Predictive Affinity Discovery Scoring Model: This model will focus on those alums who are most likely to accept a Discovery Visit invitation. The model will be built from alums who have been asked to accept a Discovery Visit. The focus of the model will be on alumni database variables that discriminate between those alums who accepted a Discovery Visit and those who didn’t. The result of the modeling process will be a score for each reachable alum in the database.
Event Management
Budget Reporting: Expenses/Revenue and Return on Investment (ROI)
WHEN do we share? “when it happens” Daily Weekly Monthly Quarterly Annually
HOW do we share the information? Paper Email (Internal) Web site Regular Meeting Common drive Other? (Soon: Dashboard!)
Other Useful Examples: • Advancement Services Information, Reference and Forms Guide • Advancement Services Information and Resources Web page (live quick links) • Advancement Services Annual Report
Fails • TOO MANY prospect management reports – negative impact on perceived relevance • Don’t save copies of everything (maxed out our server allotment!) • Edits/customization by person will be requested – saying no is OK! (unless it’s the VP)
Other suggestions/ideas, management tips, best practices -- what works for you to keep colleagues, constituents and staff informed?
Thank you! Have a great day! Lisette Clem, Bryant University lclem@bryant.edu 401-232-6805