Overview of 3F-2015
FASTER FORWARD FUND 2015
Background & Request for 2015 Proposals (2015 detailed proposal information is also available on different webpages on this site–click here).
This fund (3F hereafter) has been established in order to facilitate and accelerate the development of the theory, practice, and profession of evaluation. It is based on the belief that explicit attention to critical study of the methodology and foundations of a discipline and its applications—especially in the case of an emerging or radically changing discipline—can avoid many dead ends in its development, and nurture valuable new perspectives. These improvements are not merely academic refinements, because the whole operation of society depends critically on the careful identification and demonstration of the success or failure of its efforts at improvement and its response to crises, i.e., on ethical professional evaluation. Funding from 3F will be allocated in a way that significantly weights social payoffs from evaluation for those in need, including unconventional needs for which a case can be made.
Evaluation itself had to spend fifty years to achieve a moderate degree of legitimacy after its discard, at the beginning of the twentieth century, into the trash heap of scientifically untouchable topics, a rejection based on a superficial critique of evaluation by the positivists. That indefensible blunder meant that many, perhaps most, social scientists turned their back on full frontal attacks on the great problems faced by our global society, a decision which almost certainly cost us all dearly. 3F is an attempt to reduce the chances of similar mistakes and similar costs. Its modest resources will be devoted to supporting proposals for new approaches to the task of extending the domain of rational scientific efforts at objective analysis of evaluative issues at either the applied or theoretical level, including methodological or foundational issues.
In this effort, 3F’s aim is to generate the development of new perspectives on and applications of evaluation, with some preference to ‘out of the box’ or ‘long shot’ proposals that are likely to find funding hard to get from the usual sources because of their departure from the current research paradigms and/or their relatively low chances of success. Comments, from a number of Nobelers amongst others, have supported the need for this kind of approach as filling a gap in the current research funding portfolio across all research disciplines, reaching far beyond the sciences, e.g., into jurisprudence, technology, mathematics, history, literature, and the classical arts.
Examples of the topical range of 3F efforts might include such efforts as: examining the evaluative assumptions and methodology used by historians of warfare in supporting evaluative judgments; the scope and limitations of ‘big data’ methodology for social policy analysis; supporting work by a small international specialist online group focusing on specific technical problems in intercultural cooperation and evaluative comparisons; the application of game theory (or positive psychology; or addiction theory) to the ethics of suicide prevention; the quality of a sample of evaluations done by a state’s legislative analysts (or by a federal inspector-general’s office; or by the World Bank); the utility in certain circumstances of what might be called ‘televal’ by analogy with the now highly active field of telemedicine, e.g., to bring technical or management skills from highly experienced evaluators to Nepal or Namibia or the Northern Sioux; developing new curricula and pedagogy for teaching evaluation methodology in the K-16 realm (or in the professional development area); the validity of quality-adjusted life years (QALYS) as the emerging key metric for national/international social (or medical) interventions. We have already supported a number of other examples, a few of which are mentioned below.
In general, supported proposals will undertake to produce new understanding or evaluative information about, or methods for doing, evaluation, in a form suitable for immediate publication, typically in a paper, chapter, or a set of these. Meta-evaluations are of course eligible, but support can go well beyond them, e.g., to improve the logic of rubric formulation. On the other hand, it is unlikely that funding will provide support for purchase of travel or durable equipment, since online video and technology rental are usually adequate; and of course, work that has been previously published, online or in hard copy, will not be eligible. Also ineligible is any evaluation of the usual applied kind that currently occurs in the score of fields where thousands of professional evaluators already work, not even if it uses some novel methodology; the research must be on that novel methodology, not just embodying it.
There is no hidden agenda in the sense of a preferred approach other than that outlined above. The judging for awards, and all other strategic management decisions will be done by the 3F Advisory Committee, a group of experienced evaluation specialists and disciplinary leaders representing many evaluation approaches, including many ex-presidents and editors of evaluation organizations and publications such as leading journals and anthologies; their names and interests can be found here. Jane Davidson will be Chief Operations Officer, but everyone involved, including donors, will have at most one vote.
The financial limitations on 3F support at the moment mean that applications for a few thousand dollars are perhaps slightly more likely to be supported than those requesting or requiring tens of thousands (partly because there can be more of them); and requiring hundreds of thousands puts a proposal beyond the current 3F range. That range may be expanded in later years, since our resources may expand via stock market growth, or by inspiring or developing further contributions, (We may also be useful as co-sponsors for larger projects.)
3F is legally set up as a Donor Advised Fund under the umbrella of the Marin Community Foundation, the third largest community foundation in the US (it is a non profit 501(c)(3) organization, in US legal terminology). That management and funding is set up to continue in perpetuity, subject to the continued agreement of the Advisory Committee.
The 3F approach is intended to be, and should be seen as, simply complementary to the great efforts made by the American Evaluation Association, which covers a vast range of support for professional evaluation members and their needs, including an excellent system of honorary awards for research and service. Our intent is more narrowly focused, specifically: (i) to generate new research rather than reward completed research; (ii) to focus on a certain sub-area of research (normative meta-research), which is just a small although crucial part of the big field of research on evaluation itself; (iii) to facilitate the emergence of new or massively transformed paradigms, critiques, and practices, rather than the mere refinement of existing ones, even excellent ones; and (iv) to emphasize continued consideration of the potential social benefits of refining evaluation theory, practice, and methodology.
Completed and Current 3F Projects:
Project FFF/1, the Micro Pro Bono Project. Many Advisory Committee members have volunteered to offer some unpaid email, phone or video micro-services to those who need them. Specifically, if you send in a request to FasterForwardFund@gmail.com, headed “Micro Pro Bono Request” explaining what kind of advice about evaluation you need, why you need it, and who on the Advisory Board you’d like to get it from, then we’ll forward it to that person—or to an alternative person that you or we suggest. Your adviser or our staff will then email you a time window when you can call your adviser on a free service like Skype. Our advisers can typically give advice about specific questions, e.g., how to frame the high-level evaluation questions you’ll need to address, and what to look for in an evaluation team with the skill sets to answer them; understanding the main alternatives you should consider as evaluation approaches; ballparking the kind of time or budget allowance for evaluation services that might be appropriate; what ‘program logic’ means, what to do when you don’t have baseline data for the time when the intervention began. The donors are not offering to be a continuing source of help, just a ‘get-started’ or ‘crisis-helping’ source, so they can’t be assumed to be willing to talk to you more than once, or for more than an hour on that occasion. Our main idea is simply to help people thinking about using evaluation to be realistic about what that involves or might yield. (But doing this may set an example that could expand e.g., into a Pro Bono Registry, or at least be replicated by other groups.) And don’t assume that specific suggestions from one individual, either in the content of this document or in advice from any of our pro bono advisers are guarantees of support from 3F as a whole, which of course would require specific discussion by the whole Advisory Committee.
Project FFF/2A, the FFfellowhip project at Claremont Graduate University. In a separately funded project, intended partly as a trial of logistics for 3F, a $10,000 prize was offered in 2015 for the best essay on a 3F theme of the applicant’s choice by a graduate student in evaluation at Claremont Graduate University. This offer received nine substantial entries. The judges—the four senior faculty who teach and do research in evaluation via appointments in the psychology department at Claremont—split the award three ways, as follows: Samantha Langan usefully extended the existing work on evaluation anxiety by bringing in related work on anxiety from other fields; Sarah Mason submitted a detailed proposal to apply the substantial work on ‘superealistic’ training in the armed services field to evaluation; and Josh Penman developed the thesis that a ‘grand unified theory’ of evaluation would be an especially worthwhile project for the discipline at this stage of its development.
Project FFF/2B, the Critical Thinking Fellowship Project. This is a twin of FFF/2A, but open to all and with topics focused on the furtherance of critical thinking as a discipline, rather than evaluation. The detailed administration of this, which—like its funding—is separate from 3F, is currently being negotiated with the national critical thinking association. It is mentioned here just to illustrate that 3F is considering generalizing some of its efforts beyond the field of evaluation, at least to closely related fields like critical thinking.
The current requirements are fairly simple: (i) describe what you or your group/organization wants to do, when, where, and how; (ii) provide an argument, and perhaps evidence (e.g., expert opinion or a formal or informal needs assessment), that this effort is consistent with the aims of 3F, including any possible social payoff; (iii) give your estimate of (and possibly evidence for) its feasibility and importance, including evidence of your qualifications for success, with some emphasis on expected tangible results that will be widely available; (iv) give a detailed budget and, if appropriate, evidence for its accuracy and cost-effectiveness (e.g., don’t include travel when video can do the job). Note that 3F will not pay overhead beyond about 10%, preferably less. (v) Keep all this down to a maximum of two or three thousand words (e.g., by severely condensing your cv), to reduce the investment of time by you and us. Each of these five headings must be directly addressed.
In setting up a budget, there are two considerations to keep in mind that distinguish 3F from some–perhaps most–other funding agencies. First, the reviewers, other things being equal (i.e., assuming equal probability of success and equal importance of a successful outcome), will be likely to fund less expensive projects over more expensive ones, since they are trying to maximize the payoff for the discipline and society from a limited budget of their own. So the common practice, in proposal budgeting, of multiplying your normal salary rate by the number of days or weeks you think the work will take, may make you uncompetitive with someone, whose project is otherwise equally promising, who sets their payment at the minimum level that will get them to take on the task. Second, on the other hand, reviewers will not proportionately penalize long-shot projects by comparison with those whose probability of success is higher: they will give substantially greater weight to the payoff compared to the expectation of success.
Proposals for this round of funding are due by September 15th, 2015, but earlier submission is recommended, since a few smaller projects may be funded before that date, and others may be considered more extensively, and it spreads the rating load for readers. While we strongly recommend using colleagues for extra input and critique of your proposal, they should not be identified as co-authors for that kind of input, and a principal investigator (or at most 2) must be identified to take responsibility for the project and its funds.
Decision time will depend on the number of entries, but we’ll try to keep it inside a month. Payment to successful applicants is best done via a legally qualified philanthropy, e.g., your university or organization (keeping in mind that 3F will at most pay overhead of about 10% and click here for overseas funding requirements). Payment will normally be in three installments: a small sum for start-up costs if any; a mid-project payment for demonstrated results at that point (i.e., a progress report or draft results); and a final payment for demonstrated further and final results. Reporting 3F results openly, which we think is important, means that a summary of results and support will be posted on the web and perhaps elsewhere.
Evaluators should practice what they preach, and there are good reasons for that practice, so we welcome suggestions for improving this proposal or its presuppositions. Send them to our email: FasterForwardFund@gmail.com with the subject: Suggestions.