Catálogo de publicaciones - revistas

Compartir en
redes sociales


American Journal of Evaluation

Resumen/Descripción – provisto por la editorial

No disponible.

Palabras clave – provistas por la editorial

No disponibles.

Disponibilidad
Institución detectada Período Navegá Descargá Solicitá
No detectada desde mar. 1999 / hasta dic. 2023 SAGE Journals

Información

Tipo de recurso:

revistas

ISSN impreso

1098-2140

ISSN electrónico

1557-0878

Editor responsable

SAGE Publishing (SAGE)

País de edición

Estados Unidos

Fecha de publicación

Tabla de contenidos

The Role of Evaluation Theory and Practice in Narrowing the Research-to-Practice Gap

Tiffany BerryORCID; Brittany Hite; Michelle Sloper; Haley UmansORCID

<jats:p> The research-to-practice gap describes the well-documented phenomena of researchers and practitioners working in silos, embedded in vastly different contexts. This “cultural divide” has several causes, including ineffective collaboration, inadequate understanding of context, and insufficient dissemination and translation of research. Evaluation theory and practice have been largely absent from discussions about solving the research-to-practice gap, despite remarkable alignment between these efforts and the core principles and goals of evaluation. In this article, we describe why the research-to-practice gap exists, current models and frameworks to address the gap, and how evaluation aligns with, and extends, efforts to create a bi-directional bridge between research and practice. This article is a call-to-action for professional evaluators who are poised to narrow this gap given that they work at the intersection of research and practice. Through authentic collaboration, researchers, practitioners, and evaluators can leverage our collective strength to address the pressing challenges faced by our communities. </jats:p>

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

From the Co-Editors: Welcome to Volume 45 and a New Editorial Team

Rodney HopsonORCID; Laura R. PeckORCID

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

And When We Say Our Last Goodbye…: In Memory of and Salute to Stafford Hood

Rodney HopsonORCID

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

Evaluation Capacity Building Through Community-University Partnership

Sarah SuiterORCID; Kathryn Y. Morgan; Sara Eccleston

<jats:p> Evaluation is essential to achieving program outcomes, especially when stakeholders engage with evaluation and make use of the findings. Both of these activities require evaluation capacity that might not be present in community-based organizations. In this paper, we describe how community-university partnership models can support evaluation capacity building (ECB). The basic framework for our ECB initiative was a semester-long, master's-level university course in which 5–6 community partners worked with small groups of 3–4 students to design an evaluation plan. We used mixed-methods to assess (1) if organizations implemented the evaluation plans developed in the course; (2) how organizations used the findings; and (3) what evaluation skills participants continued to use after the course ended. We found that organizations that implemented their evaluation plans gained intended outcomes of ECB, such as improving practice and communicating with stakeholders. These results suggest that community-university partnerships for developing ECB can be effective. </jats:p>

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

Program Plan Evaluation: A Participatory Approach to Bridge Plan Evaluation and Program Evaluation

Huey T. ChenORCID; Liliana Morosanu; Victor H. Chen

<jats:p> Most program evaluation efforts concentrate on assessments of program implementation and program outcomes. However, another area of programs that has not received sufficient attention in the literature is evaluating the plan of the program. Since the quality of the plan and planning process can influence program implementation and outcomes, there is a need to expand program evaluation efforts to cover program plans, and thus bridge plan evaluation and program evaluation. This paper utilizes the program evaluation literature to illustrate two approaches to participatory program plan evaluation— ex-ante or proactive and ex-post or reactive—including a conceptual framework that identifies the requirements, barriers, and strategies for evaluating program plans. Concrete examples are provided to illustrate the application of these two approaches. </jats:p>

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

Challenges and Adjustments in a Multisite School-Based Randomized Field Trial

Debbie L. Hahs-VaughnORCID; Christine Depies DeStefano; Christopher D. Charles; Mary Little

<jats:p> Randomized experiments are a strong design for establishing impact evidence because the random assignment mechanism theoretically allows confidence in attributing group differences to the intervention. Growth of randomized experiments within educational studies has been widely documented. However, randomized experiments within education have received criticism for implementation challenges and for ignoring context. Additionally, limited guidance exists for programs that are tasked with both implementation and evaluation within the same funding period. This study draws on a research team's experiences and examines opportunities and challenges in conducting a multisite randomized evaluation of an internship program for teacher candidates. We discuss how problems were collaboratively addressed and adjusted to align with local realities and demonstrate how the research team, in consultation with local stakeholders, addressed methodological and program implementation problems in the field. Recommendations for future research are provided. </jats:p>

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

Mapping Evaluation Use: A Scoping Review of Extant Literature (2005–2022)

Michelle SearleORCID; Amanda Cooper; Paisley Worthington; Jennifer Hughes; Rebecca GokiertORCID; Cheryl Poth

<jats:p> Factors influencing evaluation use has been a primary concern for evaluators. However, little is known about the current conceptualizations of evaluation use including what counts as use, what efforts encourage use, and how to measure use. This article identifies enablers and constraints to evaluation use based on a scoping review of literature published since 2009 ( n = 47). A fulsome examination to map factors influencing evaluation use identified in extant literature informs further study and captures its evolution over time. Five factors were identified that influence evaluation use: (1) resources; (2) stakeholder characteristics; (3) evaluation characteristics; (4) social and political environment; and (5) evaluators characteristics. Also examined is a synthesis of practical and theoretical implications as well as implications for future research. Importantly, our work builds upon two previous and impactful scoping reviews to provide a contemporary assessment of the factors influencing evaluation use and inform consequential evaluator practice. </jats:p>

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

Reclaiming Logic Modeling for Evaluation: A Theory of Action Framework

Rebecca H. WoodlandORCID; Rebecca MazurORCID

<jats:p> Logic modeling, the process that explicates how programs are constructed and theorized to bring about change, is considered to be standard evaluation practice. However, logic modeling is often experienced as a transactional, jargon-laden, discrete task undertaken to produce a document to comply with the expectations of an external entity, the consequences of which have minimal or even negative influence on the quality of program evaluation. This article presents the Logic Modeling Theory of Action Framework (LMTAF) which elucidates needs, resources, and central activities of logic modeling, and describes its potential evaluation-related benefits. The LMTAF situates evaluators as the primary intended users of logic modeling, and logic modeling as a fundamental element of each stage of a program evaluation life cycle. We aim to reassert the value of logic modeling for evaluation and provide evaluation practitioners a useful touchstone for reflective practice and future action. </jats:p>

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

Book Review: Evaluation in Rural Communities by Allyson Kelley

Jeremy BraithwaiteORCID

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible

A Protocol for Participatory Data Use

Jane Buckley; Elyse PostlewaiteORCID; Thomas ArchibaldORCID; Miriam R. Linver; Jennifer Brown UrbanORCID

<jats:p> The purpose of this paper is to offer both theoretical and practical support to evaluation professionals preparing to facilitate the utilization phase of evaluation with a program or organization team. The Systems Evaluation Protocol for Participatory Data Use (SEPPDU) presented here is rooted in a partnership approach to evaluation and is therefore designed as a way to structure conversations and facilitate thinking around data interpretation and decision making. The SEPPDU is presented in three main parts: (a) summarizing evaluation results, (b) interpreting results, and (c) planning for action. This paper describes specific and practical tips for the facilitation of each part based on field experience in a variety of settings. </jats:p>

Palabras clave: Strategy and Management; Sociology and Political Science; Education; Health (social science); Social Psychology; Business and International Management.

Pp. No disponible