Jun
8
9:00am 9:00am

Doctoral Student Professional Development Workshop Sponsored by the "Journal of Public Administration Research" and "Theory and Perspectives on Public Management and Governance"

Location: Ward Circle Building, Room T2

Download the workshop flyer

ORGANIZERS:

  • Bradley E. Wright, The University of Georgia
  • Rhys Andrews, Cardiff University
  • Sandra Groeneveld, Leiden University
  • Patrick Kenis, Tilburg University
  • Stéphane Lavertu, The Ohio State University
  • Kenneth J. Meier, Texas A&M University
  • Stephanie Moulton, The Ohio State University
  • Dan Smith, University of Delaware
  • David M. Van Slyke, Syracuse University

This workshop will engage students in hands-on learning across a range of topics that may not be covered systematically or consistently across Ph.D. programs. By attending the workshop students will:

  • Gain insight into the ‘publication journey’ of academic manuscripts by:
    • Writing and getting feedback on an original review of a recently published manuscript;
    • Discussing the revision process the manuscripts’ author went through before their work was accepted for publication;
    • Build a network among peers, future colleagues, and potential employers; and
    • Get tips from experienced scholars on selecting journals to target for different types of work.

This workshop is sponsored by the Journal of Public Administration Research and Theory and Perspectives on Public Management and Governance. There is no cost to students to attend, and your conference registration fee will be waived if you are selected to participate in the workshop. A light lunch will be provided.

View Event →
Jun
8
9:00am 9:00am

Experimental Design and Feedback Workshop

Location: Ward Circle Building, Room T3

ORGANIZERS:

  • Sebastian Jilke, Rutgers University
  • Oliver James, University of Exeter
  • Gregg Van Ryzin, Rutgers University

The increased use of experiments in public administration and management research offers much promise for advancing theory, methods, and substantive knowledge in the field. However, scholars need to think carefully about their experimental designs, including manipulations, before implementing the actual experiment. For instance, they need to be clear what they want to manipulate, and how they will do so. Thus, there is strong need to carefully theorize about intended effects in the design phase of an experiment. Also, other aspects of experimental designs, such as the treatment of subjects, statistical power, or issues of anticipated non-compliance need careful consideration in the design phase of an experiment. However, few public administration scholars that do experimental research have opportunities to seek advice and feedback from colleagues before implementing their experimental designs. Moreover, public management conferences are not set up to facilitate the presentation and discussion of experimental research proposals. With this workshop we seek to increase the capacity of experimental designs within the discipline by addressing this lacuna. It is planned that a group of public administration scholars presents and reviews each other’s experimental research designs before they are implemented. Therefore, this workshop will provide the unique opportunity to present experimental research designs of specific studies and discuss their particular strength and weaknesses, with the aim to improve the overall quality of experimental research designs in the discipline. This workshop is aimed at any public administration scholar (including doctoral students) who plans to implement experimental designs and seeks qualified feedback before doing so.

View Event →
Jun
8
9:00am 9:00am

Implementation Research in RCT Evaluations

Location: Ward Circle Building, Room T6

ORGANIZERS:

  • Carolyn Hill, MDRC
  • Rekha Balu, MDRC
  • David Greenberg, MDRC
  • Michelle Manno, MDRC

The learning objectives for workshop participants are:

Participants will learn the core questions typically addressed by implementation research in the context of randomized controlled trial (RCT) evaluations;

  • Participants will learn how technical assistance and monitoring data can help document the quality of service delivery and identify implementation challenges;
  • Participants will learn a framework for understanding program modifications as either intentional or unintentional adaptation of a program model;
  • Participants will learn a framework for articulating and describing service contrast (i.e., the difference in services received by treatment group participants compared with those received by control group participants).

Increasingly, policymakers at the federal, state, and local levels are requiring or encouraging the use of "evidence-based" programs and practices. These developments are of central interest to our community of public management researchers --- both for understanding the evidence base of programs being implemented by the organizations we study (and whether and how they adapt evidence-based programs), and for understanding possibilities for further research in these settings.

View Event →
Jun
8
9:00am 9:00am

Methodological Reporting Standards in Public Management Scholarship: An Interplay

Location: Ward Circle Building, Room T4

ORGANIZERS:  

  • Seulki Lee, New York University
  • Sonia Ospina, New York University  
  • Valentina Mele, Bocconi University
  • Marc Esteve, University College London

A description of research methodology in a journal article provides the rationale for the choice of analytic tools and techniques, allowing the reader to understand approaches to the research problem and critically evaluate methodological rigor. Despite the importance of reporting methodological decisions, however, there is a lack of agreement among researchers about what and how much to report, particularly along the qualitative-quantitative continuum. Systematic reviews of research methodology practices have pointed out the great variation in the reporting of methodological details, emphasizing the lack of reporting standards (Brower, Abolafia, and Carr 2000; Scandaura and Williams 2000; Lowery and Evans 2004; Stewart 2012; Ospina, Esteve, and Lee 2016). Poor reporting of research methodology precludes full understanding of the research and prevents the assessment of research quality (Ospina, Esteve, and Lee 2016).  This workshop will engage public management researchers from different methodological communities in a discussion around the question: how do we develop agreements around reporting standards for our research that respect the internal logic of inquiry of different methodologies, but at the same time move the public management community toward a common ground?   Scholars who have published in main public management journals using different methods (e.g., case studies, regression analysis, experimental research, narrative analysis, and network analysis) will reflect on the issues discussed above. Based on their own practice and experience, they will offer some insights to jump-start a group conversation on the intrinsic challenges of reporting and solutions they have devised considering the specific guidelines of public management journals and of their research traditions.

This session has two main goals:

  • To discuss and nail down the methodological choices that require appropriate and sufficient reporting, including: sampling/selection, data collection procedures, analysis strategies, and methodological limitations.
  • To compare criteria for evaluating the quality of research in different methodological traditions (e.g., for quantitative research: internal/external validity, reliability, transparency, and replicability; for inductive qualitative research: credibility, transferability, confirmability, and trustworthiness). Once compared, to identify and search for opportunities for common ground around the various dimensions of reporting.

Based on these goals, the session will encourage a discussion of the implications and ways forward. On the one hand, we aim at sharing reporting strategies that raise the bar of current standards, while being consistent with the editorial policies of public management journals. On the other hand, we also aim at discussing how to raise awareness among researchers about methodological reporting and to consider if we need to change some of the journal policies in ways that improve methodological rigor and strengthen the field of public management research.

View Event →
Jun
8
9:00am 9:00am

Designing Public Sector Performance

Location: Ward Circle Building, Room T5

ORGANIZER:

  • Don Moynihan, University of Wisconsin Madison

The theme of this workshop is to examine the state of the research on public sector performance. In particular, the session will focus on design issues, that is, deliberate efforts to alter some aspect of the public setting to improve outcomes. These can include governmental reforms, policies about hiring, restructuring incentives, or training. Historically, such practices have been implemented, and then studied in an ex-post fashion using observational designs. But the availability of administrative data offers more opportunities for quasi-experimental designs, and researchers are addressing some of these issues with field experiments.

The workshop is an effort to pull together various trends in public management and public policy scholarship: an ongoing interest in performance, a shift toward evidence-based policy, and the embrace of experiments. Much of the research focus on performance has focused on cognitive process, but less on organizational design. The rise of evidence-based policy units in government has embraced field experiments, but has rarely applied them to management questions. Governments continue to introduce major reforms in a fashion that resists good causal design, but the existence of administrative data offers new possibilities to apply quasi-experimental designs. These varying trends suggest that this is an opportune time to discuss this topic, and perhaps to establish a research community.

Each presenter will be asked to briefly present a piece of research, but to also reflect upon their experience of the challenges of doing such research. We will feature research from the US, Europe, but also include a session that focuses on research in developing countries session. In addition, the DC location allows an exchange with policymakers engaged in public service improvement efforts.

The event is sponsored by the European Studies Alliance & La Follette School of Public Affairs, University of Wisconsin-Madison

9:00 a.m. – 9:15 a.m. Introduction
Donald Moynihan

 9:15 a.m. -  10:30 a.m. Session 1: Recruitment, Training and Incentives
Chair: Dan Chenok, IBM Business of Government

  • Nina Van Loon, Leiden University: Changing Incentive Structures in Performance Regimes
  • Elizabeth Linos, Harvard University & Behavioral Insights Team: Improving Recruitment
  • Lotte Bogh Andersen, Aarhus University: Can we Train Better Leaders?

Respondent: Dustin Brown, US Office of Management and Budget

(15 minutes per presenter, 10 minutes for respondent, 25 minutes for discussion)

Break 10:30 a.m. - 10:45 a.m.

10:45 a.m. - 12:00 p.m. Session 2: Studying Performance Outcomes in Developing Countries
Chair: Obed Pasha, University of Massachusetts-Amherst

  • Martin Williams, Oxford University: Management of Bureaucrats and Public Service Delivery: A Scientific Replication in Nigeria and Ghana,
  • Rikhil Bhavani, University of Wisconsin-Madison: Research on Local Government Performance in India
  • Dan Honig, Johns Hopkins University: When Reporting Undermines Performance: The Costs of Politically Constrained Organizational Autonomy for Aid Agencies,

Respondent: Carolyn Heinrich, Vanderbilt University

(15 minutes per presenter, 10 minutes for respondent, 25 minutes for discussion)

Break 12:00 p.m. - 12:10 p.m.

12:10 p.m. - 1:00 p.m. Session 3: What are Next Steps to Facilitate Progress?
Chair: Donald Moynihan

  • Carolyn Hill, MDRC
  • Brian Humes, National Science Foundation
  • Christopher Mihm, Government Accountability Office
  • Dan Rogger, World Bank

(10 minutes or less per respondent, 10 minutes for discussion)

View Event →