Abstract
The goal of this paper is to contribute toward bridging the gap between policy design and implementation by focusing on domains, such as education, healthcare and community services, where policy implementation is largely left to the autonomous decision of public service providers, which are strategic actors themselves. More specifically, we suggest that two characteristics of policy design spaces in which policies are designed, i.e., the level of ideational coherence and the prevailing function of the adopted policy instruments, generate systematic patterns of responses in terms of the extent of compliance with policy goals, the presence of strategic gaming and possible defiance. We illustrate our model through a contrastive case study of the introduction of performance-based funding in the higher education sector in four European countries (France, Italy, Norway, and the United Kingdom). Our analysis displays that policy designs chosen by governments to steer public systems have different trade-offs in terms of responses of the public organizations involved that are essential to effectively implement governmental policies. The model we are proposing provides therefore a framework to understand how these interactions unfold in specific contexts, what are their effects on the achievement of policy goals and how policymakers could exploit their degrees of freedom in policy design to reduce unwanted effects.
Similar content being viewed by others
Avoid common mistakes on your manuscript.
Introduction
While many policy domains are characterized by high ambiguity, conflicts of interests and chaotic processes (Peters and Fontaine, 2022), the public policy literature has also emphasized that, even in these unsettled conditions, policymakers frequently intentionally design policies based on a previous understanding of policy goals, means and outcomes (Howlett & Mukherjee, 2014). In addition, there is a common perception that ‘better designed policies are more likely to correctly identify or solve the problems they are expected to address’ (Howlett and Mukherjee, 2018, p. 4). Peter May pointed out that ‘policy designs provide both the blueprint for carrying out policies and the foci for efforts to shore up or undermine policy implementation’ (May, 2012, p. 279).
Yet, it is also widely acknowledged that this relationship between policy design—i.e., the content of the decisions through which policymakers attempt to solve socially relevant problems—and implementation—i.e., the process through which the solutions for these problems are concretely pursued in practice—is a complex and slippery puzzle (Bardach, 1977; Pressmann & Wildawsky, 1973). This so-called ‘implementation gap’ – that is, the fact that many governmental policies, despite having been consciously designed (Shattock, 2014), have failed to achieve their intended goals – has been a major concern of policy scholars (Hupe & Hill, 2016; Ongaro & Valotti, 2008; Saetren, 2014).
In this paper, we address the implementation gap by specifically focusing on contexts where policy implementation has been largely delegated to partially autonomous public service providers (Christensen, Lægreid and Røvik, 2020) as part of the process of decentralizing and autonomising public sector activities promoted by new governance models such as New Public Management (Ferlie et al., 1996; Pollitt & Bouckaert, 2000). While their mandate is largely defined by the State, these organizations (for example: universities, schools, and public hospitals) also behave as strategic actors pursuing their goals (Brunsson & Sahlin-Andersson, 2000) and are exposed to external pressures from society, professional communities, and economic markets. Therefore, they may respond to policy interventions in unexpected and heterogeneous ways (Durand et al., 2019; Oliver, 1991), thereby influencing the effectiveness of policy implementation (Christensen, Lægreid and Røvik, 2020). And, indeed, there is empirical evidence that many policy interventions failed to achieve their designed goals since the organizations targeted as ‘first implementers’ did not adopt the intended behavior and resorted to ‘ceremonial’ responses (McNulty & Ferlie, 2004) or to gaming (Dahler-Larsen, 2014).
We deal with this problem by bridging two streams of literature that do not often intersect albeit covering the two sides of the same coin. On the one hand, while acknowledging the complexity of policymaking, the policy design literature assumes a hierarchical perspective and emphasizes the relevance of policy instruments as the main drivers of policy implementation, thus overlooking that the organizations’ strategic behavior is a powerful driver of the way through which policies are implemented (Bozeman, 2013). On the other hand, the literature on organizational responses tends to underestimate the role of policies in driving organizational behavior; policy design is considered as one of the many environmental pressures factored in when organizations decide on their strategies (King et al., 2010; Oliver, 1991) within broader organizational fields (Fligstein & McAdam, 2011). We contend that this analytical divide represents a significant obstacle toward a better understanding of how public decisions are made and implemented in core policy fields (Arellano-Gault et al., 2013).
To connect these elements, we mobilize the literature on the concept of policy design space to capture how the content of policy interventions are chosen (Capano & Mukherjee, 2020; Chindarkar et al., 2017). And we resort to the organizational literature to identify those general characteristics of the policy design which are expected to affect the way public organizations respond to policy interventions (Oliver, 1991; Scott, 2008).
Thus, we introduce a new conceptualization of policy design spaces in terms of two dimensions, which characterize the broader policy regimes, and represent the drivers of the operational modes of implementation (Bressers & O'Toole, 2005; Salamon, 2002; Spicker, 2006): the level of coherence in the policy frames underlying the policy intervention on the one hand (Surel, 2000), and the functions of policy instrumentation in directing organizational behavior on the other hand (Lascoumes & Le Galès, 2007).
By drawing on the organizational literature, we suggest that these characteristics of the space in which policies are developed lead to systematic patterns in the responses deployed by public organizations, specifically on whether compliance or resistance may be expected (Oliver, 1991) and whether organizations may respond differently as related to their positioning in the field (Durand et al., 2019). And, accordingly, also impact the extent of achievement of intended policy goals.
We illustrate our model through a (literature-based) contrastive study of the introduction of performance-based funding (PBF; Mauro et al., 2017) for Higher Education Institutions (HEIs) in four European countries: France, Italy, Norway, and the United Kingdom. Our illustration indeed supports the framework and reveals different types of responses associated with the characteristics of the policy space; however, it also points to the potential effects of other contextual factors.
Indeed, we are aware that the drivers of implementation and of a policy’s success and failure in terms of both outputs and outcomes cannot be reduced only to the characteristics of the policy design. Policy implementation can be influenced by factors such as goal ambiguity (Matland, 1995), the autonomy of street-level bureaucracy (Baviskar & Winter, 2017; Meyers & Nielsen, 2012; Winter, 2012), a lack of funding (Dimitrakopoulos and Richardson, 2001), conflictual dynamics and political games (Bardach, 1977), lack of enforcement (May & Winter, 2009), the performance of public organizations involved in delivering the service (Meier and O'Toole Jr, 2006) and the role of partially autonomous implementation agencies (Pollitt et al., 2004).
Yet, in policy domains where the first and most pivotal implementers are public organizations, we believe that a framework that allows identifying systematic patterns of interactions between policy design and organizational responses is an essential step to understanding certain regularities of policy implementation in terms of success and failure. And might open interesting avenues to systematically analyze the impact of other implementation dimensions on the achievement of policy goals.
We suggest that he theoretical course proposed here, and the related propositions, are of relevance for policy design studies, as they provide analytical categories to bridge the gap between policy design and the behavioral responses of the public organizations that are the first implementers of the policy and display how ideational and instrumental dimensions of policy design should be analyzed as interdependent.
Moreover, policy design is the place where policymakers make conscious choices concerning the instruments to achieve certain policy goals based on perceptions of their appropriateness and effectiveness in a specific context (Capano & Lippi, 2017; Howlett et al., 2015) and on behavioral assumptions on the responses of the treated subjects (Howlett, 2018; Schneider & Ingram, 1990). Accordingly understanding how the characteristics of the policy design space affect the public organizations’ responses is of practical relevance to policymakers to exploit the available degrees of freedom to design more effective policies and to counter those organizational responses that might jeopardize the achievement of policy goals.
Policy design spaces and organizational responses
In a context characterized by multiple actors and competing interests, policymakers face a double challenge, i.e., “finding solutions that will be politically acceptable and achieve desired outcomes” (May, 1991). While the concept of policy design spaces allows describing the range of politically acceptable solutions (Sect. "Policy design and its spaces"), in the policy fields we are considering the implementation challenge is largely related to generating responses aligned with the policy goals by the ‘first implementers’ (Sect. "Beyond compliance: How can the behavior of first implementers align to policy goals?"). Hence, the main thrust of this paper is to develop an analytical framework that allows connecting these two dimensions (Sect. “A framework to connect policy design spaces and organizational responses”).
Policy design and its spaces
Policy design is a never-ending process that involves not only the formulation of a policy but also the agenda-setting and implementation stages. All policy-making is policy design; this can be more or less deliberate, while it can be also accidental or experimental (Peters, 2018). However, even in this broad and complex landscape of policy design, it can be assumed that policymakers are committed, or at least convinced of their commitment, to finding solutions to problems perceived to be collective. As such, policy design, defined as the intentional component of policymaking that “involves the deliberate and conscious attempt to define policy goals and connect them to instruments or tools expected to realize those objectives” (Howlett et al., 2015, p. 291), is a fundamental task of policymaking. And, there is a need to identify the linkages between these intentional policymakers’ efforts and the implementation of policy (Howlett and Mukherjee, 2018).
The adoption of this definition does not exclude the fact that policy design is also the result of a miscellany of ideas, political preferences, interests and technologies, as institutionalized in specific contingences (Kern & Howlett, 2009; Christensen and Lægreid, 2011; Rogge & Reichardt, 2016; Rayner et al., 2017) and embedded in a specific context in which different values, interests and political dynamics delimit its characteristics (Howlett & Mukherjee, 2014). It involves an extensive process of compromising and adaptation in which policymakers inherit complex mixes of instruments and goals of the past and attempt to transform them in order to reach (new) policy goals (Bressers & O'Toole, 2005; Capano, 2018; Capano & Pritoni, 2019; Howlett, 2004; Howlett & Mukherjee, 2014; Streeck & Thelen, 2005). It is through the design process that the various understandings of policy problems, policy goals and policy instruments are eventually integrated into actual policies (Peters and Fontaine, 2022). Policy design is surely a political activity in which the conflict between interest and ideas can be extensive (Turnbull, 2022), and in which the choice of instruments is not driven by rationalistic and neutral logic (Le Galès, 2022).
To account for the fact that the set of choices that policy actors can make in specific circumstances is limited by a whole range of contextual (i.e., political, socio-economic and cultural) factors, the concept of policy design space has been proposed to designate the set of eligible solutions available for policymakers according to the actual political contingency, governance arrangements, and policy legacy (Linder & Peters, 1991). In the literature, policy design spaces have been conceptually treated, to assess the quality of the design itself, in terms of technical or political capacity, technical or political concerns, and governmental capacity/policy anomalies (Capano & Mukherjee, 2020; Chindarkar et al., 2017, 2022).
In this paper, we characterize policy design spaces in terms of two main dimensions of public policies. i.e., their ideational and instrumental content. On the one hand, a popular way of representing ideational content has been through the identification of policy paradigms (Hall, 1993; Capano, 2003; Hogan and Howlett, 2015) or frames (Surel, 2000) that represent coherent combinations of cognitive and normative elements underpinning the policy design process, including elements such as the principles and goals that should govern public policy, norms for policy implementation and preferences for types of instruments (Schneider & Ingram, 1990). On the other hand, the literature has introduced the notion of policy implementation style, that is, general and historically embedded patterns of how chosen policy instruments (Vedung, 1998) are turned into reality and their delivery is organized (Capano & Toth, 2023; Howlett & Ramesh, 1993; Salamon, 2002).
We suggest that these two dimensions are relevant for observing the interaction between policy interventions and the responses of the treated subjects (Howlett & Ramesh, 1993).
Beyond compliance: how can the behavior of first implementers align to policy goals?
Public policy literature has extensively discussed changes in the public sector governance toward decentralizing and autonomising public sector activities (Pollitt & Bouckaert, 2000). This involves the devolution of State tasks to partially autonomous agencies (Pollitt et al., 2001; Christensen and Lægreid, 2006) and the provision of greater autonomy to public service providers such as universities, schools and hospitals (Bode et al., 2017; Piening, 2011). While these organizations are still part of the public sector and endowed by an explicit mandate to achieve policy goals, they are situated farther from the public administration and frequently deliver services on a cost basis in market settings (e.g., competing with private providers). In many countries, policy reforms were introduced to transform these organizations into strategic actors (Arellano-Gault et al., 2013; Brunsson & Sahlin-Andersson, 2000) and to grant them more strategic and managerial autonomy (Verhoest et al., 2004), albeit with substantial cross-country variations in terms of the extent and type of autonomy (de Boer et al., 2007; Seeber et al., 2015).
From a public policy perspective, this issue is relevant because in sectors such as education, healthcare and culture, core policy goals are achieved through the service delivery efforts of these providers. Accordingly, most of the reforms introduced in these fields have focused on the behavior of these first implementers. For example, to improve the quality of the educational outputs, various interventions have been introduced to increase the accountability of schools (Verger & Skedsmo, 2021). In health policy, the same was done to reduce those waiting lists, which are a point of discontent for many healthcare systems. Many of these interventions directly targeted the behavior of public hospitals (Toth, 2021). In higher education, to make universities more accountable and responsive to the socio-economic needs, many governments have directly intervened in the institutional governance of universities (Shattock, 2014; Capano et al., 2017; Capano and Jarvis, 2020).
These reforms assumed that the chosen policy instruments activate suitable behavioral mechanisms in the treated organizations and, therefore, generate compliance to policy goals (Schneider & Ingram, 1990). Yet, conceiving public organizations as strategic actors implies that these are able to engage with external demands (including public interventions) and develop intentional responses (Oliver, 1991; King et al., 2010. In this perspective, behaviors that have been defined as ‘gaming’ in performance management—such as ratchet effects, thresholds effects, distortions in the outputs (Hood, 2006) or bending the rules (Pollitt, 2013)- and that are considered as causes of misleading or failed implementation (Christensen and Lægreid, 2021; Taylor, 2021),may simply represent instances of strategic responses where the treated organizations attempt to balance between the policy pressures and their own characteristics and interests.
The literature highlighted two major drivers of organizational responses to environmental pressures, i.e., institutional pressures and resource dependencies (Oliver, 1991).
On the one hand, organizations strive to maintain their legitimacy by complying with expectations from powerful audiences, such as professionals or the State, even if this requires adopting dysfunctional behavior (Powell & DiMaggio, 1991; Suchman, 1995). Institutional pressures are conveyed to organizations through shared values, social norms of behavior and coercion (Scott, 2008). On the other hand, organizations also seek securing critical resources for their survival, such as getting students in universities and patients in hospitals (Pfeffer & Salancik, 1978). Accordingly, shaping the resource environment is a powerful means to direct organizational behavior.
Organizational theory suggests that compliance is expected when institutional pressures are strong and aligned with resources, while defiance is expected when institutional pressures are weak, and compliance also jeopardizes resources (Oliver, 1991). In many organizational fields, more complex situations are encountered, such as the presence of competing institutional systems and misalignment between institutional pressures and resources; in these situations, more complex responses are expected (Greenwood et al., 2011).
For example, when external pressures menace their core values or activities, organizations might resort to symbolic behaviors without changing their way of working (decoupling; Boxenbaum & Jonsson, 2008), for example, by adopting policies only on paper, which potentially jeopardizes the achievement of policy goals (McNulty & Ferlie, 2004). Particularly when exposed to conflicting pressures, organizations may also resort to selective coupling, i.e., complying only selectively to policy interventions, for example for what concerns the administration, but not professional work (Pache & Santos, 2013), or to compromising by adopting hybrid practices which embed alternative norms and values (Greenwood et al., 2011). A higher-level strategic response is organizations trying to manipulate their environment, for example by intervening in the political process to alter policies in their favor (Edelman et al., 1999).
Organizations also respond differently to environmental pressures depending on their positioning in the field (Durand et al., 2019; Greenwood et al., 2011). Specifically, high-status organizations are subject to stronger scrutiny, but also have more resources to manipulate policies, while low-status organizations might be forced to comply to keep their resource basis (Fligstein & McAdam, 2011). Organizations also attribute different levels of saliency to external pressures, interpreting them as aligned, conflicting or unrelated to their identity and goals (Bundy et al., 2013) – for example, a strong professional identity (Leicht & Fennell, 2008) might generate resistance to managerial interventions affecting professional activities (Townley, 2002).
Heterogeneity of responses is relevant to public policies, as it might affect the extent of achievement of policy goals (Cattaneo et al., 2016).
A framework to connect policy design spaces and organizational responses
To develop our theoretical framework, we connect the core characteristics of the policy design space with drivers of organizational responses, by focusing specifically on those characteristics which are expected to lead to different types of responses.
On the one hand, organizational theory suggests that a major determinant of organizational responses is the degree of coherence of the policies’ ideational content. In a policy design space characterized by the presence of a hegemonic frame, most relevant audiences in the policy design process will share the same basic values and norms, and therefore pressures for compliance will be strong. Moreover, since decision-makers design interventions according to their legitimacy and efficiency in pursuing policy goals (Capano & Lippi, 2017), when a frame is hegemonic in a specific context, it is expected that the intervention will be coherently designed, conveys coherent signals to the treated subjects and is supported by relevant audiences such as professionals or societal actors. On the contrary, when there are contrasting frames, the design process will be affected by the need to mediate different values and perceptions of the efficiency of instruments as well as the conflicting interests of the stakeholders involved, and thus it will include incoherence and ambiguity in terms of goals and instruments’ design, as well as potential contestation and conflict. We assume that incoherence and ambiguity in the ideational content may be dependent on the fact that the problem to be solved is perceived as being highly complex, uncertain and wicked; thus, policy-makers cannot clearly agree on the goal of the intervention.
On the other hand, organizational theory suggests that the mechanisms that policy instruments can activate to influence organizational behavior also matter, as exerting cognitive, normative or coercive pressures results in unique behavioral responses (Scott, 2008). This insight has been explored in the sociological literature on public policies as well, which distinguished between a symbolic-normative and a pragmatic function of instruments (Edelman & Suchman, 1997; Lascoumes & Le Galès, 2007). In the former, policy instruments establish cognitive templates and social norms, which might activate level of compliance by the organizational targets; in the latter, they affect directly organizational behavior by coercion and/or by touching the flow of resources from the state. Symbolic-normative pressures are generally considered as the most powerful mechanism of organizational behavior as they lead to ‘taken-for-granted’ responses based on social pressures and imitation (DiMaggio & Powell, 1983), while coercive pressures are expected to generate more strategic and self-interested behavior (Scott, 2008).
Accordingly, while other characteristics of policy interventions are relevant, such as intensity, target groups and time horizon, we consider this distinction as the most suitable to the specific goal of the paper, i.e., connecting the characteristics of the policy instruments selected with behavioral responses of the target organizations (see also Schneider & Ingram, 1990). Two important remarks are at place here: first, unlike the distinction between coercive and voluntary instruments (Doern & Phidd, 1983), these functions characterize the same policy instrument to different degrees depending on its mode of delivery (Salamon, 2002). Second, in line with our focus on policy design, this distinction refers to the intended function of instruments when policies are designed, which might be different from their actual effect, a situation that we will analyze empirically later in the paper.
We therefore represent policy design spaces in terms of two axes, i.e., the coherence of the ideational content (coherent vs. incoherent) and the prevalent function of the chosen instruments (symbolic-normative vs. pragmatic), as in Fig. 1.
In Q1 (coherent and pragmatic policy design), beliefs and goals underlying the governmental choices are dominated by a single frame, and it is expected that the choice of policy instruments is consistent with the dominant frame, for example, incentive tools are prevalent if the dominant policy frame is managerialism. Further, policy interventions leverage on rules and economic incentives to directly affect the behavior of the treated subjects and adopt tight monitoring of responses.
Under these conditions, public organizations are expected to acquiesce with policy interventions: tight and direct enforcement leaves limited space for compromises, while defiance would have limited chances against a coherent policy design and implementation.
If the substantive content of policy interventions clashes with organizational identity, an alternative strategy would be (covert) avoidance, i.e., formally complying with policy interventions, but keeping existing organizational practices (Townley, 2002), a strategy, however, potentially dangerous in terms of legitimacy and resources given tight monitoring. We therefore expect limited heterogeneity in responses.
In Q2 (coherent and symbolic/normative policy design) beliefs and goals underlying the governmental choices are dominated by a single frame. However, policy implementation is largely left to the implementers’ decisions; when regulatory, financial and information instruments are adopted, they have mostly the function of setting norms of behavior. Therefore, Q2 shares with Q1 the strength and coherence of institutional pressures; however, policy instruments do not directly affect organizational resources and activities.
In such a context, organizations are expected to respond with compliance as otherwise their legitimacy would be penalized. However, the policy intervention leaves latitude for compromising to avoid internal conflicts, for example by adopting managerial practices in the administration, while keeping professional norms in the conduct of professional activities.
We therefore expect some heterogeneity and local adaptation in responses depending on the positioning and identity of individual organizations; policy goals will be achieved to a large extent, but the process will be gradual and take also into account goals and interests of the treated organizations.
In Q3 (incoherent and pragmatic policy design), the ideational frame is incoherent, for example managerialism is mixed with bureaucratic principles, and there is policy conflict around beliefs and goals. At the same time, implementation is based on constraining design such as detailed regulations and highly impacting incentive instruments; given the incoherence of the ideational frame, policy interventions are expected to be unstable and subject to continuous negotiation with the implementers.
In this space, organizations are confronted with a contradictory environment characterized by contested norms and values of behavior and by direct, but potentially unstable interventions affecting their operations and resources. Therefore, avoidance is expected as there are no benefits in complying with a policy intervention which might be modified on short notice. Alternatively, organizations might engage in manipulation to alter policies to their favor. We expect heterogeneity of responses driven by local interests and achievement of policy goals only when they are aligned with the individual organizations’ goals.
In Q4 (incoherent and symbolic-normative policy design space), treated organizations are confronted with an institutionally complex environment characterized by the simultaneous and lasting presence of competing policy frames; at the same time, policy interventions have a limited impact on resources, but convey diverse norms and values whose application is left to the voluntary decision of the treated organizations. In such an environment, organizations are expected to selectively couple their behavior with multiple policy frames to maximize legitimacy – they might implement bureaucratic norms in the administration, professional norms in research and managerial norms in fund-seeking (Kraatz & Block, 2008). Different identities are expected to generate heterogeneity in responses, as some organizations are more aligned with either policy frames. Very limited achievement of policy goals is therefore expected.
Methods
Performance-based funding
To illustrate the usefulness of the proposed framework, we have applied it to performance-based funding (PBF) in higher education. PBF can be defined as a resource distribution instrument linking state funding to the performance of public organizations, with the aim to provide a financial incentive for improved outcomes and an accountability mechanism (adapted from Hicks, 2012). PBF is associated with New Public Management (NPM) policy rationales (Ferlie et al., 1996) and has been adopted in several policy domains and countries since the 1980s (Mauro et al., 2017).
Variation has been observed in how the delivery of this instrument has been designed and, specifically, on the direct economic impact, on the method adopted for evaluating performance and on the extent of monitoring (Zacharewicz et al., 2018). The literature also provided evidence that HEIs did not always respond as expected to policy interventions (Aagaard, 2015) and of unintended effects, like strategic gaming (Dahler-Larsen, 2014).
Data sources
Our illustration is based on a systematic literature review. We first analyzed a number of review papers and reports to identify general patterns (Boer et al., 2015; Butler, 2010; Hicks, 2012; Teixeira et al., 2022; Zacharewicz et al., 2018). Further, four country cases were identified based on patterns of policy reforms in higher education (Paradeise et al., 2009). The selected countries are France, Italy, Norway and the United Kingdom. A focused literature search on these countries was performed through Google Scholar and by snowballing cited and citing works on the implementation of PBF in higher education. Finally, we undertook a search on works dealing specifically with organizational responses to PBFs (see Sivertsen, 2023 and Kivistö & Mathies, 2023).
Our illustration can be described as a small-n case study (Yin, 1994), which have been selected as contrastive cases according to our typology of policy design spaces. The goal is, first, to provide preliminary illustration of the framework’s components, i.e., the characterization of policy design spaces, the behavioral model of organizations and the expected responses and outcomes. Second, to contribute to theory development by enriching the model with additional dimensions driving organizational responses.
The timeframe covers the period from the late 1980s, when the prototype of PBFs, i.e., the UK Research Assessment Exercise, was introduced (Barker, 2007).
Analytical dimensions
We compare cases using dimensions as suggested by theory and saturated through descriptors derived from empirical evidence (Glaser & Strauss, 1998).
First, we describe the design space in terms of the characteristics of the ideational content (coherent vs. incoherent) and of the prevalent function of policy interventions (symbolic-normative vs pragmatic). On the ideational side, we analyze the presence of alternative frames in the policy debate, and, specifically, how the performance-based frame inspired by NPM was combined with (other) country-specific frames (Baker, 2022). We also consider the extent PBF policies were stable over time, as this is an indicator of coherence and of lack of contestation. Finally, we look to the involvement of actors other than the state in the policy design process, as (multiple) policy frames are usually associated with the presence of actors endorsing them (Sabatier, 2007). On the instrumental side, we analyze the share of funding attributed through PBF as an indicator of the direct impact on organizational resources, and the method adopted to assess performance and to allocate funding (Zacharewicz et al., 2018). We also observe the involvement of actors such as HEIs and professionals in the instruments’ design as the literature suggests that this impacts on implementation (Sivertsen, 2023). Finally, we analyze the extent PBF and its effects were systematically evaluated and whether this led to substantial redesign.
Second the literature suggests that both individual organizational characteristics (Durand et al., 2019) and the field’s structure (Fligstein & McAdam, 2011) affect organizational responses. Accordingly, we consider the extent to which HEIs have been constructed into a strategic actor and have been granted strategic and managerial autonomy (Brunsson & Sahlin-Andersson, 2000; de Boer et al., 2007; Christensen, 2011) and to which extent academics maintained their power and resisted pressures toward managerialization (Townley, 2002). Further, we analyze the extent of resource dependency from the state and of financial autonomy as a major driver of organizational responses (Pfeffer & Salancik, 1978). Finally, we look to the field’s structure in terms of vertical hierarchization as coined by international rankings (Hazelkorn, 2017) and functional specialization of HEI types having different missions (Bleiklie, 2003).
Third, as of the impacts and outcomes, we first focus on the organizational responses observed and their heterogeneity, respectively to which extent HEIs unfolded strategic behavior when confronted with policy pressures (Durand et al., 2019; Oliver, 1991). We then analyze the extent policy interventions led to a redistribution of resources, and the implications for organizational governance, as well as for professional work. Further, we inquire the achievement of the policy goals, i.e., increasing the volume and excellence of the national research output (Whitley & Glaser, 2007), as well about unintended effects (Diefenbach, 2009), such as limiting academics’ autonomy and promoting more incremental research (Wang et al., 2018).
Empirical illustration
Characterizing policy design spaces
UK is characterized by a ideationally coherent design space with a direct impact on universities. The context was the set by the reforms in the public sector introduced by the conservative Thatcher government, which made UK a forerunner in NPM (Christensen and Laegreid, 2001; Andresani & Ferlie, 2006). Consistently with this ideational frame, in 1986, a Research Assessment Exercise (RAE) was introduced in which university departments were evaluated by disciplinary panels on the ground of scientific excellence, and then funding was computed based on the grades received (Barker, 2007; Rebora & Turri, 2013). NPM was therefore combined with a tradition of professional autonomy through the involvement of academic élites in its design and implementation (Baker, 2022).
The RAE was very incisive, since more than 90% of institutional funding of research was distributed though this instrument. It was coherently designed to link evaluation to funding and, hence, to incentivize an improvement in the scientific performance of universities (Butler, 2010). The rules of the game, such as the output definition, the grading criteria, and the panels’ composition, were published in advance, while assessment results were public; the working and outcomes of the system have been also regularly evaluated (Barker, 2007; Smith et al., 2011).
This design was remarkably stable despite changes in the political leadership (Deem et al., 2007). After the 2001 RAE, a discussion emerged on broadening evaluation criteria and replacing the peer review system, perceived as very resource consuming, with bibliometric indicators. The proposal was rejected, due to the opposition of academics and of most universities; the new evaluation system, relabelled as Research Evaluation Framework (REF), maintained the existing delivery model by adding – coherently with the dominant ideational frame—societal impact as an evaluation criterion (Martin, 2011; Smith et al., 2011).
As of Norway, the policy design space can be described as coherent, but mostly relying on the symbolic-normative function of policy interventions. Norway has been considered as a slow mover in managerial reforms (Christensen and Lægreid, 1999). In higher education, reforms were introduced stepwise by respecting traditional prerogatives of academic decision-making (Bleiklie & Michelsen, 2013) and leaving discretion to actors in the implementation. The introduction of NPM was therefore moderated by a consensus-based tradition of public policy (Baker, 2022), but, nevertheless, the ideational content of reforms was remarkably consistent (Bleiklie, 2009).
The ‘Norwegian model’ of PBF, established in 2004, introduced a publication indicator based on simple weights; then, a small share of funding (about 4%) was allocated to HEIs based on the share of publication points (Sivertsen, 2016). The direct impact was therefore limited. However, the publication indicator acquired a role of standard in evaluating research quality (Sauder & Espeland, 2009). due to its simplicity and transparency. Moreover, its design was originally drafted by the National Association of Universities and took care of issues, such as disciplinary balance and coverage of national-language publications. The evaluation conducted in 2013 showed wide acceptance by universities and scholars and a positive effect on productivity (Frolich, 2011); accordingly, the system was continued with only minor corrections (Sivertsen, 2016).
Italy adopted since the 2010 a national research exercise, known under the VQR acronym (Geuna & Piolatto, 2016). The underlying policy design can be described as incoherent in its ideational content, but instrumented in a way to have potentially a strong impact on universities’ resources. Indeed, two competing frames can be identified. The first one, based on NPM, considered the VQR as a tool to increase the quality of university research (Aversano et al., 2018); the second one was based on the old state centric paradigm that assumes that all universities should be treated as equal (Capano, 2011). Accordingly, the VQR has undergone a long period of gestation in which there has been a first experimental exercise, the establishment of a national evaluation agency, and the institutionalization of a public discourse about the need to assess research quality (Capano, 2010; Capano et al., 2017). Unlike in the UK, in Italy the exercise was designed without any kind of public consultation, but through a process in which the National Conference of Rectors interacts with the ministry and the national agency in preparing the call, thereby allowing universities to manipulate the rules of the game.
The percentage of performance funding to total public funding has increased from 16% in 2014 to 30% in 2021. Thus, the policy intervention had potentially a high impact on universities, which was however mitigated by a stop gain/loss at 5% of the previous allocation. The allocation method is a mix of bibliometrics and peer review, a differentiation which implied grade inflation for some fields, such as social sciences, in which only peer review was adopted. Moreover, there have been changes both in the distribution of the grades and in the rules for composition of the panels. While the VQR outcomes have been largely debated (Abramo & D’Angelo, 2021; Checchi et al., 2019), no systematic evaluation was undertaken.
Finally, in France policy reforms have been mostly driven by national issues, such as the weakness of universities, without a clear reference to a broader policy rationale (Musselin & Paradeise, 2009). Managerial ideas became visible from the early ‘2000, but never constituted the ideational core of the reforms (Mathisen Nyhagen, 2015). This context led to a train of reforms where the emphasis was on restructuring of universities (Musselin & Paradeise, 2009), contractualisation (Jongbloed & Vossensteyn, 2001) and on the creation of excellent universities through mergers (Cremonini et al., 2013). A core component was the establishment of an independent evaluation agency, which started to evaluate HEIs and research units (Capano & Turri, 2017).
Funding reforms were introduced from 2006 on the wake of a new budgetary law (Barbato et al., 2022): this included a new allocation model, called SYMPA, in which 20% of research funding (representing 30% of total funding) was based on performance, measured through the evaluation of research units and the number of doctoral degrees (Calviac, 2019). The PBF was however only loosely connected with evaluation outcomes and considered by the ministry more as help for the negotiations with universities, coherently with the central role of the state in managing French higher education (Bleiklie & Michelsen, 2013). Eventually, in 2018, the system was abandoned.
We summarize our comparisons of policy design in Table 1.
Organizational characteristics and field’s structure
Since the’80, the State actively promoted the transformation of UK universities from professional organizations to managerial forms (Ferlie & Andresani, 2009), which implied the reduction of professional autonomy, the managerialization of university leaders (Breakwell & Tytherleigh, 2008) and the establishment of internal control systems (Deem et al., 2007). Accordingly, in responding to policy pressures, universities are expected to display a large deal of strategic behavior, even more so since direct state funding does not any more account for the majority of funds (Jongbloed & Lepori, 2015). UK higher education is also characterized by a strong vertical hierarchy (Taylor, 2003), in which a small core of highly reputed universities, such the 24 members of the Russel group, receive most of the funding (Barbato & Turri, 2019). Accordingly, heterogeneity in responses to policy intervention is expected to be largely driven by status (Cattaneo et al., 2016).
Norwegian HEIs were traditionally part of the public sector with professional autonomy and limited central power. The so-called Quality reform of 2001 introduced a new management model, which foresaw the transition to a more centralized model with appointed leaders (Bleiklie, 2009). HEIs were, however, left free to decide whether to adopt the new model, and most of them have chosen a mixed model combining centralization with participation of academics (Stensaker, 2006). Norwegian higher education is still mostly funded by the state with low share of student fees and of private funding (Jongbloed & Lepori, 2015). While historically horizontal specialization between universities and professional higher education prevailed (Jungblut & Woelert, 2018), the system has become more integrated and hierarchized due to mergers and upgrade of several colleges to universities (Kyvik & Stensaker, 2016).
University governance in Italy traditionally combined governmental bureaucracy with academic corporation, and a weak institutional level (Reale & Potì, 2009). From the 1990s, the state attempted to modernize universities. While providing more autonomy and strengthening central authority, the reform maintained a tight regulatory role of the state (Capano, 2011), which was reinforced by the dependency on (shrinking) public funding (Civera et al., 2021). Change in internal governance was also incremental and the collegial principle remains dominant in internal decision-making (Donina et al., 2015). As of the system’s structure, Italian higher education is characterized by little specialization and low-quality differentiation (Barbato & Turri, 2019; Rossi, 2010), while most differences remain related to the geographical north south divide (Mateos-González & Boliver, 2019).
French universities were traditionally considered as ‘non-existent’ in a system where power was shared between disciplinary communities and the state bureaucracy and most of research was conducted in large public research organizations (Musselin, 2013). However, remarkable changes took place the late 1990ies, including increasing autonomy, state-university contracts and strategic planning (Musselin & Paradeise, 2009), which led to a strengthening of universities’ central authority and to a decline in academic autonomy (Mignot-Gérard et al., 2023). As of funding, French universities remain highly dependent from public funding, but there has been some diversification with the increase of project funding. Policy reforms attempted to restructure the system, which was perceived as fragmented and lacking excellence, by launching a program to support ‘excellent universities’ (Cremonini et al., 2013) and through HEI mergers (Heller-Schuh et al., 2020).
Accordingly, we characterize the systems as in Table 2.
Organizational responses and outcomes
As of the UK, compliance was the main observed responses to the introduction of RAE: most HEIs acted strategically to cope with the new funding system by hiring productive academics, monitoring scientific production and putting pressures on departments (Barker, 2007; Pinar & Horne, 2022). Responses were however differentiated by position in the field’s hierarchy. Top-ranked universities exploited their status position for hiring (Jappe & Heinze, 2023), while lower tier universities increasingly focused on student fees as a funding source (Rolfe, 2003). Accordingly, the main impact on academic life was felt on middle-tier pre-1992 universities, which have activated dense internal procedure to monitor their academic staff (Elton, 2000; Talib, 2003). Overall, the introduction of RAE/REF led to stronger polarization between research-oriented and education-oriented HEIs (Barbato & Turri, 2019) and further increased concentration with 24 out of 174 institutions receiving nearly three-quarters of the funding (Adams & Gurney, 2010). It is widely accepted that the REF brought an increase in the volume of the UK scientific production (Butler, 2010; Geuna & Piolatto, 2016), however, the effect on excellence is contested as there is evidence that the REF pushed academics to prime quantity over quality (Barbato & Turri, 2022) and that further concentrating resources in a few places impoverished the overall research basis (Adams & Gurney, 2010).
In Norway, universities did not transfer economic incentives to departments, nor they implemented management strategies to enhance their evaluation, as the overall economic impact was very low. However, the publication indicator started to be employed for monitoring research at the departmental level and for assessing the quality of academics, even if it was never designed for that purpose (Aagaard, 2015). As such, it became an important input for universities’ decisions such as the opening of new chairs and tenure decisions and affected the publication behavior of academics toward higher productivity and international journals (Frolich, 2011). Therefore, the impact on the publication volume of Norwegian HEIs was comparable to the (much more forceful) REF in the UK (Sivertsen, 2016), but the effect of the PBF was not due to the financial incentive, but to the normative dimension of the design content. This process was less centrally directed by organizational managers, and, accordingly, did not affect the balance of power within universities; moreover, the share of publication points of the four historical universities diminished reflecting a broadening of the system (Aagaard, 2015).
As of Italy, despite the large amount of resources involved, the impact of VQR in the distribution of funding among universities was limited (Geuna & Piolatto, 2016). Some evidence has been provided that the scientific productivity of Italian universities increased during the considered period and that low productivity universities from the south improved their performance more than the best universities in the country (Abramo & D’Angelo, 2021; Checchi et al., 2020; Demetrescu et al., 2020; Grisorio & Prota, 2020). While some universities use the result of the VQR also for the internal allocation, there is no evidence of systematic monitoring of performance and of VQR impacting internal decision processes. It was also suggested that increasing average productivity of the Italian researchers was less due to the research exercise itself, and more to a national regulation on recruitment that has established that, before applying for a professorial position, it is necessary to get a national qualification (Marini, 2017). Overall, the main effect of VQR was, together with other measures, to establish a minimum standard of quality for professors, thereby fostering convergence from the bottom, rather than pushing universities to enhance their research capacity.
As of France, impacts of the PBF are difficult to ascertain since, on the one hand, it was a minor element in a train of reforms where organizational restructuring and contractualization were prevalent, and, on the other hand, implementation was untransparent and changed over time up to the dismissal of the SYMPA model. It has been suggested that the main impact was to anchor within universities the idea that funding should be linked to performance (Boitier et al., 2015), leading to stronger formalization and connection to performance goals in the internal process of budgeting (Mignot-Gérard et al., 2023). This decentralized implementation generated large heterogeneity in responses. As of improving the overall system’s performance, impacts have been, at best, mediocre (Mai, 2022). The French universities’ position in international rankings hardly improved and, overall, the country position in international comparisons worsened (Highman, 2020). The characteristics of organizational responses and outcomes are summarized in Table 3.
Discussion
Our proposal aimed to bridge the gap between policy design and implementation, by analyzing the implications of different characteristics of policy design spaces (in terms of the level of ideational coherence and the prevailing function of the adopted policy instruments) on the types of responses of public organizations as ‘first implementers’ of public policies and strategic actors.
The four countries cases that have provided preliminary evidence of the effects of the different types of policy design.
First, when comparing UK and NO with IT and FR (coherent vs. incoherent policy design space), our data suggest that the level of ideational coherence is a pivotal dimension in addressing the targeted public organizations (Hall, 1993; Hogan and Howlett, 2015). Despite differences in the adopted instruments, both in the UK and in Norway a coherent policy frame activated strong responses by the treated organizations, which, however, did not always align with the policymakers’ expectations: UK universities largely resorted to hiring strategies rather than to improve the researchers’ productivity, while the Norwegian publication indicator was taken up to assess individual academics, for which it was never intended for.
In such a perspective, ideational coherence might be considered as a mixed blessing: on the one hand, it allows designing policy interventions based on clear norms, goals and mechanisms of action; on the other hand, it bears the risk that the policy design process is blind to the goals and norms of the treated organizations – which, accordingly, might come up with unexpected responses, which are potentially problematic given the strong effects.
Heterogeneous policy spaces, such as in Italy and France, display different trade-offs, as they bear the risk of partial or even no real implementation with treated organizations resorting to mostly ritual behavior. At the same time, our data show that the composite ideational content allowed for compromising and selective coupling by the treated organizations– and, accordingly, for a flexible implementation fitting the context of each organization and avoiding internal conflicts.
Therefore, in these policy design spaces, the main issue for policymakers is not to avoid goal displacement, but to convey a clear message that policy goals need to be achieved, while delegating implementation to the treated organizations.
Proposition 1
The more coherent the ideational content of a policy design space, the stronger will be the effects of policy interventions.
Proposition 1a
Coherent design spaces imply powerful effects, and therefore should be monitored for avoiding goals displacement.
Proposition 1b
Incoherent policy design spaces bear the risk of limited implementation, and therefore need a strong instrumentation and monitoring of compliance.
Second, when comparing countries within these two groups (UK vs. NO and IT vs. FR), we found evidence that our second dimension of policy design spaces, i.e., the prevalent function of the selected instruments (Lascoumes & Le Galès, 2007), also matters, and interacts with ideational coherence.
In countries characterized by a coherent ideational content (UK and NO), our data suggest that an instrumentation intentionally designed to impact directly on resources, such as in the UK, ‘crowds’ out the instruments’ normative function, as the treated organizations responded strategically by maximizing their revenues, but without necessarily complying with the original goals underlying the policy design. This can obviously lead to certain forms of gaming (Hood, 2006; Pollitt, 2013) and generated responses that, while being efficient for the treated organizations, were not always functional at the system’s level, such as publishing more papers with similar contents or hiring people to improve the evaluation scores. On the contrary, in Norway, the normative function activated behaviors in line with the goal of increasing researchers’ productivity, but departments and selection committee were more responsive than universities. In other words, normative pressures generate practical effects through activated behavioral responses, such as compliance with normative standards of productivity. Since compliance refers directly to the underlying policy goals, there is less risk of strategic gaming. Indeed, a recent review of the implementation of PBF in European countries suggested that these systems were implemented more effectively when evaluation was not directly associated with a large share of funding (Sivertsen, 2023).
In countries where the ideational content of the policy design space is heterogeneous (IT and FR), our data suggest that a normative instrumentation, as in France, will have little impact, as normative pressures are powerful only if supported by a coherent ideational content; in such spaces, a pragmatic element, such as linking a large share of resources to performance, is required to make policy norms and goals credible to the treated organizations and to elicit at least partial responses. As shown by the Italian case, when a high proportion of public funding is distributed according to performance, there can be some improvement in HEI performance, thus showing how resource dependence can be a significant trigger of organizational responses despite ideational incoherence.
Proposition 2
The more the ideational content is coherent, the less the chosen policy instrumentation requires a pragmatic function to be effective.
Proposition 2a
In coherent policy design spaces, the instruments’ pragmatic function will crowd out the normative one leading to potential gaming.
Proposition 2b
In incoherent policy design space, a pragmatic function is required to generate significant responses.
Third, while in all four cases there is evidence of variation in responses between the treated organizations, heterogeneity took different forms depending on the characteristics of policy design space and of the organizational field (Cattaneo et al., 2016).
When comparing the two countries with homogeneous ideational content (UK and NO), we observed opposite effects. A pragmatic intervention, such as in the UK, reinforced segmentation driven by resourcing opportunities, as for less-reputed HEIs investing in improving their research evaluation, was less economically interesting than focusing on education; on the contrary, in Norway, a mostly normative intervention promoted convergence with ‘new’ universities striving to imitate the most reputed ones (DiMaggio & Powell, 1983).
On the contrary, when ideational incoherence prevailed (IT and FR), policy implementation led to more complex forms of heterogeneity in responses. When ideational incoherence is paired with a pragmatic function of instruments, as in Italy, universities with higher status focused on manipulating the rules of the game with the minimal goal to avoid losing resources. As a result, scientific quality did not improve, as the advantages would have been less than the effort required; on the opposite, universities with low research quality were pushed to improve their performance as this required less effort. The outcome of this design was improving the average system’s quality without pushing toward excellence. On the opposite, when the ideational incoherence, such as in France, is paired by a normative function of the adopted instruments, both pragmatic and normative pressures are weak. Accordingly, this design space intrinsically leads to heterogeneous responses driven by local conditions and the final outcome is far from the expected policy goal.
Proposition 3
Different policy design spaces translate into different forms of heterogeneity in responses.
Proposition 3a
In coherent policy design spaces, pragmatic policy interventions lead to hierarchical segmentation, while normative interventions to isomorphism.
Proposition 3b
In incoherent policy design spaces, pragmatic policy interventions lead to improvements only in the low-status HEIs, while normative policy interventions to local heterogeneity.
Fourth, our analysis suggests that the characteristics of the policy design space were indeed conditioned by the pre-existing field’s structure, as well as by other concurring policies. In the UK, the pragmatic function of the adopted policies has contributed to reinforce the position of few top-level universities, whose academic élites played a core role in the REF; however, this was also enabled by the fact that the losers could get financial resources by accepting more students (Barbato & Turri, 2019; Rolfe, 2003). This way to balance the policy intervention further strengthened the system’s hierarchical structure, to the price of not fostering an increase in quality in the low-tier HEIs. On the contrary, the small size of the system helped the implementation of the Norwegian policy design, as imitation of effects generated by social norms are stronger and adaptation to local conditions is manageable. As of Italy, in a system, where there are few excellent universities, the introduction of PBF was only acceptable by guaranteeing a baseline level of resources to all universities, with the consequence that the main impact was on low-performing universities rather than on excellence. Finally, creating a unique playing ground where universities would compete for resources based on quality proved to be unfeasible in a fragmented space as in France, characterized by local specificities and direct negotiations between the ministry and individual universities, leading eventually to the breakdown of the PBF system.
This analysis confirms our assumption that there are structural drivers that push policymakers to design the policy interventions in one of the four spaces. For example, we can expect that if the relevant problem is unsettled or particularly complex and ambiguous in terms of goals or values, policymakers will design the policy in one of the two ideationally heterogeneous spaces, while the choice of the function of instruments could depend on existing administrative traditions or on specific political or economic conditions., or on the fields structure and power relationships.
The theoretical value of our characterization of policy design spaces lies exactly in the ability to summarize this variety of contextual factors affecting policy design and implementation in terms of ideal types, which can be associated with different responses of the treated organizations. While there cannot be causal linearity between the types of policy design and policy implementation, exactly because of the strategic capability of the ‘first implementers’, nevertheless we were able to draw reasonable expectations regarding systematic patterns of behavior.
Our analysis also highlighted that other dimensions affected the behavioral responses foreseen by our model. For instance, in the UK, the large financial autonomy of the analyzed organizations reinforced strategic gaming and thus amplified the unwanted effects of the designed policies. Conversely, in France and in Italy, the short-time horizon of policy interventions further weakened their effects. These dimensions are only partially correlated with our typology as incoherent policy designs might be stable over time if supported by lasting policy coalitions. We have provided some preliminary evidence that their role in implementation differed by type of policy design; for instance, the level of trust did not emerge as a significant factor in a policy design space in which the practical function of instruments prevailed, such as in the UK, while in Norway, trust and social ties between policy-makers and universities strengthened Norwegian policies’ normative effects. We therefore suggest that our typology provides a framework for future studies to understand the varied effects of other dimensions of implementation – such as intensity, level of trust, and time horizon – on the achievement of policy goals when the basic characteristics of the policy design space are fixed.
Conclusion
In this paper we have tried to connect the literature on policy design with the one on organizational responses to (policy) environments. The focus has been to conceptualize a link between the content of policy design and its effect in terms of organizational response. Thus, our contribution does not have the goal of resolving the various causes and dynamics of the implementation gap. Instead, it aims to shed light on specific linkages that can be considered strategic when the first implementers of a policy are public organizations.
This focus allows to fill the gap in understanding implementation that both policy studies and organization studies still have. Policy studies very often disregard how strategic behaviors of organizations are pivotal in align or de-aligning policy implementations with policymakers’ expectations. Organization studies also frequently disregard the notion that policy design is a core environmental factor that influences organizational behavior. Thus, linking the types of policy design spaces to specific patterns of organizational responses promotes the balancing these two perspectives and allows policy studies to have a deeper understanding of implementation that takes into account organizations. Furthermore, it enables researchers in the field of organizational studies to understand how different designs of public policies can impact organizations.
This attempt has been pushed by the assumption that policies can reach their goals only if the organizations in charge of their implementation behave in the expected way and thus, in a way or in another, comply. By drawing on the public policy literature, a typology of policy design spaces has been proposed by dichotomizing the ideational content and the characteristics of the adopted policy instruments. Then, we have hold propositions relating each of the four policy design spaces to expectations on organizational responses, such as compliance, compromise, selective coupling and manipulation. Furthermore, we have proposed some empirical illustrations of these propositions by drawing examples from the field of comparative higher education.
Our data show how this framework is promising in terms of understanding whether and how policy design can be capable to achieve the expected outcomes. The four types of policy design should be conceived as structured spaces in which policy makers have the chance to take only delimited choices that will drive to specific organizational reactions by the first implementers. We suggest that this more realistic understanding of organizational responses has a potential to allow policymakers to adapt the public intervention to the specific characteristics of the field in which it takes place and of the treated organizations.
Beyond our exemplary case, the framework could be applied to other policy fields in which public organizations, or organizations that are contracted by the governments, are the first implementers. An example is the introduction of national testing systems, which has resulted in various unexpected responses by schools (e.g., reducing the attention on non-tested subjects or encouraging poor performing students to drop out). While the literature has considered these responses in terms of gaming (Heilig & Darling-Hammond, 2008; Jacob & Levitt, 2003; Yiu, 2020), our framework would rather consider it as a consequence of a design that does not take into consideration the characteristics of its target organizations. Our framework could also allow a better understanding of the unexpected responses to various policy interventions in health care policy, such as the introduction of reimbursement systems (Parkinson et al., 2019), the regulation of waiting list (Breton et al., 2020), or design of public procurement ( García-Altés et al., 2023).
Our study has limitations that open further avenues for research. First, it would be important to understand how different contextual conditions drive policymakers to elaborate their interventions in one of the four policy design spaces, and the extent to which these types show regularities across policy domains and countries. Second, while we are convinced that the strength of our framework lies in its general categories that, with some variation, can be applied to grasp the general response patterns of the treated organizations, there are several contextual conditions that are likely to affect policy implementation, such as the level of trust between State and public organizations, the role of semi-autonomous agencies hold in leading implementation processes (Talbot, 2004; Verhoest et al., 2009), and the organizational structure of the implementers (Egeberg & Trondal, 2018). Understanding how these factors could moderate organizational responses and reduce the risk of unwanted effects would be important to design more robust policies. Third, more systematic analyzes of organizational responses to policies and of their drivers would be needed to provide a rigorous testing of our theoretical model and propositions.
All in all, every type of policy design shows specific trade-offs of which policy makers should be aware and that policy scholars should take into consideration when analyzing the policy design dynamics and assessing the results of the implementation. In fact, both policy makers and policy scholars, although from different perspectives, should be solicited to pay more attention to the organizational responses of those organizations that, being the target of the policy intervention, are also the first implementers of them. Many implementation problems and shortcomings depend on the fact that the role of these organization is considered as ‘neutral’; yet policy interventions directly affect their identity and positioning and then they can activate themselves to defend them.
Therefore, it is time for policy scholars to think that the first goal of policy design is to reach the expected responses from the organizations involved and that the ideational and instrumental content of the policy design cannot escape to consider these as the first real target of any policy intervention, and thus the main and inescapable actors of the implementation stage.
References
Aagaard, K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy, 42(5), 725–737.
Abramo, G., & D’Angelo, C. A. (2021). The different responses of universities to introduction of performance-based research funding. Research Evaluation, 30(4), 514–528.
Adams, J., & Gurney, K. (2010). Funding selectivity, concentration and excellence: How good is the UK's research?. Higher Education Policy Institute.
Andresani, G., & Ferlie, E. (2006). Studying governance within the British public sector and without: Theoretical and methodological issues. Public Management Review, 8(3), 415–431.
Arellano-Gault, D., Demortain, D., Rouillard, C., & Thoenig, J. (2013). Bringing public organization and organizing back in. Organization Studies, 34(2), 145–167.
Aversano, N., Manes-Rossi, F., & Tartaglia-Polcini, P. (2018). Performance measurement systems in universities: A critical review of the Italian system. In E. Borgonovi, E. Alessi-Pessina, & C. Bianchi (Eds.), Outcome-based performance management in the public sector (pp. 269–287). Springer.
Baker, I. (2022). Institutional logics as a theoretical framework: A comparison of performance based funding policies in the United Kingdom, Germany, and France. Higher Education Policy, pp 1–16.
Barbato, G., Pin, C., & Turri, M. (2022). A longitudinal analysis of the relationship between central government and universities in France: The role of performance measurement mechanisms. In E. Caperchione & C. Bianchi (Eds.), Governance and performance management in public universities (pp. 69–85). Springer.
Barbato, G., & Turri, M. (2019). What do positioning paths of universities tell about the diversity of higher education systems? An exploratory study. Studies in Higher Education, 45(9), 1919–1932.
Barbato, G., & Turri, M. (2022). An analysis of methodologies, incentives, and effects of performance evaluation in higher education: The English experience. In E. Caperchione & C. Bianchi (Eds.), Governance and performance management in public universities (pp. 49–68). Springer.
Bardach, E. (1977). The implementation game: What happens after a bill becomes a law. Cambridge: MIT Press.
Barker, K. (2007). The UK research assessment exercise: The evolution of a national research evaluation system. Research Evaluation, 16(1), 3–12.
Baviskar, S., & Winter, S. C. (2017). Street-level bureaucrats as individual policymakers: The relationship between attitudes and coping behavior toward vulnerable children and youth. International Public Management Journal, 20(2), 316–353.
Bleiklie, I. (2003). Hierarchy and specialisation: On the institutional integration of higher education systems. European Journal of Education, 38(4), 341–355.
Bleiklie, I. (2009). Norway: From tortoise to eager beaver? In C. Paradeise, I. Bleiklie, E. Reale, & E. Ferlie (Eds.), University governance (pp. 127–152). Cham: Western European comparative perspectives Springer.
Bleiklie, I., & Michelsen, S. (2013). Comparing HE policies in Europe. Higher Education, 65(1), 113–133.
Bode, I., Lange, J., & Märker, M. (2017). Caught in organized ambivalence: Institutional complexity and its implications in the German hospital sector. Public Management Review, 19(4), 501–517.
Boer, H. D., Jongbloed, B., Benneworth, P., Cremonini, L., Kolster, R., Kottmann, A., Lemmens-Krug, K., & Vossensteyn, J. J. (2015). Performance-based funding and performance agreements in fourteen higher education systems. University of Twente, Enschede.
Boitier, M., Chatelain-Ponroy, S., Riviere, A., Mignot-Gerard, S., Musselin, C., Sponem, S. (2015). Le Nouveau Management Public dans les universités françaises, un puzzle doctrinal encore mal articulé en pratiques? Comptabilité, Contrôle et Audit des invisibles, de l'informel et de l'imprévisible. Hal Open Science, hal-01188862.
Boxenbaum, E., & Jonsson, S. (2008). Isomorphism, diffusion and decoupling. In R. Greenwood, C. Oliver, K. Sahlin, & R. Suddaby (Eds.), The SAGE Handbook of Organizational Institutionalism (pp. 78–98). Sage.
Bozeman, B. (2013). What organization theorists and public policy researchers can learn from one another: Publicness theory as a case-in-point. Organization Studies, 34(2), 169–188.
Breakwell, G. M., & Tytherleigh, M. Y. (2008). UK university leaders at the turn of the 21st century: Changing patterns in their socio-demographic characteristics. Higher Education, 56(1), 109–127.
Bressers, H. T. A., & O’Toole, L. J. (2005). Instrument selection and implementation in a networked context. In E. Pearl, M. M. Hill, & M. Howlett (Eds.), Designing Government: From Instruments to Governance (pp. 132–153). McGill - Queen’s University Press.
Breton, M., Smithman, M. A., Sasseville, M., Kreindler, S. A., Sutherland, J. M., Beauséjour, M., Green, M., Marshall, E. G., Jbilou, J., & Shaw, J. (2020). How the design and implementation of centralized waiting lists influence their use and effect on access to healthcare-A realist review. Health Policy, 124(8), 787–795.
Brunsson, N., & Sahlin-Andersson, K. (2000). Constructing organizations: The example of the public sector reform. Organization Studies, 21(4), 721–746.
Bundy, J., Shropshire, C., & Buchholtz, A. K. (2013). Strategic cognition and issue salience: Toward an explanation of firm responsiveness to stakeholder concerns. Academy of Management Review, 38(3), 352–376.
Butler, L. (2010). Impacts of performance-based research funding systems: A review of the concerns and the evidence. In OECD (Ed.), Performance-based funding for public research in tertiary education institutions (pp. 127–165). Organisation for Economic Cooperation and Development,
Calviac, S. (2019). Le financement des universités: évolutions et enjeux. Revue Française D’administration Publique, 1, 51–58.
Capano, G. (2003). Administrative traditions and policy change: When policy paradigms matter The case of Italian administrative reform during the 1990s. Public Administration, 81(4), 781–801.
Capano, G. (2010). A Sisyphean task: Evaluation and institutional accountability in Italian higher education. Higher Education Policy, 23(1), 39–62.
Capano, G. (2011). Government continues to do its job A comparative study of governance shifts in the Higher Education Sector. Public Administration, 89(4), 1622–1642.
Capano, G. (2018). Reconceptualizing layering—From mode of institutional change to mode of institutional design: Types and outputs. Public Administration, 97(3), 590–604.
Capano, G., & Jarvis, D. S. (Eds.). (2020). Convergence and diversity in the governance of higher education: Comparative perspectives. Cambridge: Cambridge University Press.
Capano, G., & Lippi, A. (2017). How policy instruments are chosen: Patterns of decision makers’ choices. Policy Sciences, 50(2), 269–293.
Capano, G., & Mukherjee, I. (2020). Policy design and non-design: Discerning the content of policy packaging, patching, stretching and layering. In G. Capano & M. Howlett (Eds.), A modern guide to public policy (pp. 203–220). Edward Elgar.
Capano, G., & Pritoni, A. (2019). Varieties of hybrid systemic governance in European Higher Education. Higher Education Quarterly, 73(1), 10–28.
Capano, G., Regini, M., & Turri, M. (2017). Changing governance in universities. London: London Palgrave MacMillan.
Capano, G., & Toth, F. (2023). Health policy under the microscope: A micro policy design perspective. Frontiers in Public Health, 11 (1180836), 1–10. https://doi.org/10.3389/fpubh.2023.1180836
Capano, G., & Turri, M. (2017). Same governance template but different agencies. Higher Education Policy, 30(2), 225–243.
Cattaneo, M., Meoli, M., & Signori, A. (2016). Performance-based funding and university research productivity: The moderating effect of university legitimacy. The Journal of Technology Transfer, 41(1), 85–104.
Checchi, D., Malgarini, M., & Sarlo, S. (2019). Do performance-based research funding systems affect research production and impact? Higher Education Quarterly, 73(1), 45–69.
Checchi, D., Mazzotta, I., Momigliano, S., & Olivanti, F. (2020). Convergence or polarisation? The impact of research assessment exercises in the Italian case. Scientometrics, 124(2), 1439–1455.
Chindarkar, N., Ramesh, M., & Howlett, M. (2022). 19. Designing social policies: design spaces and capacity challenges. In: G.B. Peters & G. Fontaine (Eds). Research Handbook of Policy Design, Edward Elgar, pp. 323–337
Chindarkar, N., Howlett, M., & Ramesh, M. (2017). Introduction to the special issue: “Conceptualizing effective social policy design: Design spaces and capacity challenges.” Public Administration and Development, 37(1), 3–14.
Christensen, T., & Laegreid, P. (2001). New Public Management - The transformation of ideas and practice. Ashgate Pub Ltd.
Christensen, T., & Lægreid, P. (Eds.) (2006). Autonomy and regulation: Coping with agencies in the modern state. Edward Elgar Publishing.
Christensen, T. (2011). University governance reforms: Potential problems of more autonomy? Higher Education, 62(4), 503–517.
Christensen, T., & Lægreid, P. (1999). New public management: Design, resistance, or transformation? A study of how modern reforms are received in a civil service system. Public Productivity & Management Review, 23(2), 169–193.
Christensen, T., & Lægreid, P. (2011). Complexity and hybrid public administration—theoretical and empirical challenges. Public Organization Review, 11, 407–423.
Christensen, T., & Lægreid, P. (2021). Performance management: Experiences and challenges. In B. Hildreth, E. Lindquist, & J. Miller (Eds.), The Routledge Handbook of Public Administration (4th ed., pp. 210–222). London: Routledge.
Christensen, T., Lægreid, P., & Røvik, K. A. (2020). Organization theory and the public sector: Instrument, culture and myth. Routledge.
Civera, A., Meoli, M., & Paleari, S. (2021). When austerity means inequality: The case of the Italian university compensation system in the period 2010–2020. Studies in Higher Education, 46(5), 926–937.
Cremonini, L., Benneworth, P., Dauncey, H., & Westerheijden, D. F. (2013). Reconciling republican ‘Egalite’and global excellence values in French higher education. In J. C. Shin & B. Kehm (Eds.), Institutionalization of World-Class University in global competition (pp. 99–123). Springer.
Dahler-Larsen, P. (2014). Constitutive effects of performance indicators: Getting beyond unintended consequences. Public Management Review, 16(7), 969–986.
de Boer, H., Enders, J., & Leisyte, L. (2007). Public sector reform in Dutch higher education: The organizational transformation of the University. Public Administration, 85(1), 27–46.
Deem, R., Hillyard, S., Reed, M., & Reed, M. (2007). Knowledge, higher education, and the New Managerialism: The changing management of UK universities. Oxford: Oxford University Press.
Demetrescu, C., Ribichini, A., & Schaerf, M. (2020). Are Italian research assessment exercises size-biased? Scientometrics, 125(1), 533–549.
Diefenbach, T. (2009). New public management in the public sector: The dark side of managerialistic “enlightenment.” Public Administration, 87(4), 892–909.
DiMaggio, P. J., & Powell, W. W. (1983). The iron cage revisited: Institutional isomorphism and collective rationality in organizational fields. American Sociological Review, 48(1), 147–160.
Dimitrakopoulos, D. G., & Richardson, J. (Eds.). (2001). Implementing EU public policy. London: Routledge.
Doern, G. B., & Phidd, R. W. (1983). Canadian public policy: Ideas, Structure, process. London: Methuen.
Donina, D., Meoli, M., & Paleari, S. (2015). Higher education reform in Italy: Tightening regulation instead of steering at a distance. Higher Education Policy, 28(2), 215–234.
Durand, R., Hawn, O., & Ioannou, I. (2019). Willing and able: A general model of organizational responses to normative pressures. Academy of Management Review, 44(2), 299–320.
Edelman, L. B., & Suchman, M. C. (1997). The legal environments of organizations. Annual Review of Sociology, 23(1), 479–515.
Edelman, L. B., Uggen, C., & Erlanger, H. S. (1999). The endogeneity of legal regulation: Grievance procedures as rational myth. American Journal of Sociology, 105(2), 406–454.
Egeberg, M., & Trondal, J. (2018). An Organizational Approach to Public Governance. Oxford University Press.
Elton, L. (2000). The UK research assessment exercise: Unintended consequences. Higher Education Quarterly, 54(3), 274–283.
Ferlie, E., & Andresani, G. (2009). United Kingdom from bureau professionalism to new public management? In C. Paradeise, I. Bleiklie, E. Reale, & E. Ferlie (Eds.), University governance (pp. 177–195). Cham: Western European comparative perspectives Springer.
Ferlie, E., Ashburner, L., Fitzgerald, L., & Pettigrew, A. (1996). The New Public Management in Action. Oxford University Press.
Fligstein, N., & McAdam, D. (2011). Toward a general theory of strategic action fields. Sociological Theory, 29(1), 1–26.
Frolich, N. (2011). Multi-Layered accountability. Performance-Based Funding of Universities. Public Administration, 89(4), 840–859.
García-Altés, A., McKee, M., Siciliani, L., Barros, P. P., Lehtonen, L., Rogers, H., Kringos, D., Zaletel, J., & De Maeseneer, J. (2023). Understanding public procurement within the health sector: A priority in a post-COVID-19 world. Health Economics, Policy and Law, 18(2), 172–185.
Geuna, A., & Piolatto, M. (2016). Research assessment in the UK and Italy: Costly and difficult, but probably worth it (at least for a while). Research Policy, 45(1), 260–271.
Glaser, B. G., & Strauss, A. L. (1998). Grounded theory. Hans Huber.
Greenwood, R., Raynard, M., Kodeih, F., Micelotta, E., & Lounsbury, M. (2011). Institutional complexity and organizational responses. The Academy of Management Annals, 5(1), 317–371.
Grisorio, M. J., & Prota, F. (2020). Italy’s national research assessment: Some unpleasant effects. Studies in Higher Education, 45(4), 736–754.
Hall, P. A. (1993). Policy paradigms, social learning, and the state: The case of economic policymaking in Britain. Comparative Politics, 25(3), 275–296.
Hazelkorn, E. (Ed.). (2017). Global rankings and the geopolitics of higher education. London: Routledge.
Heilig, J. V., & Darling-Hammond, L. (2008). Accountability Texas-style: The progress and learning of urban minority students in a high-stakes testing context. Educational Evaluation and Policy Analysis, 30(1), 75–110.
Heller-Schuh, B., Lepori, B., & Neuländtner, M. (2020). Mergers and acquisitions in the public research sector. Toward a Comprehensive Typology. Research Evaluation, 29(4), 366–376.
Hicks, D. (2012). Performance-based university research funding systems. Research Policy, 41(2), 251–261.
Highman, L. (2020). Remapping French higher education: Towards a multi-tiered higher education system? Tertiary Education and Management, 26, 199–214.
Hogan, J., & Howlett, M. (Eds.). (2015). Policy paradigms in theory and practice: Discourses, ideas and anomalies in public policy dynamics. Cham: Springer.
Hood, C. (2006). Gaming in target world: The targets approach to managing British public services’. Public Administration Review, 66(4), 515–521.
Howlett, M. (2004). Beyond good and evil in policy implementation: Instrument mixes, implementation styles, and second generation theories of policy instrument choice. Policy and Society, 23(2), 1–17.
Howlett, M. (2018). Matching policy tools and their targets: Beyond nudges and utility maximisation in policy design. Policy & Politics, 46(1), 101–124.
Howlett, M., & Mukherjee, I. (2014). Policy design and non-design: Towards a spectrum of policy formulation types. Politics and Governance, 2(2), 57.
Howlett, M., & Mukherjee, I. (Eds.). (2018). Routledge handbook of policy design. London: Routledge.
Howlett, M., Mukherjee, I., & Woo, J. J. (2015). From tools to toolkits in policy design studies: The new design orientation towards policy formulation research. Policy & Politics, 43(2), 291–311.
Howlett, M., & Ramesh, M. (1993). Patterns of policy instrument choice: Policy styles, policy learning and the privatization experience. Review of Policy Research, 12(1), 3–24.
Hupe, P. L., & Hill, M. J. (2016). And the rest is implementation Comparing approaches to what happens in policy processes beyond Great Expectations. Public Policy and Administration, 31(2), 103–121.
Jacob, B. A., & Levitt, S. D. (2003). Rotten apples: An investigation of the prevalence and predictors of teacher cheating. The Quarterly Journal of Economics, 118(3), 843–877.
Jappe, A., & Heinze, T. (2023). Research funding in the context of high institutional stratification. Policy scenarios for Europe based on insights from the United States. In B. Lepori, B. Jongbloed, & D. Hicks (Eds.), Handbook of public research funding Edward Elgar, pp. 203–220
Jongbloed, B., & Lepori, B. (2015). The funding of research in higher education: Mixed models and mixed results. In M. Souto-Otero, J. Huisman, D. D. Dill, H. de Boer, A. S. Oberai, & L. Williams (Eds.), Handbook of higher education policy and governance (pp. 439–461). Palgrave.
Jongbloed, B., & Vossensteyn, H. (2001). Keeping up Performances: An international survey of performance-based funding in higher education. Journal of Higher Education Policy and Management, 23(2), 127–145.
Jungblut, J., & Woelert, P. (2018). The changing fortunes of intermediary agencies: Reconfiguring higher education policy in Norway and Australia. In M. Nerland & L. Yates (Eds.), Reconfiguring knowledge in higher education (pp. 25–48). Dordrecht.
Kern, F., & Howlett, M. (2009). Implementing transition management as policy reforms: A case study of the Dutch energy sector. Policy Sciences, 42(4), 391–408.
King, B., Felin, T., & Whetten, D. (2010). Finding the organization in organizational theory: A meta-theory of the organization as a social actor. Organization Science, 21(1), 290–305.
Kivistö, J., & Mathies, C. (2023). Incentives, rationales, and expected impact. Linking performance-based research funding to internal funding distributions of universities. In B. Lepori, B. Jongbloed, & D. Hicks (Eds.), Handbook of public research funding (pp. 186–202). Edward Elgar.
Kraatz, M. S., & Block, E. S. (2008). Organizational Implications of Institutional Pluralism. In R. Greenwood, C. Oliver, K. Sahlin, & R. Suddaby (Eds.), The SAGE handbook of organizational institutionalism (pp. 243–275). Sage.
Kyvik, S., & Stensaker, B. (2016). Mergers in Norwegian higher education. In R. Pinheiro, L. Geschwind, & T. Aarrevaara (Eds.), Mergers in higher education (pp. 29–42). Springer.
Lascoumes, P., & Le Galès, P. (2007). Introduction: Understanding public policy through its instruments? From the nature of instruments to the sociology of public policy instrumentation. Governance, 20(1), 1–21.
Le Galès, P. (2022). Policy instrumentation with or without policy design. In G. B. Peters & G. Fontaine (Eds.), Research Handbook of Policy Design (pp. 88–103). Edward Elgar.
Leicht, K. T., & Fennell, M. L. (2008). Institutionalism and the professions. In R. Greenwood, C. Oliver, K. Sahlin, & R. Suddaby (Eds.), The Sage handbook of organizational institutionalism (pp. 431–448). Sage.
Linder, S., & Peters, B. G. (1991). The logic of public policy design: Linking policy actors and plausible instruments. Knowledge and Policy, 4, 125–151.
Mai, A. N. (2022). The effect of autonomy on university rankings in Germany, France and China. Higher Education for the Future, 9(1), 75–92.
Marini, G. (2017). New promotion patterns in Italian universities: Less seniority and more productivity? Data from ASN. Higher Education, 73(2), 189–205.
Martin, B. R. (2011). The research excellence framework and the ‘impact agenda’: Are we creating a Frankenstein monster? Research Evaluation, 20(2), 247–254.
Mateos-González, J. L., & Boliver, V. (2019). Performance-based university funding and the drive towards ‘institutional meritocracy’in Italy. British Journal of Sociology of Education, 40(1), 145–158.
Mathisen Nyhagen, G. (2015). Between slow and comprehensive reformers: Comparing government’s funding policies of universities in three European countries. International Journal of Public Administration, 38(8), 533–543.
Matland, R. E. (1995). Synthesizing the implementation literature: The ambiguity-conflict model of policy implementation. Journal of Public Administration Research and Theory, 5(1), 145–174.
Mauro, S. G., Cinquini, L., & Grossi, G. (2017). Insights into performance-based budgeting in the public sector: A literature review and a research agenda. Public Management Review, 19(7), 911–931.
May, P. J. (1991). Reconsidering policy design: Policies and publics. Journal of Public Policy, 11(2), 187–206.
May, P. J. (2012). Policy design and implementation. In G. Peter & J. Pierre (Eds.), The Sage Handbook of Public Administration (2nd ed., pp. 279–291). Sage.
May, P. J., & Winter, S. C. (2009). Politicians, managers, and street-level bureaucrats: Influences on policy implementation. Journal of Public Administration Research and Theory, 19(3), 453–476.
McNulty, T., & Ferlie, E. (2004). Process transformation: Limitations to radical organizational change within public service organizations. Organization Studies, 25(8), 1389–1412.
Meier, K. J., & O’Toole, L. J., Jr. (2006). Political control versus bureaucratic values: Reframing the debate. Public Administration Review, 66(2), 177–192.
Meyers, M. K., & Nielsen, V. L. (2012). Street-level bureaucrats and the implementation of public policy. In G. Peter & J. Pierre (Eds.), The Sage Handbook of Public Administration (2nd ed., pp. 305–318). Sage.
Mignot-Gérard, S., Sponem, S., Chatelain-Ponroy, S., & Musselin, C. (2023). Kaleidoscopic collegiality and the use of performance research metrics The case of French universities. Higher Education, 85(7), 887–918.
Musselin, C. (2013). Long march of French universities. Routledge.
Musselin, C., & Paradeise, C. (2009). France: From incremental transitions to institutional change. In C. Paradeise, E. Reale, I. Bleiklie, & E. Ferlie (Eds.), University governance (pp. 21–49). Western European comparative perspectives, Springer: Cham.
Oliver, C. (1991). Strategic responses to institutional processes. The Academy of Management Review, 16(1), 145–179.
Ongaro, E., & Valotti, G. (2008). Public Management Reform in Italy: Explaining the Implementation Gap. International Journal of Public Sector Management, 21(2), 174–204.
Pache, A., & Santos, F. (2013). Inside the hybrid organization: Selective coupling as a response to competing institutional logics. Academy of Management Journal, 56(4), 972–1001.
Paradeise, C., Reale, E., Bleiklie, I., & Ferlie, E. (Eds.). (2009). University Governance. Springer.
Parkinson, B., Meacock, R., & Sutton, M. (2019). How do hospitals respond to price changes in emergency departments? Health Economics, 28(7), 830–842.
Peters, G. (2018). Policy Problems and Policy Design. Edward Elgar
Peters, B. G., & Fontaine, G. (2022). Introduction: Operationalizing the policy design framework. In G. B. Peters & G. Fontaine (Eds.), Research handbook of policy design (pp. 1–38). Edward Elgar.
Pfeffer, J., & Salancik, G. R. (1978). The external control of organizations. Harper & Row.
Piening, E. P. (2011). Insights into the process dynamics of innovation implementation: The case of public hospitals in Germany. Public Management Review, 13(1), 127–157.
Pinar, M., & Horne, T. J. (2022). Assessing research excellence: Evaluating the research excellence framework. Research Evaluation, 31(2), 173–187.
Pollitt, C., & Bouckaert, G. (2000). Public management reform: A comparative analysis. Oxford.
Pollitt, C. (2013). The logics of performance management. Evaluation, 19(4), 346–363.
Pollitt, C., Bathgate, K., Caulfield, J., Smullen, A., & Talbot, C. (2001). Agency fever? Analysis of an international policy fashion. Journal of Comparative Policy Analysis, 3, 271–290.
Pollitt, C., Talbot, C., Caulfield, J., & Smullen, A. (2004). Agencies: How governments do things through semi-autonomous organizations. Cham: Springer.
Powell, W. W., & DiMaggio, P. J. (1991). The new institutionalism in organizational analysis. University of Chicago Press.
Pressmann, J., & Wildawsky, A. (1973). Implementation: How great expectations in Washington are dashed in Oakland. Berkeley: University of California Press.
Rayner, J., Howlett, M., & Wellstead, A. (2017). Policy mixes and their alignment over time: Patching and stretching in the oil sands reclamation regime in Alberta. Canada. Environmental Policy and Governance, 27(5), 472–483.
Reale, E., & Potì, B. (2009). Italy: Local policy legacy and moving to an ‘in between’configuration. In C. Paradeise, E. Reale, I. Bleiklie, & E. Ferlie (Eds.), In: C (pp. 77–102). Cham: University governance, Western European comparative perspectives, Springer.
Rebora, G., & Turri, M. (2013). The UK and Italian research assessment exercises face to face. Research Policy, 42(9), 1657–1666.
Rogge, K. S., & Reichardt, K. (2016). Policy mixes for sustainability transitions: An extended concept and framework for analysis. Research Policy, 45(8), 1620–1635.
Rolfe, H. (2003). University strategy in an age of uncertainty: The effect of higher education funding on old and new universities. Higher Education Quarterly, 57(1), 24–47.
Rossi, F. (2010). Massification, competition and organizational diversity in higher education: Evidence from Italy. Studies in Higher Education, 35(2), 277–300.
Sabatier, P. (Ed.). (2007). Theories of the policy process. Boulder: Westview Press.
Saetren, H. (2014). Implementing the third generation research paradigm in policy implementation research: An empirical assessment. Public Policy and Administration, 29(2), 84–105.
Salamon, L. (2002). The tools of government: A guide to the new governance. Oxford: Oxford University Press.
Sauder, M., & Espeland, W. N. (2009). The discipline of rankings: Tight coupling and organizational change. American Sociological Review, 74(1), 63–82.
Schneider, A., & Ingram, H. (1990). Behavioral assumptions of policy tools. The Journal of Politics, 52(2), 510–529.
Scott, W. R. (2008). Institutions and organizations. Sage.
Seeber, M., Lepori, B., Montauti, M., Enders, J., De Boer, H., Weyer, E., Bleiklie, I., Hope, K., Michelsen, S., & Mathisen, G. N. (2015). European universities as complete organizations? Understanding identity, hierarchy and rationality in public organizations. Public Management Review, 17, 1444–1474.
Shattock, M. (Ed.). (2014). International trends in university governance: Autonomy, self-government and the distribution of authority. London: Routledge.
Sivertsen, G. (2016). Publication-based funding: The Norwegian model. In M. Ochsner, S. Hug, & H. Daniel (Eds.), Research Assessment in the Humanities (pp. 79–90). Springer.
Sivertsen, G. (2023). Performance-based research funding and its impacts on research organizations. In B. Lepori, B. Jongbloed, & D. Hicks (Eds.), Handbook of Public Research Funding (pp. 90–106). Edward Elgar.
Smith, S., Ward, V., & House, A. (2011). “Impact” in the proposals for the UK’s research excellence framework: Shifting the boundaries of academic autonomy. Research Policy, 40(10), 1369–1379.
Spicker, P. (2006). Policy analysis for practice: Applying social policy. Policy Press.
Stensaker, B. (2006). Governmental policy, organisational ideals and institutional adaptation in Norwegian higher education. Studies in Higher Education, 31(1), 43–56.
Streeck, W., & Thelen, K. (2005). Beyond continuity: Institutional change in advanced political economies. Oxford: Oxford University Press.
Suchman, M. C. (1995). Managing legitimacy: Strategic and institutional approaches. Academy of Management Review, 20(3), 571–610.
Surel, Y. (2000). The role of cognitive and normative frames in policy-making. Journal of European Public Policy, 7(4), 495–512.
Talbot, C. (2004). Executive agencies: Have they improved management in government? Public Money & Management, 24(2), 104–112.
Talib, A. A. (2003). The offspring of new public management in English Universities: ‘Accountability’, ‘performance measurement’, ‘goal-setting’and the prodigal child–the RAE. Public Management Review, 5(4), 573–583.
Taylor, J. (2003). Institutional diversity in UK higher education: Policy and outcomes since the end of the binary divide. Higher Education Quarterly, 57(3), 266–293.
Taylor, J. (2021). Public officials’ gaming of performance measures and targets: The nexus between motivation and opportunity. Public Performance & Management Review, 44(2), 272–293.
Teixeira, P., Biscaia, R., & Rocha, V. (2022). Competition for funding or funding for competition? Analysing the dissemination of performance-based funding in European higher education and its institutional effects. International Journal of Public Administration, 45(2), 94–106.
Toth, F. (2021). How policy tools evolve in the healthcare sector. Five Countries Compared. Policy Studies, 42(3), 232–251.
Townley, B. (2002). The role of competing rationalities in institutional change. Academy of Management Journal, 45(1), 163–179.
Turnbull, N. (2022). The politics of policy design. In G. B. Peters & G. Fontaine (Eds.), Research Handbook of Policy Design (pp. 40–53). Edward Elgar.
Vedung, E. (1998). Policy instruments: Typologies and theories. In M. Bemelmans-Videc, R. C. Rist, & E. Vedung (Eds.), Carrots, sicks, and sermons: Policy instruments and their evaluation (pp. 21–58). Transaction Books.
Verger, A., & Skedsmo, G. (2021). Enacting accountabilities in education: Exploring new policy contexts and theoretical elaborations. Educational Assessment, Evaluation and Accountability, 33, 391–401.
Verhoest, K., Roness, P.G., Verschure, B., Rubecksen, K., & Mac Carhaigh, M. (2009). Autonomy and control in state agencies. Palgrave-MacMillan.
Verhoest, K., Peters, B. G., Bouckaert, G., & Verschuere, B. (2004). The study of organisational autonomy: A conceptual review. Public Administration and Development, 24, 101–118.
Wang, J., Lee, Y., & Walsh, J. P. (2018). Funding model and creativity in science: Competitive versus block funding and status contingency effects. Research Policy, 47, 1070–1083.
Whitley, R., & Glaser, J. (2007). The changing governance of the sciences. Springer.
Winter, S. (2012). Implementation Perspectives: Status and Reconsideration. In G. Peter & J. Pierre (Eds.), The Sage Handbook of Public Administration (2nd ed., pp. 265–278). Sage.
Yin, R. K. (1994). Case study research: Design and methods, applied social research. Sage.
Yiu, L. (2020). Educational injustice in a high-stakes testing context: A mixed methods study on rural migrant children’s academic experiences in Shanghai public school. Comparative Education Review, 64(3), 498–524.
Zacharewicz, T., Lepori, B., Reale, E., & Jonkers, K. (2018). Performance-based research funding in EU Member States—a comparative assessment. Science and Public Policy, 46, 105–115.
Funding
Open access funding provided by Alma Mater Studiorum - Università di Bologna within the CRUI-CARE Agreement.
Author information
Authors and Affiliations
Corresponding author
Ethics declarations
Conflict of interest
No competing interest, no financial interest, no non-financial interest, no funding interest, no employment interest.
Ethics approval, consent, data, materials and/or Code availability
Do not apply to this paper.
Additional information
Publisher's Note
Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.
Rights and permissions
Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/.
About this article
Cite this article
Capano, G., Lepori, B. Designing policies that could work: understanding the interaction between policy design spaces and organizational responses in public sector. Policy Sci 57, 53–82 (2024). https://doi.org/10.1007/s11077-024-09521-0
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s11077-024-09521-0