The asymptotic safety scenario for quantum gravity - An appraisal

The paper has three main aims: ﬁ rst, to make the asymptotic safety-based approach to quantum gravity better known to the community of researchers in the history and philosophy of modern physics by outlining its motivation, core tenets, and achievements so far; second, to preliminarily elucidate the ﬁ nding that, according to the asymptotic safety scenario, space-time has fractal dimension 2 at short length scales; and, third, to provide the basis for a methodological appraisal of the asymptotic safety-based approach to quantum gravity in the light of the Kuhnian criteria of theory choice.


Introduction
The correct quantum theory of gravity remains undiscovered and is widely regarded as among the holy grails of fundamental physics. With respect to the philosophical foundations of spacetime, quantum gravitational effects are a significant unknown factor with potentially drastic impacts on space-time ontology. With respect to scientific methodology, the ongoing search for a convincing quantum theory of gravity provides illuminating case study material because underdetermination of theory by data is an acute problem in this area and not merely an artificially constructed theoretical possibility.
Our best theories of fundamental constituents of matter are quantum field theories, but our best theory of gravitydEinstein's general theory of relativitydis entirely non-quantum ("classical", for the rest of this paper). And while it is possible to regard general relativity as an effective field theory and compute the leading quantum corrections to it (Donoghue (1994)) constructing a full theory of quantum gravity that applies to phenomena associated with the Planck scale M P $ 10 19 GeV and beyond is widely regarded as an enormous challenge. With respect to such high energy scales there seems to be a profound conceptual incompatibility between quantum field theories on the one hand and general relativity on the otherdan incompatibility that manifests itself in the so-called non-renormalizability of general relativitydwhich is widely believed to necessitate radically novel conceptual moves. The asymptotic safety scenario, originating from a suggestion due to Steven Weinberg (Weinberg (1979)) and first concretely workedout by Martin Reuter (Reuter (1998)), is based on the idea that, contrary to this widespread belief, non-perturbative renormalization techniques may actually reveal that there exists, after all, a straightforward quantum field theory of gravity that is mathematically well-defined and predictive up to arbitrarily high energies.
The present paper provides a comprehensive introduction to the asymptotic safety scenario to make it better known in the foundations of physics community. Moreover, it explores the asymptotic safety scenario's ramifications for space-time foundations by focusing on its consequence that space-time at very short length scales has fractal-like properties. Finally, the paper provides a methodological appraisal of the asymptotic safety scenario in the light of Kuhn's celebrated five criteria of theory choice, highlighting various of its most interesting empirical repercussions on the way.
2. Outline of the asymptotic safety scenario for quantum gravity 2.1. General relativity is not perturbatively renormalizable A core part of the standard procedure for turning classical field theories described in terms of an action S into quantum theories is "perturbative renormalization". Starting from the classical action S, a quantum theory can be defined by the functional integral where J and f are field variables and the integral over f ranges over all field configurations with appropriately defined boundary conditions. (The variable "f" is typically used for scalar fields, but Eq. (1) can be generalized to apply to spinor, vector, and tensor fields.) It is useful to consider not only Z½J but also the related functional W½J, defined by The expectation values of all observables can be obtained from W½J by taking appropriate derivatives with respect to J and evaluating for suitable field configurations. In that sense, it contains all the information about the quantum theory obtained from quantizing the classical action S.
Perturbative renormalization is the standard procedure to solve the problem that some contributions to W½J typically turn out to diverge if one tries to compute W½J using a perturbative expansion in terms of some coupling constant l. In perturbative renormalization, these divergent contributions are in a first step regularized, i.e. kept finite, e.g. by restricting any integrals in momentum-space to momenta with absolute values below some cut-off scale L. In a second step, these finite contributions are absorbed into the definitions of so-called physical parameters, differing from the bare parameters in terms of which S is formulated. Only the physical, not the bare, parameters are accessible through experiments. A theory is called "perturbatively renormalizable" if only finitely many parameters must be fixed through empirical input in order to complete the regularization procedure. The theories combined in the Standard Model of elementary particle physics are perturbatively renormalizable.
Notoriously, perturbative renormalization cannot successfully be applied to our best classical theory of gravity: Einstein's general theory of relativity, defined by the Einstein-Hilbert action In this case, Newton's constant G is a prima facie natural candidate coupling constant in terms of which W½J might be expanded. But, as it turns out, it is not possible to absorb the appearing infinities into finitely many parameters derived from experiment, so the resulting theory is not perturbatively renormalizable. The theory obtained by means of this procedure can at most be used as an effective, semi-classical theory, with a limited range of validity confined to energies significantly below the Planck scale M P ≡ð1=GÞ 1=2 $ 10 19 GeV.
Perturbative renormalizability can be saved by adding terms that contain higher-derivatives of the metric in Eq. (3). Unfortunately, the addition of such terms leads to a quantum theory where the equations of motions are no longer unitary (Stelle (1977)). Unlike in ordinary quantum mechanics, total probability is not conserved in the resulting theory, which is therefore not regarded as a consistent quantum theory.

Non-perturbative renormalization
The lesson that is most widely drawn from the failure of perturbative renormalization as applied to general relativity is that any quantum theory of gravity supposed to be valid at the Planck scale or even beyond will be based on (the quantization of) new degrees of freedom and/or will abandon the framework of quantum field theory altogether.
The most famous research programmes in quantum gravity are based on this diagnosis, notably, string theory and loop quantum gravity. A more conservative response is to revisit the problem of turning general relativity into a quantum theory and consider whether this might be accomplished through some other means than perturbative renormalization. The asymptotic safety-based approach to quantum gravity rests on this idea. Reuter's pioneering work on this approach, starting with his Reuter (1998), relies on the so-called functional renormalization group scheme for the effective average action, which has so far remained the tool of choice for this approach.
The functional renormalization group scheme is an alternative formulation of the relation Eq. (1) between some classical action S and quantities such as Z½J and W½J which define a quantum theory in terms of S. It is most conveniently formulated in terms of the socalled effective action G½f, defined as the Legendre transform of W½J through the equation where J carries an implicit dependence on f in that f is defined as the solution to the equation From the effective action G½f the same physical information can be derived as from W½J, its Legendre transform. (Technically, G½f is the generating functional of one-particle irreducible vertex functions, from which all expectation values of physical quantities can be derived.) An advantage of using G½f rather than W½J is that it relates to the classical action S½f in a particularly simple way in that there is an in principle calculable trajectory of functionals G k ½f which interpolate between for some suitably chosen ultraviolet cut-off L and This trajectory of functionals G k ½f is governed by the Wetterich exact renormalization group equation (Wetterich (1993), Morris (1994) In this equation, "Tr" denotes the trace operation performed over an arbitrary chosen complete set of quantum numbers (with an additional minus-sign for fermionic fields), G ð2Þ k ½f is the second functional derivative of G k ½f with respect to the field(s) f, and ℛ k is a matrix-valued regulator function. Unlike the cutoff L considered in perturbative renormalization, which is an ultraviolet cut-off, it functions as an infrared cut-off, i.e. it suppresses contributions that are associated with momenta p whose absolute value jpj is smaller than the renormalization scale k.
The regulator function ℛ k can be chosen freely, provided that it function as an infrared cut-off. For example, it can be chosen such that it endows contributions associated with momenta jpj < k with a large artificial mass term, such that they are suppressed when performing the trace, leaving modes with jpj[k entirely unaffected. As a consequence, by following the trajectory of G k from L to 0, i.e. from the classical action S to the quantum effective action G, one successively takes into account contributions associated with lower and lower momenta until one obtains the full quantum theory as encoded in the effective action G. 1 Following the renormalization flow of G k from S ¼ G k j k1L to G ¼ G k j k¼0 can be seen as an alternative strategy of quantizing the classical theory defined through S. Scale-dependent coupling constants g a ðkÞ can be obtained from G k as appropriate derivatives with respect to the field(s) f. Typically, however, Eq. (8) cannot be solved exactly. Luckily, various approximation schemes have been developeddbased on socalled truncations for the scale-dependent effective action G k din which Eq. (8) turns out to be (approximately) solvable for physical theories of interest. Computations based on such truncations have served to explore topics as different as the electroweak phase transition, the phase diagram of QCD, and the coexistence of magnetic and superconducting order in models of high-temperature superconductivity (see Berges, Tetradis, and Wetterich (2002), Metzner, Salmhofer, Honerkamp, Meden, and Sch€ onhammer (2012) for useful overviews). Computations based on Eq. (8) are able to go beyond effects detectable in perturbation theory, and, so, re-open the question of the renormalizability of general relativity.

The idea of asymptotic safety
One would perhaps initially think that subjecting general relativity to the quantization procedure based on the effective average action just outlined means identifying the Einstein-Hilbert action S EH (Eq. (3)) of general relativity with the effective average action G k evaluated at very large L, perhaps even for L1∞, following the flow of k from L to 0, and obtaining G ¼ G k j k¼0 as the full quantum effective action. However, the regime for which general relativity is empirically confirmed is that of large length scales corresponding to very low energies. Thus, what is really needed for obtaining a quantum version of general relativity is in fact a trajectory of effective average actions G k which are well-defined at all renormalization scales k and which reproduces the empirical content of general relativity in some regime of coupling constants for very low, not very high, values of k.
In general, the effective average action G k can be expanded in terms of some basis fP a ½ , g of the space of functionals compatible with the symmetries imposed on the theory at issue. Using scaledependent generalized couplings g a ðkÞ as expansion coefficients it can thus be written as Flow equations for the couplings g can be obtained from the Wetterich equation Eq. (8) by taking appropriate field derivatives on both sides of it. Couplings g a ðk1LÞ correspond to the "bare" couplings in conventional quantization treatments and couplings g a ðk ¼ 0Þ to the "physical" ones.
The couplings g a ðkÞ are in general dimensionful with canonical mass dimensions d a . Dimensionless couplings are defined as g a ≡k Àda g a : For the theory to be well-defined at arbitrarily large scales k, i.e. between ∞ and 0, these dimensionless couplings g a ðkÞ must remain finite at all k. In particular, they must be finite in the ultraviolet (UV) limit k1∞. One possibility for this to happen occurs if the renormalization flow of G k converges for k1∞ to some fixed point G k j k1∞ where all the dimensionless couplings g a ðkÞ approach finite values.
The idea of perturbative renormalization, as sketched in the previous section, is based on the assumption that a UV fixed point not only exists but actually coincides with a point in the space of dimensionless couplings where all g a that describe interactions are zero. A theory for which g a ðkÞj k1∞ ¼ 0 for all g a that describe interactions is called asymptotically free, and the corresponding point of parameter space is in this case referred to as a Gaussian fixed point. There is one empirically well-established quantum field theory that is known to be asymptotically free, namely, quantum chromodynamics (QCD), the theory of quarks and gluons which describes the strong nuclear force. QCD's asymptotic freedom means that the particles described by it are effectively noninteracting ("free") at very high energies in that g a ðkÞj k1∞ ¼ 0 for all coupling constants.
Being asymptotically free is not the only manner in which a quantum field theory can be well-defined at arbitrarily high energy scales. There is no reason as to why the UV fixed point would have to correspond to the point of no interactions, i.e. g a ðk1∞Þ ¼ 0 for all a. If some theory has a fixed point where all the dimensionless couplings g a ðkÞ converge and at least some to non-zero finite values as k1∞, it is referred to as a non-Gaussian UV fixed point. The asymptotic safety scenario for quantum gravity is based on the idea that general relativity is a low-energy limit of some quantum theory that exhibits a non-Gaussian UV fixed point.
If some theory has a (Gaussian or non-Gaussian) fixed point, one may consider the trajectories in the (infinite-dimensional) space of the couplings g a ðkÞ that emanate from it. These trajectories correspond to the distinct possibilities for the theory to be valid. They approximately agree on its high-energy behaviour for energies corresponding to scales k where the couplings g a ðkÞ are close to their UV limits, but they differ on the low-energy behaviour. For the overall theory to have predictive force, it is important that it not be compatible with arbitrary low-energy behaviour, which means that the UV-critical surface, consisting of the physically acceptable renormalization trajectories, should be finitedimensional. If the UV-critical surface is d-dimensional for some finite d, we have to determine d couplings g a ðkÞ experimentally for some given energy scale k to pick out some specific UV-finite renormalization trajectory as the physically realized one. At least in principle, all the other couplings g a ðkÞ could then be derived from the theory thus obtained and be treated as its genuine empirical predictions.

An asymptotically safe theory of quantum gravity?
The possibility that there might be an asymptotically safe quantum field theory which contains general relativity as a special case in some specific low energy regime was pointed out by Steven Weinberg already in 1979(Weinberg (1979). In subsequent years, relatively little work was done to follow up on Weinberg's suggestion (but see Smolin (1982), Kawai and Ninomiya (1990) for notable exceptions). Even the advent of the exact renormalization group formalism based on the Wetterich flow equation Eq. (8) did not immediately change this because its application to quantum gravity is not straightforward. Difficulties result from the fact that 1 The idea that underlies this approach goes back to Wilson's approach to renormalization (Wilson and Kogut (1974), Polchinski (1984)). Many similarities notwithstanding, both approaches differ, however, in that the scale-dependent effective action S W L that is used in Wilson's approach corresponds to one and the same physical model for all L (the correlation functions for observables are the same for all L). The effective average action G k , in contrast, corresponds to different correlation functions for the different energy scales k. Moreover, for contributions with large momenta jpj[k, the effective average action G k at any given k is already almost identical with the full effective action G, such that the properties of G at large momenta can be read off from G k . This makes the effective action-based approach particularly suitable to study the high-energy properties of quantum gravity. See the review article (Berges et al. (2002)) for more detailed comparisons between the Wilsonian approach and the one based on the effective average action. the chosen ansatz for the effective average action G k may not depend on any specific prior assumption concerning the spacetime metric g mn , i.e. it should be background-independent. This raises a challenge because performing the trace in Eq. (8) requires integration over all spatio-temporal degrees of freedom and thus seems to presuppose operating with a fixed space-time metric. A further difficulty is that, in virtue of diffeomorphism invariance, g mn has unphysical gauge degrees of freedom, which means that established manners of evaluating Eq. (8) for field theories where all degrees of freedom are physical cannot be applied to it. A solution to these problems was developed by Reuter and Wetterich by adapting the so-called background field formalism to the exact renormalization group framework (Reuter and Wetterich (1994)). In this formalism, a second metric g mn is introduced with respect to which all space-time integrals are performed, while it is kept arbitrary throughout the renormalization flow. The application of this formalism to quantum gravity was first undertaken by Martin Reuter in a paper published online in 1996 (Reuter (1998)). In the years following the release of this work, numerical results according to which there indeed seems to exist a non-trivial UV-fixed point for quantum gravity began to accumulate.
The simplest truncation (i.e. approximation for the effective average action G k ) in which numerical hints for such a fixed point where established is the so-called Einstein-Hilbert truncation, modelled on the Einstein-Hilbert classical action Eq. (3) for general relativity. In space-time with Euclidean signature (þþþþ), which is typically used in the asymptotic safety approachdand bracketing contributions from gauge-fixing terms as well as those from unphysical "ghost" fields which are needed to solve the problem of unphysical gauge degrees of freedomdit is given by Based on this ansatz for the effective average action, a flow diagram for the flowing dimensionless Newton's constant g k and dimensionless cosmological constant l k was derived which exhibits a UV-fixed point at positive values of these parameters (Fig. 1).
The details of the flow diagrams change once the truncation is extended and more flowing couplings are taken into account, but so far the picture, including the existence of a non-Gaussian fixed point, holds up in its essential features (see (Reuter and Saueressig (2012)), Sect. 5 for a survey). Almost all of the couplings taken into account in more extensive truncations seem to not correspond to UV-attractive directions, so the results obtained so far are consistent with the hope that the UV-critical surface may indeed be finite-dimensional (see Sect. 4.2 for some more details).
Thus, according to the best currently available numerical evidence, there are strong hints that the asymptotic safety scenario is a coherent theoretical possibility. In what follows I will outline what it would mean for space-time at very short length scales and high energies if the asymptotic safety scenario were indeed realized.

Emergent two-dimensional space-time at high energies?
The most intriguing empirical consequence of the asymptotic safety scenario is that it attributes fractal-like properties to spacetime. Notably, for processes associated with length scales shorter than the Planck length and/or energies above the Planck scale, space-time according to the asymptotic safety scenario behaves in some respects as if it were two-dimensional. The present section discusses the meaning and interpretation of this finding, notably with respect to the question of whether aspects of space-time are "emergent" in asymptotic safety.

Fractal-like space-time structure
In the exact renormalization group framework outlined in Section 2.2, one can define an effective metric with respect to the energy scale k as the solution hg mn i k to the equation Since the effective average action G k at any given k is obtained by integrating out contributions from four-momenta jpj ! k, it takes into account only those fluctuations with characteristic length scale smaller than l ¼ 1=k. If the properties of space-time are probed, the momenta of scattering particles used to do that provide lower bounds on the resolution that can thereby achieved. Effective, these momenta thus act as infrared cut-offs of the renormalization flow. This means that one can, for the purposes of heuristics, identify these external scales with the renormalization scale k in the sense that an observer who probes space-time by means of particles with a certain momentum k 0 will find correlation functions derived from the effective average action G k at k ¼ k 0 .
Since the renormalization flow at ks0 is depends on the regulator ℛ k , it is clear that this identification of k with an external momentum can only be valid at an approximate level. Nevertheless, at the level of heuristics one can conceive of the effective metric hg mn i k as defined by Eq. (12) as describing space-time structure as accessible to an observer whose observations are restricted to momenta smaller than k, i.e. length scales below l ¼ 1=k. For any specific scale k, the effective metric hg mn i k is a smooth classical metric. However, since Eq. (12) holds separately at all scales k, the entire collection of metrics hg mn i k can exhibit highly non-classical, notably fractal-like, features. This indeed does appear to be the case if the asymptotic safety scenario is indeed correct. For each coupling g a , the dimensionful and dimensionless couplings (i.e. g a and g a ) are related by specific powers of the renormalization scale k, see Eq. (10). For example, for the cosmological constant, the dimensionful quantity L k and the dimensionless quantity l k are related by At the non-Gaussian fixed point, in virtue of its being a fixed point, the dimensionless couplings do not change with the renormalization scale k. It follows that the k-dependence of the dimensionful couplings is completely determined by their dimensionality, i.e. by the power with which k appears in equations that link the dimensionless and the dimensionful couplings, e.g. the power 2 in Eq. (13): Using this approximation-independent result one can derive that, for scales k where the renormalization trajectories are very close to the fixed point, the typical radius of curvature r c ðlÞ associated with some length scale l is proportional to that length scale itself (see Eq. (6.8) in Reuter and Saueressig (2012)): This equation can be taken to indicate that space-time exhibits a conformal structure for high scales k, i.e. one where it has no characteristic length (or energy) scale. Put differently, it suggest that if we increase the resolution through which we study properties of space-time at very high energies beyond the Planck scale, the characteristic curvature of what we study increases simultaneously. At lower scales, in contrast, notably at values of k below the Planck scale, the dimensionful Newton's constant and the dimensionful cosmological constant no longer run. In these regimes, the effective metric hg mn i k is independent of k and the same holds for the characteristic radius of curvature r c .
The self-similarity of the effective metric hg mn i k at high energy scales manifests itself in the dynamical properties of gravitons. Notably, for very high energies the graviton propagator (a correlation function which, for any two space-time positions x and y, specifies the joint probability of the graviton to appear at x and y) takes a form that is otherwise characteristic of particles confined to two space-time dimensions (Reuter and Saueressig (2012), Eq. (6.12)).
Another way to illuminate the dynamical aspects of space-time structure is in terms of the diffusion processes that it supports. A useful quantity for characterizing these processes is the so-called spectral dimension. It specifies how the return probability P g of a particle in a random walk depends on the duration T of the walk: The sign, the pre-factor and the logarithms in this definition are chosen such that, on smooth classical manifolds, the spectral dimension coincides with the topological dimension d.
Lauscher and Reuter suggest a way of evaluating Eq. (16) for the asymptotic safety scenario in four (topological) space-time dimensions according to which the spectral dimension is d s ¼ 2 for very high energy scales k significantly above the Planck scale (Lauscher and Reuter (2005)). (Lauscher and Reuter derive this result independently of any truncation scheme.) For energy scales k significantly below the Planck scale, in contrast, the spectral dimension has its familiar classical value 4, which coincides with the topological dimension.
An important point to stress about the fractal-like features of space-time at very high energy scales k according to the asymptotic safety scenario is that they are by definition tied to particle motion and are thus inherently dynamical. As Reuter and Saueressig point out, referring to the theory of quantum gravity to which the asymptotic safety scenario gives rise as "Quantum Einstein Gravity" (QEG): The smooth manifold underlying QEG has per se no fractal properties whatsoever.
[/] We emphasize that the effective QEG space-times should not be visualized as a kind of sponge. Their fractal-like properties have no simple geometric interpretation; they are not due to a "removing" of space-time points. Rather they are of an entirely dynamical nature, reflecting certain properties of the quantum states the system "space-time metric" can be in. (Reuter and Saueressig (2012), p. 41, p. 41) The purely dynamical character of the fractal-like properties of space-time according to the asymptotic safety scenario is reflected in the fact that the Hausdorff dimension d H ¼ lim r10 lnV ðrÞ lnr :, which is not dynamically defined, does not vary with k and coincides with the topological dimension (4, in an ordinary four-dimensional space-time setting) at all scales.

Fractal-like space-time structure and emergence
The fact that, according to the asymptotic safety scenario, spacetime has a fractal-like structure and, according to Lauscher and Reuter, a reduced spectral dimension for very high energy scales k has been characterized by Rickles as "ripe for philosophical pickings" (Rickles (2008), p. 349). A natural question in response to this finding is whether fractal-like character and reduced spectral dimensionality at high energies should be seen as "emergent" features. Assessing this question requires adopting a workable characterization of emergence of which their are several in the literaturednot all of them mutually compatible. Jeremy Butterfield has recently proposed a richly illustrated characterization of physical behaviour as emergent if it is "novel and robust relative to some comparison class" (Butterfield (2011a), p. 920). Among the examples that Butterfield uses to highlight which interpretation of this characterization he regards as appropriate is the ascription of fractal dimensions to the geometry of physical objects such as coast lines. The upshot of Butterfield's discussion is that fractal dimensions are indeed both "novel" and "robust" in the relevant sense (Butterfield (2011b), Sect. 5) and, so, qualify as emergent. The factor that most complicates his conclusion is that the ascription of fractal dimensions to the geometry of actual physical objects typically rests on an idealization. For example, when one chooses a sufficiently fine resolution at the molecular or even atomic level, the geometry of a coast line will presumably cease to be fractal. (See (Butterfield (2011b), Sect. 5.3) for how Butterfield handles this complication).
This complication seem to be somewhat attenuated when we try to assess whether fractal-like structure in the asymptotic safety scenario qualifies as "emergent" in Butterfield's sense. While identifying the renormalization scale k with a physical scale is essentially a heuristic move, the calculation of the spectral dimension is done in the limit k10 where results become regulator-independent (if we abstract from truncation-dependent inaccuracies). So at least in the latter respect the ascription of fractal-like properties to space-time according to the asymptotic safety scenario at very high energy scales does not seem to involve an idealization, i.e. reduced dimensionality is literally realized if the scenario holds.
Are fractal-like structure and reduced dimensionality at high energy scales k "novel" with respect to some suitable comparison class? Plausibly yes, if we accept the physical behaviour of objects in space-times described by general relativity as our comparison class. Multi-fractality and reduced spectral dimensionality are features of space-time that are absent from general relativistic space-times and, as it seems, not yet somehow suggested by them.
Similarly, fractal-like character and reduced spectral dimensionality are "robust" inasmuch as they are an approximationindependent consequences of the asymptotic scenario. They are also "robust" in that both the graviton propagator and the spectral dimension independently suggest that dynamical processes associated with very high energy scales k are characterized by an effective dimensionality of 2. To conclude, at least by the standards of Butterfield's characterization of emergence, fractal-like character and reduced spectral dimensionality at high energy scales in the asymptotic safety scenario qualify as "emergent." Going beyond Butterfield's characterization of emergence, one may argue that, for physical behaviour to genuinely qualify as "emergent", it must not only be novel and robust with respect to behaviour in some reference class but also occur at a level that is in some relevant sense "less fundamental" than the level with which the reference class behaviour is associated. It seems clear that the short space-time scale, high-energy regime with fractal-like structure and reduced spectral dimensionality is not relevantly "less fundamental" than the long space-time scale, low-energy regime where general relativity is empirically adequate. If anything, the opposite seems to be the case since the procedure of lowering k in the renormalization flow can be seen as a form of coarse graining, in which the fractal-like sub-Planckian regime precedes the regime where general relativity is empirically adequate. Inasmuch as general relativity appears robust and novel with respect to the fractallike regime at high scales, it indeed qualifies as emergent in the asymptotic safety scenario.
Space-time itself, in contrast, does not seem to appear as emergent in the asymptotic safety scenario. The scenario is set up in terms of the space-time metric g mn from the start and is thus formulated using spatio-temporal categories from the outset. In this respect, the asymptotic safety scenario differs notably from candidate theories of quantum gravity that attempt to recover space-time itself must notoriously be recovered as emergent. While those theories are confronted with the potentially serious challenge of accounting for how they can possibly be about a world that lends itself to a description in terms of spatio-temporal categories at macroscopic length scales (Lam and Esfeld (2013), Oriti (2014)), the asymptotic safety programme does not face any such difficulty. Conversely, if one regards it as a desideratum for a theory of quantum gravity that it be formulated in more primitive, nonspatio-temporal, terms and recover space-time as emergence, one may find the asymptotic safety scenario disappointing.

Methodological appraisal
The present section undertakes a methodological appraisal of the asymptotic safety approach to quantum gravity in the light of Kuhn's famous five criteria of theory choice: empirical accuracy, consistency, simplicity, fruitfulness, and breadth of scope (Kuhn (1977)). The aim of this appraisal is the modest one of assembling features and consequences of the asymptotic safety scenario which the reader may find relevant to assessing whether it deserves to be regarded as a serious contender in the quest for the correct quantum theory of gravity. The appeal to Kuhn's criteria is purely pragmatic in that they provide a convenient way of structuring the appraisal. Their use here should not be regarded as endorsing them as in any way sacrosanct or privileged.

Empirical accuracy
To be empirically accurate, a minimal requirement for any theory of quantum gravity is that it be able to reproduce general relativity's predictions in some suitable low-energy regime. A necessary requirement for the asymptotic safety scenario to fulfil this desideratum is that the specific renormalization trajectory which is realized in nature must exhibit a "classical" stage, where the dependence of the dimensionful constants on the scale k is only very weak.
Fortunately for the asymptotic safety scenario, the results obtained using the Einstein-Hilbert truncation indicate that there is a candidate renormalization trajectory (to the right of and extremely close to the green line in (Fig. 1)) which fulfils this criterion and exhibits the measured values of the cosmological constant and Newton's constant in the suitable regime. This trajectory is very close to the origin of l À g-space at values of k of the order meV and then turns to the right as k becomes even smaller, exhibiting only a very weak scale dependence in this regime, as required for it to be "classical." Empirically discriminating between theories of quantum gravity which do reproduce the predictions of general relativity in some suitable regime is notoriously difficult because energy scales close to the Planck scale cannot be probed using laboratory experiments. While processes associated with extremely high energies cannot be studied using laboratory experiments, their remote effects can possibly be studied in virtue of the traces they have left in the very early universe, which was characterized by very high energies. As remarked in Section 3.1, close to the UV fixed point the flow of the dimensionful Newton's constant and the dimensionful cosmological constant is determined by the fact that the dimensionless constants do not flow.
According to Eq. (14), the dimensionful cosmological constant diverges as the energy scale k grows indefinitely. Since the cosmological constant describes the repulsive side of gravity, this has the physical consequence that, according to the asymptotic safety scenario, when characteristic energies were very high in the very early universe, it must have been in rapid and constantly accelerated expansion. As pointed out by Bonanno and Reuter (Bonanno and Reuter (2007)), this finding indicates that the asymptotic safety scenario includes a mechanism for cosmic inflation (Guth (1981)). This is interesting because inflation is widely believed to have occurred on independent grounds, notably because assuming that inflation occurred removes various otherwise puzzling coincidence problems. 2 One attractive feature of this mechanism for inflation is that it naturally accounts for why inflation came to an end at a certain point, as it obviously diddnamely, as soon as, with decreasing cosmic energy density, L k dropped below a certain critical scale, where its size was no longer sufficient to sustain an accelerated expansion.
There is one prediction derived on the basis of the asymptotic safety scenario that was indeed vindicated by an empirical discovery, namely, the prediction by Shaposchnikov and Wetterich, in 2010, that the value of the Higgs boson mass would be m h ¼ 126 GeV, with only a few GeV uncertainty (Shaposhnikov and 2 Earlier work by Bonanno and Reuter suggests an alternative solution to those problems that does not rely on inflation (Bonanno and Reuter (2002)). For philosophers' considerations on whether the supposed coincidence problems are genuine and whether inflation really solves them, see Earman and Mosterín (1999), McCoy (2015). Wetterich (2010)). Within the boundaries of prediction and measurement accuracy, this prediction was confirmed by the actual discovery, in 2012, of a Higgs boson with a mass of around 125GeV. The prediction by Shaposhnikov and Wetterich does not directly follow from the asymptotic safety scenario, however. It is based on the additional assumption that there are no particles (other than gravitons) beyond those describe by the Standard Model. Furthermore, they depend on a particular scenario concerning the UV behaviour of the Higgs self-coupling (namely, that it is "UV-irrelevant") that is widely regarded as plausible, though not rigorously established.

Consistency
The existence of a non-Gaussian UV fixed point is required for the mathematical consistency of the asymptotic safety scenario. Numerical results obtained in ever more extended truncations so far suggest the existence of such a fixed point, see (Reuter and Saueressig (2012), Sect. 5) for a review of studies that go beyond the Einstein-Hilbert truncation. There had been scepticism about whether the non-Gaussian UV fixed point found in the Einstein-Hilbert truncation would persist in truncations that include a perturbative counterterm, but a positive answer to this question has recently been established for the so-called Goroff-Sagnotti counterterm (Gies, Knorr, Lippoldt, and Saueressig (2016)). Whether the problem of non-unitarity that otherwise plagues higher-derivative quantum gravity can be avoided in the asymptotic safety scenario is not yet known, but see (Benedetti, Machado, and Saueressig (2009), Becker, Ripken, and Saueressig (in press)) for promising results. The hope for a truncation-independent, mathematically rigorous proof of the existence of a non-Gaussian UV fixed point, however, does not seem realistic for the foreseeable future.
As mentioned above, the numerical results obtained so far are consistent with the possibility that the UV-critical surface may be finite-dimensional, which is required for the theory to be predictive. Beside the directions associated with Newton's constant and the cosmological constant there appears to be (at least) one further UV-relevant direction, possibly associated with the coupling constant in an R 2 -term in the effective action (Lauscher and Reuter (2002), Benedetti et al. (2009)). Truncations where the effective average action is of the form f ðRÞ have been shown to have a finitedimensional UV-critical surface (Benedetti (2013)). Overall, according to the current numerical state-of-the-art, the dimensionality of the UV-critical surface may be as low as 3. Shomer (2007), based on earlier work by Aharony and Banks (1999), voices a worry concerning the consistency of the asymptotic safety scenario with the widely acknowledged Bekenstein-Hawking formula for black hole entropy. Falls and Litim (2014) object against this worry, arguing that the Bekenstein-Harwking result relies on a semi-classical approximation for black hole entropy that one should not expect to be valid in the limit of high k in the first place (see Doboszweski and Linnemann (2017) for further considerations). Another worry concerning the consistency (or at least coherence) of the asymptotic safety scenario found in the literature is that there are problems related to the very definitions of a scale-dependent Newton's constant (Anber and Donoghue (2012)). Finally, since almost all calculations done so far use the Euclidean space-time signature ðþ; þ; þ; þÞ, there remains the challenge to ascertain that the results actually carry over to the Lorentzian setting with signature ðþ; À; À; ÀÞ (or, equivalently, ðÀ; þ; þ; þÞ). So far, this challenge has been met only for one specific truncation (Manrique, Rechenberger, and Saueressig (2011)).
Finally, one may worry about the consistency (or conceptual coherence) of any quantum theory that aims at being fundamental without at the same time suggesting a clean solution to the quantum measurement problem. The asymptotic safety approach to quantum gravity does not come with any such solution and, so, is committed to the assumption that the quantum measurement problem can be solved independently.

Simplicity
The asymptotic safety approach seems to be "simple" in at least two very different ways: first, by being ontologically parsimonious in that, unlike other suggested theories of quantum gravity, it does not posit any hypothetical physical objects like strings or branes for which there is no independent empirical evidence; and, second, by being methodologically conservative in that it does not rely on any concepts and techniques that go beyond those used in our best currently well-established physical theories: just as the theories of particle physics that are combined in the Standard Model, the asymptotic safety scenario assumes that gravity can be described by a field theory; and the exact renormalization group that it uses to study the non-perturbative features of the scale-dependent coupling constants has been reliably used for calculations beyond particle physics and cosmology (see Sect. 2.2 for references). One sense in which the asymptotic safety approach is not methodologically conservative is that the approximations that it currently uses (truncations of the effective average action) are not mathematically under control. There are no analytically derived upper bounds on the errors made by the specific truncation procedures.
As noted in Section 3.2, the asymptotic safety scenario is formulated in spatio-temporal terms from the outset, i.e. spacetime is not recovered as in some sense "emergent" from it. If one regards a theory of quantum gravity as sufficiently conceptually basic (and in that sense "simple") only if it accounts for space-time in terms of more fundamental structure, one will regard that as a shortcoming of the asymptotic safety scenario as far as simplicity is concerned.

Fruitfulness
According to Kuhn, for a theory to be fruitful it must either have led to the discovery of hitherto unknown empirical phenomena or at least to the discovery of hitherto unknown relations between known phenomena. So far, the asymptotic safety approach has not led to the discovery of hitherto unknown empirical phenomena. It does, however, suggest previously unidentified relations between empirical phenomena. One would presently hesitate to call these suggestions "discoveries" for the simple reason that the asymptotic safety scenario is presently too speculative and controversial for its empirical ramifications to be looked at as discoveries.
One relation between known phenomena that the asymptotic safety scenario suggests is between the so-called cosmological constant problemdthe question of why the cosmological constant is so much smaller than the Planck scaledand the apparently unrelated question of why there is an approximately "classical" regime with approximately scale-independent constants in the first place (see Sect. 4.1). As it turns out, the only "classical" regime exhibited by the flow diagram Fig. 1 is one where the dimensionful cosmological constant is dramatically smaller than the Planck mass (Reuter and Weyer (2004), Sect. 3.3). As a consequence, given the asymptotic safety scenario, if observers find themselves in a world with an approximately "classical" macroscopic regime, they will unavoidably find a cosmological constant in that regime which is dramatically smaller than the Planck scale. Thus the asymptotic safety scenario transforms the question of why the cosmological constant is so small as we find it into the broader question of why there is a long (in renormalization flow terms) classical regime at all.
A way in which the asymptotic safety programme has at least the potential to become fruitful is by permitting us to derive, or at least establish relations between, the measured values of various fundamental constants. In the Standard Model of elementary particle physics the values of the constants (particle masses, couplings) at some specified scale are taken as unrelated primitives. Their scale dependence may, however, be affected if they are coupled to gravity, and it may then become possible to derive dependencies between them. As explained in Section 2.3, the number of dimensions of the UV-critical surface determines the number of independent parameters necessary to identify a physically possible renormalization trajectory. The smaller this number turns out to be, the more the range of possible trajectories is restricted and the more couplings become predictable at energies that can be accessed experimentally. Harst and Reuter carried out the suitable calculations for gravity coupled to quantum electrodynamics (QED), and they found that there is indeed a solution to the flow equations according to which the fine-structure constant a has a non-zero value at experimentally accessible scales that can in principle be predicted (Harst and Reuter (2011)). (The value they derive is of the same order of magnitude as the physically realized one, but, according to them, the approximations used are not sufficient to permit a numerically precise comparison with observation.) A similar result has been obtained by Eichhorn, Held, and Pawlowski (2016) for a Higgs-Yukawa model.
Yet another way in which the asymptotic safety programmedor, more generally, the non-perturbative approach to quantum gravitydhas at one point been suggested to be potentially empirically fruitful is with respect to the astronomical observations that are widely regarded as indicating the existence of dark matter. There is a regime of the renormalization flow associated with very small energy scales k close to zero, i.e. even below the regime where general relativity is effectively valid, which is potentially relevant here (Reuter and Weyer (2004)). For the renormalization trajectory realized in nature, the dimensionful Newton's constant G k seems to strongly grow in this regime with decreasing energy scales k. In the Einstein-Hilbert truncation it diverges before k ¼ 0, indicating the breakdown of the truncation at this point. But the growth of G k with decreasing k may be a correct feature of the Einstein-Hilbert truncation. If it is, this would mean that the strength of gravity increases with decreasing energy scales k, i.e. very (astronomically) large length scales. From this perspective, the apparent need to invoke dark matter might be the result of unduly neglecting the scale-independent Newton's constant at very low k. However, there does not seem to have been any further research activity directed at this possibility since the work of Reuter and Weyer (2004), which may indicate that this idea is no longer regarded as promising.

Breadth of scope
The discussion in the previous subsections has highlighted various areas of high energy physics and cosmology with respect to which the asymptotic safety scenario potentially has repercussions. Examples include: the structure of space-time at very high energies and very short length scales; the evolution of the very early universe, including a candidate mechanism for inflation (see, moreover, Kofinas and Zarikas (2016) for an application of the asymptotic safety scenario to the big bang itself and alternatives to it); black hole physics; possible deviations of gravity's strength from the Newtonian law at astronomical length scales; and the derivation of values of fundamental parameters like the finestructure constant and the Higgs mass.

Conclusion
In the light of the methodological appraisal of the asymptotic safety programme given in the previous section, should we regard the approach as a serious contender in the quest for the correct theory of quantum gravity? One's answer to this question will depend not only on how impressed or unimpressed one is by the points mentioned in the previous section but also on what one regards as the most salient strengths and weaknesses of the other contenders in the quest. It is worth pointing out that identifying the "space of contenders" in this quest may be more complicated than it initially seems, and not only because there may be so far unconceived candidate theories of quantum gravity on which physicists' imagination so far simply has not hit. A possibility which must be kept in mind is that some of the competitors that are currently on the scene may turn out to be alternative formulations of one and the same, or at least some closely related, theory. (One may think of how Schr€ odinger's wave mechanics and G€ ottingen matrix mechanics turned out to be alternative formulations of one and the same "quantum" theory.) An approach with respect to which the asymptotic safety approach is suspected to be similarly "complementary" is the so-called Causal Dynamical Triangulation approach. Promisingly, this approach has also produced computational results according to which space-time structure is fractal-like with respect to high energy scales (Ambjørn, Jurkiewicz, and Loll (2005)). 3 The last word, as far as the present appraisal is concerned, goes to Sabine Hossenfelder, who may have produced the most poignant assessment of the asymptotic safety scenario yet, whether one agrees with it or not: [T]his approach towards quantum gravity has its problems, its friends and its foes, as has every other approach towards quantum gravity. But it is a strong competitor. What makes this approach so appealing is its minimalism: Maybe quantum gravity makes sense as a quantum field theory after all! Depending on your attitude though you might find exactly this minimalism unappealing. It's like at the end of a crime novel [where] the murder victim comes back from vacation and everybody feels stupid for their conspiracy theories. (Hossenfelder (2014))