Please login to be able to save your searches and receive alerts for new content matching your search criteria.
The classical conformality (CC) is one of the possible candidates for explaining the gauge hierarchy of the Standard Model (SM). We show that it is naturally obtained from the Coleman’s theory on baby universe.
Motivated by natural inflation, we propose a relaxation mechanism consistent with inflationary cosmology that explains the hierarchy between the electroweak scale and Planck scale. This scenario is based on a selection mechanism that identifies the low-scale dynamics as the one that is screened from UV physics. The scenario also predicts the near-criticality and metastability of the Standard Model (SM) vacuum state, explaining the Higgs boson mass observed at the Large Hadron Collider (LHC). Once Majorana right-handed neutrinos are introduced to provide a viable reheating channel, our framework yields a corresponding mass scale that allows for the seesaw mechanism as well as for standard thermal leptogenesis. We argue that considering singlet scalar dark matter extensions of the proposed scenario could solve the vacuum stability problem and discuss how the cosmological constant problem is possibly addressed.
I give a brief review of theoretical high energy physics as we know it today in the general picture of quantum physics. Prospects for future developments are outlined.
A mechanism is discussed to obtain light scalar fields from a spontaneously broken continuous symmetry without explicitly breaking it. If there is a continuous manifold of classical vacua in orbit space, its tangent directions describe classically massless fields that may acquire mass from perturbations of the potential that do not break the symmetry. We consider the simplest possible example, involving a scalar field in the adjoint representation of SU(N). We study the scalar mass spectrum and its renormalization group running at one-loop level including scalar and pseudoscalar Yukawa couplings to a massive Dirac fermion.
We study the implications of a criterion of naturalness for a simple two Higgs doublet model in the context of the discovery of a Higgs-like particle with a mass at 125 GeV. This condition which measures the amount of fine-tuning further limits the parameter space of this particular model and together with other phenomenological constraints lead to an allowed range of masses for the other neutral or charged Higgs bosons: H, a±, a0.
After the last missing piece, the Higgs particle, has probably been identified, the Standard Model of the subatomic particles appears to be a quite robust structure, that can survive on its own for a long time to come. Most researchers expect considerable modifications and improvements to come in the near future, but it could also be that the Model will stay essentially as it is. This, however, would also require a change in our thinking, and the question remains whether and how it can be reconciled with our desire for our theories to be “natural”.
I define a naturalness criterion formalizing the intuitive notion of naturalness discussed in the literature. After that, using ϕ4 as an example, I demonstrate that a theory may be natural in the MS-scheme and, at the same time, unnatural in the Gell-Mann–Low scheme. Finally, I discuss the prospects of using a version of the Gell-Mann–Low scheme in the Standard Model.
We perform a comprehensive study of the Higgs potential of the two Higgs doublet model extended by a real triplet scalar field Δ. This model, dubbed 2HDM+T, has a rich Higgs spectrum consisting of three CP-even Higgs h1,2,3, one CP-odd A0 and two pairs of charged Higgs H±1,2. First, we determine the perturbative unitarity constraints and a set of nontrivial conditions for the boundedness from below (BFB). Then we derive the Veltman conditions by considering the quadratic divergencies of Higgs boson self-energies in 2HDM+T. We find that the parameter space is severely delimited by these theoretical constraints, as well as experimental exclusion limits and Higgs signal rate measurements at LEP and LHC. Using HiggsBounds-5.3.2beta and HiggSignals-2.2.3beta public codes, an exclusion test at 2σ is then performed on the physical scalars of 2HDM+T. Our analysis provides a clear insight on the nonstandard scalar masses, showing that the allowed ranges are strongly sensitive to the sign of mixing angle α1, essentially when naturalness is involved. For α1<0 scenario, our results place higher limits on the bounds of all scalar masses, and show that the pairs (h2,H±1) and (h3,H±2) are nearly mass degenerate varying within the intervals [130,246] GeV and [160,335] GeV, respectively. When α1 turns positive, we show that consistency with theoretical constraints and current LHC data, essentially on the diphoton decay channel, favors Higgs masses varying within wide allowed ranges: [153,973] GeV for mA0; [151,928] GeV for (mh2, mH±1) and [186,979] GeV for (mh3, mH±2). Finally, we find that the γγ and Zγ Higgs decay modes are generally correlated if tanβ lies within the reduced intervals 17≤tanβ≤25 and λb parameter is frozen around 1.3 (1.1) for sinα1>0 (sinα1<0).
Neurules are a kind of hybrid rules integrating neurocomputing and production rules. Each neurule is represented as an adaline unit. Thus, the corresponding neurule base consists of a number of autonomous adaline units (neurules). Due to this fact, a modular and natural knowledge base is constructed, in contrast to existing connectionist knowledge bases. In this paper, we present a method for generating neurules from empirical data. To overcome the difficulty of the adaline unit to classify non-separable training examples, the notion of 'closeness' between training examples is introduced. In case of a training failure, two subsets of 'close' examples are produced from the initial training set and a copy of the neurule for each subset is trained. Failure of training any copy, leads to production of further subsets as far as success is achieved.
We study effects of temperature in hadron dense matter within a generalized relativistic mean field approach based on the naturalness of the various coupling constants of the theory, The Lagrangian density of our formulation contains the fundamental baryon octet, nonlinear self-couplings of the σ and δ meson fields coupled to the baryons and to the ω and ρ meson fields. By adjusting the model parameters, after inclusion in a consistent way of chemical equilibrium, baryon number and electric charge conservation, our model describes static bulk properties of ordinary nuclear matter and neutron stars. In the framework of the Sommerfeld approximation, we extend our approach to the T≠0 domain. The Sommerfeld approximation allows a drastic simplification of computational work while improving the capability of the theoretical analysis of the role of temperature on static properties of protoneutron stars. We perform the calculations by using our nonlinear model, which we extend by considering trapped neutrinos introduced into the formalism by fixing the lepton fraction. Integrating the Tolman–Oppenheimer–Volkoff equations we have obtained standard plots for the mass and radius of protoneutron stars as a function of the central density and temperature. Our predictions include the determination of an absolute value for the protoneutron star limiting mass at low and intermediate temperature regimes.
In the investigation of the role of naturalness in effective theory, we focus on dense hadronic matter in a generalized relativistic multi-baryon Lagrangian density mean field approach which contains nonlinear self-couplings of the σ and δ meson fields interacting with the fundamental baryon octet and with the ω and ϱ meson fields and compare its predictions with estimates obtained within a phenomenological naive dimensional analysis based on the naturalness of the coupling constants of the Lagrangian model; our investigation is limited however to the scalar sector of the theory. Upon adjusting the model parameters to describe bulk static properties of ordinary nuclear matter, we show that our approach represents a natural modelling of nuclear matter under the extreme conditions of density as found in the interior of neutron stars.
High density hadronic matter is studied in a generalized relativistic multi-baryon Lagrangian density mean field approach which contains nonlinear couplings of the σ, ω, ϱ fields. We compare the predictions of our model with estimates obtained within a phenomenological naive dimensional analysis based on the naturalness of the coefficients of the theory. Upon adjusting the model parameters to describe bulk static properties of ordinary nuclear matter, we show that our approach represents a natural modelling of nuclear matter under the extreme conditions of density as the ones found in the interior of neutron stars. Moreover, we show that naturalness play a major role in effective field theory and, in combination with experiment, could represent a relevant criterium to select a model among others in the description of global static properties of neutron stars.
We study the effects of temperature in hadron dense matter within a generalized relativistic mean field approach based on the naturalness of the coupling constants of the theory. The Lagrangian density of our formulation contains nonlinear self-couplings of the σ meson field coupled to baryons and to the ω and ϱ meson fields. Moreover, we use the Sommerfeld and Hartle approximations to extend our approach to the finite temperature domain and slow rotational scenario. Both Sommerfeld and Hartle approximation allows a drastic simplification of computational work while improving the capability of theoretical analysis of the role of temperature and rotation on properties of protoneutron stars. Our predictions indicate that in the slow rotating regimen, neutron stars density profiles as well as the maximum mass and the inertial moment of these stellar objects are well approximated by the zero temperature approximation.
We study the consequences of the presence of a negative electric charge condensate of antikaons in neutron stars using an effective model with derivative couplings. In our formalism, nucleons interact through the exchange of σ, ω and ϱ mesons, in the presence of electrons and muons, to accomplish electric charge neutrality and beta equilibrium. The phase transition to the antikaon condensate was implemented through the Gibbs conditions combined with the mean-field approximation, giving rise to a mixed phase of coexistence between nucleon matter and the antikaon condensate. Assuming neutrino-free matter, we observe a rapid decrease of the electron chemical potential produced by the gradual substitution of electrons by kaons to accomplish electric charge neutrality. The exotic composition of matter in neutron star including antikaon condensation and nucleons can yield a maximum mass of about Mns ~ 1.76 M⊙.
A recently developed relativistic theory for nuclear matter, which shows consistency with the concept of naturalness and conservation of chiral symmetry, is applied to the description of the Okamoto–Nolen–Schiffer anomaly. The results of our study show significant improvements in the description of the anomaly when compared to the corresponding results of other authors. In particular, our predictions for the Okamoto–Nolen–Schiffer anomaly include the known anomalous growth, namely that the anomaly not necessarily grow with the nuclear mass, since the anomaly is lower, for instance, for the nuclei 39C and 39K when compared to the 17F and 17O nuclei. As far as we know, this trend was not described by any other model found in the literature.
We argue that when a theory of gravity and matter is endowed with (classical) conformal symmetry, the fine tuning required to obtain the cosmological constant at its observed value can be significantly reduced. Once tuned, the cosmological constant is stable under a change of the scale at which it is measured.
This document summarises recent ATLAS results for searches for third generation squarks using 36.1 fb−1 of LHC proton-proton collision data collected at √s=13 TeV. Despite the absence of experimental evidence, weak scale supersymmetry remains one of the best motivated and studied Standard Model extensions. Supersymmetry can naturally solve the Standard Model hierarchy problem by preventing a large fine-tuning in the Higgs sector: a typical natural SUSY spectrum contains light third generation squarks (stops and sbottoms). Both R-Parity conserving and R-Parity violating scenarios are considered. The searches involve final states including jets, missing transverse momentum, electrons or muons. Simplified models predicting pair production of third generation squarks have been excluded at 95% CL up to about one TeV in the most favourable scenarios.
I define a naturalness criterion formalizing the intuitive notion of naturalness discussed in the literature. After that, using ϕ4 as an example, I demonstrate that a theory may be natural in the MS-scheme and, at the same time, unnatural in the Gell-Mann–Low scheme. Finally, I discuss the prospects of using a version of the Gell-Mann–Low scheme in the Standard Model.
After the last missing piece, the Higgs particle, has probably been identified, the Standard Model of the subatomic particles appears to be a quite robust structure, that can survive on its own for a long time to come. Most researchers expect considerable modifications and improvements to come in the near future, but it could also be that the Model will stay essentially as it is. This, however, would also require a change in our thinking, and the question remains whether and how it can be reconciled with our desire for our theories to be “natural”.