[Vision2020] Eugenie Scott: Science, Science, Science!
Tbertruss at aol.com
Tbertruss at aol.com
Sat Nov 5 21:58:41 PST 2005
Michael et. al.
In keeping with the goal of focusing on the issue of whether or not
Intelligent Design/Creationism should be taught as a "scientific theory" in science
classrooms, it is puzzling that you did not respond to the problem posed in my
statement below, that was central (or so I thought it would appear?) to the gist
of my post that you answered on Eugenie Scott's talk at the U of I.
Intelligent Design/Creationism can be taught as a philosophical or
theological problem in classes focusing on these disciplines, an effort I am not
focusing on in this analysis. You perhaps are speaking from a
philosophical/theological system that may place science under the overarching metaphysics of a
certain theology, weakening the independence of science to dispassionately and
thoroughly investigate phenomena according to the methods of science and science
alone, leaving other sorts of problems for other disciplines.
Many questions science cannot now, and may never, be able to answer,
questions suited for investigations in philosophy and/or theology. But to expect
science to be a discipline that must address the subjects other differing
disciplines focus on, is perhaps like demanding a Bible class must offer all the
scientific evidence that the history the Bible presents and other Biblical
theories, are not supported by scientific evidence.
Shall we pass a law regulating the study of the Bible in terms of science for
any public institution, such as courses at the U of I that discuss the Bible?
Ted wrote on 10/16/05:
The problem for Intelligent Design/Creationism theory is to find a testable
reproducible empirical method of data gathering and/or experimentation that can
be presented in a science classroom as "science" based on a theory that also
passes logical/mathematical analysis for coherence with itself and other
established theories of science.
Perhaps it's useful to consider the details of a scientific theory about the
origin of our universe that has received empirical scientific validation to
flesh out what is required to offer an answer to the problem.
The Cosmic Microwave Background radiation, which the theory of the Big Bang
predicts, has been found, and is offered by modern science as empirical
evidence for the theory, a discovery which earned Penzias and Wilson the Nobel prize
for physics in 1978. More empirical evidence is being sought by the methods
of modern science for this theory, predicted by difficult mathematical
calculations.
http://www.big-bang-theory.com/
Offer your scientific theory for empirically investigating the Creation
and/or Intelligent Design of our universe by a supreme being, and perhaps a
scientist will explore your path of investigation to earn a Nobel Prize? Such a
published article, investigated by numerous scientists and later found to be
valid, would garner a massive audience, with millions doing cart wheels at having
their cherished beliefs gaining major confirmation by modern science.
I doubt that any of the theories you have presented from Plantinga, etc.,
regarding belief producing mechanisms, regardless of how clever and complex and
interesting to consider in the world of philosophy or theology, could serve as
the basis for a publishable scientific theory (think in terms of the well
known journal "Science") that offers validation of Intelligent Design/Creationism
of the universe. This is in part because it is obvious that any scientific
theory of the creation of the universe must deal with the complexities of
physics, and your defense of Intelligent Design/Creationism offered nothing on this
subject, nor did your referencing of Plantinga. But perhaps your introduction
of Plantinga into the discussion of Intelligent Design/Creationism was not
meant to utilize his theories aimed specifically at the Intelligent
Design/Creationism of the universe as science debate.
To provide more context into which the "problem" with Intelligent Design
and/or Creationism as a scientific theory might be understood more profoundly, I
offer below mind boggling information (very little of which I understand) on
the efforts of modern Physics to explore issues that impact our understanding of
the origins of our universe, including continuing exploration of empirical
evidence predicted by mathematics to confirm or complicate the Big Bang theory.
These issues are so complex and difficult, they should give pause to anyone
who thinks they can construct a scientific theory based on empirical evidence
that supports Intelligent Design and/or Creationism, though I am focusing only
on one theory of Intelligent Design/Creationism, that of a supreme being who
created the entire universe, Big Bang and All. Of course for someone who does
not accept the age of the universe since the Big Bang that science has
presented, because they believe God created the universe much sooner, they may not
believe in the Big Bang theory.
We can speculate that God and the universe may have always existed, so the
problem of Intelligent Design/Creationism could be thus narrowed to the
scientific evidence that God created human life, though consider this does not
automatically rule out God "supervised" evolution of the human species. If God is
"all powerful," such a being could create a scenario where humans evolved from
one celled organisms. And furthermore, when it is suggested that because the
theory of evolution has "gaps," that therefore Intelligent Design/Creationism
of human life must have validity as a scientific theory, this does not address
the demands for empirical verification and logical/mathematical examination
that all scientific theories require, so we are back again to the problem I
suggested is the crux of why Intelligent Design/Creationism has serious
difficulties as a scientific theory, though now perhaps more focused on Biology than
Physics:
http://www.washingtonpost.com/wp-dyn/content/article/2005/09/25/AR2005092501177_pf.html
We can date fossils that reveal life forms from previous eras of Earth's
history, we can observe evolution occurring in real time in living organisms, and
we can conduct mathematical analysis of genes to determine if their makeup
fits evolutionary theory. Various other theories to explain human life are
possible, but to teach them as science requires they fulfill the conditions for
scientific investigation.
http://www.europhysicsnews.com/full/27/article4/article4.html
Europhysics News (2004) Vol. 35 No. 3
Particle physics from the Earth and from the sky
Daniel Treille, CERN, Geneva, Switzerland
Recent results in particle physics offer a good balance between the news
"coming from the Earth", namely results from the various colliders, and news
"coming from the sky", concerning solar and atmospheric neutrinos, astroparticle
programmes, searches for dark matter, cosmic microwave background (CMB),
cosmology, etc.
In the light of this information, gathered in particular from the 2003
Summer Conferences (EPS in Aachen, Lepton-Photon in Fermilab), an account of the
status of our field is given. It will appear in two parts, corresponding
approximatively to the division between the Earth and the sky. The first one covers
the Electroweak Theory, ideas beyond the Standard Model, Quantum
Chromodynamics (QCD), Beauty and heavy ion physics.
Electroweak Theory
The Electroweak Theory (EWT), together with Quantum Chromodynamics (QCD),
modern version of the strong interaction, builds the Standard Model (SM) of
Particle Physics [1]. The EWT is a fully computable theory. All EW measurable
quantities, called "observables", as for instance the properties of the various Z0
decay modes, can be predicted with great accuracy and compared to
measurements. Each of them allows in particular to determine the Weak Mixing Angle, i.e.
the parameter of the 2X2 unitary matrix which transforms the two abstract
neutral bosons of the EWT into the two neutral physical states, photon and Z0. The
internal consistency of the EWT implies that all values of the Weak Mixing
Angle obtained should coincide. In terms of the standard Big Bang model, the
breaking of the EW symmetry, namely the time at which known elementary particles
got their mass, presumably through the Higgs mechanism*, occured at about
10-11s after the Big Bang.
The e+e- high-energy colliders LEP at CERN and SLC (SLAC Linear Collider)
have delivered their quasi-final results. Their contribution to the validation
of the EWT has been invaluable. However, besides celebrating this great
success, it is worth considering the few areas of obscurity left and discussing how
one can hope to improve the precision measurements in the future.
It is amusing to remember what was expected from LEP, for instance at the
time of the meeting held in Aachen, in the same place as the 2003 EPS
Conference, in 1986. In nearly all domains the quality and accuracy of the final
results of Z0 and W± physics* have been much better than foreseen, in particular
due to the progress made during the last decade on detectors (microvertex
devices allowing a clean tag of beauty particles, by revealing their long lifetime
(flight path) of about 1 picosecond (few mm), luminometers providing a very
accurate absolute normalization of the various processes, etc), on methods (such
as how to determine the number of neutrinos from the Z0 properties, ...) and
on the mastering of theoretical calculations.
Figure 1 and its legend recall what is the scenery of e+e- collisions.
Sitting on the huge Z0 resonance, LEP recorded about 18 millions Z0 events and
SLC about half a million only, but with the strong bonus of a large polarization
of the incident electrons and better conditions for beauty tagging. From this
large amount of data, many observables were measured, often with an accuracy
of one per mil or better. Later, LEP200 measured e+e- interactions at higher
center-of-mass energies, up to 206 GeV: it recorded about 40K W pair events and
set quite strong lower mass limits on the Higgs boson and Supersymmetric
Particles.
If one summarizes the whole set of available EW measurements (LEP/SLC and
others) by performing a global fit [2], one finds that the SM accounts for the
data in a satisfactory but nevertheless imperfect way: the probability of the
fit is only 4.5%.
The measurement lying furthest from the average is that of the weak mixing
angle by the NuTeV experiment in Fermilab [3], which scatters neutrinos and
antineutrinos on target nuclei. Before invoking new physics, the possible
"standard" causes of such a disagreement were carefully investigated: unexpected
features of the quark distribution inside nucleons are the most likely culprits.
If this measurement is excluded from the fit, the probability becomes 27.5%,
a reassuring number.
The other noticeable disagreement concerns the two most precise
electroweak measurements, namely the spin asymmetry ALR at SLC, i.e. the relative change
of rate of Z0 production in e+e- collisions when one flips the electron
helicity (i.e. the component of its spin along the direction of motion), and the
forward-backward asymmetry of beauty production on the Z0 at LEP, AFBb, i.e. the
manifestation of the violation of particle-antiparticle conjugation C (and of
parity P) in e+e-Z0beauty-antibeauty, which give values of the weak mixing
angle differing by 2.7 standard deviation with no hint of an explanation,
neither instrumental nor theoretical.
Fig 1 The scenery of e+e- collisions as a function of energy. LEP1 was
"sitting" on the huge Z0 resonance. Beauty factories exploit the Y(4S) resonance
located in the family of Y beauty-antibeauty resonances near 10 GeV. The J/y is
the lowest charm quark-antiquark bound state near 3 GeV. (Fig. courtesy U.
Almaldi.)
An ambiguity which is not yet removed concerns the theoretical
interpretation of the muon g-2 measurement [4] obtained in Brookhaven with an
experimental accuracy of ~ 5 10-7. The slight departure of the muon g factor, relating
the magnetic moment to the spin, from its canonical value of 2 (i.e. the value
given by the Dirac equation describing pointlike relativistic fermions) is due
to the fact that the electromagnetic interaction of a muon and a photon is
perturbed by the exchange of one (or more) additional photon(s) (figure 2a).
After correction of a small error, the theoretical frame is sound. However, the
tiny hadronic contribution (figure 2b) to this quantity expected in the SM,
which reflects the probability that the additional photon fluctuates into a light
hadronic system, differs, depending on the way it is estimated. To obtain its
value one has to resort to subsidiary experimental data. Using for this
purpose the hadronic decays of the tau* (plus a set of assumptions) leads to a
relatively fair agreement between theory and experiment (the latter larger than the
former by 1.4 s. However using hadronic production in low energy e+e-
collisions leads to an excess of experiment over expectation which according to the
most recent analyses [5] amounts to 2.7 s. The situation may still change with
the advent of data from the B Factories in SLAC and in KEK (Japan) and from
KLOE in Frascati. This residual discrepancy is all the more unfortunate given
that the g-2 observable is potentially a powerful telltale sign of new physics,
in particular Supersymmetry [1], since new particles can contribute to the
perturbation as virtual states (figure 2c). While a significant excess of the
measured value over theory could point to an appetizing window for the masses of
some supersymmetric particles, good agreement could on the contrary eventually
turn into a noticeable constraint on the minimal value of their masses.
A low energy measurement which "returned to the ranks" is that of atomic
parity violation (APV). APV [6] occurs because in an atom the electrons and the
nucleus interact not only by photon exchange but also by Z0 (and its possible
recurrences at higher mass Z0') exchange. Alkali atoms, having a single outer
electron, are the only ones that lead to tractable atomic calculations. Due
to recent refinements of some theoretical estimates, there is presently a good
agreement between the expectation and the 0.6% accurate measurement on cesium
made in Boulder in 1997. The APV measurement does not weight much in the EW
fit. However, a remarkable result for such a small sized experiment is that the
lower mass limit it sets on a potential Z' (600-800 GeV) is quite competitive
with those of LEP or Tevatron. However, to stay so in the face of future LHC
data, the APV measurement should reach ~ 1‰ or so. The possibility of a
programme using francium, the next alkali atom, much more sensitive but radioactive,
is sometimes mentioned.
It is worth underlining here the promises of another set of low energy
measurements concerning Electric Dipole Moments (EDM), in particular of the
neutron. For particles to have a permanent EDM the forces concerned must violate
the invariance under time reversal T (and therefore under CP*), and the SM
expectations are out of reach, far below existing and foreseeable limits. But
various scenarios beyond the SM may lead to strong enhancements. Very sophisticated
methods involving ultra cold neutrons are under study and may bring an
improvement of two orders of magnitude on the present neutron EDM upper limit.
Limits on the muon EDM, as a by-product of the g-2 measurement, and on the electron
EDM, through measurements made on various atoms, in particular Hg, are likely
to improve as well. If no positive evidence is found, these limits will in
particular become a major constraint for Supersymmetry.
Let us finally quote a potential problem concerning the unitarity of the
Cabbibo-Kobayashi-Maskawa (CKM) matrix7, and more precisely its first row. The
CKM matrix gives the relationship between the quarks seen as mass and as
flavour eigenstates. Unitarity just means that when one "rotates" from one base to
the other the probability has to be conserved. The CKM matrix is a 3X3 unitary
matrix, entirely defined in terms of four real parameters. It gives a concise
description of all that we know at present about the weak interactions of
quarks. The first row of the matrix concerns essentially the ud and the us
(Cabbibo angle) transitions and the fact that their moduli squared do not add
exactly to unity could indicate that the value of the Cabibbo angle is slightly
underestimated. Actually, after including recent results, like the data of E865,
at Brookhaven Alternate Gradient Synchrotron, on the decay Kpen, the remaining
deficit relative to unity amounts only to ~1.8 s and is not a big worry
Fig 2 Examples of loop diagrams
(a) The lowest order diagram contributing to g-2.
(b) The hadronic contribution to g-2.
(c) SUSY particles in the loops as a possible contribution to g-2.
(d) Penguin diagrams (SM and SUSY) contributing to one B decay mode. The
ressemblance to a penguin is a matter of taste.
(e) The loop diagrams responsible for beauty-antibeauty oscillation.
The message from LEP
In spite of the few open questions quoted above, the first message of LEP/SLC
is therefore the quality of the agreement of the SM, or more exactly of its
neutral current (i.e. Z0) sector with data. Any theory attempting to go beyond
the SM (see below) must therefore mimic it closely and offer very similar
predictions of the various EW observables. Most interestingly, because of the
extreme accuracy of the measurements, the agreement has been demonstrated at the
quantum loop-level. Before expanding this last point, let us remark that the
situation is less precise for the charged current sector of the SM. As for its
scalar (Higgs or equivalent) sector, still largely untested, it will need the
CERN Large Hadron Collider (LHC) to be explored.
In a given process, particles, even if they are too heavy to be produced
as "real" particles, can nevertheless intervene as "virtual" states and
slightly influence the process. Figure 2 presents a variety of such loop diagrams.
Accurate measurements on a process can thus yield information on these virtual
particles. At LEP the "missing pieces" of the SM were the top quark, too heavy
to be pair produced but whose existence was never in doubt, and the Higgs
boson, not yet observed directly at present. As G.Altarelli put it, LEP physicists
were in the situation of a bush hunter, his ear to the ground, trying to hear
the pace of a tiger (the Higgs) while an elephant (the top) was rampaging
around.
It is well known that Z0 physics at LEP gave a rather accurate "indirect"
estimate of the top quark mass (presently 171.5 +11.9 -9.4 GeV), in very good
agreement with the value that later the Tevatron measured "directly" by producing
the top, presently 174.3 ± 5.1 GeV (figure 3a). Once the "large" effect of
the elephant-top on the relevant electroweak observables was well under control,
one could search for the tiny effect expected from the tiger-Higgs boson,
which in the SM is assumed to be the only missing piece. Ignoring the disagree
ments quoted above, essentially that existing between ALR and AFBb, and con
sidering only the mean values, one can thus deduce, in the strict frame of the SM,
the preferred mass region for the Higgs boson (remembering that the information
concerns the logarithm of its mass):
Mh = 91+58 -37 GeV, and mh <219 GeV at 95% CL (figure 3b).
Taken alone, the ALR observable would give for the boson mass a range
between about 15 and 80 GeV, while the observable AFBb would give it between about
200 and 700 GeV. The W mass value (the world average is 80.426 ± 0.034 GeV)
indicates also a Higgs mass region on the low side.
Let us remark that the SLC measurement seems to contradict the lower limit
of 114.2 GeV set on the Higgs mass by the direct Higgs search* at LEP200, as
well as the indication for an effect near 115 GeV which is presently at the
1.7 s level. However the problem would be less acute if the top mass was a few
GeV, say one standard deviation, higher than one states presently, a
possibility that a reanalysis by the Tevatron [8] experiment D0 of its Run I data might
suggest. If it were so the limit on mh would be raised from 219 to ~280 GeV.
For this reason, and many other good ones, a precise determination of the top
mass is "devoutly to be wished". The Tevatron will reduce the uncertainty to
~2.5-3 GeV, per experiment and with an integrated luminosity of 2fb-1 (i.e.
providing 2 events for a process having a cross-section of a femtobarn, i.e. 10-39
cm2). The LHC should reach an uncertainty of ~1-2 GeV, while a Linear
Collider will do about ten times better.
Fig 3 Left: The top mass from indirect LEP measurements (open circles) and
from the direct Tevatron measurements (colour triangles).
Right: The preferred region for the SM Higgs mass (near the bottom of the c2
curve) deduced from electroweak measurements.
The other key message of LEP/SLC is thus the indication of a light Higgs
boson. Is this the truth, or could it be an illusion? Clearly if one quits the
frame of the SM by introducing new physics, it is quite possible to invent
"conspiracies", i.e. interference of amplitudes of different processes, by which
a heavy Higgs boson has its effect on electroweak observables compensated by
something else, like new particles or extradimensions of space. However, these
solutions are more or less artificial: it is thus reasonable to focus on the
simplest scenario and to test in priority the assumption of a light boson by
obtaining direct evidence for it.
Beyond the standard model
Exhaustive reviews of the direct searches for new physics at colliders,
updating the existing limits, have been given. Unfortunately, besides the DsJ
particles* found by the Beauty Factories [9] and the Pentaquarks [10]*, no
discovery has showed up at the high-energy frontier.
Nevertheless the motivations pushing to go beyond the SM are still present
and more compelling than ever. The main one is the Hierarchy Problem that can
be stated as follows. Gravity exists and defines a very high energy scale,
the Planck scale* (~1019 GeV) at which the gravitational force becomes strong.
In the SM all other masses, in particular the Higgs mass, should be
irredeemably pulled towards this high scale by the radiative effects already quoted.
Something more is needed to guarantee the stability of low-mass scales.
Traditionally the routes leading beyond the SM either call for new levels of structure
and/or new forces, as Technicolour (TC) [11] does, or involve more symmetry
among the players of the theory, as in the case of Supersymmetry (SUSY) [1,12],
in which SM particles and their "superpartners", i.e. the new particles of
opposite spin-statistics (a boson as partner of a SM fermion and vice-versa) that
SUSY introduces, conspire to solve the Hierarchy Problem.
TC breaks the EW symmetry in an appealing way, very reminiscent of the way
the electromagnetic one is broken by supraconductivity (which, crudely
speaking, gives a mass to the photon). However TC meets serious problems in passing
the tests of electroweak measurements, because it harms too much the
predictions. On the other hand SUSY, which has a more discrete effect in this respect,
keeps its eminent merits and remains the most frequented and even crowded
route. In this context another important result [13] derived from the LEP data is
the quasi-perfect convergence near 1016 GeV of the electromagnetic, weak and
strong coupling "constants" in the frame of SUSY, the so-called Supersymmetric
Grand Unification (SGU) (figure 4b). This "running" of coupling constants with
the energy scale is another consequence of the quantum nature of the theory:
it is due to the effect of virtual particles appearing in the loop diagrams.
The presence of superpartners explains why the "running speed" is different in
SUSY and in the SM.
Fig 4 Left: The evolution of the strong coupling constant with the energy
scale. Right: The convergence of the SM coupling constants, approximate in the SM
(upper figure), exact in SUSY (lower figure). One should distinguish this
smooth running of couplings from the evolution of the intensity of the
interaction with the energy scale, depending on the mass of the exchanged boson.
SUSY is certainly a broken symmetry as no partner of known particles with
opposite spin-parity exists with the same mass. These partners are assumed to
be heavy, but not too much (few hundred GeV to few TeV) as otherwise SUSY
would no longer cure the hierarchy problem. Furthermore the convergence of
couplings quoted above requires that the superpartners appear at relatively low mass,
say 1 to 10 TeV.
With the diversity of the possible SUSY breaking mechanisms*, this theory
presents a complex phenomenology with many different possible mass spectra for
the supersymmetric particles. Its minimal version however offers a golden
test: it predicts a very light Higgs boson, i.e. <130 GeV in full generality (for
mtop=175 GeV), and <126 GeV once SUSY is broken, as it has to be, and in
particular in all versions of Supergravity14 presently considered as the reference
points for future searches. This is a mass window that LEP, with 80
additional (i.e. 30% more) superconducting accelerating cavities and the magnificent
performances of the accelerating field finally reached, could have explored and
which stays as the first objective of future programmes. If SUSY represents
the truth, the LHC, or maybe, with much luck and considerable improvements, the
Tevatron, will discover it by observing, besides the light Higgs boson, some
supersymmetric particles. But a Linear Collider will be needed to complete its
metrology in the mass domain it will give access to.
However, quite interesting new roads have appeared in recent years.
One, the Little Higgs scenario, leaving aside the Big Hierarchy problem
(the one we introduced above) for the time being, tackles first the Small
Hierarchy one, namely the fact that LEP announces a light Higgs boson while it
pushes beyond several TeV the scale of any new physics (except SUSY which can still
be "behind the door"): again the Higgs mass should be pulled to this high
scale and the fact that it is not calls for efficient cancellation mechanisms to
be at work. Keen to do without SUSY, this model, by an algebraic tour de
force, manages to realize the compensations needed by inventing new particles, a
Z', a W', a new quark, etc., at the mass scale of few TeV. The existing EW
measurements put however the model under a severe tension. True or not, this theory
has the merit to reinvigorate the LHC phenomenology by introducing new
particles into the game and in particular insisting on quantitative tests concerning
their decay modes.
The other new route postulates the existence, so far uncontradicted, of
extra dimensions of space (ED), large enough to generate visible effects at
future experiments. The general idea of an ED, due to Kaluza and Klein, is rather
old (around 1919). The Superstring Theory requires EDs since it is consistent
only in 9 or 10 spatial dimensions. For long, however, these EDs were thought
to be "curled up" (compactified) at the Planck scale, until it was realized
that things could be different. Several versions are presently put forward [15].
With substantial differences between them, they all predict Kaluza-Klein
recurrences of the graviton or some of the SM particles, i.e. new states which can
be produced if their mass is at the TeV scale or below, or that may change
the rate of SM processes through their effect as virtual particles.
Such an eventuality, which has to be fully explored, would be an
extraordinary chance for LHC and its prospective study also contributes an agreeable
diversification of its phenomenology. However, before dreaming too much, it is
important to appreciate correctly the existing limits, drawn either from
accelerators or from astrophysics. For the ADD scenario, one should also consider
the impact of dedicated tests of Newtonian gravity at small scale [16], which,
besides micro-mechanical experiments, use sophisticated methods involving Ultra
Cold Neutrons and maybe in the future Bose-Einstein Condensates, which build
interesting bridges between particle physics and other sectors of physics.
Moreover, it is still a rather natural attitude to assume that extra
dimensions, if they play a role, would do so at much higher energy scales, for
instance the one of Grand Unification (GU). Many studies follow that path and
analyse what one or more extra dimensions bring to the already very successful
theories of Supersymmetric GU. This complements the class of studies which, to
the symmetry group of GU, add other ones (a new U(1), a new SU(3), etc.) whose
role is to deal in particular with the mystery of the triplication of families
(i.e. the existence of the electron, muon and tau families).
The hope is that these attempts, performed from "bottom to top", i.e. from
low towards high energies, and those, from "top to bottom", of Superstrings
[17] will meet one day and guide each other.
Fig 5 The "d-b" Unitary Triangle. Adding to zero three complex numbers like
VudVub*, etc. naturally lead to draw a triangle. We indicate which B decay
modes give access to its angles and sides. Vij is the element of the CKM matrix
connecting the flavour eigenstate quark i to the mass eigenstate quark j.
-------------------
Heavy ions
In the Big Bang model the transition from free quarks to hadrons occurred at
a few microseconds. High energy heavy ions collisions are under study, to find
evidence for the reverse step, i.e. the fusion of nucleons into a quark-gluon
plasma (quagma) [23].
Fresh results are coming from the RHIC collider in Brookhaven, concerning
Au-Au collisions up to 200 A GeV and have brought a few "surprises" concerning
the properties of the hot and dense medium thus produced. The quote expresses
the fact that some of them were actually predicted long ago.
The chemical freeze-out (at which the identity of the particles is fixed)
occurs at 175 MeV (the Hagedorn temperature [24]), as at the CERN SPS, but the
medium is now nearly baryon-free. The kinetic freeze-out (at which their
kinematics is fixed) happens near 100 MeV. The medium undergoes an explosive
expansion at a speed of 0.6 c, and shows a strong anisotropy of transverse flux,
suggesting a hydrodynamic expansion due to very strong pressure gradients
developing early in the history of the collision. Remarkably, the collision zone is
opaque to fast quarks and gluons and this has a strong impact on hard phenomena:
suppression of hadrons produced at large pT, jet "quenching", i.e. the
decrease of their rate of production, phenomena which are not observed in control
collisions D-Au . Several questions concerning the Hanbury-Brown-Twiss (HBT)
correlations (a concept borrowed from astronomy), e.g. the size of the collision
zone, or the fate of charm in this opaque medium, etc. have still to be
clarified.
However the most prominent signatures which could reveal a quark-gluon plasma
are not yet available from RHIC and it is from the CERN Super Proton
Synchrotron that results are still coming. In particular, the experiment NA45 confirms
that the excess of low mass e+e- pairs, mee>0.2 GeV, implies a modification
of the r resonance in the dense medium, probably linked to its baryonic
density. The suppression of the production of the J/y, the lowest bound state of
charm and anticharm, which could signal its fusion in the quagma [25], is
confirmed by the analyses of NA50 and keeps all its interest. Unfortunately no unique
prediction of this effect exists for RHIC and LHC. Data are needed: the next
ones should come from PHENIX at RHIC and from NA60 at the CERN SPS.
--------------------------------------------
Vision2020 Post by Ted Moffett
-------------- next part --------------
An HTML attachment was scrubbed...
URL: http://mailman.fsr.com/pipermail/vision2020/attachments/20051106/cd46ef2e/attachment-0001.htm
More information about the Vision2020
mailing list