Variability in Type 1A Supernovae Has Implications for Studying Dark Energy


The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae, and these stellar explosions have long been used as “standard candles” for measuring the expansion. But not all type 1A supernovae are created equal. A new study reveals sources of variability in these supernovae, and to accurately probe the nature of dark energy and determine if it is constant or variable over time, scientists will have to find a way to measure cosmic distances with much greater precision than they have in the past.

“As we begin the next generation of cosmology experiments, we will want to use type 1a supernovae as very sensitive measures of distance,” said lead author Daniel Kasen, of a study published in Nature this week. “We know they are not all the same brightness, and we have ways of correcting for that, but we need to know if there are systematic differences that would bias the distance measurements. So this study explored what causes those differences in brightness.”

Kasen and his coauthors–Fritz RΓΆpke of the Max Planck Institute for Astrophysics in Garching, Germany, and Stan Woosley, professor of astronomy and astrophysics at UC Santa Cruz–used supercomputers to run dozens of simulations of type 1a supernovae. The results indicate that much of the diversity observed in these supernovae is due to the chaotic nature of the processes involved and the resulting asymmetry of the explosions.

For the most part, this variability would not produce systematic errors in measurement studies as long as researchers use large numbers of observations and apply the standard corrections, Kasen said. The study did find a small but potentially worrisome effect that could result from systematic differences in the chemical compositions of stars at different times in the history of the universe. But researchers can use the computer models to further characterize this effect and develop corrections for it.

A type 1a supernova occurs when a white dwarf star acquires additional mass by siphoning matter away from a companion star. When it reaches a critical mass–1.4 times the mass of the Sun, packed into an object the size of the Earth–the heat and pressure in the center of the star spark a runaway nuclear fusion reaction, and the white dwarf explodes. Since the initial conditions are about the same in all cases, these supernovae tend to have the same luminosity, and their “light curves” (how the luminosity changes over time) are predictable.

Some are intrinsically brighter than others, but these flare and fade more slowly, and this correlation between the brightness and the width of the light curve allows astronomers to apply a correction to standardize their observations. So astronomers can measure the light curve of a type 1a supernova, calculate its intrinsic brightness, and then determine how far away it is, since the apparent brightness diminishes with distance (just as a candle appears dimmer at a distance than it does up close).

The computer models used to simulate these supernovae in the new study are based on current theoretical understanding of how and where the ignition process begins inside the white dwarf and where it makes the transition from slow-burning combustion to explosive detonation.

The simulations showed that the asymmetry of the explosions is a key factor determining the brightness of type 1a supernovae. “The reason these supernovae are not all the same brightness is closely tied to this breaking of spherical symmetry,” Kasen said.

The dominant source of variability is the synthesis of new elements during the explosions, which is sensitive to differences in the geometry of the first sparks that ignite a thermonuclear runaway in the simmering core of the white dwarf. Nickel-56 is especially important, because the radioactive decay of this unstable isotope creates the afterglow that astronomers are able to observe for months or even years after the explosion.

“The decay of nickel-56 is what powers the light curve. The explosion is over in a matter of seconds, so what we see is the result of how the nickel heats the debris and how the debris radiates light,” Kasen said.

Kasen developed the computer code to simulate this radiative transfer process, using output from the simulated explosions to produce visualizations that can be compared directly to astronomical observations of supernovae.

The good news is that the variability seen in the computer models agrees with observations of type 1a supernovae. “Most importantly, the width and peak luminosity of the light curve are correlated in a way that agrees with what observers have found. So the models are consistent with the observations on which the discovery of dark energy was based,” Woosley said.

Another source of variability is that these asymmetric explosions look different when viewed at different angles. This can account for differences in brightness of as much as 20 percent, Kasen said, but the effect is random and creates scatter in the measurements that can be statistically reduced by observing large numbers of supernovae.

The potential for systematic bias comes primarily from variation in the initial chemical composition of the white dwarf star. Heavier elements are synthesized during supernova explosions, and debris from those explosions is incorporated into new stars. As a result, stars formed recently are likely to contain more heavy elements (higher “metallicity,” in astronomers’ terminology) than stars formed in the distant past.

“That’s the kind of thing we expect to evolve over time, so if you look at distant stars corresponding to much earlier times in the history of the universe, they would tend to have lower metallicity,” Kasen said. “When we calculated the effect of this in our models, we found that the resulting errors in distance measurements would be on the order of 2 percent or less.”

Further studies using computer simulations will enable researchers to characterize the effects of such variations in more detail and limit their impact on future dark-energy experiments, which might require a level of precision that would make errors of 2 percent unacceptable.

Source: EurekAlert

26 Replies to “Variability in Type 1A Supernovae Has Implications for Studying Dark Energy”

  1. So type Ia supernovae aren’t quite the ‘standard candles’ they were once thought to be, even after making corrections for elemental abundances (metallicity). Hopefully, these new computer sims will help clarify the observed luminosity range of these SN events. As noted, this will be crucial in teasing out accurate data on the evolution of Dark Energy in the universe and how it may vary over time.

  2. This seems to be the paper (link is to ArXiv preprint):

    @JH: Concerns over just how good Ia SNe are as standard candles can be found in the literature right from Day One; systematic effects are the bane of astronomers’ lives, and they are (or should be) acutely aware of them.

    What I think you’ll find is that the coverage of these objects outside the relevant community tends to downplay or completely ignore this aspect.

    In general, astronomers address systematic effects in two broad ways: they try to identify, characterise, and study them in as much detail as they can; and they try to circumvent them by using independent techniques, seeking multiple consistency checks, etc.

    In the case of using Ias as distance indicators, the independent/consistency channel includes investigations of methods of estimating cosmological distances that are completely independent of Ias (e.g. gravitational lensing, the SZE); wrt dark energy, independent constrains come from observations of the CMB and BAO.

  3. Nereid, thanks for your introduction to the supernova ‘as standard candles’ debate that I first encountered in my frosh year at OSU in 1976. Maybe less informed readers may look into this well-known (in astronomical circles) effect and learn more about what we know about different types of SNe. Four years of college quickly brought to my attention the “constant standard luminosity” problem at the the forefront of distant SNe studies at the time. I do find it encouraging that modern astrophysicists are using all means at their disposal to accurately determine the types and total luminosities of these celestial beacons. I even remember when GRBs were seriously being touted as far more distant ‘standard candles’ a few years back, IIRC mostly by work done by Bradley Scheafer. Too bad that didn’t pan out as hoped. I heartily agree that more robust measures of cosmic distances will probably supersede current methods, as the aforementioned gravitational lensing, SZE, CMB, BAO and gravitational microlensing measurements may provide. Should prove to a a couple of interesting years, though πŸ™‚

  4. Type 1a? Cepheid’s? I say parallax is the way to go man. We just need to put two telescopes at either end of the milky way and we should be all set πŸ™‚

  5. First let me start my rant by saying that I appreciate the great work of the writers and site maintainers and appreciate that I can read this content free which helps keep some fun in life for me as I cannot watch the intellectuall void called TV….

    However, one sentence in this article really bothers me a bit:

    “The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae”

    As far as I understand, Dark Energy and Dark Matter are theories brought about by cosmologists to explain and plug gaps in the mathematical and theoretical models of the big bang and accelerating expansion of the universe where less than 5% of matter/energy can be accounted for by observation alone.

    At this point the jury is still out on whether there is concrete evidence proving the existance of Dark Energy or Matter..

    However, there are interesting observations by Hubble and other telescopes of interacting galaxies such as the Bullet Cluster and others but at this stage the observational evidence can best be described as anecdotal until different observational methods can confirm the existance of Dark Energy or Matter.

    I personally believe that Dark Energy / Matter will be looked back on in history in much the same light as ‘Lumeniferous Aether’ (an imaginary substance in outer space) which was invoked by scientists in the 18th century to explain how light could be exhibiting its properties in the vaacum of outer space.
    The model and conception of space and the forces of light were fundamentally flawed and no amount of lumeniferous aether would have fixed that model until Dr Albert Einstein, Schrodinger, Bohr, Planck and others came along and proved light as a duality of both energy/wave and particle/matter..
    However, although that theory was later proven wrong, it did help the scientists who came along later to focus their experiments.. This is the foundation of the scientific model… 1.Conjecture/Hypothesis, 2.Theory, 3.Observation&Testing (and then 4.refinement/acceptance/rejection of a theory)

    In the interest of accuracy, if in the abovementioned sentence we could replace

    “The discovery of dark energy, a mysterious force that is accelerating the expansion of the universe, was based on observations of type 1a supernovae”


    “The creation of the theory of Dark energy, a mysterious force that is theorised to be accelerating the expansion of the universe, was based on observations of type 1a supernovae”

    I would be pleased then as I think there is always a danger in accepting theories too easily without a spirit of genuine, honest questioning… After all this is how real discoveries and progress is made in any field or endevour…

    /end rant/


  6. weeasle Says:
    August 12th, 2009 at 7:49 pm

    I agree with what you’re saying to a certain extent, however, there comes a point where prefacing every statement you make in science with a ‘this may or may not be the case’ starts to become a bit tedious. At some point, you have to just accept that everything in science is certain only in so much as it currently provides the best explanation for what we observe – the uncertainly is implicit in any scientific statement of ‘fact’ – there are only degrees of uncertainty.

    For the most part, I would think that almost anybody who visits this site would realise that there is a fair chance that DM, and certainly DE, may prove to be illusory, and a more all-encompassing theory may replace them. Then again, maybe not – perhaps exotic particles can account for DM, and the DE effect is real. Who knows? We’re likely nowhere near resolving that completely yet – we have barely begun to characterise the observational effects of the phenomena we’re trying to describe.

    Parallels can be drawn with the Aether hypothesis, but there are significant differences too. Maxwell’s equations were fundamentally Lorentz covariant, and predicted the speed of light in a vacuum to always be c, no matter what the velocity of the observer. In a way, the answer was staring everyone in the face. The problem was that the consequences of accepting this were so dramatic as to be unthinkable at the time. It’s still unthinkable if you ask me! The universe has a funny way of doing things.

    I’m not sure the resolution to our DM or DE issues will play out simply as an obvious misunderstanding of current theory…

  7. This is an unfortunate but not unexpected finding. A white dwarf which growsin mass from a companion will reach the C-limit at 1.4M_{sol} . The subsequent implosion is a sort stellar version of a nuclear fusion bomb. It is my understanding the fusion inolves lots of He to C and C to Si and so forth. So the energy released will clearly depend upon the elemental (chemical) composition of the white dwarf.

    It is my hope that DE is not ruled out by this. The existence of what we call DE, or the influence of the cosmological constant, has some fascinating consequences for quantum gravity and cosmology.


  8. @ weeasle:

    Not that I’m an astronomer, but these things interest me of course. Standard cosmology (hey, it’s been ~ 5 years or more without replacement! :-D) is a nifty thing for a layman.

    To the best of my understanding neither DM nor DE is brought out to plug untestable “gaps” in models, but was suggested as testable hypotheses.

    Take the Bullet Cluster (and now some more clusters) observations for example, that provide a direct test of DM, albeit with indirect observation. Other hypotheses that matches these observations are now very contrived, and must according to “Starts With A Bang” astrophysicist be tailored specifically to each of them. Ie the alternatives are AFAIU contradictory and so falsified, while DM passed.

    Both DM and DE has passed several such tests, which is why one can (tentatively, of course) adopt them as null hypotheses if one wish.

    The comparison with aether theories are apt. These failed their first test, DM and DE didn’t. Even more, both DM and DE can be derived from underlying physics just as non-aether relativity could. (From particle theories respectively field theories. The cosmological constant/vacuum energy/string theory DE hypothesis was, in Astrofiend’s parlance “staring everyone in the face”. So it’s not “mysterious” anymore, the mystery would occur if it wasn’t there.)

    Nitpick 1: You are using the everyday/creationist/religious idea of “theory”, while we should be discussing (scientific) theory.

    That is, when you say, correctly, that DE is a “force that is theorised” it means as soon as it is successfully tested in some manner, and it has been (say by using WMAP observations to test standard cosmology models), it is _stronger_. As it then validates _both_ the observations and the theory that both incorporates and predicts them.

    Often, as is the case for both DM and DE, it is with some measure (say the number of permutations of connections to other theory (see above for examples), or something such) _order of magnitudes_ stronger than the mere observational fact.

    Nitpick 2 &3: “Duality” is AFAIU of aether theory status, pre-QM ideas that have been replaced, not QM at all. And there was never any “energy/wave” or “particle/matter” such.

    Modern QM in the form of quantum field theory is AFAIU mostly a particle based field theory.

    “In summary, the classical visualisation of “everything is particles and fields”, in quantum field theory, resolves into “everything is particles”, which then resolves into “everything is fields”. … For example, a quantum theory of the electromagnetic field must be a quantum field theory, because it is impossible (for various reasons) to define a wavefunction for a single photon. … Quantum field theory thus provides a unified framework for describing “field-like” objects (such as the electromagnetic field, whose excitations are photons) and “particle-like” objects (such as electrons, which are treated as excitations of an underlying electron field). [Wikipedia.]

    QFT glom these objects together instead of separating them into dualities. And even if it didn’t, we would then have to rely on QM wavefunctions and particles, not “wave or particle behavior” as pre-quantum _mechanics_ hypotheses went.

    [And FWIW string theory and putative “particle less” sectors are embedding these in a larger QM setting, without fundamental particles at all.]

    In the end, science is by it’s competitive nature operating with “a spirit of genuine, honest questioning”. When scientists are suggesting new phenomena it is against that nature for us others to question these ideas solely based on gut feeling (aka experienced “common sense”, observational science’s worst enemy) and misperceived history.

    But the spirit of this questioning is both admirable and necessary in all, so we shouldn’t buy baseless speculation, honest mistakes or perversions of science either. Kudos for your spirit! But note that such criticism takes some effort and studying to meaningfully produce.

    [Also note that I’m not suggesting that I’m there. For example, QFTs are unknown territory for me.]

  9. My bad, seems my nitpick #2 was invalid of sorts (“that theory was later proven wrong”). Except perhaps that I don’t think duality has been falsified (“proven wrong”) as such, it is just not part of today’s theory.

  10. Interesting discussion!
    I credit Weeasle with an opinion I share: what you 3 guys explained at lenght here, and Torbjorn summed up best, is THE most important thing that science information should convey to the public, that science writers should write about (they do! but not always…), that science readers should look for.
    Until you have come to recognize these major aspects of scientific knowledge (temporary, collective, open), you only come across apparently contradictory definitive statements, and that’s very frustrating.

    What makes me mad is that mainstream channels never, ever go into this, so science end up being totally misrepresented.
    Bacteria in a Mars stone make headlines, the falsification thereof doesn’t, and whoever heard of Popper anyway?

  11. For now, I’d like to comment on just this part of what weeasle wrote:

    As far as I understand, Dark Energy and Dark Matter are theories brought about by cosmologists to explain and plug gaps in the mathematical and theoretical models of the big bang and accelerating expansion of the universe where less than 5% of matter/energy can be accounted for by observation alone.

    (bold added)

    Dark Matter has a fascinating, and intricate history!

    It began with Zwicky, in 1933 (or 1934), who discovered that there seemed to be more mass in a galaxy cluster (Coma?) than he could account for, based on the light detected from the galaxies in the cluster.

    Several decades later, x-ray astronomers discovered that indeed rich galaxy clusters – such as Coma – contain far more mass in the hot, diffuse intracluster medium than there is in the constituent galaxies.

    In between, Rubin and Ford discovered that the rotation curves of (normal) spiral galaxies are essentially flat (beyond a modest radius), implying that such galaxies contain an awful lot more mass than appears to be there, based on detection of light (and radio) from stars, gas, and dust.

    One of the very first tasks of the Hubble Space Telescope, several decades later, after it was handed over to scientists, was to look for very faint stars in the halo of our own galaxy, to test the idea that the mass responsible for the flat rotation curves was in the form of such stars … some were found, but far too few to account for the rotation curves.

    … (there’s a very great deal more!) …

    The bottom line: cold, dark (“non-baryonic”) matter – Dark Matter in short – is spectacularly successful in accounting for a great many (millions) of observations, of many different objects (millions), of several different classes (dozens), over enormous distance (and time) scales (from a few thousand parsecs to approx 14 billion years).

    When such astounding success happens in branches of science over which we have more direct control, almost no one baulks at calling the conclusions ‘fact’ (think of neutrinos, for example). And even in branches of science that have limitations similar to astronomy’s – geology say – such sweeping success is also called ‘fact’ (think of the Permian-Triassic mass extinction event – “the Great Dying” – for example). Why is CDM different? And where do the wildly inaccurate one-liners come from (I’m not attacking, or even complaining about, weeasle; the comment is, sadly, very common)?

  12. @Nereid, belated thanks for that earlier link to the relevant preprint that was eventually published in Nature. The authors point out limitations and possible (mis)interpretations in using Type Ia SN as the only means of determining objects at great distances to us (as you have pointed out, several other methods are used for this determination in your posts above). Kudos for that last post that succinctly describes how astronomers came to realize the ‘missing mass’ problems that initially cropped up in detailed studies of galactic rotation curves and the problems that arose with ‘gravitationally bound’ members of galaxy clusters.

    And your keen observations on the downright rejection of the existence of DM in light of other ‘remarkable’ events is right on the money, IMHO. It is indeed sad that a great many posters here on UT hold that DM/DE research is a waste of time & money without having a deep understanding of how these two great ‘unknowns’ were finally regarded as real phenomena. Maybe the word ‘dark’ in these two terms convinces some that this is all smoke-and-mirrors designed to protect the status quo of ‘modern astronomy’ . I guess modern astrophysicists have it all wrong compared to posters with only the merest acquaintance or comprehension of the topics.

    And who knows, Astrofiend may be on the right track after all πŸ™‚ These topics remain the great unknowns in modern astronomy and will probably continue to be so in the future.

  13. These results will not particularly challenge the existence of dark matter. That is a more local gravitational effect, and there is lots of evidence for DM by other means. This issue might have some bearing on th existence of dark energy. The accelerated expansion of the universe is calibrated with SN1s based on their regular luminosities.


  14. Wow! I get the feeling a few of you here must work in observatories or some more serious science-related field for your day jobs… You all probably noticed that I was trolling a bit there but in a positive way, ie. to elucidate informative responses from such an educated group as opposed to just wanting to reinforce any of own pet theories or notions…

    A personal issue I was having was accepting placeholders as theory without solid observational evidence (from more than 1 or 2 distinct fields or methods, ie. not just telescope observations but it would be nice to have the particle physics people discover something that reinforces the idea).

    I accept Nereid you are right in saying that when all the evidence points one direction we generally accept that as ‘fact’ however I do feel there’s always room for a little honest questioning.. If it’s not easy to explain to a child then maybe the understanding and explanation could use further refinement πŸ˜‰

    I guess DM and DE are similar in many ways to black holes; many people refused to accept they existed because its hard to observe something completely black that emits nothing… Indirect evidence from studies was compelling but not the ‘final proof’ to many people..

    However, most people generally accept black holes exist now that our telescopes and detectors have become orders of magnitude more powerful & sensitive.. We can see all the stuff swirling around ‘something’ which most accept are likely BH’s… Plus we now have particle accelerators smashing bits and observing particle showers and inferring micro-BH’s should soon be observable when the larger accelerator rings come online..

    Personally, I will wait until DM can be borne out by further observation and in the case of DE would prefer to see some mathematical demonstration derived by an experiment here on earth.. ie, some repeatable experiment in particle physics which demonstrates how this phenomena could exist, (ie micro wormholes/BH’s, etc)… I believe with an expanded understanding of the micro we can make more accurate assumptions about the macro and vice-versa..

    Thanks for all your comments everyone above.. I often enjoy reading the comments as much as the stories on this site πŸ˜‰

  15. That was a nice little exchange. It is so refreshing to have a civilized discussion about the current state of play in regards to DM and DE, and the various merits and possible issues with these theories. A reasonable dose of scientific skepticism followed by an interesting discussion.

    Contrast this approach and outcome to the usual efforts of those who usually hijack such threads with their inane ramblings and crayon-on-the-wall physics.

    As I said – refreshing.

  16. @ weeasle

    We will see, where DM and DE will lead us. But maybe in a few years we will have direct proof of black holes. AFAIK it won’t be long till we have the resolution to resolve the event horizon of the black hole in Sgr A* (the central BH of the Milky Way). Then we will know for sure!

    @ Astrofriend:

    Indeed, although I didn’t participate, it was really an enjoyment to read this thread!

  17. I’d like to go back to what weeasle wrote, and throw out some thoughts for further discussion …

    Dark Energy and Dark Matter are theories brought about by cosmologists to explain and plug gaps in the mathematical and theoretical models of the big bang and accelerating expansion of the universe where less than 5% of matter/energy can be accounted for by observation alone.

    Here I go …

    * in astrophysics – indeed, the whole of physics – is there any *theory* that is not mathematical? IOW, as all theories are mathematical, isn’t “mathematical” redundant?

    * in physics, are there any models which are not built from theories? IOW, isn’t “theoretical” redundant?

    * in physics, for relationships which are not yet derived from theoretical first principles, is there anything which is not mathematical? In astronomy, to take one example of such a relationship, think of the Tully-Fisher relationship; or, in physics, think of RΓΆser’s equation.

    * other than observations you make using your unaided vision, are there any astronomical observations which are completely independent of models (and theories)? An extreme example might be TeV gamma radiation from the Crab nebula observed by an imaging atmospheric Cherenkov telescope (such as H.E.S.S.), but even a nice picture of the Moon shot with a modern digital camera is just as good an example.

    * Certainly from the time of Newton, hasn’t our understanding of how the universe works – through the lens of physics – been entirely built on mathematical theories and models?

    * One of my favourite examples of just how dramatically our understanding can change: gravity. Newton’s theory (of universal gravitation) worked perfectly well up until the mid-1800s, when the orbit of Mercury seemed to deviate from that predicted using Newton’s theory, by tiny, tiny amounts. And today, for everything other than astronomy (and your GPS), Newton’s theory still works perfectly; for some aspects of astronomy (and your GPS) it doesn’t … to get perfection you need to use General Relativity (GR). Here’s the kicker: in Newton’s theory, gravity is a force, an instantaneous action at a distance; in GR, gravity is geometry … and forces and geometry could not possible be more different, could they?

    * who wants to take a shot at re-writing the words I quoted, with “quarks” instead of DM+DE? Of course, you’ll have to replace big bang, but see if you can keep “theories brought about by … to explain and plug gaps in …”.

    Maybe, one day, “Dark Matter” will be an esoteric tweak to GR (and not “matter” at all); however, the millions of observations explainable by using “Dark Matter” will remain just as explained … just as Cassini’s orbit, slingshotting around Saturn and Titan, remains explainable in terms of the gravitational force of those two on the spaceprobe, despite the fact that, in GR, gravity is geometry.

  18. Nereid, you want us to turn to philosophy, don’t you?
    Since it’s late and there is a beautiful clear sky out there, only a short note:
    I wonder what an experimentalist would say about that everything in physics is theoretical πŸ˜‰ .
    However, you are right, of course. Physicists always try to put everything into a nice beautiful mathematical formula. So physics is mathematical “by definition”, and it is right to be so, because mathematics is the one thing we can trust that it will always hold. And not just in time, also in space. Mathematics is true everywhere.
    If there are aliens out there and they are clever enough to think about mathematics, they will find the same rules.
    That is useful in physics. We have a ground upon which we can build our theories. We assume that all our physical rules hold in the entire universe. At least the ground does – a good start, I’d say.

  19. “The miracle of the appropriateness of the language of mathematics for the formulation of the laws of physics is a wonderful gift which we neither understand nor deserve. We should be grateful for it and hope that it will remain valid in future research and that it will extend, for better or for worse, to our pleasure, even though perhaps also to our bafflement, to wide branches of learning.”

    Eugene Wigner, of course, in his famous article “The Unreasonable Effectiveness of Mathematics in the Natural Sciences” (see Wikip).

    Nereid: interesting example about gravitation. It goes further back: Descartes developed his idea of space-filling ‘vortices’ (is it the right English word? ‘tourbillons’ in French) because he abhorred the idea of distant action.
    Newton and Maxwell sent the swing back to distant action, Einstein and quantum physics bring it again to Descartes’ side.
    We haven’t seen the end of the swinging, I guess.

  20. I guess there’s a philosophical aspect, and I certainly enjoy a good science/philosophy discussion!

    However, I also want to draw attention to the fact (hah!) that so much of what we treat as fact (hah squared!!) is, in fact (well, you get the idea), merely a really brief, math-based, description that has astonishing explanatory and predictive powers (thanks Manu, Wigner is *exactly* one of the sources I had in mind) …

    If one is interested only (or mostly) in ‘observations’ (or, more generally, observables), then who cares if gravity is an instantaneous action at a distance, geometry, or mischievous invisible pink fairies (provided it’s 100% reliable, objective, repeatable, independently verifiable, etc, etc, etc)?

    In this respect, what’s the difference between CDM and quarks (say)?

  21. I guess there’s a philosophical aspect, and I certainly enjoy a good science/philosophy discussion!

    Oh, yes! As a physicist you have to be a little philosopher, especially if you enjoy the extreams of physics (cosmology and QM). And sometimes it is worth the effort to start such philosophical discussions about physics. I think, in some ways it brings us back down on earth. πŸ˜‰

    In some cases you are right, Nereid, that our facts are, in fact (I like the way you said it πŸ˜‰ ), mathematicla descriptions that seem to give correct interpretations of what’s going on. CDM is such a case.
    On the other hand, there are cases where the facts have gone beyond a “pure mathematical” description. Quarks, to use your example, are real objects. At least there are objects in our universe that behave as what we mathematically describe as quarks. And these objects have revealed themselves, we can see and detect them, we can almost put our hand on them. So things are, in fact, real!
    And I think, this is different from what we can say about CDM.

    Concerning “observation”.
    That is the point, physics not only want to observe, they want to predict.
    I can observe that the Moon flies around the earth in about 28 days. This meets your mentioned criteria and I need not care why he does it – it could be due to pink unicorns or geometry, it doesn’t matter. That would be “pure observation”.
    Of course, I can also make the prediction that the moon will be at (almost) the same position again in 28 days.
    But this is not what we want. Because our knowledge that the moon will be back again in 28 days does not apply to Mars (e.g.). We will see that Mars reappears in so-and-so-many days (sorry, I don’t know and don’t want to look up). So, probably the pink unicorns are at work here, too, and let Mars reappear after a longer time-period.
    That would be “observation-only”, somehow like Anaconda wishes it πŸ˜‰ .

    To make predictions we not only need to know when something will reappear in the skies, but also why. We could assume, like Newton did (or Kepler before him), that both observations could be due to the same cause. And what we find is a beautiful law that only depends on the distance, and we can apply it to other objects in the solar system and see, well, probably there is something that governs the motion of the planets. Let’s call it gravity (we could name it “pink unicorns”, but why?).

    Probably there is some sense in these many words of mine.
    In short, there is a difference between pure observation, prediction, pure mathematics and (well) observation.
    Pure things won’t do anything. Pure observation without the power of forecast is as useless as pure mathematical works without any observation that shows if the work has anything to do with (what we call) reality.
    There is a difference between CDM and Quarks, as I might have shown.
    As a little prediction ( πŸ˜‰ ): The future will tell, if this difference can be overcome….

  22. DrFlimmer: interesting example, that.
    “Quarks, to use your example, are real objects. At least there are objects in our universe that behave as what we mathematically describe as quarks. And these objects have revealed themselves, we can see and detect them, we can almost put our hand on them.”
    Well I’m not so sure…
    You’re aware of course, that quarks were in the first place a math concept which was helpful in building the Standard Model, and that their inventors (Gell-Mann and Zweig) were quite surprised to actually ‘see’ them come to some reality.
    Still, how ‘real’ is a particle that it is, precisely, impossible to detect *alone*? That in fact can’t *exist* alone?
    We can only see them by 2s or 3s, and have to indirectly deduct their existence from observations of these clusters.
    The same goes in fact for much of modern physics.
    I think the more we delve down into the elementary, the more blurred familiar concepts such as *reality* become, and the more *real* abstract math concepts become: they might well be one and the same, at the bottom (provided there is a bottom).

    I for one dream of an ultimate physics where the elementary object, whatever it is, and the law that governs it, are one and only thing!
    Does this even *mean* anything? πŸ˜‰

  23. You have earned a point, Manu.

    Still, they are there. As are “virtual particles”. The proton has a mass of about 936 MeV (assuming c=1, natural units – Damn, I hate them!). The three valence-quarks add up to a mass of about 15 MeV, IIRC. The rest is in relativistic effects, gluons and in virtual particles that just pop into and out of existence.
    That those quantum fluctuations are real is shown by the “Casimir-effect”: Two plates (neutral, of course) close by in a vaccum. They feel an “attraction” towards each other, because on the outside there are more virtual particles (possible) than between the plates. This is crazy and weird, but is experimentally varified.
    So what is really real and what is possibly real? And when do we say we “found” quarks and CDM and when do we say these “theoretical” objects can possibly explain what’s happening in this strange universe?
    Now, you (and others) again πŸ˜‰

    Btw: Your dream is the same that everyone dreams of: A great unified theory, or GUT for short.
    It is my opinion that the solution should be quite simple – maybe we don’t have the (mathematical) tools and the skills to find it, but maybe we will…… and maybe we won’t….

Comments are closed.