Special Relativity. It’s been the bane of space explorers, futurists and science fiction authors since Albert Einstein first proposed it in 1905. For those of us who dream of humans one-day becoming an interstellar species, this scientific fact is like a wet blanket. Luckily, there are a few theoretical concepts that have been proposed that indicate that Faster-Than-Light (FTL) travel might still be possible someday.
A popular example is the idea of a wormhole: a speculative structure that links two distant points in space time that would enable interstellar space travel. Recently, a team of Ivy League scientists conducted a study that indicated how “traversable wormholes” could actually be a reality. The bad news is that their results indicate that these wormholes aren’t exactly shortcuts, and could be the cosmic equivalent of “taking the long way”!
In April of 2016, Russian billionaire Yuri Milner announced the creation of Breakthrough Starshot. As part of his non-profit scientific organization (known as Breakthrough Initiatives), the purpose of Starshot was to design a lightsail nanocraft that would be capable of achieving speeds of up to 20% the speed of light and reaching the nearest star system – Alpha Centauri (aka. Rigel Kentaurus) – within our lifetimes.
At this speed – roughly 60,000 km/s (37,282 mps) – the probe would be able to reach Alpha Centauri in 20 years, where it could then capture images of the star and any planets orbiting it. But according to a recent article by Professor Bing Zhang, an astrophysicist from the University of Nevada, researchers could get all kinds of valuable data from Starshot and similar concepts long before they ever reached their destination.
To recap, Breakthrough Starshot seeks to leverage recent technological developments to mount an interstellar mission that will reach another star within a single generation. The spacecraft would consist of an ultra-light nanocraft and a lightsail, the latter of which would accelerated by a ground-based laser array up to speeds of hundreds of kilometers per second.
Such a system would allow the tiny spacecraft to conduct a flyby mission of Alpha Centauri in about 20 years after it is launched, which could then beam home images of possible planets and other scientific data (such as analysis of magnetic fields). Recently, Breakthrough Starshot held an “industry day” where they submitted a Request For Proposals (RFP) to potential bidders to build the laser sail.
According to Zhang, a lightsail-driven nanocraft traveling at a portion of the speed of light would also be a good way to test Einstein’s theory of Special Relativity. Simply put, this law states that the speed of light in a vacuum is constant, regardless of the inertial reference frame or motion of the source. In short, such a spacecraft would be able to take advantage of the features of Special Relativity and provide a new mode to study astronomy.
Based on Einstein’s theory, different objects in different “rest frames” would have different measures of the lengths of space and time. In this sense, an object moving at relativistic speeds would view distant astronomical objects differently as light emissions from these objects would be distorted. Whereas objects in front of the spacecraft would have the wavelength of their light shortened, objects behind it would have them lengthened.
This phenomenon, known as the “Doppler Effect”, results in light being shifted towards the blue end (“blueshift”) or the red end (“redshift”) of the spectrum for approaching and retreating objects, respectively. In 1929, astronomer Edwin Hubble used redshift measurements to determine that distant galaxies were moving away from our own, thus demonstrating that the Universe was in a state of expansion.
Because of this expansion (known as the Hubble Expansion), much of the light in the Universe is redshifted and only measurable in difficult-to-observe infrared wavelengths. But for a camera moving at relativistic speeds, according to Prof. Zhang, this redshifted light would become bluer since the motion of the camera would counteract the effects of cosmic expansion.
This effect, known as “Doppler boosting”, would cause the faint light from the early Universe to be amplified and allow distant objects to be studied in more detail. In this respect, astronomers would be able to study some of the earliest objects in the known Universe, which would offer more clues as to how it evolved over time. As Prof. Zhang explained to Universe Today via email, this would allow for some unique opportunities to test Special Relativity:
“In the rest frame of the camera, the emission of the objects in the hemisphere of the camera motion is blue-shifted. For bright objects with detailed spectral observations from the ground, one can observe them in flight. By comparing their blue-shifted flux at a specific blue-shifted frequency with the flux of the corresponding (de-blueshifted) frequency on the ground, one can precisely test the Doppler boosting prediction in Special Relativity.”
In addition, the frequency and intensity of light – and also the size of distant objects – would also change as far as the observer was concerned. In this respect, the camera would act as a lens and a wide-field camera, magnifying the amount of light it collects and letting astronomers observe more objects within the same field of view. By comparing the observations collected by the camera to those collected by a camera from the ground, astronomers could also test the probe’s Lorentz Factor.
This factor indicates how time, length, and relativistic mass change for an object while that object is moving, which is another prediction of Special Relativity. Last, but not least, Prof. Zhang indicates that probes traveling at relativistic speeds would not need to be sent to any specific destination in order to conduct these tests. As he explained:
“The concept of “relativistic astronomy” is that one does not really need to send the cameras to specific star systems. No need to aim (e.g. to Alpha Centauri system), no need to decelerate. As long as the signal can be transferred back to earth, one can learn a lot of things. Interesting targets include high-redshift galaxies, active galactic nuclei, gamma-ray bursts, and even electromagnetic counterparts of gravitational waves.”
However, there are some drawbacks to this proposal. For starters, the technology behind Starshot is all about accomplishing the dream of countless generations – i.e. reaching another star system (in this case, Alpha Centauri) – within a single generation.
And as Professor Abraham Loeb – the Frank B. Baird Jr. Professor of Science at Harvard University and the Chair and the Breakthrough Starshot Committee – told Universe Today via email, what Prof. Zhang is proposing can be accomplished by other means:
>“Indeed, there are benefits to having a camera move near the speed of light toward faint sources, such as the most distant dwarf galaxies in the early universe. But the cost of launching a camera to the required speed would be far greater than building the next generation of large telescopes which will provide us with a similar sensitivity. Similarly, the goal of testing special relativity can be accomplished at a much lower cost.”
Of course, it will be many years before a project like Starshot can be mounted, and many challenges need to be addressed in the meantime. But it is exciting to know that in meantime, scientific applications can be found for such a mission that go beyond exploration. In a few decades, when the mission begins to make the journey to Alpha Centauri, perhaps it will also be able to conduct tests on Special Relativity and other physical laws while in transit.
Since ancient times, philosophers and scholars have sought to understand light. In addition to trying to discern its basic properties (i.e. what is it made of – particle or wave, etc.) they have also sought to make finite measurements of how fast it travels. Since the late-17th century, scientists have been doing just that, and with increasing accuracy.
In so doing, they have gained a better understanding of light’s mechanics and the important role it plays in physics, astronomy and cosmology. Put simply, light moves at incredible speeds and is the fastest moving thing in the Universe. Its speed is considered a constant and an unbreakable barrier, and is used as a means of measuring distance. But just how fast does it travel?
Speed of Light (c):
Light travels at a constant speed of 1,079,252,848.8 (1.07 billion) km per hour. That works out to 299,792,458 m/s, or about 670,616,629 mph (miles per hour). To put that in perspective, if you could travel at the speed of light, you would be able to circumnavigate the globe approximately seven and a half times in one second. Meanwhile, a person flying at an average speed of about 800 km/h (500 mph), would take over 50 hours to circle the planet just once.
To put that into an astronomical perspective, the average distance from the Earth to the Moon is 384,398.25 km (238,854 miles ). So light crosses that distance in about a second. Meanwhile, the average distance from the Sun to the Earth is ~149,597,886 km (92,955,817 miles), which means that light only takes about 8 minutes to make that journey.
Little wonder then why the speed of light is the metric used to determine astronomical distances. When we say a star like Proxima Centauri is 4.25 light years away, we are saying that it would take – traveling at a constant speed of 1.07 billion km per hour (670,616,629 mph) – about 4 years and 3 months to get there. But just how did we arrive at this highly specific measurement for “light-speed”?
History of Study:
Until the 17th century, scholars were unsure whether light traveled at a finite speed or instantaneously. From the days of the ancient Greeks to medieval Islamic scholars and scientists of the early modern period, the debate went back and forth. It was not until the work of Danish astronomer Øle Rømer (1644-1710) that the first quantitative measurement was made.
In 1676, Rømer observed that the periods of Jupiter’s innermost moon Io appeared to be shorter when the Earth was approaching Jupiter than when it was receding from it. From this, he concluded that light travels at a finite speed, and estimated that it takes about 22 minutes to cross the diameter of Earth’s orbit.
Christiaan Huygens used this estimate and combined it with an estimate of the diameter of the Earth’s orbit to obtain an estimate of 220,000 km/s. Isaac Newton also spoke about Rømer’s calculations in his seminal work Opticks (1706). Adjusting for the distance between the Earth and the Sun, he calculated that it would take light seven or eight minutes to travel from one to the other. In both cases, they were off by a relatively small margin.
Later measurements made by French physicists Hippolyte Fizeau (1819 – 1896) and Léon Foucault (1819 – 1868) refined these measurements further – resulting in a value of 315,000 km/s (192,625 mi/s). And by the latter half of the 19th century, scientists became aware of the connection between light and electromagnetism.
This was accomplished by physicists measuring electromagnetic and electrostatic charges, who then found that the numerical value was very close to the speed of light (as measured by Fizeau). Based on his own work, which showed that electromagnetic waves propagate in empty space, German physicist Wilhelm Eduard Weber proposed that light was an electromagnetic wave.
The next great breakthrough came during the early 20th century/ In his 1905 paper, titled “On the Electrodynamics of Moving Bodies”, Albert Einstein asserted that the speed of light in a vacuum, measured by a non-accelerating observer, is the same in all inertial reference frames and independent of the motion of the source or observer.
Using this and Galileo’s principle of relativity as a basis, Einstein derived the Theory of Special Relativity, in which the speed of light in vacuum (c) was a fundamental constant. Prior to this, the working consensus among scientists held that space was filled with a “luminiferous aether” that was responsible for its propagation – i.e. that light traveling through a moving medium would be dragged along by the medium.
This in turn meant that the measured speed of the light would be a simple sum of its speed through the medium plus the speed of that medium. However, Einstein’s theory effectively made the concept of the stationary aether useless and revolutionized the concepts of space and time.
Not only did it advance the idea that the speed of light is the same in all inertial reference frames, it also introduced the idea that major changes occur when things move close the speed of light. These include the time-space frame of a moving body appearing to slow down and contract in the direction of motion when measured in the frame of the observer (i.e. time dilation, where time slows as the speed of light approaches).
His observations also reconciled Maxwell’s equations for electricity and magnetism with the laws of mechanics, simplified the mathematical calculations by doing away with extraneous explanations used by other scientists, and accorded with the directly observed speed of light.
During the second half of the 20th century, increasingly accurate measurements using laser inferometers and cavity resonance techniques would further refine estimates of the speed of light. By 1972, a group at the US National Bureau of Standards in Boulder, Colorado, used the laser inferometer technique to get the currently-recognized value of 299,792,458 m/s.
Role in Modern Astrophysics:
Einstein’s theory that the speed of light in vacuum is independent of the motion of the source and the inertial reference frame of the observer has since been consistently confirmed by many experiments. It also sets an upper limit on the speeds at which all massless particles and waves (which includes light) can travel in a vacuum.
One of the outgrowths of this is that cosmologists now treat space and time as a single, unified structure known as spacetime – in which the speed of light can be used to define values for both (i.e. “lightyears”, “light minutes”, and “light seconds”). The measurement of the speed of light has also become a major factor when determining the rate of cosmic expansion.
Beginning in the 1920’s with observations of Lemaitre and Hubble, scientists and astronomers became aware that the Universe is expanding from a point of origin. Hubble also observed that the farther away a galaxy is, the faster it appears to be moving. In what is now referred to as the Hubble Parameter, the speed at which the Universe is expanding is calculated to 68 km/s per megaparsec.
This phenomena, which has been theorized to mean that some galaxies could actually be moving faster than the speed of light, may place a limit on what is observable in our Universe. Essentially, galaxies traveling faster than the speed of light would cross a “cosmological event horizon”, where they are no longer visible to us.
Also, by the 1990’s, redshift measurements of distant galaxies showed that the expansion of the Universe has been accelerating for the past few billion years. This has led to theories like “Dark Energy“, where an unseen force is driving the expansion of space itself instead of objects moving through it (thus not placing constraints on the speed of light or violating relativity).
Along with special and general relativity, the modern value of the speed of light in a vacuum has gone on to inform cosmology, quantum physics, and the Standard Model of particle physics. It remains a constant when talking about the upper limit at which massless particles can travel, and remains an unachievable barrier for particles that have mass.
Perhaps, someday, we will find a way to exceed the speed of light. While we have no practical ideas for how this might happen, the smart money seems to be on technologies that will allow us to circumvent the laws of spacetime, either by creating warp bubbles (aka. the Alcubierre Warp Drive), or tunneling through it (aka. wormholes).
Until that time, we will just have to be satisfied with the Universe we can see, and to stick to exploring the part of it that is reachable using conventional methods.
At the end of the millennium, Physics World magazine conducted a poll where they asked 100 of the world’s leading physicists who they considered to be the top 10 greatest scientist of all time. The number one scientist they identified was Albert Einstein, with Sir Isaac Newton coming in second. Beyond being the most famous scientist who ever lived, Albert Einstein is also a household name, synonymous with genius and endless creativity.
As the discoverer of Special and General Relativity, Einstein revolutionized our understanding of time, space, and universe. This discovery, along with the development of quantum mechanics, effectively brought to an end the era of Newtonian Physics and gave rise to the modern age. Whereas the previous two centuries had been characterized by universal gravitation and fixed frames of reference, Einstein helped usher in an age of uncertainty, black holes and “scary action at a distance”.
We’ve come a long way in 13.8 billion years; but despite our impressively extensive understanding of the Universe, there are still a few strings left untied. For one, there is the oft-cited disconnect between general relativity, the physics of the very large, and quantum mechanics, the physics of the very small. Then there is problematic fate of a particle’s intrinsic information after it falls into a black hole. Now, a new interpretation of fundamental physics attempts to solve both of these conundrums by making a daring claim: at certain scales, space and time simply do not exist.
Let’s start with something that is not in question. Thanks to Einstein’s theory of special relativity, we can all agree that the speed of light is constant for all observers. We can also agree that, if you’re not a photon, approaching light speed comes with some pretty funky rules – namely, anyone watching you will see your length compress and your watch slow down.
But the slowing of time also occurs near gravitationally potent objects, which are described by general relativity. So if you happen to be sight-seeing in the center of the Milky Way and you make the regrettable decision to get too close to our supermassive black hole’s event horizon (more sinisterly known as its point-of-no-return), anyone observing you will also see your watch slow down. In fact, he or she will witness your motion toward the event horizon slow dramatically over an infinite amount of time; that is, from your now-traumatized friend’s perspective, you never actually cross the event horizon. You, however, will feel no difference in the progression of time as you fall past this invisible barrier, soon to be spaghettified by the black hole’s immense gravity.
So, who is “correct”? Relativity dictates that each observer’s point of view is equally valid; but in this situation, you can’t both be right. Do you face your demise in the heart of a black hole, or don’t you? (Note: This isn’t strictly a paradox, but intuitively, it feels a little sticky.)
And there is an additional, bigger problem. A black hole’s event horizon is thought to give rise to Hawking radiation, a kind of escaping energy that will eventually lead to both the evaporation of the black hole and the destruction of all of the matter and energy that was once held inside of it. This concept has black hole physicists scratching their heads. Because according to the laws of physics, all of the intrinsic information about a particle or system (namely, the quantum wavefunction) must be conserved. It cannot just disappear.
Why all of these bizarre paradoxes? Because black holes exist in the nebulous space where a singularity meets general relativity – fertile, yet untapped ground for the elusive theory of everything.
Enter two interesting, yet controversial concepts: doubly special relativity and gravity’s rainbow.
Just as the speed of light is a universally agreed-upon constant in special relativity, so is the Planck energy in doubly special relativity (DSR). In DSR, this value (1.22 x 1019 GeV) is the maximum energy (and thus, the maximum mass) that a particle can have in our Universe.
Two important consequences of DSR’s maximum energy value are minimum units of time and space. That is, regardless of whether you are moving or stationary, in empty space or near a black hole, you will agree that classical space breaks down at distances shorter than the Planck length (1.6 x 10-35 m) and classical time breaks down at moments briefer than the Planck time (5.4 x 10-44 sec).
In other words, spacetime is discrete. It exists in indivisible (albeit vanishingly small) units. Quantum below, classical above. Add general relativity into the picture, and you get the theory of gravity’s rainbow.
Physicists Ahmed Farag Ali, Mir Faizal, and Barun Majumder believe that these theories can be used to explain away the aforementioned black hole conundrums – both your controversial spaghettification and the information paradox. How? According to DSR and gravity’s rainbow, in regions smaller than 1.6 x 10-35 m and at times shorter than 5.4 x 10-44 sec… the Universe as we know it simply does not exist.
“In gravity’s rainbow, space does not exist below a certain minimum length, and time does not exist below a certain minimum time interval,” explained Ali, who, along with Faizal and Majumder, authored a paper on this topic that was published last month. “So, all objects existing in space and occurring at a time do not exist below that length and time interval [which are associated with the Planck scale].”
Luckily for us, every particle we know of, and thus every particle we are made of, is much larger than the Planck length and endures for much longer than the Planck time. So – phew! – you and I and everything we see and know can go on existing. (Just don’t probe too deeply.)
The event horizon of a black hole, however, is a different story. After all, the event horizon isn’t made of particles. It is pure spacetime. And according to Ali and his colleagues, if you could observe it on extremely short time or distance scales, it would cease to have meaning. It wouldn’t be a point-of-no-return at all. In their view, the paradox only arises when you treat spacetime as continuous – without minimum units of length and time.
“As the information paradox depends on the existence of the event horizon, and an event horizon like all objects does not exist below a certain length and time interval, then there is no absolute information paradox in gravity’s rainbow. The absence of an effective horizon means that there is nothing absolutely stopping information from going out of the black hole,” concluded Ali.
No absolute event horizon, no information paradox.
And what of your spaghettification within the black hole? Again, it depends on the scale at which you choose to analyze your situation. In gravity’s rainbow, spacetime is discrete; therefore, the mathematics reveal that both you (the doomed in-faller) and your observer will witness your demise within a finite length of time. But in the current formulation of general relativity, where spacetime is described as continuous, the paradox arises. The in-faller, well, falls in; meanwhile, the observer never sees the in-faller pass the event horizon.
“The most important lesson from this paper is that space and time exist only beyond a certain scale,” said Ali. “There is no space and time below that scale. Hence, it is meaningless to define particles, matter, or any object, including black holes, that exist in space and time below that scale. Thus, as long as we keep ourselves confined to the scales at which both space and time exist, we get sensible physical answers. However, when we try to ask questions at length and time intervals that are below the scales at which space and time exist, we end up getting paradoxes and problems.”
To recap: if spacetime continues on arbitrarily small scales, the paradoxes remain. If, however, gravity’s rainbow is correct and the Planck length and the Planck time are the smallest unit of space and time that fundamentally exist, we’re in the clear… at least, mathematically speaking. Unfortunately, the Planck scales are far too tiny for our measly modern particle colliders to probe. So, at least for now, this work provides yet another purely theoretical result.
The paper was published in the January 23 issue of Europhysics Letters. A pre-print of the paper is available here.
It’s a cornerstone of modern physics that nothing in the Universe is faster than the speed of light (c). However, Einstein’s theory of special relativity does allow for instances where certain influences appear to travel faster than light without violating causality. These are what is known as “photonic booms,” a concept similar to a sonic boom, where spots of light are made to move faster than c.
And according to a new study by Robert Nemiroff, a physics professor at Michigan Technological University (and co-creator of Astronomy Picture of the Day), this phenomena may help shine a light (no pun!) on the cosmos, helping us to map it with greater efficiency.
Consider the following scenario: if a laser is swept across a distant object – in this case, the Moon – the spot of laser light will move across the object at a speed greater than c. Basically, the collection of photons are accelerated past the speed of light as the spot traverses both the surface and depth of the object.
The resulting “photonic boom” occurs in the form of a flash, which is seen by the observer when the speed of the light drops from superluminal to below the speed of light. It is made possible by the fact that the spots contain no mass, thereby not violating the fundamental laws of Special Relativity.
Another example occurs regularly in nature, where beams of light from a pulsar sweep across clouds of space-borne dust, creating a spherical shell of light and radiation that expands faster than c when it intersects a surface. Much the same is true of fast-moving shadows, where the speed can be much faster and not restricted to the speed of light if the surface is angular.
At a meeting of the American Astronomical Society in Seattle, Washington earlier this month, Nemiroff shared how these effects could be used to study the universe.
“Photonic booms happen around us quite frequently,” said Nemiroff in a press release, “but they are always too brief to notice. Out in the cosmos they last long enough to notice — but nobody has thought to look for them!”
Superluminal sweeps, he claims, could be used to reveal information on the 3-dimensional geometry and distance of stellar bodies like nearby planets, passing asteroids, and distant objects illuminated by pulsars. The key is finding ways to generate them or observe them accurately.
For the purposes of his study, Nemiroff considered two example scenarios. The first involved a beam being swept across a scattering spherical object – i.e. spots of light moving across the Moon and pulsar companions. In the second, the beam is swept across a “scattering planar wall or linear filament” – in this case, Hubble’s Variable Nebula.
In the former case, asteroids could be mapped out in detail using a laser beam and a telescope equipped with a high-speed camera. The laser could be swept across the surface thousands of times a second and the flashes recorded. In the latter, shadows are observed passing between the bright star R Monocerotis and reflecting dust, at speeds so great that they create photonic booms that are visible for days or weeks.
This sort of imaging technique is fundamentally different from direct observations (which relies on lens photography), radar, and conventional lidar. It is also distinct from Cherenkov radiation – electromagnetic radiation emitted when charged particles pass through a medium at a speed greater than the speed of light in that medium. A case in point is the blue glow emitted by an underwater nuclear reactor.
Combined with the other approaches, it could allow scientists to gain a more complete picture of objects in our Solar System, and even distant cosmological bodies.
Nemiroff’s study accepted for publication by the Publications of the Astronomical Society of Australia, with a preliminary version available online at arXiv Astrophysics
It sounds like science fiction, but the time you experience between two events depends directly on the path you take through the universe. In other words, Einstein’s theory of special relativity postulates that a person traveling in a high-speed rocket would age more slowly than people back on Earth.
Although few physicists doubt Einstein was right, it’s crucial to verify time dilation to the best possible accuracy. Now, an international team of researchers, including Nobel laureate Theodor Hänsch, director of the Max Planck optics institute, has done just this.
Tests of special relativity date back to 1938. But once we started going to space regularly, we had to learn to deal with time dilation on a daily basis. GPS satellites, for example, are basically clocks in orbit. They travel at a whopping speed of 14,000 kilometers per hour well above the Earth’s surface at a distance of 20,000 kilometers. So relative to an atomic clock on the ground they lose about 7 microseconds per day, a number that has to be taken into account for them to work properly.
To test time dilation to a much higher precision, Benjamin Botermann of Johannes Gutenberg-University, Germany, and colleagues accelerated lithium ions to one-third the speed of light. Here the Doppler shift quickly comes into play. Any ions flying toward the observer will be blue shifted and any ions flying away from the observer will be red shifted.
The level at which the ions undergo a Doppler shift depends on their relative motion, with respect to the observer. But this also makes their clock run slow, which redshifts the light from the observer’s point of view — an effect that you should be able to measure in the lab.
So the team stimulated transitions in the ions using two lasers propagating in opposite directions. Then any shifts in the absorption frequency of the ions are dependent on the Doppler effect, which we can easily calculate, and the redshift due to time dilation.
The team verified their time dilation prediction to a few parts per billion, improving on previous limits. The findings were published on Sept. 16 in the journal Physical Review Letters.
Time Reborn: From the Crisis of Physics to the Future of the Universe is one of those books intended to provoke discussion. Right from the first pages, author Lee Smolin — a Canadian theoretical physicist who also teaches philosophy — puts forward a position: time is real, and not an illusion of the human experience (as other physicists try to argue).
Smolin, in fact, uses that concept of time as a basis for human free will. If time is real, he writes, this is the result: “Novelty is real. We can create, with our imagination, outcomes not computable from knowledge of the present.”
Physics as philosophy. A powerful statement to make in the opening parts of the book. The only challenge is understanding the rest of it.
Smolin advertises his book as open to the general reader who has no background in physics or mathematics, promising that there aren’t even equations to worry about. He also breaks up the involved explanations with wry observations of fatherhood, or by bringing up anecdotes from his past.
It works, but you need to be patient. Theoretical physics is so far outside of the everyday that at times it took me (with education focusing on journalism and space policy, admittedly) two or three readings of the same passage to understand what was going on.
But as I took my time, a whole world opened up to me.
I found myself understanding more about Einstein’s special and general relativity than I did in readings during high school and university. The book also made me think differently about cosmology (the nature of the universe), especially in relation to biological laws.
While the book is enjoyable, it is probably best not to read it in isolation as it is a positional one — a book that gathers information scientifically and analytically, to be sure, but one that does not have a neutral point of view to the conclusions.
We’d recommend picking up other books such as the classic A Brief History of Time (by physicist Stephen Hawking) to learn more about the universe, and how other scientists see time work.
Einstein’s explanation of special relativity, delivered in his 1905 paper On the Electrodynamics of Moving Bodies focuses on demolishing the idea of ‘absolute rest’, exemplified by the theoretical luminiferous aether. He achieved this very successfully, but many hearing that argument today are left puzzled as to why everything seems to depend upon the speed of light in a vacuum.
Since few people in the 21st century need convincing that the luminiferous aether does not exist, it is possible to come at the concept of special relativity in a different way and just through an exercise of logic deduce that the universe must have an absolute speed – and from there deduce special relativity as a logical consequence.
The argument goes like this:
1) There must be an absolute speed in any universe since speed is a measure of distance moved over time. Increasing your speed means you reduce your travel time between a distance A to B. A kilometre walk to the shops might take 25 minutes, but if you run it might take only 15 minutes – and if you take the car, only 2 minutes. At least theoretically you should be able to increase your speed up to the point where that travel time reaches zero – and whatever speed you are at when that happens will represent the universe’s absolute speed.
2) Now consider the principle of relativity. Einstein talked about trains and platforms to describe different inertial frame of references. So for example, you can measure someone throwing a ball forward at 10 km/hr on the platform. But put that someone on the train which is travelling at 60 km/hr and then the ball measurably moves forward at nearly 70 km/hr (relative to the platform).
3) Point 2 is a big problem for a universe that has an absolute speed (see Point 1). For example, if you had an instrument that projected something forward at the absolute speed of the universe and then put that instrument on the train – you would expect to be able to measure something moving at the absolute speed + 60 km/hr.
4) Einstein deduced that when you observe something moving in a different frame of reference to your own, the components of speed (i.e. distance and time), must change in that other frame of reference to ensure that anything that moves can never be measured moving at a speed greater than the absolute speed.
Thus on the train, distances should contract and time should dilate (since time is the denominator of distance over time).
And that’s it really. From there one can just look to the universe for examples of something that always moves at the same speed regardless of frame of reference. When you find that something, you will know that it must be moving at the absolute speed.
the electromagnetic output produced by the relative motion of a magnet and an induction coil is the same whether the magnet is moved or whether the coil is moved (a finding of James Clerk Maxwell‘s electromagnetic theory) and;
the failure to demonstrate that the motion of the Earth adds any additional speed to a light beam moving ahead of the Earth’s orbital trajectory (presumably an oblique reference to the 1887 Michelson-Morley experiment).
In other words, electromagnetic radiation (i.e. light) demonstrated the very property that would be expected of something which moved at the absolute speed that it is possible to move in our universe.
The fact that light happens to move at the absolute speed of the universe is useful to know – since we can measure the speed of light and hence we can then assign a numerical value to the universe’s absolute speed (i.e. 300,000 km/sec), rather than just calling it c.
None! That was AWAT #100 – more than enough for anyone. Thanks for reading, even if it was just today. SN.