Oberst Tharnow wrote:Assuming that the laws of physics changed, such as c getting bigger or smaller or weak nuclear forces dissipating, or the first law of thermodynamics no longer appling IS ridiculous. At least if we are not talking about the time shortly after the Big Bang, where they propably DID. But if they would be changing TODAY (or over the last millions of years) life and the universe as we know it could propably not exist.
If C is different, it's not so much that the "Laws of Physics" get changed, but rather, many other physical constants will be affected. There are very few instances where "Laws of Physics" might be challenged and for the most part theories need to be symmetric; be true in all coordinate spaces etc. etc.
starslayer wrote:The last few posters got the gist of how light travels through dense media. Here's the full picture: When a light wave (really, billions upon billions of photons) encounters a dense medium, it strikes the atoms that make it up. Usually, a photon will hit one of the electrons orbiting the nucleus. This causes the electron to gain energy equivalent to that of the photon's. About 10^-8 seconds later, on average, the electron emits a photon of approximately the same energy in a random direction. So what you get, in essence, are a bunch of sources of spherical waves, as each atom is being struck by millions of photons a second, and is emitting new ones in random directions. This would seem to violate conservation of energy, but it doesn't. The beam is partially absorbed as it travels through the medium; backscatter isn't observed because the photons that aren't emitted in the original direction of the beam are, on average, out of phase and destructively interfere, so almost no energy is transmitted backwards or to the sides. The only portion that's in phase, and thus visible, is the portion that travels with the direction of the beam.
Anyway, photons can only exist at c, yet we can easily verify that light waves travel more slowly through dense media. The key is that time delay I mentioned earlier. Because each individual photon that makes up the beam is essentially now doing a random walk, and has a 10^-8 s delay at each "turn", the beam appears to slow down.
There is also no "average speed of light." c is the speed of a photon, period. They never slow down, and never speed up; they are only absorbed and emitted. If Magueijo is right, what I just said is still right; it's c that's changing, I presume (I haven't seen his actual papers, so I could be completely wrong here).
There are actually a number of factors that come into play, not least dispersive effects such as phonon energy, multi-wave mixing etc.