Volume of $ 1 e20,000

SLAM: debunk creationism, pseudoscience, and superstitions. Discuss logic and morality.

Moderator: Alyrium Denryle

User avatar
Dooey Jo
Sith Devotee
Posts: 3127
Joined: 2002-08-09 01:09pm
Location: The land beyond the forest; Sweden.
Contact:

Post by Dooey Jo »

Re: the OP:
Instead of using huge-bit encryption, why don't they use One-time Pads? It's AFAIK completely unbreakable, even using quantum computers. It might be a little unpractical, but if they can have a 65536-bit encryption system, I'm sure they could pull it off ;)
They could send the keys via quantum encrypted lines, too. That would probably be as secure as it gets...
Durandal wrote:It sounds like you're talking about using the measurement of the number of decays as a seed. Even a perfectly random seed will not produce perfectly random numbers. The algorithms which take the seed and produce a number from it are not random themselves.
I read it like he would take the number of recorded decays and perform a modulo 10 operation on it or something (but that first 8 should be a 9). Or mod 2 for the binary one. The sequences produced in such a fashion should never repeat themselves, even if the modulo operation itself is very non-random... :?
Image
"Nippon ichi, bitches! Boing-boing."
Mai smote the demonic fires of heck...

Faker Ninjas invented ninjitsu
User avatar
phongn
Rebel Leader
Posts: 18487
Joined: 2002-07-03 11:11pm

Post by phongn »

Dooey Jo wrote:Re: the OP:
Instead of using huge-bit encryption, why don't they use One-time Pads? It's AFAIK completely unbreakable, even using quantum computers. It might be a little unpractical, but if they can have a 65536-bit encryption system, I'm sure they could pull it off ;)
They could send the keys via quantum encrypted lines, too. That would probably be as secure as it gets...
OTPs are a pain in the ass to use ... and you have to generate the pads. You'd better hope that your generator really is random.
User avatar
Wyrm
Jedi Council Member
Posts: 2206
Joined: 2005-09-02 01:10pm
Location: In the sand, pooping hallucinogenic goodness.

Post by Wyrm »

Durandal wrote:Wyrm, did you miss my post or something? The observable universe is related only to the expansion rate of the Universe, not the total age.
BZZT! The "observable universe" is defined as the distance out to which light has had time to reach us, not the inverse of the Hubble law. Even if the universe was not expanding (0 km/s/mpc), the observable universe would still be 14 billion ly, whereas your rule would give an observable universe with an infinite extent. Conversely, if the universe was only a second old (but with the same 72 km/s/mpc Hubble constant), we could only see out to 299,792,458 m, the distance light travels in a second.

My astronomy is very clear on this. Heck, I even consulted an astronomer on this (my ace in the hole, William H. Jefferys), and your relationship is wrong. Actually, the horizon has been revised up to 70 billion ly, due to the recently discovered accelerating expansion of the universe, but my basic notion was essentially correct: There is a 1:1 correspondance between the age of the universe and the size of the observable universe.
Darth Wong on Strollers vs. Assholes: "There were days when I wished that my stroller had weapons on it."
wilfulton on Bible genetics: "If two screaming lunatics copulate in front of another screaming lunatic, the result will be yet another screaming lunatic. 8)"
SirNitram: "The nation of France is a theory, not a fact. It should therefore be approached with an open mind, and critically debated and considered."

Cornivore! | BAN-WATCH CANE: XVII | WWJDFAKB? - What Would Jesus Do... For a Klondike Bar? | Evil Bayesian Conspiracy
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

Durandal wrote:Right now, the Universe expands at 72 km/s/mpc. So for every megaparsec away from us a body is, it moves away at 72 km/s. Obviously, when the expansion rate between us and a body reaches c, that will delineate the edge of our universe.
There's a bit of an issue here. You're correct in that this delineates the edges from which no signal emitted now could possibly be observed by us, but that doesn't mean that a signal emitted in the past by an object currently in that position cannot be observed. Let's say the universe the expansion is such that an object five units away travels luminally, so that no further communication is possible, but objects within that distance are still subluminal. Thus:
Past: <--(---+---)--> Object three units away emits a signal toward the orgin.
Now: <--(------+------)--> The object is now six units away, but the origin is just now receiving the old signal. Because it was the space between the two that expanded, the signal actually travelled six units. Thus, the particle horizon, i.e., what is "visible," is actually greater than the luminal limit predicted by expansion. Realistically, the behaviour is not so nicely linear, but intuitively, the picture holds.

---

Some googling has revealed that a dollar bill is 6.6294cm wide, by 15.5956cm long, and 0.010922cm thick, thus having a volume of 1.1292cm³. I have no idea just how truly accurate these statistics are, but they are commonly repeated on the internet. Given a sphere of radius r>>1m, the number of bills is therefore around N = 3.71e6 (r/m)³. Given the 2004 WMAP particle horizon estimate of 78Gly radius, we have N = (3.71e6)(7.8e10*365.24*24*3600*2.9979e8)³ = 1.5e75 bills. Note that lg(N) = 250 << 2^16. Let's see just by how much. Consider the reverse problem: given a particle of volume V, what must the size of the key be in order for particles of this size to fill the visible universe? This is easy: N = [4/3πr³]/V. Let's take V to be the volume of a neutron, about 5.6e-45m³, so N = 3e125. Siince lg(N) ~ 417, this is represents only a modest 417-bit key. Thus, the key space for 417-bit keys is already enough to fill the visible universe with neutrons.

The requisite volume V of the object that could fit into a universe with particle horizon r exactly N times satisfied N = 4/3πr³/V, or V = [4/3πr³]/N. The bracketed term is just the volume of the visible universe, 1.7e81m³, whereas N = 2^{2^16}. This is very small, but an order-of-magnitude estimate is log10(V) = log10(1.7e81) - 2^{16}log10(2) = -19540. In other words, we need an object around 10^{-19540}m³ in volume or 10^{-6514}m in radius. That is so many times less than the Planck length that it's completely ludicrous.
User avatar
Durandal
Bile-Driven Hate Machine
Posts: 17927
Joined: 2002-07-03 06:26pm
Location: Silicon Valley, CA
Contact:

Post by Durandal »

I'll just defer to Kuroneko. Thanks for the explanation. :)
Damien Sorresso

"Ever see what them computa bitchez do to numbas? It ain't natural. Numbas ain't supposed to be code, they supposed to quantify shit."
- The Onion
tharkûn
Tireless defender of wealthy businessmen
Posts: 2806
Joined: 2002-07-08 10:03pm

Post by tharkûn »

It sounds like you're talking about using the measurement of the number of decays as a seed. Even a perfectly random seed will not produce perfectly random numbers. The algorithms which take the seed and produce a number from it are not random themselves.
No let's say I want a a random number from 0 to 15. My QRNG takes and measures decays for a certain amount of time. Over that period of time >10^9 decays occur and I can accurately and precisely log them all. If the total number recorded is even, then the first bit of the generated number is 0; if odd then 1. This process is repeated three additional times to give four random bits, thus a truly random number between 0 and 15 is generated.

Using such a method one could generate numbers of any size which are completely random, by virtue of physics. One could build faster by say recording mod 10, mod 100, or whatever rather than mod 2; but as long as the number of decays is sufficiently large the result should be random.
Now, and interesting thing to try would be to take the idea of measuring the even or odd number of decays and then assigning even to 1 and odd to 0 and using those values to fill in the powers of 2 in a binary number. Of course, you'd have to randomly determine how big you wanted your number to potentially be using ... another RNG.
True enough, I think.
I read it like he would take the number of recorded decays and perform a modulo 10 operation on it or something (but that first 8 should be a 9). Or mod 2 for the binary one. The sequences produced in such a fashion should never repeat themselves, even if the modulo operation itself is very non-random...
Yes as long as your divisor is sufficiently small compared to the number of decays; the resulting remainder should be random. There are other ways to harness truly random decays to get random numbers out, this is merely the most straightforward I can recall.
Very funny, Scotty. Now beam down my clothes.
User avatar
Wyrm
Jedi Council Member
Posts: 2206
Joined: 2005-09-02 01:10pm
Location: In the sand, pooping hallucinogenic goodness.

Post by Wyrm »

I defer to Kuroneko too, but I thought it odd that you, Durandal, would've come up with a radius of the observable universe with a close mantissa but ten times smaller than I and wilfulton came up with, so I reexamined your analysis.

Let's go back in time where the Hubble law was linear to the distance calibrators availible at the time. The Hubble law has the form

v = H D (1)

Now consider: the lefthand side of equation (1) has dimensions of [L/T], so H must have dimensions of [1/T] in order to match dimensions. Let's express H in units of only inverse time, specifically, in inverse years. There are 31,557,600 seconds in a year, and there are 3.08568e19 kilometers in a megaparsec, so we apply these conversions to the Hubble constant,

H = 71 km/s/Mpc * 1 Mpc/3.08568e19 km * 31,557,600 s/1 y
= 7.2914290488e-14 1/y

Now, 1/H = 13.714 Gy. If we solve for D in (1) and substitute the maximum speed physically availible (c) for v, we get

D = c/H = c (13.714 Gy) = 13.714 Gly

And voilá! A figure close to ours. You must have lost a decimal place somewhere in your calculation.

Also, the Hubble constant is not a rate at which the universe expands, because it's the wrong units. A rate of expansion would be a change in length per unit time, and the Hubble constant has dimensions [1/T]. Therefore, it is not a rate of expansion. Indeed, to first approximation, the Hubble constant is the inverse of the age of the universe.

To see why, rearrange the Hubble law, and rename 1/H = T, we get for all galaxies the following relation:

vT = D (2)

Ie, the familiar time-distance-constant speed relationship you learned in high school. Since T is constant with respect to the individual velocities and distances, at some time back in the past, all of the galaxies were confined to a small volume. Then they flew out in various speeds to spread across the universe, and some time T later, their velocies and distances display the above relation. In other words, T is the time since the beginning of this expansion (in other words, the Big Bang).

You can also think about it this way: Imagine the universe was half the age, but the velocities of the galaxies are the same as they are now. Then the distances would be half, because of (2). This implies a Hubble constant of twice that of today, yet the universe is expanding at the same rate as it does now, because the velocities of all those little parts are constant (as per assumption), but crammed into a sphere of half the present radius.

Now, if you hadn't messed up your figures, you would've been in the same ballpark as wilfulton and me, because the only thing we did was skip the step where we went from the Hubble constant to the age of the universe. :wink:

Of course, I've left out a lot of details Kuroneko would not, but this is the basic idea.
Darth Wong on Strollers vs. Assholes: "There were days when I wished that my stroller had weapons on it."
wilfulton on Bible genetics: "If two screaming lunatics copulate in front of another screaming lunatic, the result will be yet another screaming lunatic. 8)"
SirNitram: "The nation of France is a theory, not a fact. It should therefore be approached with an open mind, and critically debated and considered."

Cornivore! | BAN-WATCH CANE: XVII | WWJDFAKB? - What Would Jesus Do... For a Klondike Bar? | Evil Bayesian Conspiracy
User avatar
Kuroneko
Jedi Council Member
Posts: 2469
Joined: 2003-03-13 03:10am
Location: Fréchet space
Contact:

Post by Kuroneko »

I've attempted to repeat the calculation in GTR with the FRW metric, but the result is very strongly model-dependent. I see I've made an error in one of my above comments, which in retrospects is silly--the Hubble sphere representing luminal recession velocity is not the cuttoff for observability. The Hubble sphere has radius R = c/H, where the Hubble parameter H varies in time in GTR (exactly how depends on the type of FRW model). A light signal emitted from beyond the Hubble sphere has a receding velocity, but the Hubble sphere itself does not stay constant. It is possible for the Hubble sphere to overtake the initially receding photon. Thus, it is possible for light emitted by superluminally receding objects to be observed as long as the Hubble sphere expands faster than c.

I'll look into how these things behave more precisely.
Post Reply