h a l f b a k e r yPlease listen carefully, as our opinions have changed.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
Please log in.
Before you can vote, you need to register.
Please log in or create an account.
|
Information Rocket
Think of the information content as an object in orbit, which is function of mutation rate ("entropy field"), and replication rates ("centrifugal force field"), and be inspired to create an information rocket ;) | |
The law of gravity implies that a body of mass M within a field of gravity, is
attracted by force F, dependent on the strength and direction of the
gravitational field. The force F manifests as change of velocity delta-v, and
change
of coordinate (delta-d). If, however, there is tangential motion
of the
velocity s, it may compensate for the orbital fall delta-d, and so satellites
don't fall
from the sky.
Suppose that the mass of "information object" (e.g., bit string of length N) is
defined as Kolmogorov complexity ("shortest computer program (in a
predetermined programming language) that produces the object as output").
Suppose that "information object" is in the field of entropy (e.g., "random
noise/temperature"), that is defined as "information loss", and its integral
(just
like the integral of deceleration) with respect to time, results in loss of
information (in number of bits erased), in a similar way that object of matter
in
the field of gravity results in object loss of speed, and eventually altitude
(number of meters).
Suppose that "information object" is in the field of random replication (e.g.,
field that engenders random copies of its substrings), that is defined as
"information gain", and its integral (just like integral of acceleration) with
respect to time, results in gain of information (in number of bits created), in
a
way similar to how object of matter with an engine of propulsion results in
gain
of speed, and eventually altitude (number of meters).
Suppose the source of entropy is a ball in a direction perpendicular to the
arrow of time (just like often sources of gravity are balls in the directions
perpendicular to orbital satellite motions). E.g., assume that we are traveling
in
a temporal curvature, i.e., if the 5 billion years ago looks temporarily behind,
perhaps 10 billion years ago is actually not directly behind, but at an angle
like
satellites traveling in a circular or elliptic orbits are, because a planet is a
ball-
like structure.
Suppose the source of entropy is of a fixed density (i.e., just like fixed mass
of
a massive object, this would be fixed entropy, i.e., temperature or thing that
mutates bits) at a distance D of imaginary time (i.e., time perpendicular to
the
tangent of our time flow), where entropy is inversely proportional to the
square
distance to its source. (Note -- think of imaginary time as expressed in the
imaginary part of real numbers, i.e., the one perpendicular to our timeline,
e.g.,
if the universe has existed for 13.7 billion years, this represents the distance
covered in Real part of it, while time perpendicular to normal years is
unknown, it may be different, depending on what happened to the
complexity
of the universe, -- i.e., if it became more random, then it approached the
entropy source, if it became more ordered, it may have increased the
distance
from it.).
With these assumptions, begin plotting an information rocket, to
capture the bits in our minds, and let them escape that entropy.
Centrifetal
Centripugal_20Force [MaxwellBuchanan, Feb 12 2019]
Infinity -- "Beating the Entropy"
https://inf.li/#/en/@/topic/69/ The problem associated with the idea. [Mindey, Feb 15 2019]
Everything-List -- "On information Rockets..."
https://groups.goog...ng-list/KRQXyndJv58 Thoughts on Everything-List in the search for rationale for the above. [Mindey, Feb 15 2019]
"IBM Selectric golf ball"
https://www.youtube...watch?v=RTtKaqIpOJc To illustrate anno by [Ian Tindale]. [Mindey, Feb 15 2019]
[link]
|
|
Yes, but have you accounted for Cole's Law? |
|
|
Umm, it shouldn't be information as mass.... Indeed, because information
actually does not attract (or warm things up). Or.. maybe it does, if it
interferes with other information... |
|
|
To some extent, the distance between events in time-space, is the
computational complexity between them. |
|
|
[Max], no, but I fixed the typo. :) Should it be "centripetal" force, rather
than "centrifugal"? |
|
|
So, will we have a prize for launching something like
events (call them "processes") into timespace, and and having them fly
like ... almost forever? |
|
|
//Should it be "centripetal" force, rather than "centrifugal"//
There's a link for that. |
|
|
Murphy will have something to say about all of this. |
|
|
Suppose a duck is actually a walrus. [-] |
|
|
[Voice] is not giving anything like Levenshtein distance a chance... Hmm. |
|
|
It's just like in that case, where you can actually define work in physics as
an integral of force over time, not as an integral of force over distance:
most people will have hard time to get it. |
|
|
I'm putting this out as a half-baked idea. Baked one would be actually
completely worked-out. |
|
|
Information cannot be defined in terms of the physics definition of work. The other definition of work also cannot be defined in terms of the physics definition of work. You're saying "information is like energy and therefore work is like rockets." and again I answer: bollocks. |
|
|
A novel use of centrifuges by midwives? |
|
|
I read it, and parts of it made me think of things. |
|
|
It is a fun way to look at things, but I am confused. What if you make two of the entropy balls at a distance to the other making a gravityless laGrange point? That would seem to be a place to store data absent entropy. The thing is, that locating something between two entropy sources does not seem like it would preserve data. One possibility is that if the balls were constructed one bit out of synchronization with each other and then when a bit was pulled from 1 to 0 by one of the balls, the other ball would push it from 0 to 1. The thing is though then they wouldn't be entropy balls and it might only work on things that were 1 bit big, or however many bits fit on a surface between the two balls. |
|
|
//Suppose a duck is actually a walrus.// |
|
|
... then we'll strive to get all our walruses in a row, which would
be admirable. [+] |
|
|
[beanangel], indeed, as your prediction tells, we would have to have a
laGrange point, where information would be safe, but this is not the
case! Very very interesting. |
|
|
Naturally, for heat sources, an increased mutation rate is expected.
However, if we think of distance, not as physical distance, but as
complexity distance from one object to another (i.e., how much mutation
do we need for an event of time-space to collide with another event
(become one process), and how much mutation do we need for an
event to disappear (i.e., a process to die, become randomness)), then
we could imagine that the distance to the entropy source is the number
of mutations needed for the object to become one with (i.e., in-sync
with) entropy source. |
|
|
I guess, it depends on whether or not the entropy field has wave-like
properties. Falling towards entropy balls actually does not mean
becoming more random, but becoming more similar to (more in-sync
with) these entropy sources. |
|
|
Having entropy balls of opposite phase would make laGrange points
exist, that is, if entropy, like gravity, is a wave-like phenomenon, and the
waveform is dependent on the direction (between the events, i.e., time-
space objects, which the entropy balls are too). |
|
|
Regarding attraction between events... Do events attract? Some say they
do. Do processes tend to make other processes more similar to them?
Sometimes they do. For example, some events, like earthquakes, and
criminal gangs, were found to cluster (as self-exciting point processes -
Hawkes processes). |
|
|
Are we imagining entropy balls as observable and, if so, how
would you know if you were looking at one? |
|
|
[pertinax], if we had eyes that can see events (processes) in time-space
like we see objects in space, I
guess we would see them as objects of high complexity, that makes other
objects alike them (or in-sync with them). |
|
|
So you mean we're nearby a neighboring computational universe with
different laws of physics, and the aliens in it are attracting us to their kind
of randomness? |
|
|
Yeah, e.g., why the future is more random than the past? We may be on a
collision course with an alien universe, and we need a technological
singularity to amplify our kind of randomness to escape their
pull. |
|
|
I think you should copy this to your https://inf.li site. |
|
|
[beanangel] well, it's definitely matching with the problem of "Beating
Entropy" (see link), but the https://inf.li doesn't work well enough, I want to
rewrite it, cause now some features are confusing. |
|
|
//we need a technological singularity to amplify our kind of randomness//Aha! Of course, the Infinite Improbability Drive is well worked out over the trilogy in five parts, and the many iterations of radio and film. It could work for this application. Or not. Too bad Douglas is dead; he'd like these crazy times in which we live. |
|
|
//If we had eyes that can see events (processes) in time-space// |
|
|
We *do* have eyes that can see events in time-space. The
theory of relativity, IIRC, is founded on observable events in time-
space. Unless "time-space" is somehow different from "space-
time", in which case please explain the difference. |
|
|
Processes, on the other hand, are things we infer from observed
events. So, does "see events (processes)" mean "see events and
infer processes from them" or something else? And, if
something else, then what? |
|
|
My simple two cents is, there is confusion between information (virtual definitions) and the media (the paper and ink, mass and energy movements that holds the information. It's the media that is subject to physical interactions. |
|
|
[pertinax], I guess, there is a different possible ways to define the word
"event": |
|
|
(A) - time slice of minimal interval, say of Plank time-interval
(B) - an extrusion of a 3D object over time dimension |
|
|
When I am talking about events above, I mean the (B). For example,
you can think of a piece of rock, that decays over time, becomes
smaller, and then disappears. You can imagine that as a sweep
extrusion of a 3D object over time. That's what I mean by an "event" (in
B-sense). In the B-sense, every person is an event of an object (with
some internal complexity), that spans over years, and then decays
(e.g., dies and gets dismantled). The event in B-sense also coincides with
what we think of as
"process". A piece of rock actually is a process. A human being is a process.
That's why I say "events (processes)". These events cannot be "seen" in the
literal sense, - they can
be experienced. |
|
|
> Processes, on the other hand, are things we infer from observed
events. So, does "see events (processes)" mean "see events and infer
processes from them" or something else? And, if something else, then
what? |
|
|
Since you here use the event in the (A)-sense (i.e., atomic events), yes,
it means seeing events and inferring processes of high complexity, -- for
example, hot things, randomly jumpy things, like the "IBM Selectric golf
ball" (see link), pointed out by [Ian Tindale]. |
|
|
For hot balls, you can measure ("observe") their temperature, which may
seem like instantaneous measurement, but we shouldn't forget that the
measurement of temperature is a statistic of multiple data inputs by the
particles, so, it's not instantaneous, and you'd see no temperature in an
interval of Plank time. To observe the temperature we watch the changes in one measurement
with
respect to another, with the assumption that they are related. It is impossible
with just a single measurement of a Plank-time event. |
|
|
To observe the entropy ball, one would have to create two identical probes as
different events, e.g., -- on different times (e.g., two identical rocks that lived on same place on
different
centuries). E.g., one probe 20 years ago, and
another probe 10 years from now. Then, measure the discrepancy of their
decay rates. To point its direction, create more of these probes on different locations, triangulate
that to the general time-space direction. The closest I can think of what's been done, is the
Kilogram experiment,
however, the samples were manufactured at the same time. The identical probes would have to
be manufactured at different times. The timescales and the kind of object sensitivity (is it a rock,
or a clock, a cpu, or a time crystal) for entropy types would have to be chosen depending on the
kind (e.g.,
"wavelength") of entropy we want to measure (e.g., very-short term, brownian motion, or a longer
term random
flushing of sea-shores, or some inherent quantum-entropy of space). |
|
|
Sorry, can I change my annotation? What I meant to say is, the information in those scattered chicken bones is looking clearer and clearer. |
|
|
So... if I've got this straight, (no bets on that mind you), given the many worlds interpretation every 'instant' of every possibility is a slide in an infinite slide show and so it should be possible to... encapsulate information in entropy-less bubbles to escape being lost to the flow of time? |
|
|
[2 fries shy of a happy meal], I suppose so, and these entropy-less
bubbles might be the laGrange points between the universes in orbit
around each others. |
|
|
Well it would make a decent premise for a sci fi novel. |
|
|
Thing is though, if true, there can be no loss of information if only consciousness is trapped by the flow of time. The overall multiverse slide show remains static and is itself an entropy-less bubble. Maybe one of many. |
|
|
// I want to rewrite it, cause now some features are
confusing. // |
|
|
And because, currently, if you try to load a page without
having already allowed all of the scripts it needs, it destroys
its own URL, so you have to go back to the page where you
clicked the link to it and click the link again after
unblocking everything. |
|
|
So, whatever information source actually can be thought of entropy
source, following a certain probability distribution, and it's attraction may
be thought of a distributional drag (any information source within another
information source may be affect to drag to acquire the pattern of that
other
source). |
|
|
The Lagrange points may exist, when there is pseudo-random pattern
interferences, or a stable pattern-engendering probability-distributional
drags. |
|
| |