> Entropy exists, not just from thermodynamics but also from the existence of information theory.
Try “entropy is the existential phenomena of potential distributing over the surface area of negative potential.” This fits both sides only invalidating modern information theory. State is a single for instance of a potential resolve. Number of states are a boundary condition of potential interfering with itself (constructively or destructively). Existential reality is not made up of information, existential reality is potential resolving into state in the moment of now. Probability is the projected distribution of potential (over whatever surface area such as inverse square the distance or heat dissipation or voltage resistance or number of cars on a highway.)
Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).
This could somehow fit it, such as how some minds are limited by what they interpret (numbers of state) while other minds see an intrinsic whole (a potential distribution like a moment of force to an architect.)
> Existential reality is not made up of information
> Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).
I think this leads to the question "is information a function of perception". Knowledge of resolution of potential is information, and the properties of entropy should pass through IMO
What about when information is encoded into physical state? If I sample an entropy source and record each sample as magnetism on a disk, information becomes a physical phenomenon?
How can we be sure there are no non-human entities observing all resolutions of potential at any given time and then modifying reality (e.g. existing beyond our understanding). Also consider actors at future points in time that may observe and act upon information that formed in the past. e.g. it's easy to dismiss the information from a dinosaur dying thousands of years ago, until some random human discovers it's toe and triggers a full excavation.
I don't have an answer to this, I just don't feel confident dismissing information as not a physical phenomenon itself. It's certainly a real feedback mechanism, it's just unclear if an actor is required for it to be a phenomenon.
Information I define as “the removal of uncertainty.” If it does not “remove uncertainty” it is not information (it may be something else, like data or noise.) therefore yes you are right information is dependent upon the beholder.
To solve this, I declare information as a vector.
A vector is meaningless without context.
A vector is its own dimensionality (size,speed,density,etc.) and value (magnitude).
This clears up some of your insightful wanderings. Other parts may be outside of information‘s responsibility.
Let us return to your apparently lesser footnote diety who is secretly supreme instigator all along.
Decay!
Even your smartest reading shall some day be the incomprehensible dust (or rust) or otherwise long since faded.
In a world governed by uncertainty, ignorance and confusion is the natural state of minds. It is the illuminant mind which wonders, restlessly tugging at loose stitch until the whole fabric unravels (for better or worse.)
All existential change comes by decay or interference (constructive or destructive.) that’s what this whole entropy thing is, and vectors transform potential into meaningful states (which themselves secretly decay when not being watched.)
Ok this took some time to think about. I've always conflated entropy and randomness/data, but this is wrong. Say that i sample 512 bytes of data from an entropy source. That data is no longer entropic, because it's no longer uncertain. It's now information (assuming it can be used to inform about the entropy source). It may be random looking, but not entropic.
I think I see what you mean. Information and entropy are fundamental opposites, information cannot be entropy and entropy cannot be information. Information is certainty, entropy is uncertainty.
So with information theory, we measure entropy sources to get information, but it's disconnected from the reality of entropy in physical processes. Which is maybe fine? It seems like an abstraction that is 0 cost. Like if i roll a dice and i compare that to flipping a coin, the relative amount of information yielded is based on the number of distinct outcomes given some arbitrary constraints (what side is facing "up" for each object assuming the human operates the object fairly).
If we then do some reasoning using this relative difference in amount of information, the physical, thermodynamic reality of the human doing the action, the object moving through space, etc etc is arbitrarily stripped away, but with no impact to the soundness of the reasoning about the information (assuming the human has not rigged the system/the entropy source is sound). Sorta like algebra over a field. Any field can be used so long as the axioms of the algebra hold.
So it all comes down to soundness of entropy sources...?
(this has improved my understanding of information theory _significantly_ :pray:)
> Information is certainty, entropy is uncertainty.
That is wrong.
There is NEVER certainty.
Certainly is a lie.
Certainty is a dillusion.
Listen carefully: “information removes uncertainty.” Do you see? The most well intending self deceptions begin with a misinterpretation.
Like Shannon and his prime example. He did not say “entropy is the number of possible states.”
If you excite an electron valence (increase potential) would these values not change to the next shelf? Everyone who heard what they wanted to hear.
There is no certainty, even the memory fades. You will always ask yourself if you did the right thing or picked the right one. Ten years from now some missing piece will come to light and perspectives will change.
Certainty never “existed” only the illusion of closure. We do the best we can when necessary with what we have. That is success, not certainty.
We cannot find certainty, only optimal solutions.
In a universe governed by entropy, that everything dissolves to time is the only certainty.
This is actually pretty good stuff.
> Entropy exists, not just from thermodynamics but also from the existence of information theory.
Try “entropy is the existential phenomena of potential distributing over the surface area of negative potential.” This fits both sides only invalidating modern information theory. State is a single for instance of a potential resolve. Number of states are a boundary condition of potential interfering with itself (constructively or destructively). Existential reality is not made up of information, existential reality is potential resolving into state in the moment of now. Probability is the projected distribution of potential (over whatever surface area such as inverse square the distance or heat dissipation or voltage resistance or number of cars on a highway.)
Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).
This could somehow fit it, such as how some minds are limited by what they interpret (numbers of state) while other minds see an intrinsic whole (a potential distribution like a moment of force to an architect.)
Thanks!
> Existential reality is not made up of information
> Potential and potential resolving (true entropy) is the underlying phenomenon not state (information).
I think this leads to the question "is information a function of perception". Knowledge of resolution of potential is information, and the properties of entropy should pass through IMO
What about when information is encoded into physical state? If I sample an entropy source and record each sample as magnetism on a disk, information becomes a physical phenomenon?
How can we be sure there are no non-human entities observing all resolutions of potential at any given time and then modifying reality (e.g. existing beyond our understanding). Also consider actors at future points in time that may observe and act upon information that formed in the past. e.g. it's easy to dismiss the information from a dinosaur dying thousands of years ago, until some random human discovers it's toe and triggers a full excavation.
I don't have an answer to this, I just don't feel confident dismissing information as not a physical phenomenon itself. It's certainly a real feedback mechanism, it's just unclear if an actor is required for it to be a phenomenon.
Information I define as “the removal of uncertainty.” If it does not “remove uncertainty” it is not information (it may be something else, like data or noise.) therefore yes you are right information is dependent upon the beholder.
To solve this, I declare information as a vector.
A vector is meaningless without context.
A vector is its own dimensionality (size,speed,density,etc.) and value (magnitude).
This clears up some of your insightful wanderings. Other parts may be outside of information‘s responsibility.
Let us return to your apparently lesser footnote diety who is secretly supreme instigator all along.
Decay!
Even your smartest reading shall some day be the incomprehensible dust (or rust) or otherwise long since faded.
In a world governed by uncertainty, ignorance and confusion is the natural state of minds. It is the illuminant mind which wonders, restlessly tugging at loose stitch until the whole fabric unravels (for better or worse.)
All existential change comes by decay or interference (constructive or destructive.) that’s what this whole entropy thing is, and vectors transform potential into meaningful states (which themselves secretly decay when not being watched.)
Ok this took some time to think about. I've always conflated entropy and randomness/data, but this is wrong. Say that i sample 512 bytes of data from an entropy source. That data is no longer entropic, because it's no longer uncertain. It's now information (assuming it can be used to inform about the entropy source). It may be random looking, but not entropic.
I think I see what you mean. Information and entropy are fundamental opposites, information cannot be entropy and entropy cannot be information. Information is certainty, entropy is uncertainty.
So with information theory, we measure entropy sources to get information, but it's disconnected from the reality of entropy in physical processes. Which is maybe fine? It seems like an abstraction that is 0 cost. Like if i roll a dice and i compare that to flipping a coin, the relative amount of information yielded is based on the number of distinct outcomes given some arbitrary constraints (what side is facing "up" for each object assuming the human operates the object fairly).
If we then do some reasoning using this relative difference in amount of information, the physical, thermodynamic reality of the human doing the action, the object moving through space, etc etc is arbitrarily stripped away, but with no impact to the soundness of the reasoning about the information (assuming the human has not rigged the system/the entropy source is sound). Sorta like algebra over a field. Any field can be used so long as the axioms of the algebra hold.
So it all comes down to soundness of entropy sources...?
(this has improved my understanding of information theory _significantly_ :pray:)
> Information is certainty, entropy is uncertainty.
That is wrong.
There is NEVER certainty.
Certainly is a lie.
Certainty is a dillusion.
Listen carefully: “information removes uncertainty.” Do you see? The most well intending self deceptions begin with a misinterpretation.
Like Shannon and his prime example. He did not say “entropy is the number of possible states.”
If you excite an electron valence (increase potential) would these values not change to the next shelf? Everyone who heard what they wanted to hear.
There is no certainty, even the memory fades. You will always ask yourself if you did the right thing or picked the right one. Ten years from now some missing piece will come to light and perspectives will change.
Certainty never “existed” only the illusion of closure. We do the best we can when necessary with what we have. That is success, not certainty.
We cannot find certainty, only optimal solutions.
In a universe governed by entropy, that everything dissolves to time is the only certainty.