This paper is basically statistical mechanics with a quantum veneer. Two major issues:
1. Scale: They're simulating just 13 qubits with QuTiP and making grand claims about quantum thermodynamics. The computational complexity they're glossing over here is astronomical. Anyone who's actually worked with quantum systems knows you can't just handwave away the scaling problems.
2. Measurement Problem: Their whole argument about instantaneous vs time-averaged measurements is just repackaging the quantum measurement problem without actually solving anything. They're doing the same philosophical shell game that every "breakthrough" quantum paper does by moving around where they put the observer and pretending they've discovered something profound.
1. The main underpinning of this article is the analytical theory they come up with independent of their simulation. The fact that it explains a few qubits well is exactly why this is interesting. If you were to scale up their model - a spin-1/2 ising model, you would effectively get a classical magnet, which is obviously well described by classical thermodynamics. It's in limit of small systems that quantum mechanics makes thermodynamics tricky.
2. Their time averaging is just to remove fluctuations in the state, not avoid the measurement problem. They're looking at time averages of the density matrix, which still yields a quantum object that will collapse upon measurement. And as their mathematical model points out, this can be true for arbitrary time averaging windows, the limits just change respectively as smaller time averages allow for larger fluctuations. There's nothing being swept under the rug here.
> This implies that for macroscopic systems, the expected time one would be required to wait to observe such a decrease in entropy occurring is unobservably large.
Yea but we have virtual particles and the Casimir effect. Am I wrong or isn't this these perturbations evidencing themselves on a macroscopic scale?
Perturbations can mean either analogical reasoning (something is similar to something that it could come from with a small change) or actual perturbation (the effect of Venus on the orbit of the moon). Virtual particles are perturbations in the former sense, while quantum fluctuations are a small perturbation in the latter.
This paper is basically statistical mechanics with a quantum veneer. Two major issues:
1. Scale: They're simulating just 13 qubits with QuTiP and making grand claims about quantum thermodynamics. The computational complexity they're glossing over here is astronomical. Anyone who's actually worked with quantum systems knows you can't just handwave away the scaling problems.
2. Measurement Problem: Their whole argument about instantaneous vs time-averaged measurements is just repackaging the quantum measurement problem without actually solving anything. They're doing the same philosophical shell game that every "breakthrough" quantum paper does by moving around where they put the observer and pretending they've discovered something profound.
I disagree with you on both fronts.
1. The main underpinning of this article is the analytical theory they come up with independent of their simulation. The fact that it explains a few qubits well is exactly why this is interesting. If you were to scale up their model - a spin-1/2 ising model, you would effectively get a classical magnet, which is obviously well described by classical thermodynamics. It's in limit of small systems that quantum mechanics makes thermodynamics tricky.
2. Their time averaging is just to remove fluctuations in the state, not avoid the measurement problem. They're looking at time averages of the density matrix, which still yields a quantum object that will collapse upon measurement. And as their mathematical model points out, this can be true for arbitrary time averaging windows, the limits just change respectively as smaller time averages allow for larger fluctuations. There's nothing being swept under the rug here.
Quantum mechanics is statistical mechanics in the complex numbers.
Quantum mechanics is Markov chains in imaginary time.
Can you explain that?
State transitions are probabilistic and operators have complex coefficients.
State transitions are deterministic, it's only measurement that is probabilistic.
Even that is arguable. Subjective experience is probabilistic… kinda.
ScholarlyArticle: "Emergence of a Second Law of Thermodynamics in Isolated Quantum Systems" (2025) https://journals.aps.org/prxquantum/abstract/10.1103/PRXQuan...
NewsArticle: "Even Quantum Physics Obeys the Law of Entropy" https://www.tuwien.at/en/tu-wien/news/news-articles/news/auc...
NewsArticle: "Sacred laws of entropy also work in the quantum world, suggests study" ... "90-year-old assumption about quantum entropy challenged in new study" https://interestingengineering.com/science/entropy-also-work...
> This implies that for macroscopic systems, the expected time one would be required to wait to observe such a decrease in entropy occurring is unobservably large.
Yea but we have virtual particles and the Casimir effect. Am I wrong or isn't this these perturbations evidencing themselves on a macroscopic scale?
Perturbations can mean either analogical reasoning (something is similar to something that it could come from with a small change) or actual perturbation (the effect of Venus on the orbit of the moon). Virtual particles are perturbations in the former sense, while quantum fluctuations are a small perturbation in the latter.
"The second law of thermodynamics states that the entropy of an isolated system can only increase over time. "
Isn't there a difference between "can only increase" and "cannot decrease"?
Well, it's the an equal sign missing from one. For the later it can stay the same while it cannot stay the same for the former.
Over long enough time, fluctuations to lower entropy states will happen, so the jaw is statistical.
The trusty laws of thermodynamics strike again