tisdag 30 april 2024

Crisis of Modern Statistical Physics vs Classical Deterministic Physics

This is a further comment on Leibniz Principle of Identity of Indiscernibles seemingly in conflict with the modern physics main-stream idea of electrons all alike like equal probabilities of outcomes of tossing a fair coin. 




That modern physics is in a state of deep crisis is acknowledged by leading physicists and also largely understood by the general public. Buzz words like dark energy, dark matter, inflation, Big Bang, multiversa, entanglementcollapse of the wave function,  particles and quarks, are floating around as elements of relativity theory on cosmological scales and quantum mechanics on atomic scales, both formed 100 years ago but still today harbouring toxic unresolved foundational problems, and on top of that being incompatible. A veritable mess. 

The root of the seemingly unresolvable problems of quantum mechanics can be traced back to the statistical interpretation of the multi-dimensional Schrödinger wave function as solution to the multi-dimensional Schrödinger equation serving as foundation. 

While classical physics is ontology about what reality is, modern physics is epistemology about what can be said. While classical physics is deterministic physics independent of human observer, modern physics in the form of quantum mechanics is statistical physics depending on human observers acting as mathematical statisticians in an insurance company busy computing insurance premiums. 

The departure from classical into modern physics was initiated by Boltzmann in the late 19th century seeking an ontological realistic explanation of the 2nd Law of Thermodynamics as the main unresolved problem of classical physics giving time a direction, which had to be resolved to save physics from disbelief. When Boltzmann understood that he could not reach this main goal of his scientific life, he made a Faustian deal in the form of an explanation based on statistical mechanics. This served to save the life of physics, but not Boltzmann's own life, and opened the door into the heaven of modern physics as quantum mechanics as statistical mechanics, which is now in a state of crisis. 

The step from deterministic physics to statistical physics, was taken in order to save classical physics from credibility collapse in front of the 2nd Law. The medication worked for the moment but the patient as classical physics died and so was replaced by modern physics, which however showed to be quite sick without any cure in sight still today. 

The first step in coming to grips with the crisis of modern physics, is to ask if it is impossible to explain the 2nd Law within classical deterministic physics? If not, then the step to statistics is not necessary and much trouble can be avoided. More precisely, it appears to be possible to replace statistics by a concept of finite precision physics as presented in Computational Thermodynamics and in popular form in The Clock and the Arrow with follow up into a realistic deterministic form of quantum mechanics as Real Quantum Mechanics

This means a return to deterministic physics with a new element of finite precision computational physics coming with resolutions of problems of classical physics making it possible to avoid paying the very high price of taking the drug of statistical physics. 

Real physics is what it is and is given to us for free. Statistical physics is man-made physics, which needs massive data and human interference. Real physics seeks to describe the World as it is, while modern physicists have the reduced goal of statistical prediction outcomes of man-made experiments. Schrödinger and Einstein could not accept physics as man-made statistics, but were cancelled. Maybe the present crisis can open to restart following their spirit?  

We may view real physics as a form of engineering or professional soccer game with basic questions: What is the basic mechanism/principle? How to improve it? On the other hand, a statistical physicist simply watches the game on TV and finds meaning in betting.  


måndag 29 april 2024

Cancellation of Self-Interaction as Renormalisation


The apparent clash between Leibniz Principle of Identity of Indiscernibles PII and the Copenhagen Interpretation of Quantum Mechanics (StdQM) has triggered quite a bit of discussion surveyed in the book Identity in Physics: A Historical, Philosophical and Formal Analysis.

The trouble is rooted in the interpretation of the wave function of stdQM as expressing probabilities of possible electron particle configurations. 

This is to be compared with actual real configurations as in Real Quantum Mechanics RealQM  in a sense of classical physics with non-overlapping charge densities with unique presence in space-time as expression of identity. 

PII is in harmony with classical physics and RealQM, but not with StdQM. 

Schrödinger as inventor of quantum mechanics could not accept the probabilistic interpretation of StdQM, and so was cancelled by the leading Copenhagen school of Bohr, Born and Heisenberg. 

We may ask if PII is of real importance or only of some scholastic philosophical virtual importance?

The previous post brought up the idea that PII connects to self-interaction as a toxic element of Quantum Field Theory QFT as the generalisation of StdQM underlying the Standard Model capturing all of elementary atomic particle physics. It is manifested in the appearance of "infinities" asking for "renormalisation" to be cancelled, like techniques to ignore elephants in the room. 

In classical physics prevention of self-interaction is possible because it is possible to distinguish each particle from all other particles and so to guarantee in particular that the electric/gravitational field created by a particle only affects other particles but not itself. This is the nature of Newton's Law of gravitation and Coulomb's Law. 

But StdQM describes probabilities of possible particle configurations, which lack particle paths and so lack identity over time. In StdQM bosons (such as photons) can occupy the same position in space-time as well as some fermions (such as electrons with different spin), and particle paths have no meaning. In this setting  self-interaction cannot easily be prevented, and so ask for extra-ordinary techniques for cancellation in the form of "renormalisation".  Nobody is happy with this trick introduced to handle a fundamental difficulty of physics as statistics. 

The possibility that a specific particle occupies some specific position in space-time and the possibility that another particle does the same thing do not appear to be mutually exclusive, which means that particle identity is lost. Probably. Statistics is tricky.

The problem with self-interaction is that it has to steer way from both blow-up to infinity (too much ego) or decay to zero (too much self-criticism) in a very delicate balance threatened by instability.

Recall that the electron of Hydrogen atom is prevented from disappearing into the potential hole of the proton kernel by the presence of the Laplacian in Schrödinger's equation giving the electron an extension in space as a charge density. Likewise the Earth is saved from being swallowed by the Sun by orbiting the Sun as a form of spatial extension.

From the above book:

  • It is not clear how collections of non-individual objects can be captured by standard set theory. 
  • As the mathematician Yuri Manin put it: “We should consider possibilities of developing a totally new language ...” to deal with collections of entities which do not behave as standard sets (in the sense of obeying the axioms of the usual set theories), since the “new quantum physics has shown us models of entities with quite different behaviour. 
  • Even ‘sets’ of photons in a looking-glass box, or of electrons in a nickel piece, are much less Cantorian than the ‘set’ of grains of sand”.
  • It is our intention in this book to explore these different issues and, in particular, to go some way towards developing the ‘totally new language’ suggested by Manin.
PS When young men pull on military uniform they lose identity and become soldiers all alike in violation of PII:



lördag 27 april 2024

Identity Politics from Identity Physics?



Leibniz's famous Principle of the Identity of Indiscernibles PII states 
  • No two things are exactly alike.
  • Coexistence of two indiscernibles is metaphysically impossible.
A basic aspect of material physics is spatial extension as unique occupancy of some region of space over some period of time as unique identity. Material spatial overlap is impossible. Each cell of your body occupies its own region or volume in space, making it distinct or discernible from all other cells thus equipped with a unique identity. The same with the H2O molecules filling the Ocean even if they all have the same composition. Even point-like bodies of the same type carry unique identity by having unique positions in space-time.

Newton's Law of Gravitation and Coulombs Law state that force acting between two distinct point-like masses or charges, scales with $\frac{1}{r^2}$, where $r>0$ is their Euclidean distance. The Laws break down for $r=0$, which means that spatial overlap of masses and charges is forbidden. This is a fundamental principle of classical physics as an expression of PII. 

But in the modern atom physics of Standard Quantum Mechanics StdQM this fundamental principle is violated: Electrons are viewed to be indiscernible and to occupy space and satisfy Coulombs law only in a statistical meaning. 

Real Quantum Mechanics ReQM offers a new approach to atom physics where electrons appear as non-overlapping charge densities with unique spatial occupancy in accordance with PII which satisfy Coulomb's Law in a point-wise physical sense. 

The total Coulomb potential energy of two non-overlapping charge densities $\phi_1$ and $\phi_2$ takes the form of an integral: 
  • $\int\int\frac{\phi_1(x)\phi_2(y)}{\vert x-y\vert}dxdy$ where $x\neq y$.       (E)
From strict mathematical point of view (E) can be viewed to be meaningful even if the charge densities overlap with formally $x-y=0$, since the volume where $\vert x-y\vert$ is small, is compensated by $dxdy$. In StdQM it is thus possible for two electrons to have the same charge density and thus overlap (if the electrons have different spin). 

Over-lapping point charges reflects self-interaction with infinite potential energy. This poses a serious problem to Quantum Field Theory for particles without identity requiring "renormalisation" to artificially remove infinities. 

Self-interaction is toxic and has to be prevented, but without identity how can you distinguish between interaction with yourself (toxic) and interaction with other people?

For a Hydrogen atom with the proton modeled as a positive point charge at $x=0$, Schrödinger's equation models the electron as a distributed density $\phi (x)$ of negative charge with finite (negative) potential energy balanced by the kinetic energy $\frac{1}{2}\int\vert\nabla\phi (x)\vert^2dx$ determining the size of the electron.    

Summary: StdQM violates PII. RealQM satisfies PII. Impossible to avoid self-interaction with unphysical  infinities if the identity of the self is not guaranteed.

Is this important? Is PII a basic principle of physical material existence, which cannot be violated? 
Is Identity Physics as violation of PII really physics? 

PS A central theme of this blog is that the roots of the present accelerating break-down of principles of civilisation can be traced back to the advent of modern physics in the beginning of the 20th century with quantum mechanics and relativity theory as a form of Identity Physics.  



 

fredag 26 april 2024

Primordial Gravitational and Electric/Magnetic Potentials

Dialog between the Two Greatest World Systems with primordial potentials vs densities.  

This is a further remark to previous posts on New Newtonian Cosmology with a gravitational potential $\phi_m (x,t)$ and electric potential $\phi_c(x,t)$ with $x$ a Euclidean space coordinate and $t$ a time coordinate, viewed as primordial with mass density $\rho_m (x,t)$ and electric charge density $\rho_c(x,t)$ given by 

  • $\rho_m=\Delta\phi_m$      (1)
  • $\rho_c=\Delta\phi_c$      (2)
Here $\rho_m \ge 0$ while $\rho_c$ can be both positive and negative, and $\Delta$ is the second order Laplacian differential operator. 

The corresponding gravitational force $f_m\sim -\nabla\phi$ is attractive between positive mass densities and the corresponding Coulomb force $f_c\sim \nabla\phi_c$ is attractive between charge densities of opposite sign and repulsive for charge densities of the same sign. 

In principle $\rho_m<0$ is possible in (1), with then repulsion between mass densities of different sign which would separate large scales into Universa with positive and negative mass, where we happen to live in one with mass positive. It is thinkable that presence of negative mass density shows up as dark energy. It is thinkable that a very smooth $\Delta\phi_m$ corresponds to dark matter.  

The gravitational force $f_m$ acts on large masses at large distances. The electric Coulomb force $f_c$ acts on small small charges at small distances, which requires physics preventing charges of different sign to come too close, which is represented by the presence of the Laplacian in Schrödinger's equation. 

Including also a magnetic potential connected to the electric potential by Maxwell's equations and Newton's 2nd Law for mass motion subject to force, gives a model including Newton's mechanics, electromagnetics and gravitation, with potentials as primordial quantities from which mass and charge densities and forces are derived. Here Real Quantum Mechanics naturally fits in as a classical 3d continuum mechanics model. 

An important aspect of (1) and (2) is that $\rho_m$ and $\rho_c$ are derived by differentiation as an operation acting locally in space, which can be perceived to act instantly in time,  thus avoiding the hard-to-explain instant-action-at-distance coming with the standard view with mass and charge densities as primordial. 

The absence of magnetic monopoles corresponding to point charges makes magnetics different from electrics in the formation of electromagnetics.  

 

torsdag 25 april 2024

Temperature as Quality Measure of Energy.

In ideal gas dynamics temperature appears as an intensive variable $T$ connected to internal energy $e$ and density $\rho$ by 

  • $T=\frac{e}{\rho}$                          
with a corresponding pressure law 
  • $p=\gamma e$
where $\gamma$ is a gas constant. Internal energy is viewed as small scale kinetic energy from small scale molecular motion. Internal energy can transformed into mechanical work in expansion, which without external forcing (or gravitation) is an irreversible process.  

For a solid body viewed as a vibrating atomic lattice temperature scales with total internal energy as the sum of small scale kinetic energy and potential energy, which can be transferred by radiation and conduction to a body of lower temperature.   

In both cases temperature appears as a quality measure of internal energy as an intensive variable. 

The maximal efficiency of a Carnot heat engine transforming heat energy into work operating between two temperatures $T_{hot}>T_{cold}$ is equal to $1-\frac{T_{cold}}{T_{hot}}$. 

Radiative heat transfer form a hot body of temperature $T_{hot}$ to a cold body of temperature $T_{cold}$, scales with $(T_{hot}^4-T_{cold}^4)$ according to Stephan-Boltzmann-Planck. 

Conductive heat transfer scales with $(T_{hot}-T_{cold})$ according to Fourier.

In both cases the heat transfer from hot to cold can be seen as transformation from high quality energy into low quality energy in an irreversible process in conformity with the 2nd Law of Thermodynamics. 

The Nobel Prize in Physics in 2008 was awarded to experimental detection of Cosmic Microwave Background CMB radiation with perfect Planck spectrum as an after-glow of a Bing Bang with temperature of  2.725 K and corresponding very low quality energy.  

With radiation scaling with $T^4$ the difference between 300 K as global temperature and 3 K as deep space CMB comes out with a factor of $10^{-8}$. The contribution to global warming from CMB thus appears to be very small. 

We see from $e=\rho T$ that low density and low temperature both connect to low energy quality making both wind and solar energy inefficient compared to fossil and nuclear energy.    


Cosmic Microwave Background Radiation vs Cosmic Inflation

Cosmic Microwave Background Radiation CMB is supposed to be an afterglow of a Big Bang which started with Cosmic Inflation as a theory proposed by theoretical physicist Alan Guth as an extremely rapid expansion from a Universe of the size of a proton to the size of a pea taking place during a period of time from $10^{-36}$ to $10^{-32}$ seconds after zero time with an expansion factor of $10^{13}$.  

A common view is that Alan Guth's theory solves all three main problems of cosmology: the horizon problem, the flatness problem and the magnetic monopole problem. 

The Nobel Prize in Physics 2023 was awarded for experimental methods that generate attosecond pulses of light for the study of electron dynamics in matter, with an attosecond = $10^{-18}$ second. 

Visible light has time scale of $10^{-15}$, x-rays $10^{-18}$ and $\gamma$-rays $10^{-22}$ seconds as real physics with highest frequency presently known. Frequency is connected to energy through Planck's Law which allows determining the frequency of $\gamma$-rays by measuring the energy of $\gamma$-radiation.  

Cosmic Inflation is described follows in popular form:

  • According to the theory, for less than a millionth of a trillionth of a trillionth of a second after the universe's birth, an exotic form of matter exerted a counterintuitive force: gravitational repulsion. 
  • Guth says the existence of this material was reasonably likely.
  • Guth says that we don’t necessarily expect to answer those questions next year, but anything that makes small steps towards understanding the answers is thrilling.
If you feel that you need more information to be able to judge if Cosmic Inflation is a hoax, you can consult the following book by Guth:  The Inflationary Universe: The Quest For A New Theory Of Cosmic Origins.

The present inflation in Sweden of 10% appears to be pretty small when compared to cosmic inflation.


onsdag 24 april 2024

How to Measure Temperature

Measuring temperature accurately is a delicate procedure.

This is a comment to the discussion in recent posts of the proclaimed perfect blackbody spectrum of Cosmic Microwave Background CMB radiation with temperature 2.725 K.  

You can measure your body temperature by body contact with a quicksilver thermometer or at distance by an infrared thermometer. Both work on a principle of thermal equilibrium between source and thermometer sensor as a stable state over time. Your body is assigned the temperature recorded by the thermometer. 

Temperature can be seen as a measure of energy in the form of heat energy or vibrational energy of a vibrating system like an atomic lattice as the generator of radiation as radiative heat transfer.

Computational Blackbody Radiation offers a new analysis of radiative heat transfer using classical wave mechanics as a deterministic form of Planck's analysis based on statistics of quanta. The basic element of the analysis is a radiation spectrum from a vibrating atomic lattice: 

  • $E(\nu ,T)=\gamma T\nu^2$ for $\nu \le \frac{T}{h}$        (1a)
  • $E(\nu ,T)= 0$ for $\nu >\frac{T}{h}$                               (1b)
where $\nu$ is frequency on an absolute time scale, $T$ is temperature on a lattice specific energy scale, $\gamma$ and $h$ are lattice specific parameters and $\frac{T}{h}$ is a corresponding high-frequency cut-off frequency setting a upper limit to frequencies being radiated. Here a common temperature $T$ for all frequencies expresses thermal equilibrium between frequencies. 

It is natural to define a blackbody BB to have radiation spectrum of the form (1) with maximal $\gamma$ and high-frequency cut-off and to use this as a universal thermometer measuring the temperature of different bodies by thermal equilibrium. 

Consider then a vibrating atomic lattice A with spectrum according (1)-(2) with different parameters $\bar\gamma <\gamma$ and $\bar h >h$ and different temperature scale $\bar T$ to be in equilibrium with the universal thermometer. The radiation law (1) then implies assuming that A is perfectly reflecting for frequencies above its own cut-off:
  • $\bar\gamma \bar T = \gamma T$                                         (2)
to serve as the connection between the temperature scales of BB and A. This gives (1) a form of universality with a universal $\gamma$ reflecting the use of a BB as a universal thermometer.

In reality the abrupt cut-off after at radiation maximum is replaced by a gradual decrease to zero over some frequency range as a case-specific post-max part of the spectrum.  A further case-specific element is non-perfect reflectivity above cut-off. Thermal equilibrium according to (2) is thus an ideal case.  

In particular, different bodies at the same distance to the Sun can take on different temperatures in thermal equilibrium with the Sun. Here the high-frequency part of the spectrum comes in as well as the route from non-equilibrium to equilibrium. 

Why CMB can have a perfect blackbody spectrum is hidden in the intricacies of the sensing. It may well reflect man-made universality. 

måndag 22 april 2024

Man-Made Universality of Blackbody Radiation 2

Man-made Universality of Shape

This is a clarification of the previous post on the perfect Planck blackbody spectrum of the Cosmic Microwave Background Radiation CMB as a 14 Billion years afterglow of Big Bang as the leading narrative of cosmology physics today. See also this recent post and this older illuminating post.

The Planck spectrum as the spectrum of an ideal blackbody, takes the form 
  • $E(\nu ,T) =\gamma T\nu^2\times C(\nu ,T)$                                         (1)
where $E (\nu ,T)$ is radiation intensity depending on frequency $\nu $ and temperature $T$, $\gamma$ a universal constant, and $C(\nu ,T)$ is a universal high frequency cut-off function of the specific form 
  • $C(\nu ,T)=\frac{x}{\exp(x)-1}$ with $x = \frac{\nu}{T}\times\alpha$       (2)
where $\alpha =\frac{h}{k}$ with $h$ Planck's constant and $k$ Boltzmann's constant as another universal constant, with the property that 
  • $C(\nu ,T)\approx 1$ for $x<<1$ and $C(\nu ,T)\approx 0$ for $x>>1$.  
We see that radiation intensity proportional to $T$ increases quadratically with $\nu$ in accordance with deterministic wave mechanics, and reaches a maximum shortly before a cut-off scaling with $T$ in accordance with statistics of energy quanta, which kicked off an idea of atom physics as quantum mechanics also based on statistics.    

Computational Blackbody Radiation offers a different version of high frequency cut-off motivated by finite precision physics/computation instead of statistics of quanta opening to a deterministic form of atom physics as real quantum mechanics. The underlying physics model in both cases is that of an atomic lattice capable of generating a continuous spectrum of vibrational frequencies.

The basic assumptions behind a Planck spectrum as an ideal are:
  1. Model: Atomic lattice.
  2. Equilibrium: All frequencies take on the same temperature.
  3. High-frequency universal cut-off: Statistics of energy quanta.  
Observation show that most real blackbody spectra substantially deviate from the Planck spectrum and so have their own signature reflecting specific atomic lattice, non-equilibrium and specific high frequency cut-off lower than the ideal. Graphite is just about the only substance showing a Planck spectrum. 

This was not welcome by physicists in search of universality, and so the idea was born of deciding the spectrum of a given material/body by putting it inside an empty box with graphite walls and measuring the resulting radiation peeping out from a little hole in the box, which not surprisingly showed to be a graphite Planck blackbody spectrum. 

Universality of radiation was then established in the same way as universality of shape can be attained by cutting everything into cubical shape as was done by the brave men cutting paving stone out of the granite rocks of the West Coast of Sweden, which is nothing but man-made universality.  

The line spectrum of a gas is even further away from a blackbody spectrum. The idea of CMB as an afterglow of a young Universe gas cloud with a perfect Planck blackbody as measured by the FIRAS instrument on the COBE satellite, serves as a corner stone of current Big Bang + Inflation cosmology. 

It is not far-fetched to suspect that also the COBE spectrum is man-made, and then also Big Bang + Inflation.

lördag 20 april 2024

Can Cosmic Microwave Background Radiation be Measured, Really?

The Cosmic Microwave Background radiation CMB is supposed to be a 14 billion year after-glow with perfect Planck blackbody spectrum at temperature $T=2.725$ Kelvin K of a Universe at $T=3000$ K dating back to 380.000 years after Big Bang. The apparent 1000-fold temperature drop from 3000 to 3 K is supposed to be the results of an expansion and not cooling.  

To get an idea of the magnitude of CMB let us recall that a Planck spectrum at temperature $T$ stretches over frequencies $\nu\sim T$ and  reaches maximum radiation intensity $E\sim T^3$ near the end with a high frequency cut-off over an interval $\frac{\nu}{T}\sim 1$ (notice exponential scale):



 

The $10^3$-fold temperature drop thus corresponds to a $10^9$ decrease of maximum intensity and $10^3$ decrease in spectrum width. Intensity over width decreases with a factor $10^6$ as a measure of precision in peak frequency. 

We understand that to draw conclusions concerning a 3000 K spectrum from a measured 3 K spectrum requires a very precision on the level of microKelvin or 0.000001 K. Is this really possible? Is it possible to reach the precision 2.725 K from intensity maximum? 

Why is modern physics focussed on measuring quantities which cannot be measured, like ghosts?

CMB was first detected as noise maybe from birds visiting antennas, but the noise persisted even after antennas were cleaned and then the conclusion was drawn that CMB must be left-over from Big Bang 14 billion years ago and not from any birds of today.  Big Bang is physics, while birds is ornithology. 

fredag 19 april 2024

The Ultra-Violet Catastrophe vs 2nd Law of Thermodynamics


Classical physics peaked in the late 19th century with Maxwell's equations aiming to describe all of electromagnetics as a form of continuum wave mechanics, but crumbled when confronted with the Ultra-Violet Catastrophe UVC of heat radiation from a body of temperature $T$ scaling like $T\nu^2$ with frequency $\nu$ threatening to turn everything into flames without an upper bound for frequencies, because wave mechanics did not seem to offer any escape from UVC.  

Planck took on role of saving physics from looming catastrophe, but could not find a resolution within deterministic wave mechanics and so finally gave up and resorted to statistical mechanics with high frequencies less likely in the spirit of Boltzmann's thermodynamics and 2nd Law with order less likely than disorder. 

There is thus a close connection between UVC and 2nd Law. Boltzmann would say that the reason we do not experience UVC is that high frequencies are not likely, but the physics of why is missing. Explaining that UVC is not likely would no explain why there is not any observation UVC whatsoever. 

I have followed a different route replacing statistics by finite precision physics for UVC (and similarly for 2nd Law), where high frequencies with short wave length cannot be radiated because finite precision sets a limit on the frequencies an atomic lattice can carry as coordinated synchronised motion. In this setting UVC can never occur.

A basic mission for a 2nd Law is thus to prevent UVC. This gives 2nd Law deeper meaning as a necessary mechanism preventing too fine structures/high frequencies to appear and so cause havoc. 2nd Law is thus not a failure to maintain order over time, but a necessary mechanism to avoid catastrophe from too much order. 

Similarly, viscosity and friction appear as necessary mechanisms destroying finite structure/order in order to let the World to continue, and so not only as defects of an ideal physics without viscosity and friction. This is the role of turbulence as described in Computational Turbulent Incompressible Flow and Computational Thermodynamics.

We can compare with the role of interest rate in an economy with zero interest rate of an ideal economy leading to catastrophe over time. If there is no cost of getting access to capital, any crazy mega project could get funding and catastrophe would follow. This was the idea 2008-2023 preceding the collapse predicted to 2025. Too little friction makes the wheels turn too fast. Too much idealism leads to ruin.