## Trova il tuo prossimo libro preferito

# Expanded Maxwellian Geometry of Space

## Descrizione

A causality based model meant to integrate all issues not resolved with Special Relativity and General Relativity by integrating a relativistic effect in nucleons not taken into account by either theories, and that reconciles Quantum Mechanics with causality.

****

Electromagnetic Mechanics of particles (EMM)

Description of a new space geometry that allows explaining the existence of all stable physically scatterable particles, relativistic velocities and gravity solely from Maxwell's electromagnetic theory and that consequently opens up a new and very promising avenue of research.

Even though it allows defining an electromagnetic mechanics of particles paradoxically requiring neither underlying fields nor any vacuum medium of any sort, this integration remains in perfect harmony with the equations of Special Relativity, Quantum Mechanics and Quantum Electrodynamics with minor adjustments.

All electromagnetic properties usually associated with underlying fields become direct properties of each individual elementary particle.

While classical, relativistic and quantum mechanics describe the motion of bodies and particles, this electromagnetic mechanics proposes a description of their fundamental nature and an explanation to the cause of their motion and the reason why they naturally tend to selfpropel at constant velocity and selfguide in straight line when no external force is acting on them, on top of possibly revealing a new and apparently inexhaustible source of energy.

A new physics that provides a quite unexpected solution to the magnetic monopoles issue and naturally explains in a novel manner light deflection, atomic stability, the electron magnetic moment “anomaly”, the so-called time dilation, Earth’s rotation rate slowdown, the Moon orbit expansion, Mercury's orbit precession, the two “so-called” anomalies of the trajectories of both Pioneer 10 and 11 spacecrafts by integrating a relativistic effect not yet taken account of in SR and GR, that is nucleon dilation, resulting in their rest mass variation as a function of the local gravity gradient intensity.

## Informazioni sull'autore

## Autori correlati

## Correlato a Expanded Maxwellian Geometry of Space

Einstein's Operating System di Andre Michaud Valutazione: 0 su 5 stelle

## Categorie correlate

## Anteprima del libro

### Expanded Maxwellian Geometry of Space - Andre Michaud

*Foreword *

*Foreword*

A photon is a shimmering butterfly escaping from the chrysalis of the atom

Pierre Rousseau, 1941

Without any pretense at being complete or having no failings, this book is the end result of long years of research, reflection, verification with numerous physicists, and of an exhaustive exploration of the literature in search of the required confirmations.

I am particularly grateful to my son, Jean-François, whose deep understanding of the model often triggered long discussions that led to clarifying many details that could otherwise have been left out, particularly regarding the description of the carrier-photon.

I am also immensely appreciative of causalist physicist Paul Marmet’s outstanding work that led to his remarkable discovery about the direct relation that exists between the varying magnetic mass of moving electrons and relativistic velocities (see **Chapter 13), without which the intermediate discrete fields equations set presented in Chapter 25 could not have been developed. **

My sincere thanks also to Russell Bagdoo who put at my disposal his extensive knowledge of Maurice Allais' experimental results regarding solar eclipses and lunisolar periodicities described in **Section 29.8. **

The purpose of this book being the description of a new fundamental space geometry derived from Maxwell’s theory, and to highlight how seamlessly it integrates already explained observed physical phenomena and many as of yet unexplained phenomena at the particles level as well as at the astronomical level, the mathematical treatment has been designed to remain easily accessible even to beginners in physics.

Mechanics traditionally deals with the relations between force, motion and matter, dealing with energy almost as an afterthought. But considering that force does not directly cause motion, but induces energy which is the actual cause of motion, the present analysis will bring more attention to the major role played by energy in mechanics.

Students who are looking for more than current status quo and who may even be open to complete reconsideration will have the surprise, as they integrate this model, to become aware of a novel and surprisingly simple possible alternate power source, vastly more efficient than nuclear fusion energy and very possibly technically easier to harness than nuclear fission energy (**Sections 21.8 and 21.9). **

It must be said however that his work will be meaningful only to those who are firmly convinced that fundamental elementary particles are permanently localized, even as they move.

Of course, permanent localization and point-like behaviour of moving elementary particles during scattering experiments is quite at odds with the most popular current fundamental theory, which is Quantum Field Theory (QFT), defined as combining the features of QM and SR. However, the fact is that it would be more precisely defined as combining the features of the Copenhagen interpretation of QM

and SR.

It must be said that major causalist discoverers such as Einstein, Planck, Schrödinger, de Broglie and others were quite opposed to this interpretation introduced by Bohr and Heisenberg that negated all possibilities of causality at the fundamental level.

While Bohr and Heisenberg considered that Quantum Mechanics "**governed** all aspects of objects at the fundamental level which then could not have any intrinsic properties other than those that could be determined with measuring devices, Einstein et al. considered that Quantum Mechanics only

**described**" objects, presumably incompletely pending further discoveries, and that they could have yet undiscovered intrinsic properties.

These causalists considered that the localization imprecision of moving particles with the equations of QM was a characteristic of the method, not of the underlying reality. Their view did not preclude at all the possibility of permanent localization of moving particles, thus of point-like behavior during scattering in physical reality. See **Section 3.6 for more on the causality issue. **

Many attempts have been made to unify the various fields theories into some final Unified field theory. Of course, any such unification could apply to physical reality only inasmuch as the fields considered, such as the currently popular hypothetical Higgs field, really exist.

Regarding Special Relativity, which is the other foundation of Quantum Fields Theory and has been the cornerstone of fundamental physics for the past century, this analysis will show that it could also be lacking when the fact is taken into account in the theory that nucleons are not elementary but structures made up of interacting charged and massive elementary particles (up and down quarks), which puts in question the assumption that the rest mass of such complex particles (nucleons) could be universally invariant.

Regarding the electroweak force, which is the result of the unification of electromagnetism and the weak force and an important component of QFT, there may be reason to question its validity because despite the general belief, the magnetic aspect of electromagnetism has never been *completely *clarified at the localized elementary particles level, and complete understanding of this critically important aspect of electromagnetic energy could possibly render this unification moot and not required.

For example, is the magnetic aspect of energy (say of a single photon) static as charge is in electrons, or oscillating at the observed frequency of the photon's energy? We will analyze the implications in **Appendix A. What about the charge associated to this magnetic energy of a photon (related to displacement current)? How could a frequency be observed for this photon if both its electric and magnetic aspects were static? **

Since a photon of energy 1.022 MeV or more can split into a pair of unit charge massive particles (one electron and one positron), and that these two can convert back to photon state, there has to be identity of characteristics between the electric and magnetic aspects of the energy making up the rest masses of these unit charge particles and those of the energy of free photons.

Also, since up and down quarks (the scatterable inner components of protons and neutrons) are charged, they mandatorily also have to have a magnetic aspect, and so there also has to be identity between the characteristics of their electric and magnetic aspects and those of electrons, positrons and photons!

What space geometry would then be required to allow such identity? Could it be that the popular Minkowski 4D space time geometry turns out to be too restrictive to allow clear understanding of the magnetic aspect of energy at the localized elementary particles level?

We will explore here the fantastic landscape that unfolds as we analyze an expanded space geometry that allows logical existence of this identity, a clear and logical description of the magnetic aspect of energy at the elementary localized particles level and a coherent conversion process between these various states of energy.

To more easily position this new model with respect to Newton's gravity theory and the General Relativity Theory, let's say that if Newton's gravity theory corresponds to television 480P resolution, then GR corresponds to 720P resolution and this new model, by integrating a relativistic process at play in nucleons not taken into account by SR and GR, will correspond to full 1080P resolution.

*2 Speech in Plenary Session *

*2 Speech in Plenary Session*

[**Back to TOC] **

*Text of the speech in plenary session of the International Congress on Physics PHYSICS-2000, on July 7 of 2000 as this expanded space geometry was first formally presented *(**[17], pages 291 to 310). **

From my perspective, a physical theory must be elaborated exclusively from experimentally obtained information, like for example, Newton's theory, Maxwell's Theory, and Planck's concept of the quantum of action.

**2.1 Identifying the Restricted Set of Stable Particles **

In the course of the past 100 years, all massive, elementary and stable particles that can be scattered against non-destructively, namely, the electron, the positron and quarks up and down, as well as all of their physical properties, have clearly been identified.

Some may feel surprised at finding neither proton nor neutron included in this category. The reason is that these baryons are not elementary, but are made up of three smaller scatterable particles in motion, which have been proven to be elementary and stable, that is, the up and down quarks (**uud **for proton and **ddu **for neutron).

Also, the deep non-destructive inelastic back-scattering that was observed during scattering experiment carried out with incoming bullet

electrons in 1968 at the SLAC accelerator, proves without any doubt that up and down quarks are only marginally more massive than electrons (**[55]). **

It will become obvious further on why only this very restricted set needs to be considered, and also why **photons**, despite being massless must also be included in the set.

Special attention will also be given to **neutrinos **despite the fact that they have no charge, no mass and are not directly scatterable.

See **Appendix C for a complete analysis of the set of all particles. **

The observation that gravitation has not yet been explained, in spite of these verified findings, led me to suspect, like many others, each for a variety of reasons, that something very fundamental may have been misunderstood or neglected in fundamental physics.

This suspicion caused me to reconsider the accepted space geometry and to carefully re-examine what properties of particles had truly been experimentally confirmed which eventually led to the elaboration of this theoretical solution, based exclusively on objectively verified properties of the elementary stable particles set.

Retrospectively, I find that the acceptance as physical theories instead of simple handy mathematical tools, of Quantum Electrodynamics that introduced the idea of virtual photons, and its direct offshoot Quantum Chromodynamics that extended the concept of virtual photons to virtual particles, was a determining factor in the neglect during the past half century, of the importance of the Coulomb interaction at the fundamental level, because it generalized the perception that pseudo-quantized virtual entities could physically represent the Coulomb potential progressively being induced between real electromagnetic particles during scattering events, as an inverse function of the square of the distance between the interacting particles.

**2.2 Loss of Interest for the Temporal Sequences of Events **

[**Back to TOC] **

I also believe that general acceptance of the static Lagrangian method instead of the dynamic Hamiltonian method as suggested by Feynman in the framework of his definition of QED, was instrumental in a loss of interest for the fact that non-destructive scattering and destructive collisions between particles are precise temporal sequences of events.

Feynman's conclusion that the use of the Hamiltonian method was forcing the adoption of a field viewpoint rather than an interaction viewpoint is unfounded in my view, because it can easily be argued that a relative interaction between particles can be nothing other than simultaneous and mutual, just as with the field viewpoint.

If, as Feynman suggests, the Coulomb interaction is mediated solely by an exchange of virtual photons between the particles involved, the following questions jump to mind:

(1) How can mutual interaction be initiated if virtual photons are the sole mediators of the interaction, since a photon must first be emitted by one particle before it can be absorbed by the other?

(2) What causes one of the particles involved to emit the first virtual photon of the exchange, and how is the direction of its emission determined?

**2.3 Any Electromagnetic Interaction involves both Force and Energy **

I see an obvious causality problem here since mediation of the exchange by virtual photons as proposed by Feynman bundles together two fundamentally very different aspects of the relation between particles:

(1) The Coulomb force proper, also known as the electrostatic force

, whose nature and origin are still a mystery, which is permanently acting between charged particles as a function of the inverse square of the distance between these particles, and which is the cause of the acceleration of the particles towards, or away from, each other (or should we rather more precisely say, which is the cause of the induction of kinetic energy in the particles, an energy vectorially directed towards the other particle in case of attraction and in the opposite direction in case of repulsion, and that naturally manifests its presence by a change in velocity of particles when external electromagnetic constraints do not partially or even completely prevents such a change in velocity);

(2) And the kinetic energy that the Coulomb force progressively induces in the particles as a function of the distance between them, whether or not a change in velocity occurs, and that acts as carrying energy for the particles, and whose quantized form constitutes the very substance of all existing particles.

It can effectively be postulated that kinetic energy may have a physical existence as a substance. This energy, vectorially directed in the direction mandated by the causal force, would then tend to cause motion as a function of the accumulated quantity, being expressed as a velocity of the related particles, which in turn would give rise to their inertia; or alternately, if the local electromagnetic equilibrium makes it impossible for this velocity to be expressed, give rise to a pressure being applied (weight) against other particles in immediate contact with this particle, as occurs between bodies lying at the surface of the Earth and the Earth itself.

There is also the issue that virtual photons are by definition discrete quantities, which implies that the potential would be induced in discrete increments between the particles, which directly contradicts the fact that quantities of translational kinetic energy are induced at any distance as a function of the infinitely progressive inverse square law of the distance.

**2.4 The Importance of Temporal Sequences of Events **

[**Back to TOC] **

Scattering and collision events not actually being physically instantaneous, there are good reasons to question Feynman's opinion when he declares, and I quote:

"*In many problems, for example, the close collisions of particles, we are not interested in the precise temporal sequence of events. It is of no interest to be able to say how the situation would look at each instant of time during a collision and how it progresses from instant to instant*." (**[6], p.771) **

Needless to say that I deeply disagree with Feynman on this issue, because this research philosophy has induced the respectful following generations of physicists to refrain from exploring the only remaining unexplored frontier in fundamental physics for the past 50 years:

A further argument supporting the view that virtual photons cannot be a physical reality, is Bohr's own observation that the quantized states of electrons in atoms being stable, these states can be accompanied by no radiation whatsoever, and that the existence of such non-radiating states is conform to the idea of quanta stability (**[1], p.134). See also Sections 23.8 and 23.9 about this issue. I also fully agree with de Broglie that the existence of quanta implies an inferior limit of a very special nature to the perturbations that can exist in the systems considered ([1], p.20). **

We will explore in **Appendix A a promising avenue that could lead to a possible mechanical explanation to the stability of electronic and nuclear quantized states that naturally stems from the model that we will explore in this work. **

Consequently, the perception that radiation could be emitted by electrons on rest orbitals, for example, by means of virtual photons or otherwise, appears to be in direct contradiction with the very foundation of quantum mechanics, because Bohr's and de Broglie's observations imply that some threshold of local excess intensity of energy relative to the local electromagnetic least action equilibrium level must be reached or exceeded before a photon can be emitted.

Between local least action equilibrium and this intensity threshold, the excess energy can only cause local oscillation but when this threshold is reached, conversion to photon state initiates and evacuation from the system considered of the energy locally in excess will allow local equilibrium to be re-established.

In view of the possible existence of such a relative quantization threshold (or separation threshold

), the physical existence of virtual photons as mediators of an interaction which results in the induction between the particles involved, of an energy unquantized by definition, simply due to the fact that its local relative intensity has not yet reached that threshold, appears highly questionable except as a convenient mathematical artifact.

**2.5 The Need to Redefine Fundamental Space Geometry **

[**Back to TOC] **

The space geometry that must underlie Maxwell's Theory requires that a magnetic field and an electric field orthogonally intersect each other while both of them intersect at right angle the direction of motion of any point on the wave front of any electromagnetic wave in three dimensional space.

Given that if it existed as such, such a wave would mandatorily be expanding spherically in space from its point source, the possibility came to light, on the assumption that the wave coming into being already possessed all electromagnetic characteristics even at this point source, that the orthogonally intersecting magnetic and electric fields envisioned by Maxwell could well not be residing within our tridimensional space at this very point source.

Given that is seems impossible to visualize more than 3 dimensions at a time, I then took to the habit of mentally folding the 3 dimensions of normal space as if they were the ribs of an open three-rib metaphorical umbrella in my attempts to more easily visualize an expanded space geometry that would allow for this possibility. This allowed visualizing the folded normal space

umbrella as the major X-axis of some sort of expanded space geometry, giving rise to the possibility of a major Y-axis representing a folded space

for the electric field and a major Z-axis folded space

for the magnetic field.

The further step of extending the folded umbrella idea to the other two major axes was easily taken, thus defining an intriguing new geometry of three orthogonally coexisting spaces, each internally possessing 3 dimensions, a metaphorical mental Rubik's cube

that I became very fond of playing with, mentally opening and closing the umbrellas one at a time as needed to continue being able to easily visualize the whole geometry. Of course, this mental opening and folding of umbrellas

has no impact on the real spaces that would be represented. They would be permanently open and fully extended at all times in reality.

It is to this space geometry that I then undertook to relate all verified properties of elementary scatterable particles, each of which being the focus of a local occurrence of intersection of these three spaces, and each being separated from all others by total vacuum.

**2.6 De Broglie's Dynamic Photon Internal EM Structure **

The actual element of information that triggered the chain of reasoning leading to the solution proposed here, is a conclusion by Louis de Broglie that photons must be made up, not of one corpuscle, but of two corpuscles, or half-photons, that would be complementary like the electron is complementary to the positron.

I eventually came to clearly visualize in this expanded space geometry a possible mechanics of how a photon with energy 1.022+ MeV could convert to an electron/positron pair as experimentally confirmed in the 1930's. Eventually, a plausible mechanics of interaction of electrons and positrons took shape that provided a key to understanding how protons and neutrons could come into being.

The end result was a seamless series of clearly defined interaction sequences providing an uninterrupted path of causality from: 1) the unquantized quantities of translational kinetic energy induced in particles by natural acceleration due to Coulomb or gravitational interaction, 2) to the quantization in the form of free moving electromagnetic photons of any quantity of this energy in excess of the precise amount required by the local stable or metastable electromagnetic equilibrium, 3) to the creation of electron-positron pairs from the destabilization of photons of energy 1.022 MeV or more, 4) to the creation of protons and neutrons from the interaction of electrons and positrons forced into groups of three including both types in sufficiently small volumes of space with insufficient energy to escape mutual capture, 5) to the final shedding in the form of neutrino energy of momentary metastable excess rest mass (different from velocity induced extra momentary relativistic mass) as overexcited massive elementary particles are forced by local electromagnetic equilibrium into lowering their mass to their lower and stable true rest mass.

This 9-dimensional 3-spaces geometry turns out to be the most restricted reference frame that would allow such a clearly defined causality sequence.

**2.7 Harmonization of Established Theories **

[**Back to TOC] **

The most surprising outcome however, appears to be a confirmation of Newton's Gravitational Theory in a more precise relativistic form which, by replacing the Newtonian concept of point-like particles

with that of charged point-like particles

, provides an alternate explanation to that of General Relativity of the Newtonian error in the calculation of the perihelion advance of Mercury, of the correct calculation of gravitational deflection of photons trajectories by stars, of the increase in frequency of cesium atoms in cesium clocks with altitude, provides a gravitational solution to the observed unexplained anomalous constant residual acceleration directed towards the Sun and unexplained rotation anomaly observed for the Pioneer 10/11 (**[7]) and ([38], p.23]), and the unexplained anomalous planetary flybys of Galileo, Ulysses and other spacecrafts ([43]) all of them related to inertial hyperbolic trajectories (See Chapter 29), and finally provides a workable solution to the problem of the corona excessive heat. **

Surprisingly, this solution draws a natural bridge between Maxwell's electromagnetic theory that it confirms in a manner allowing it to directly describe photons, Coulomb interaction and Newton's gravitational theory upgraded to relativistic status, and refocuses in a new perspective the bulk of accepted orthodox theories, namely Special Relativity, Quantum Mechanics, Quantum Electrodynamics, as well as many of the postulates that are now taken for granted.

**2.8 Unresolved issues **

A number of unresolved issues will be addressed in this work. For example, De Broglie's important conclusion regarding a possible internal structure for photons, which, in conjunction with Abraham and Kaufmann's discovery regarding the insensitivity of translational kinetic energy to any force applied transversally (See **Section 24.2**), seems to be the very key to building the last missing causality link between the kinetic energy that accumulate by means of electromagnetic acceleration of particles and the energy that quarks up and down have to be made up of.

Note that increase in kinetic energy through acceleration has traditionally been treated in correlation with the notion of reciprocal decrease in potential energy so that the Principle of Conservation of Energy

is respected, This principle will be analyzed in **Section 16.2. **

Special Relativity on its part has not yet been adapted to account for the internal adiabatic contraction and expansion of complex particles such as protons and neutrons as a function of the local intensity of electrostatic interactions between them and elementary charged components of surrounding matter (other up and down quarks) and of the impact of this interaction on the local rest mass of these complex particles as a function of the local density of surrounding matter (the local intensity of gravity, aka the gravitational gradient). SR still deals with protons and neutrons as if they were rest mass invariant elementary particles!

Could this be why no one can currently properly calculate the trajectories of Pioneer 10 and 11, even with the General Relativity equations (**[38]), a theory that supposedly is the final word on all observed inertial gravitational phenomena? Moreover, data gathered for other spacecrafts definitely hints at the possibility that this assumed anomalous acceleration phenomenon would be systematic and due to some aspect of fundamental reality not yet covered by traditional theories. **

Presently, GR has persistently been researched to no avail for the past decade for ways to account for two distinct so-called anomalies observed regarding the Pioneer spacecrafts, one pertaining to a seldom documented so-called anomalous

loss of angular momentum about their spin axis (**[38], p.23), that we will address in Chapter 23, and the other pertaining to a so-called anomalous acceleration directed towards the Sun as they escape the Solar system on their hyperbolic trajectories, that we will address in Chapter 29. **

Could these failings of GR be due to the fact that SR (to which GR is intimately related) does not yet take into account the relativistic implications of the fact that protons and neutrons are not elementary, and that their momentary rest mass may well depend on the local gravity gradient intensity dependant velocity of the quarks making up their structure? This model will put in perspective how the effective rest mass of complex particles can be correctly integrated.

Other very well documented phenomena that SR and GR are unable to explain are the fact that the Moon is progressively receding from the Earth at a rate of about 3.8 cm per year and that the earth’s rotation is higher in summer than in winter and that it is steadily decreasing from year to year, in a manner that no current theory can explain.

We will study in **Chapter 29 how these phenomena can possibly be explained for the same fundamental reason that explains the apparent anomaly in the trajectories of the Pioneer 10 and 11 spacecrafts and the slowing down of atomic clocks. The present model predicts in fact that this so-called anomaly is no anomaly whatsoever, but a normal behavior of all small bodies moving in space. **

It is well known that at short gravitational range such as the distance between Mercury and the Sun, GR has proven to be more precise than Newton's classical theory.

Both theories however have been proven to work out the same results for all translation motion of all observable matter

at galactic and intergalactic range at which all planets and stars bodies behave as if they were point-like relative to each other, for both luminosity and virial theorem methods (**[23], p.389). **

**2.9 Observable matter vs dark matter **

[**Back to TOC] **

A note of interest here regarding observable matter

at galactic and intergalactic ranges concerns the so-called dark matter

. Let us clearly understand that Observable matter

at these ranges is exclusively matter whose luminosity is detectable either by directly radiating detectable energy or by reflecting energy from these far galaxies.

In 1933, astronomer Fritz Zwicky observed that the mass of a cluster of far galaxies calculated from its luminosity compared to the mass of the same cluster calculated from Einstein's GR theory gave a much larger figure with the latter method (the virial theorem) than could be estimated from the luminosity alone. This observation gave birth to the theory that dark

invisible matter must exist to explain the difference, if GR was to match reality.

From the get go, it is easy to observe that most of the matter in the solar system, from asteroids, planets, interplanetary particles and debris of all sorts captive of the Sun's magnetic field, up to and including the Oort cloud and interstellar dust and debris clouds of all natures that accumulated since the beginning of the universe, do not emit light that would be observable from intergalactic distances, and that the little light that they reflect is too weak to be detectable by our instruments. The mass of all that non light-emitting matter would also appear as not being computable from the luminosity of the Sun, not speaking of the masking effect that it would contribute by interfering with the Sun's light as observed from such distances.

Given the age of the Universe and the constant ejection of matter from stars and stars' coronas from their birth on (See **Chapter 21), doesn't common sense at least suggest that the greater part, if not all of the so-called missing mass (as calculated from luminosity) could simply be this very normal type of matter, undetectable at such distances simply because it is not hot enough to emit light or not reflective enough to register on our instruments? **

But more exotic theories seemed much more appealing from the start to the community at large that tended to prefer postulating the existence large quantities of some hypothetical abnormal, unknown, undetectable and all pervading extra dark matter

, and to postulate the existence of just as hypothetical undetectable strange or dark energy

to explain

the discrepancy.

Shouldn't the fact that none of the exotic particles conjured up in theory to replace normal matter as composing dark matter

are nowhere to be found on the Earth nor in the Solar system but would be plentiful far away where it is impossible to detect them be a telltale that they may not exist at all?

Hasn't time come to return to simple and logical explanations and to explore these avenues that are much more promising in concrete results? Let us see where following the lead provided by the really scatterable existing particles detectable here on the Earth will lead us.

We will also see that the Higgs boson, which is a simple fleetingly existing parton that quickly decays into more stable particles like all other partons (See **Appendix C) is not even required to explain the existence of mass, and that a much simpler explanation directly stemming from simple energy inertia and electromagnetism completely justifies the existence of mass. **

*3 Maxwellian Space Geometry *

*3 Maxwellian Space Geometry*

*3.1 Neglected Maxwellian Space Geometry *

[**Back to TOC] **

Maxwell’s theory has traditionally been considered strictly from the mathematical viewpoint offered by his famous equations (**Section 7.7) and understood within the restrictive perspective of plane wave treatment, leaving the space geometry that underlies it to be mostly taken for granted, since it is sufficient for the needs of the continuous wave concept, which in turn is sufficient for precise calculations at the general level. This classical space geometry is of course the traditional Euclidian 3-dimensional flat space geometry to which the time dimension is added to justify motion. **

Just like the habit of using the electromagnetic tensor to refer to the electric and magnetic fields relation as a single electromagnetic field

concept keeps away from permanent awareness the fact that both fields are of equal and separate importance in Maxwell's theory, with different and irreconcilable characteristics, the habit of using plane wave treatment leaves in the background the fact that the wavefront of the electromagnetic wave of Maxwell’s theory can only be in spherical expansion from its point-like source, a point-like source confirmed by experimental reality.

Maxwell's theory is in fact the natural end result of the integration of many discoveries made previously. His first equation is Gauss' law for electricity; his second equation is derived from Faraday's law, his third from Gauss' law on magnetism and his forth is a generalization of Ampere's law. What Maxwell did in fact was unify into one coherent integrated theory all these experimentally confirmed laws that were not clearly linked to each other previously.

But his really brilliant personal contribution was his success in mathematically linking Faraday's law and his modified Ampere's law in such a way that no doubt could remain that light was intimately linked to electricity and magnetism, as confirmed by Faraday’s experiments on light polarization by magnetic fields. Linking them provided as a side benefit the only way ever devised to calculate light velocity from first principles, a velocity that is the only velocity possible from these equations since it rests on the products of only two other fundamental constants, that is, the electric permittivity and magnetic permeability constants of vacuum.

As already mentioned, a fundamental and thoroughly verified aspect of Maxwell’s theory is the mandatory state of orthogonality that must exist between the electric and magnetic fields of free moving electromagnetic energy, both fields also being normal to the phase velocity vector that identifies the direction of motion of any point considered on the wave front of the spherically propagating wave

. Experimental reality reveals that this triple orthogonality also applies to the motion of charged massive particles, such as electrons being forced to move in straight line when subjected to equal density external electric and magnetic fields.

Indeed, any elementary textbook on electricity and electromagnetism explains how the vectorial cross product of an electric force and a magnetic force being applied to a charged particle can generate a velocity vector in straight line forcing that particle to move in a direction perpendicular to both forces, which is represented in classical electrodynamics from the Lorentz equation, by this well known relation:

To avoid confusion between symbol E for energy and the very similar symbol **E **for electric fields, we will generally use **E **to symbolize electric fields, and conserve E for energy (in Joules or eV). By symmetry, we will use **B **for magnetic fields.

The following orthogonal basis will be used in this book: **a) **3-D rectangular coordinate system, and corresponding rectangular vectorial base and **b) **the corresponding rectangular electromagnetic fields vs velocity vector base

or rather, in the present context, under the form of a vectorial cross product

and since θ must be equal to 90o by definition in the present case:

It must be clearly understood also that despite the precision of the calculations that Maxwell's theory allows at the general level, it is deemed unable to directly describe photons as discrete localized electromagnetic particles since it is founded on the notion that electromagnetic energy is a continuous wave phenomenon.

**3.1.1 Discrete particles as the only possible support of electromagnetic properties **

Maxwell's theory, as a matter of fact, was designed to account for electromagnetic energy behavior at the macroscopic level without the need to take quantization into account (which had not yet been clarified in Maxwell's time), that is, by treating electromagnetic energy as a featureless energy density per unit volume or featureless energy flow per unit surface rather than by adding the energy of localized moving electromagnetic photons enclosed in a unit volume or flowing through a unit surface, that would take localization into account and would represent just as well all observed electromagnetic phenomena at the macroscopic level.

Considering that the electromagnetic waves

that Maxwell conceived of were meant to animate what was perceived from our macroscopic level as a still hypothetical underlying and all pervading ether

, then if some means was found to associate to each individual localized photon all of the electric and magnetic properties that characterize the electromagnetic wave of Maxwell's theory, this would remove the theoretical need for the existence of such a supporting all pervading medium for the purpose of supporting continuous electromagnetic waves, that we now know are nonexistent.

Let us note also that a second theoretical use of the various forms of the concept of ether was for it to constitute the very substance that massive particles were made of as singularities

that developed in such all pervading ether fields in a variety of theories. Now if the kinetic energy, of which discrete localized photons are demonstrably made, turns out to have physical presence

with a volume

that can be measured, this would altogether remove the last reason that would justify resorting to the theoretical concept of ether as a basis to explain the fundamental level of physical reality.

All the more so since it has been conclusively verified since the 1930's that massive electron and positron can be made from destabilizing electromagnetic photons containing at least 1.022 MeV of this kinetic energy (See **Section 9**). Head-on collision experiments between beams of electrons and positrons ([56]) even lead to suspect that protons and neutrons could be stable adiabatic equilibrium states involving triads of electrons and positrons that could have interacted in such a way that they could have locally adiabatically accelerated until they reach these two ultimate and irreversible equilibrium bound states (See **Section 15**).

Of course, such a possibility seems at first glance to be in total contradiction with the Principle of conservation of energy, but a separate analysis reveals that it may not be the case (See **Section 16.2.2**). If confirmed, this discovery could explain the existence of all atoms in the universe.

We must not forget either that even if ether can be done away with, more and more data seem to indicate that here on Earth, we are permanently immersed in an all pervading interacting magnetic fields combination involving the Earth’s magnetic field moving through the immense magnetic field of the Sun that reaches way beyond Pluto, which also interacts with the local magnetic fields of the other planets of the Solar system, and finally there seems to be little doubt that the global magnetic field of our local galaxy also interacts with the Sun’s magnetic field.

So, whatever the final solution will be, it will mandatorily involve this all pervading underlying medium in what we consider as the total vacuum of space.

**3.2 The Issue of Intensity Conservation with Maxwell's Spherically expanding Wave Concept **

[**Back to TOC] **

This leads to attempt clarifying why an acceptable description of electromagnetic photons as stable permanently localized moving particles, in line with their demonstrated point-like localization at the moments of emission and capture, more than one century ago, has not yet been successfully reconciled with the verified aspects of Maxwell's theory, particularly after Louis de Broglie elaborated his intriguingly promising hypothesis ([1]).

According to Maxwell's theory, the electric and magnetic aspects of a wave must of necessity always be in phase at the wave front as described in the following familiar representation, that is, at maximum at the same moment for the wave to exist at all and propagate.

When both aspects are 90o out of phase, we obtain a standing wave.

But as an intriguing dead end, if both aspects are set 180o out of phase, we end up with the exact equivalent of both aspects being in phase (Figure A)!

Also, it is the conjunction of both fields, in phase and at right angle with each other at all points of the wave front that is deemed to maintain the intensity of the energy of the wave at every points of the wave front, despite the inherent spherical spread involved from the mandatorily point-like origin of such a wave. This issue is of course familiar to all in the physics community but is accepted as being a yet unexplained and unavoidable axiom.

Mathematically speaking, when any local point of the curved spherical wave front surface is considered, this surface can be locally approximated to a plane surface at the infinitesimal level and this is the origin of the plane wave

equations set.

But space being three-dimensional, such treatment with the plane wave analogy can only be a mathematical approximation obscuring the fact that if such an electromagnetic wave really physically existed, it could only be in spherical expansion from its initial point-like state, assuming unbounded isotropic expansion. So plane wave treatment applied to Maxwell's theory currently does not describe electromagnetic energy as it starts existing at its point-like source, but only after the expanding wave has begun to propagate.

Also, the geometry of such a spherically propagating wave would be much more similar to the spherical expansion of a sound wave from its point-like source in some underlying medium than to the propagation of waves on a plane liquid surface that immediately comes to mind when thinking about plane wave treatment. It then becomes very difficult to accept the idea that the initial intensity of the point source of the wave could be arbitrarily multiplied in such a way that it could be measured as equal to the energy of the source at any point of the expanding spherical wavefront at any arbitrary distance from the source, as plane wave treatment seems to allow.

So the habit of dealing with the state of orthogonality of both fields with respect to each other and to the direction of motion in space of any point on the already expanding wavefront always leaves in the background the fact that such a spherically expanding wave can only be a single electromagnetic event originating from a single point-like source.

**3.3 Applying EM Properties to Maxwell's Wave Point-like Source **

[**Back to TOC] **

Now, considering that the electromagnetic event is a single event, could it not be imagined that after appearing at its point-like origin, it could be represented as remaining locally point-like as it starts moving, harmonically oscillating as it moves, which is what de Broglie's hypothesis implies, instead of spherically expanding as Maxwell's theory implies by definition?

This would involve a precise trajectory being followed by this electromagnetic event, which would then behave point-like from emission to capture, which would in turn be in total harmony with the verified fact of its point-like capture, however much time could have elapsed after it was emitted and whatever distance it could have covered before being captured. This would also directly explain why the initial intensity of this electromagnetic quantum is conserved, barring energy losses or gains through red or blue shifting due to gravitational interaction along the path that it would have followed.

The idea naturally comes to mind then that the state of fundamental orthogonality of both electric and magnetic fields could possibly be served just as well, if not better, by being defined with respect to the electromagnetic event immediately as it initially appears point-like, instead of after spherical expansion is already under way. But the apparently insurmountable issue with this approach in classical electrodynamics is the mathematically assumed infinite energy associated with such a punctual electromagnetic concept.

Another problematic issue comes to light with the