Discover millions of ebooks, audiobooks, and so much more with a free trial

Only $11.99/month after trial. Cancel anytime.

The Infinity Delusion
The Infinity Delusion
The Infinity Delusion
Ebook255 pages6 hours

The Infinity Delusion

Rating: 0 out of 5 stars

()

Read preview

About this ebook

If Richard Dawkins had decided to write a book about irrational beliefs in mathematics this is the book that he would have written. This is a book that is both controversial and illuminating and which forces us to take a fresh look at what we believe and why we believe it.

The book asks why people might believe that numbers ‘exist’, rather than simply being a concept of our minds? In particular, why should we believe that numbers that consist of the sum of an infinite number of other numbers added together exist?

If you think that this is an inconsequential philosophical question, then this book will make you think again. It presents powerful and lucid arguments that demonstrate that many mathematical findings actually depend on a hidden incorporation of such beliefs, and shows precisely how this can happen.

If you have any interest in how some of the strangest results of mathematics are generated, then this book is sure to fascinate you.

LanguageEnglish
PublisherJames R Meyer
Release dateMar 3, 2014
ISBN9781906706036
The Infinity Delusion
Author

James R Meyer

I am interested in how little attention is paid to the limitations of the language when it is used to make statements that are supposedly logical. The consequences of this are particularly evident in mathematics, where there are theories that are based on the philosophy that numbers and other mathematical concepts are ‘actual’ things that exist independently of any physical reality. Such beliefs are commonly held on an almost subliminal level; most people have never taken the time to carefully examine the basis and the consequences of such beliefs. It is because of such beliefs that detailed considerations of language are ignored - the ‘actual’ non-physical reality is considered all-important - with the result that a detailed evaluation of the possibility of errors due to limitations of language is generally considered unnecessary.Every statement has to be stated in some language. If assumptions are made that ignore some aspects of the language of the statement, then how can we be sure that the statement is entirely logical? In particular, when a statement refers in some way, either implicitly or explicitly, to some language, whether it is the language of the statement itself or some other language, there is a significant possibility of confusion.Unless every aspect of such statements is very carefully analyzed, a statement that superficially appears to be logical may actually contain subtle errors of logic. In my work, I show how such errors can occur and how we can avoid such errors by careful analysis of language.

Related to The Infinity Delusion

Related ebooks

Science & Mathematics For You

View More

Related articles

Reviews for The Infinity Delusion

Rating: 0 out of 5 stars
0 ratings

0 ratings0 reviews

What did you think?

Tap to rate

Review must be at least 10 words

    Book preview

    The Infinity Delusion - James R Meyer

    The Infinity Delusion

    How irrational mystical beliefs have

    hindered the progress of mathematics

    by

    James R Meyer

    eBook ISBN: 978-1-906706-03-6

    Version: 5th Edition: 2 June 2023

    Copyright © James R Meyer 2013 - 2023

    Cover Design © James R Meyer 2013

    All Rights Reserved

    www.jamesrmeyer.com

    Please respect the copyright on this book. Please purchase another copy if you would like to give someone a copy of this book. Copyright means that this book may not be copied, lent, resold, hired out, or otherwise given to any person other than the purchaser, by way of trade or otherwise, in any format unless the publisher has given written consent.

    The format of this book was tested on several devices and was found to display reasonably. However, please note that many eBook readers have limitations and not all devices will give an identical display. Also note that many eBook reader apps do not give an accurate rendition of the author’s formatting. Please note that if this eBook is converted into other formats, it may not display correctly.

    Dedication

    For my dear wife Ann, who has been an endless source of support and who has taught me so much about life.

    Contents

    1. Myth: Humans are Innately Logical Beings

    2. Is Mathematics Unreasonably Effective?

    3. Falsifiability

    4. Fantasy-mathematics

    5. ‘Pure’ and ‘Applied’ Mathematics

    6. The Strange Story of Platonism

    6.1. Non-Physical ‘Existence’

    7. A Short History of Numbers

    8. Sums of an Infinite Number of Fractions

    8.1. Sums of Positive and Negative Numbers

    9. Are some Infinities ‘Bigger’ than other Infinities?

    9.1. The Courant & Robbins Contradiction

    9.2. Further Absurdity

    9.3. Lebesgue Measure Theory

    10. What about Cantor’s Diagonal Proof?

    10.1. Some Preliminaries to the Diagonal Proof

    10.2. Real numbers and Language

    10.3. The Diagonal Proof

    10.4. The Secondary Argument: Indefinable Real numbers

    10.5. One-to-one Correspondences and Properties

    10.6. A List of Real numbers with no Diagonal Number

    10.7. The Platonist Assumptions attached onto the Diagonal Proof

    11. Finite and Infinite

    11.1. Different types of Limitlessness

    11.2. Potentially Limitless?

    11.3. Limits and Limitlessness

    11.4. The Continuum Hypothesis

    12. Fallacy: Natural set theory is inconsistent

    12.1. Russell’s paradox

    12.2. Natural Set Theory

    12.3. Fantasy Set Theory

    12.4. The Fantasy ‘Empty Set’

    12.5. Myth: Fantasy-Set theory contains all of Mathematics

    13. Truth

    13.1. The Origin of the Notion of Truth

    13.2. Mathematical Truth

    13.3. Fantasy-set theory, Platonism and ‘Truth’

    13.4. Absolute Truth

    14. Fantasy misnaming

    15. Epilogue

    Appendices

    Appendix A: Number Systems

    Appendix B: What is 0.999…?

    Appendix C: Cantor’s Original 1891 Diagonal proof

    Appendix D: Cantor’s First ‘Non-Listability’ Proof

    Appendix E: The Power Set Theorem

    Appendix F: A Hypothetical Translate Function

    Appendix G: Geometric Series

    Appendix H: Listing the rational numbers

    Appendix I: The Law of the Excluded Middle

    Appendix J: Surjections

    Glossary

    Suggested Reading

    Notes to the Text

    1. Myth: Humans are Innately Logical Beings

    ‘When dealing with people, remember you are not dealing with creatures of logic, but with creatures of emotion, creatures bristling with prejudice, and motivated by pride and vanity’, Dale Carnegie, 1888-1955, writer and lecturer.[¹]

    Many people have a belief that humans, unlike other creatures, are born with a keen ability to think in a logical and analytical way. But the reality is that analytical logical thought does not come naturally to humans; man evolved to survive in an inhospitable environment, not to solve complex analytical problems. But because of the common presumption to the contrary, many people, when they find that they have great difficulty in trying to think in an entirely logical and analytical manner, assume that they are in some way deficient and they give up, rather than persevere and learn the required knowledge and techniques. They do not realize that they are not alone in finding it difficult to think in a strictly logical way.

    Even those trained in logical thinking often fail to think logically, despite their years of experience in such thinking. They are quite capable of coming to illogical conclusions in the same way as the rest of humanity; there are numerous examples of trained logicians arriving at erroneous conclusions because of mistakes in their reasoning. They may be marvelous when it comes to manipulating complex symbols on a sheet of paper according to predefined rules, but that does not mean that they are immune to having any illogical thoughts.

    For the vast majority of the day, people don’t use precise logical thought processes as they go about their everyday tasks. Decisions are made without precise rules of logic being applied to the decision process. During the course of a day, hundreds of decisions are made. Because there are so many possibilities and so many unknowns, it is impossible to process such decisions in a reasonable time without taking certain shortcuts. That does not mean that such decisions are wrong. What it does mean, is that simply because a person is good at making everyday decisions, that does not necessarily mean that that person is good at thinking logically and analytically.

    Alan Cromer, in his book, ‘Uncommon Sense’ [²] challenges the conventional assumption that humans are innately logical beings, and concludes that purely objective thinking – that is, being able to think without being influenced by pre-formed opinions – is not an attribute bestowed on the human species by evolution. In fact, as Cromer points out, it is only in the past 2500 years or so that there has been any sustained practice of rational thought. Around 332 BC, Alexander the Great founded the city of Alexandria and created a grand centre of learning in the city, and the foundations of modern rational thinking arose from that centre of learning. Had that not happened, the history of human civilization might have been very different. It’s quite possible, Cromer argues, that the technological world that we live in today, founded as it is on rational and scientific thinking, isn’t the inevitable consequence of human evolution that it is generally believed to be. In genetic terms, we are virtually identical to the humans of the era of Alexander the Great; and also to the humans who lived 5000 to 20000 years ago, long before Alexander the Great. If Alexander had not established an environment in which learning and rational thought could be communicated efficiently between many humans, surviving over many generations, our modern technological society might not have developed as it did. Before Alexander, human knowledge was passed on piecemeal, without any organized structure.

    Before Alexander, there were many leaders who had as many resources as Alexander, but they didn’t make the leap that Alexander did. By and large, kings and other rulers wanted to control and hold on to knowledge rather than disseminate it. In 332 BC, human evolution hadn’t reached a point where it was inevitable that some leader would do what Alexander actually did do. Our technological and scientific civilization hasn’t developed inexorably along a smooth course from the time our primitive ancestors started to use sticks and stones as tools. The assumption that all our technological and scientific progress has developed because humans always think rationally and logically isn’t necessarily accurate – humans often don’t think rationally and logically. Throughout history, many technological advances have been made before there was a full understanding of the principles underlying those advances.

    Rational and logical thinking must be learnt, and must be cultivated, and it requires a great deal of practice before anyone becomes proficient at it. If anyone really wants to be able to think logically, then they must be able to cast their preconceptions aside and free themselves of fixed ideas. They must be able to stand back and think dispassionately about those things that they always thought to be true. They must examine them closely, and they must ask themselves, Can I give any evidence to support those beliefs, evidence that would stand up in a fair court of law? If they can’t, then it’s time for them to seriously question those beliefs. However, many people, when confronted with this challenge, become inhibited and uncomfortable and retreat back to the safe position that they always knew. Bertrand Russell summed up this state of mind perfectly when he wrote:

    ‘If a man is offered a fact which goes against his instincts, he will scrutinize it closely, and unless the evidence is overwhelming, he will refuse to believe it. If, on the other hand, he is offered something which affords a reason for acting in accordance to his instincts, he will accept it even on the slightest evidence. The origin of myths is explained in this way.’ [³]

    There is a wonderful sense of liberation awaiting anyone who dares to experience the elation that comes from discovering things afresh. But they won’t be able to unless they make a determined effort not to think like the man that Russell describes. If they want to feel the liberation of discovery, they mustn’t be afraid to think, and instead, they should revel in the freedom that results when they question their deeply held beliefs. As Thomas Jefferson, third President of the United States once said:

    ‘Neither believe nor reject anything because any other persons have rejected or believed it.’ [⁴]

    2. Is Mathematics Unreasonably Effective?

    ‘All the mathematical sciences are founded on relations between physical laws and laws of numbers, so that the aim of exact science is to reduce the problems of nature to the determination of quantities by operations with numbers’, James Clerk Maxwell, 1831-1879, physicist and mathematician.[¹]

    Some sixty years ago, the physicist Eugene Wigner published an article with the title ‘The Unreasonable Effectiveness of Mathematics in the Natural Sciences’.[²] In the article Wigner claimed that because mathematics is so successful as the basis of so many scientific theories, then it would seem that all of mathematics must include some essential kernel of an underlying truth. Since then, Wigner’s notion seems to have found a following among people who wish to see a deeper meaning in mathematics. Many people will cite examples of where a mathematical idea had been invented, and many years elapsed before it was found to have an application in some real world scientific theory. They will point to those instances and say, isn’t that amazing!

    But is it? Is it any more amazing than when someone can take a vaguely worded astrological prediction in a magazine and they discover that you can interpret it as being applicable to an actual real world scenario? At which point some people might think, well - isn’t that amazing – and, at the same time, conveniently forgetting all the other times when your astrological predictions had been hopelessly wide of the mark – and so ignoring the possibility of simple coincidence.

    So, is the notion that mathematics is unreasonably effective in real world scientific theories just another instance of seeing what you want to see – by attributing more importance to coincidences than is warranted – or is there something more to it? In fact, it doesn’t take long to discover that Wigner’s notion is irredeemably flawed – because it conveniently ignores all those mathematical ideas that have never found any applicability in real world scientific theories. Supporters of the notion can only point to the mathematical ideas that have successfully been used in real world science. But there are plenty of mathematical ideas (or perhaps that should be: ‘ideas that are called mathematical’) that have never found any application in real world science. For example, suppose for a brief moment that there was a theory of physics that was based on mainstream set theory.[³] In that set theory, there is a proof of a theorem called the Banach-Tarski theorem, which states that, given a sphere of a certain size, that single sphere is equivalent to two spheres of the same size as the original sphere.[⁴] So, using that set theory, we would have a physical theory that states that a ball of a given size is equivalent to two balls of that same size. One wouldn’t call this theory a demonstration of the unreasonable effectiveness of mathematics in its applicability to science of the real world. Indeed we would remark on the converse, and point out how far the theory is removed from reality.[⁵]

    If you look at it dispassionately, it’s quite easy to see that Wigner was simply choosing those instances where mathematics is effective and ignoring the huge mass of what is called ‘mathematics’ that simply isn’t scientifically effective. And the real reason why we might observe that so much of mathematics forms the underlying basis of so many scientific theories is simply because the mathematics that has actually been used in real world applications is a mathematics that has grown from empirical observation of the real world.

    3. Falsifiability

    ‘At the beginning of this century a self-destructive democratic principle was advanced in mathematics, according to which all axiom systems have equal right to be analyzed, and the value of mathematical achievement is determined, not by its significance and usefulness as in other sciences, but by its difficulty alone, as in mountaineering. This principle quickly led mathematicians to break from physics and to separate from all other sciences. In the eyes of all normal people, they were transformed into a sinister priestly caste of a dying religion, like Druids’, Vladimir Arnold, 1937-2010, mathematician.[¹]

    Over the years there has been controversy over what it means when we say that we think that a scientific theory is correct. The philosopher Karl Popper (1902-1994) argued that scientific theories can never be considered to be unquestionably correct, and that they are inherently hypothetical. He argued that the only way we have of reckoning whether a scientific theory might or might not be correct is by testing the predictions of the theory against actual events. He went on to argue that if you look at it logically, then it doesn’t matter how many tests fit in with the predictions of the theory, you still can’t be absolutely sure that the theory will always and invariably give a correct prediction. On the other hand, a demonstration that a prediction of the theory is wrong is enough to show that the scientific theory isn’t completely correct.

    Popper’s concept of ‘falsifiability’ lay at the heart of his philosophy of science. He claimed that for a theory to be considered a valid scientific theory, it must be possible to perform tests on it – tests that could reveal an inconsistency between what the theory predicts, and what actually happens in the real world. In other words, a theory can only be considered scientific if and only if it is falsifiable.[²] Various attempts have been made to wriggle out of this limitation on scientific theories – unsuccessfully.

    While Popper’s arguments were directed at scientific theories that deal with real world scenarios, the fact is that mathematics originally began as a scientific theory. When humans began using numbers, they used them in real world situations, such as in simple counting and adding. Researchers have found that when primitive people are confronted with what we would consider extremely simple mathematics, they will not accept it until they are shown that it does apply to real world situations.

    But today it is the fashionable view in mathematical circles that mathematics is completely independent of the real physical world. A common modern view is that the only consideration that we need apply to a mathematical system is whether the system can or cannot result in a contradiction. A mathematical system that can never result in a contradiction is called a consistent system. Clearly an inconsistent mathematical system is undesirable. However, most of today’s mathematicians consider any other aspect of a mathematical system to be immaterial – as long as the mathematical system never produces a contradiction, then they will claim there is no mathematical system that is ‘better’ than another, and all such consistent mathematical systems are considered to be equally ‘good’.

    An essential part of all mathematics is the idea of a proof. A mathematical proof results when you start off with a certain number of initial assumptions, and then, by a sequence of logical steps, you arrive at a conclusion. Since all the steps taken in arriving at the proof are completely logical, then the conclusion that you arrive at must be absolutely inevitable (provided that everything is clearly defined and your mathematical system is consistent). And that makes a mathematical proof somewhat special. Every mathematical proof is based on a set of assumptions, and the result of the proof develops from that set of assumptions by a sequence of logical steps. A proof is considered to be correct if there are no mistakes in any of these logical steps. But simply because there are no mistakes in any of the steps of the proof does not mean that the proof is applicable to real world situations. To be able to say that the proof applies to a real world situation, then, if you are applying a mathematical system to a real world situation, you must be able to show that it is good model of that real world situation. That means that you should be able to make predictions using your mathematical theory that can be shown to apply to the real world. And if you can do that, then your claim that your mathematical system is a good model of a real world situation is testable. In that case, Popper’s concept of falsifiability does apply – it applies to any result that arises from a mathematical model of a real world physical situation.

    But if we look at some mathematical theories, we find, for example, proofs that state that there ‘exist’ in some non-physical way, infinities that are ‘bigger’ than other infinities, where the ‘existence’ is an existence that is entirely independent of the real physical world. A careful logical analysis of these proofs reveals, as will be shown later, that these proofs all rely on hidden existential assumptions such as that there ‘exist’ in some non-physical way, real mathematical ‘things’ which ‘exist’ completely independently of any definition we might make. The

    Enjoying the preview?
    Page 1 of 1