Why No One Knows Anything
By David Metcalfe
September 14, 2018
Today I would just like to briefly discuss a basic concept of limitations in human knowledge. It is something that has frustrated me for quite some time, and today, after reading some articles from post-modern thinkers, I have become all the more frustrated with the concept. I suppose this is my version of venting, but also, I hope my venting will get your brain thinking a bit, and inspire you to continue on that path.
What I will be arguing today is very simple: you don’t know anything, and neither do I.
Let’s start by addressing something that you definitely think you know: 1+1=2.
It might sound simple, but it’s not. First of all, we don’t even know what 1 is. Do we mean an abstract, theoretical number or a physical object? If we mean an abstract, theoretical number, how do we even know that this number exists? If we mean a physical object, what does being “1” entail?
The classic example is 1 apple and 1 apple being placed together to make 2 apples. But what is “1” apple? Since size is a continuous, infinite quantity, there is no way they are both the same size. So, we can’t base it off their size. What we might base it on is its identity as an apple. That’s basically what Bertrand Russel and Alfred North Whitehead did in their massive collection of mathematical proofs, called “Principia Mathematica”.
But it still doesn’t solve the problem of applying abstract identities to concrete objects. Like, why do we call it an “apple” anyway? What if we had instead called it an “orange”? Obviously, the name is arbitrary, based in nothing. But it does need to have an agreed upon identity. Like, we couldn’t call some apples “oranges” while still referring to most apples as “apples”, because our language system requires shared identities in order to make any sense at all. If we dropped identity-language adherence, we would essentially be reduced to babble.
Ok, so we agree that if a certain fruit is a descendant of the thing we have decided to call an “apple”, then it is, in fact, an apple. But what if one of the apples had a bite taken out of it? Then would it still be 1 apple? What if it had 50 bites, and was just a tiny piece of core left? What makes 1 apple, 1 apple?
And if we can’t even figure out how 1 object can exist, good luck trying to figure out how 2 exist.
1+1=2 can get really complicated really quickly. Brilliant mathematicians, to this day, spend years just trying to figure out whether 1+1=2. By the 23rd century, they may get to working on 2+2=4 🙂
The point I’m trying to make here is this: 1st graders know that 1+1=2, and genius mathematicians don’t.
The Tiny Knowledge Range
So, how does a 1st grader come into the knowledge that 1+1=2? Well, it’s what they are taught. But it’s also something they would likely come about through experience. They would see a tree beside another tree in a field, and they would recognize those as two distinct things, both defined as “trees”.
But if a person were, in theory, to never come across multiple objects their entire life, and were never taught that 1+1=2, would that still be true to them?
There are two kinds of knowledge: a posteriori and a priori. We’ve already discussed a posteriori; it is the knowledge gained from experience in the world, like being taught something, or seeing two trees. But a priori is knowledge that can be gained through reason alone. For example, if A is equal to B, and B is not equal to C, then we know that A is not equal to C.
While it’s great that certain pieces of knowledge can be acquired through reason alone, it needs to have some kind of content to work with. So yes, we can deduce from things we know to be true, but our a priori is still heavily dependent on our a posteriori. Or, essentially, we need to know at least some things from experience in order to know anything from independent reason. It’s as Immanuel Kant said in his famous work “Critique of Pure Reason”, that, “All our knowledge begins with the senses, proceeds then to the understanding, and ends with reason. There is nothing higher than reason.”
But here’s the problem: we know very little about anything, and the things we think we know, we don’t actually know.
Let’s say you went to the world’s best historian and asked them a few questions. I guarantee that even if you asked them the birth year of 10 very prominent historical figures, they would get several wrong. Even a historian who spends their entire life studying one year in history will still not know the vast majority of things that happened in that year. Same goes for biologists, sociologists, or any academic discipline.
Beyond that, we don’t know that the information they’re getting is completely true, and in fact, there’s no way it’s completely true. Even if they have the best research methods possible, they still have to interpret the research into a conclusion, and there’s no way to remove all bias, and no way to perfectly process information.
And this is the best conditions possible. Very few of us devote ourselves to academic pursuits at all. Majority of people don’t have the faintest clue about much of anything. But, of course, the less educated people, ignorant of the vastness of possible knowledge and their own bias, often believe that they have everything figured out. It’s as Russell used to say, “The whole problem with the world is that fools and fanatics are always so certain of themselves, but wiser people so full of doubts.”
It’s true, in a great many senses. Of course, most people interpret the quote as meaning things in social and political discourse, which is certainly the case. Like, anyone who thinks they have solved political philosophy or economic policy is clearly not aware of their own ignorance, and there seems to be a lot of simpletons who think they have accomplished that task. But more so, think back to the math example. When we study, we become aware of how much possible knowledge there is, and realize that we have essentially none in comparison. It’s the old adage, “the more you learn, the less you know.”
Is Learning Even Worth It?
The question arises, I think, of whether one should bother to invest in pursuit of knowledge at all. After all, one can study their whole life and not gain a fraction more knowledge than a common idiot. In fact, they might find they know less than the idiot, relatively speaking.
In Bertrand Russell’s fantastic work, “The Will To Doubt”, he says this of knowledge:
“None of our beliefs are quite true; all have at least a penumbra of vagueness and error. The methods of increasing the degree of truth in our beliefs are well known; they consist in hearing all sides, trying to ascertain all the relevant facts, controlling our own bias by discussion with people who have the opposite bias, and cultivating a readiness to discard any hypothesis which has proved inadequate. These methods are practiced in science, and have built up the body of scientific knowledge.
Every man of science whose outlook is truly scientific is ready to admit that what passes for scientific knowledge at the moment is sure to require correction with the progress of discovery; nevertheless, it is near enough to the truth to serve for most practical purposes, though not for all. In science, where alone something approximating to genuine knowledge is to be found, men’s attitude is tentative and full of doubt.”
Basically, all scientific theories are wrong. They are constantly changing. No modern scientist believes in Newton’s theory of gravity, or Darwin’s theory of natural selection as it was when they conceived of it. However, these theories have approximated enough at truth that we can see tangible results, and the results get better as we get closer to the actual truth. Or, in other words, we know that science works because we have cars, cell phones, space crafts, etc. that really do work.
But, in other matters, we may not be able to see tangible results so easily. Russell also said that “Science is what you know, philosophy is what you don’t know.”
And it’s true. There are a great many things in philosophy which are unknown, and possibly can’t ever be known. But just like science, we have figured out certain things that seem to work. Philosophy only works when grounded in a more tangible discipline, such as that of science, history, or social science. We can look back to history as sort of a “scientific test” of whether certain philosophies have worked. It’s not hard to see that we are not interested in having a tyrant control our every thought and action when we look back on history, for example.
The problem that arises, though, is that we are coming at every philosophical idea with some kind of bias. Like, when we think it’s better to not be ruled by a tyrant, we are speaking based on our own preconceived ideas of what good government is. But who’s really to say what’s better? Michel Foucault brings up this question in his book, “Discipline and Punish”. It’s a fair question: why do we, for example, assume upholding individual rights to be superior to maintaining order by any means necessary?
Many people love religion for this reason. It allows one to cling to an idea that “God said so”. It’s like when a parent can’t give a child a good reason for something, so they say, “because I said so”. It’s an appeal to authority that rids real philosophical inquiry, and replaces it with human assumptions, made to think it superior. However, if one does serious investigation into any religion, they find that theology is just as complex and unknowable as secular philosophy. Eventually, studying anything enough will make you realize you know nothing.
The response to this fact, that we know nothing, has been in the form of what is called “post-modernism”. It rejects any knowledge as being superior to other forms, because it is all socially situated, and therefore, biased. We also have so little access to all of the possible knowledge, that we cannot form any objective views on anything, because they will undoubtedly be wrong.
It’s not necessarily a bad way to think, because it does take into account a lot of things, and has potential efficacy to make us less confident in things that we should not be confident in. But I think the trouble lies in that very fact; humans need something to believe in. America was founded through its strong beliefs that all humans were endowed by their creator with natural rights. Then you have thinkers like Jeremy Bentham who come along and offer very strong critiques of it. And that’s good, because any truth should be able to stand the test of argument. But the problem is, if we don’t have any kind of unity in belief, it gets really difficult for us to co-operate together, and even just to live our own lives with a sense of purpose.
The reason that I don’t think religion will ever die is because it offers people a sense of objective meaning and purpose. Philosophy is great, but ultimately, it leaves us not knowing anything. That’s not to say that all philosophers should give up and join some religion, of course, but what it means is that to make philosophy effective in the world, it needs to have some kind of objectivity that places certain things as superior to other things. The enlightenment thinkers thought that great thing was reason. Things that were unreasonable were worse than reasonable things, basically. But, of course, we needed to leave room for emotion, which is where the romantic movement spawned from.
I would think, and I suppose hope, that we can move towards accepting our limitations in knowledge (that are admittedly significant) without throwing out everything we do know. We’ve seen the devastating consequences of treating certain people as superior because of their skin color. We’ve seen theories in biology cure cancer. We’ve seen our physics theories take us to the moon. We’ve seen our psychological theories save people from depression. We’ve seen a political philosophy based on human rights lead to the world’s most prosperous and successful nation of all time.
We’ve seen it, and seeing is believing.
So yes, no one knows anything, but I think we can all believe in something.
I am going to have to go out and say you lost me on applying postmodernism to mathematics. I will say from your wordage that I am a simpleton who knows nothing but yet the more I learn, the more I learn of what I don’t know. So you may disregard this comment as fools talk.
I still, however, have my pool of knowledge to pull from. I think it’s very anti-intellectual to go off the basis that that pool of expertise of physicists is poisoned and cannot be used to reason. It’s a matter of discrete mathematics that 1+1=2 is to be true, and if you want to play with abstract mathematics and proofs, you are more than welcome to, you can quickly come to an endless supply of them which you are implying, but yet are misinforming. Being able to produce more proofs do not iterate what we don’t know but build upon what we already know. Three millennia of the fine-tuning of mathematical theories for you to say 1+1≠2 is quite out there.
Bertrand Russell would even acknowledge the limits to his proofs from the tool and the practicality. It’s like the dialogue between theoretical physicists and the experimental physicists; we are going to disagree when we find that something that can work correctly in the theoretical but can go awry in practice.
To end I do appreciate your comment we can all believe in something, I would love to see how post-modernism is positive when it comes to morality as it is something yet to be seen for myself.
To end I’ll leave this quote from quote Carl Sagan “Science is far from a perfect instrument of knowledge. It’s just the best we have. In this respect, as in many others, it’s like democracy. Science by itself cannot advocate courses of human action, but it can certainly illuminate the possible consequences of alternative courses of action.”
LikeLike
Thanks for the feedback. In hindsight, one thing I would’ve done is more clearly defined what it means “to know” something. People seem to be getting confused between “having knowledge” and “being absolutely certain”. I use “to know” interchangeably many times, but am just hoping people get it by the context (which is not an ideal way to do it).
When I say, “no one knows anything”, I am trying to communicate that no one can know anything with absolute certainty. The kind of provocative, hook aspect is that they initially think I may be attempting to argue that no knowledge exists at all.
I wanted to go straight into the mathematics aspect because I know that is a huge barrier to people accepting post-modernism. If I argued it from an ethics or religious perspective, people would be thinking in the back of their minds, “well…we know there’s objective truth because of mathematics.” The main point with the math argument is that saying even the simplest, most self-evident thing possible carries with it unlimited intellectual baggage. Once we say that 1+1=2, there are tons of assumptions being made that we are not even aware of. And don’t worry, I’m definitely not saying that 1+1 does not equal 2! I’m just saying that we can’t prove this truth, although seemingly self-evident, with absolute certainty.
I think it would be really interesting for me to write something arguing in favor of post-modern morality. I think I am up to the task but it would be difficult, for sure.
LikeLike
Ah, yes I see your point! We can know 1+1=2 but yet we are lost in what fundamental got us there. So, therefore, getting to your argument. Math in nature is not objective, it is an art that allows for the communication of very complex and meaningful truths. It takes scientists to discern what truth they can derive out of there findings which results in many messages communicated (whether it be global warming or datasets from other disciplines etc.)
My own epistemology would render that the telos of this argument is that mathematics allows for the communication of objective truth because it is seen as fundamental. I’ll even quote the Nobel Prize winner Paul Dirac who said: “God used beautiful mathematics in creating the world.” Paul Dirac however not a Christian saw beauty and complexity in his own mathematics. This, however, blows in the wind with post-modernism and invalidates my opinion as just an opinion. Which I think is what you are trying to hurdle jump.
As a thought experiment, I would argue that the fundamental is inherently present as if I was deaf, blind, could not smell, taste or touch, I would have the concept of what is 1, but because I have never been those things and have the privilege of not ever being that way, I would never know for certain.
LikeLike
I agree with that. There is likely some kind of objective truth in many things, and definitely mathematics is the most clear one.
There is a huge problem right now with post-modernism, not in the theory itself, but in the low quality thinking of many who embrace it. The majority of post-modernists represent a view that is not true post-modernism. True post-modernism, as held by top theorists in the field, is much more advanced. It doesn’t necessarily reject objective truth as existing theoretically, and CERTAINLY doesn’t suggest that all opinions are equally valid. It is more so trying to incorporate a broader, more critical understanding of truth by considering social and historical contexts, subjectivity of experience, and a host of other important critical concerns.
Common critics of post-modernism often criticize its lower form.
LikeLike