View Single Post
Old 05-17-2009, 03:36 AM   #36
VulcanRevenge
FFR Player
 
VulcanRevenge's Avatar
 
Join Date: May 2009
Posts: 13
Default Re: 1 + 1 = Invalid: cannot add distinct objects

I think I understand your argument, Saik0Shinigami. "But really, why bother anticipating those circumstances? Just hit backspace and continue on :P" I completely agree with you on this idea. I'm not trying to say we should be paranoid and worry about uncertainty in objects or situations where uncertainty is rather low. Sadly I have to repeat this again, that "this is why uncertainty should be understood as something that applies to all things but only to a somewhat limited degree." Understand that we should feel confident in the certainty of objects of very low uncertainty, but still at least understand and know that it still has uncertainty. If you feel you can't accept that concept without being paranoid then perhaps a rereading and a rethinking on the concept could help.

Now, as to your argument Reach, you are correct in saying that according to mathematical terms and definitions 1 + 1 would equal 2. However, here is the question, what is the representation of the number 1 in the real world? It could be anything right? Now, you could be referring to something that doesn't actually physically exist in the real world, but when using the number one to refer to an object that takes up space you are referring to something that is an object of either matter or energy.

-Before I continue with this idea, let's establish a basic algebraic concept. According to mathematical definitions, what would x + y equal? Of course, you learned in perhaps 5th grade that "x" and "y" are not similar terms and thus cannot be added. So you would conclude that x + y would equal just that, it would equal x + y because you cannot add variables that are not similar.

I'm suggesting that since no two objects are similar, they are not like numbers that can be added or subtracted, they are more like variables in more than one sense; the fact that they don't necessarily have a value, and also that when you add x + x you get 2x not just 2. At the end of the equation you still have a variable to work out and define. I believe that variable to be the amount of uncertainty that the specific object has (or the level of uncertainty associated with the object). The title should be understood that in reality 1 is a variable that is unlike any other object, so the equation 1 + 1 could be seen as x + y. (Yes, yes, x + y isn't necessarily invalid, it was meant to spark your intellect and interest in how I could consider this equation to have a different result than the presumed one) Since numbers are not always exact in the real world, they have an element of uncertainty to them. This is one reason why in chemistry, or engineering classes teachers allow for a margin of error on your homework assignment answers; it's because in the -real world-, mathematical equations and numbers can get you close, but not always are they exact.

Once again, you bring up a very valid point that in some objects the level of uncertainty is quite low as to cause mathematical reasoning to be very helpful and accurate enough. However, it is only completely accurate, useful, and applicable to real life when the level of uncertainty is so low as to render the variable of uncertainty to be insignificant enough to make, very close to, all case results to be the same. This is true for situations and objects of relatively low uncertainty. For example, yes, if you press the "a" key on your keyboard when your computer is hooked up and your devices work relatively well then you can be confident that an "a" will be registered on your computer.

The main reason why this concept is difficult to understand, I imagine, is because most of your examples are of objects and situations that I consider to have a low level of uncertainty. Where the problems and fallacy of which I speak of comes from, is when we consider these numbers to be accurate in objects and situations of higher uncertainty. For example, trying to predict human behavior. We see this all the time, researchers and marketers think a product will do phenomenally well, and sometimes they are correct. Other times they are completely and utterly wrong. There are many people who do not understand this concept and do not consider the bias, mathematical errors, and uncertainty involved in these studies. This is the fallacy I am trying to help people understand and correct, but only if they are under the influence of it (which, like I've said before, I don't believe many of the posters here are under the influence of this fallacy. This is probably making it hard for them to see the relevancy and usefulness of it).
VulcanRevenge is offline   Reply With Quote