Thread: Eric Dollard
View Single Post
Old 12-01-2012, 02:25 PM
rickinva rickinva is offline
Junior Member
Join Date: Apr 2010
Location: Manassas, VA, USA
Posts: 24
Originally Posted by GSM View Post
If a + b = b

then 'a' must be zero

and apart from the impossibility of multipling or dividing anything by zero, of course 2 zeroes equal *0*.

Mathemeticians make these conceptual mistakes all the time -

just like Einstein did !

Cheers .......... Graham.
The mistake was NOT in making "a" = 0. It occurred at the step where both sides were divided by (a - b). Since a = b, then (a - b) = 0 and division by 0 is NOT allowed for very good obvious reasons.

So, actually "a" did NOT have to = 0, it could have equaled ANY value originally.

Reply With Quote