• Quick note - the problem with Youtube videos not embedding on the forum appears to have been fixed, thanks to ZiprHead. If you do still see problems let me know.

BBC reports professor can divide by 0

This reminds me of the very silly media interest back in 1999 of some 12-year old kid in New Zealand or some other far-flung place who was touted as having solved the Y2K problem. It was unbelievable that whatever news organization fell for it didn't get the basic fact that the problem was well understood and the solution was simply hard-slogging through old code, which was well underway.

Same goes for the divide by zero problem. If your software crashes on that, then you've got to build in a little bit better error capture.

Y2K... yeah that sure did expose the ignorance of the masses didn't it? Before 2000, everyone was freaking out because they vastly overestimated the impact the glitch might have and vastly underestimated the ability of programmers to solve it just by recoding the old programs (which, while tough, was still not nearly so tough as to be impractical). Then E2K came and went, and everyone was wondering where that Earth shattering kaboom was. Well, not only was that not going to happen, what little WOULD have happened (little is a relative term, relative to the hype) was fixed in plenty of time. Those that didn't get it fixed didn't really do any damage for not having done so. So, the problem is they just assumed there never was a problem and everyone started thinking programmers were to blame for all the fuss, instead of themselves.

Annoying really... Too many people didn't bother to do any research. The problem was real, though entirely overblown. The majority of electronic devices don't even keep the date, and those that did weren't going to self destruct just because they maxed out their timers. It was just those more important book keeping programs, namely in banks, that were the major issue. The damage would have been in terms of monies and that's about it, but it was solved in plenty of time.

So now there's the "stigma" of that whole episode which by all rights was caused by the media and the masses.
 
Unbelievable! This guy's invention solves a 1200 year old problem! :oldroll:
Just watch the video. It is sooo simple.

Or rather: gives the solution a name.

His approach to solving things is useful for many real world applications:
- I have a solution to all the world's problems
- Okay, so what is it?
- It is my solution. I call it "Golliboggelotz".

Here's the guy's explanation why 0^0 = Nullity:

0^0=0^(1-1)
0^0=(0^1)*(0^-1)
0^0=((0/1)^1)*((0/1)^-1)
0^0=(0/1)*(1/0)
0^0=(0/0)

I am instantly reminded of that old trick that proves that 1 = 2, but sneakily divides by 0 along the way.

mathmistake.jpg


According to Dr. Math, mathematicians usually agree that the solution to the "1200 year old problem" is 1, though for reasons that are not entirely intuitive to me.

ETA: The guy should read this.
 
Last edited:
hey I posted , under 'Erik' a comment on 'Ryan's' number circle thing with 1 on top -1 on bottom and 0 on the right with 'nothing' on the that -0 belongs on the left.
Did I do something clever?
 
Y2K... yeah that sure did expose the ignorance of the masses didn't it? Before 2000, everyone was freaking out because they vastly overestimated the impact the glitch might have and vastly underestimated the ability of programmers to solve it just by recoding the old programs (which, while tough, was still not nearly so tough as to be impractical). Then E2K came and went, and everyone was wondering where that Earth shattering kaboom was. Well, not only was that not going to happen, what little WOULD have happened (little is a relative term, relative to the hype) was fixed in plenty of time. Those that didn't get it fixed didn't really do any damage for not having done so. So, the problem is they just assumed there never was a problem and everyone started thinking programmers were to blame for all the fuss, instead of themselves.

Annoying really... Too many people didn't bother to do any research. The problem was real, though entirely overblown. The majority of electronic devices don't even keep the date, and those that did weren't going to self destruct just because they maxed out their timers. It was just those more important book keeping programs, namely in banks, that were the major issue. The damage would have been in terms of monies and that's about it, but it was solved in plenty of time.

So now there's the "stigma" of that whole episode which by all rights was caused by the media and the masses.
Funny, the Y2K problem was being fixed many years before the public became generally aware of it. I first came on it as a Year 2050 problem in 1988, which is another way of saying it's not a Year 2000 problem, but just a 2-digit year problem. I was working at a portfolio management firm, and yields to maturity on fixed income instruments were calculated against the 2-digit maturity date, wherein the software would take anything <=49 as 21st centurty and anything >=50 as 20th century. Then some damn portfolio manager bought Japanese warrants maturing in 2050. Boom, bad calculations. The fix was a cinch.
 
My favorite part is that he says that "nullity" isn't on the number line. If not then it can't be used in one-dimensional computations such as 1/0 = nullity.
 
a/0 = infinity for all a ∈ C* \ {0}

you can divide a complex number in the extended complex plane (C*), by zero so long as a =! 0

in maths anything is possible :)
 
Not automatically, when you catch this error you can implement a procedure to end the program without necessarily catastrophic consequences.

nimzo
With accompanying techno-babble error message and instructions to phone [insert name, home number and email of person you don't like, probably someone in Accounts].
 
Perhaps the word Nullity could be useful, in some way, if we define it thusly:

nullity: n. The delusion that one has solved the problem of dividing numbers by zero. As in: "My math teacher, Dr. Anderson, has clearly been suffering from nullity, since he replaced the word 'undefined' with a new symbol, and claims he solved a 1200 year old problem."
adjective form: nullible. See also gullible.


I wonder if we can get that added to SkepDic.com, or something.
 
I don't really trust the hyperreal numbers, since it sounds more and more like bad math than a real branch of mathematics.

1/0, Let's see. No way to divide 1 into zero groups, and anytime you take zero away from one or add zero to one, you don't get any closer to zero. We could also find the number c where 0c=1, but anything time 0 is zero, so there is no way (ignoring those iffy hyperreals) to complete this problem. Either way it's undetermined.

Seriously, nullity is stupid. The concept of x/0 (x =/= 0) has been around for a while. Naming it "nullity" does not expand our intelligence by anything more than giving us another word to remember.
 
Last edited:
The scary thing is how the kids lap it up and "totally" accept the teacher's word. It shows what a position of great responsibility teachers hold, and how they step over the line when they decide to teach their flights of fancy as "the truth".
 
My favorite line: "If your heart pacemaker divides by zero, you're dead."

Speaking as someone who has a pacemaker, I cannot imagine a scenerio where a pacemaker would take it upon itself to divide by anything, let alone zero. The device has been quite useless in helping me finish my math homework.
 
rwguinn said:
Dividing by zero usually means either 1) you messed up the calculations, or 2) you screwed up the valid domain
Or there are no data points for which to calculate the mean. Or the array to be sorted is empty. Or the string to be centered is null.
Or you're trying to transform a point on the y axis into polar coordinates, using the arctangent function.

- Myriad
 
I must be clevererer than this professor, because I have invented a way to divide by zero without invoking any "numbers off the number line".

  1. I define "numerator" as the number of indivisible marbles I have.
  2. I define "denominator" as the number of people, excluding myself, I give marbles to.
  3. I define "division" as the number of marbles each person has when I have given each person an equal number of marbles.
  4. I define "remainder" as the number of marbles I have left when I gave all those people an equal number of marbles.
Examples
  1. Suppose I have 2 marbles, and there are 2 people I can give marbles to. That means they get 1 each, and I have lost all my marbles. 2/2=1, remainder 0.
  2. Suppose I have 3 marbles, and there are 2 people. 3/2=1 remainder 1
  3. 4 marbles, 2 people: 4/2=2 remainder 0.
Now dividing by zero

  1. Suppose I have 2 marbles and no one to give it to. That means I am left with 2 marbles: 2/0=0 remainder 2.
  2. Suppose I have 100 marbles and no one to give it to: 100/0=0 remainder 100.
To put it in more general terms:
x/0=0 with a of remainder x.

That also works with 0/0:
Suppose I have 0 marbles and no one to give marbles to. That means I am left with 0 marbles and nobody gets any. 0/0=0 remainder 0.

Find the flaw in that, mathematologists!
 
Article said:
"Imagine you're landing on an aeroplane and the automatic pilot's working," he suggests. "If it divides by zero and the computer stops working - you're in big trouble.

OK, so the idea is that “nullity” will stop a computer program from crashing. So when an value is divided by zero it returns a value of nullity. How do we represent nullity in binary? If I have a 32-bit floating point number variable, what would be the ones and zeros? Or would we need to change the way that all existing number variables in all (or almost all) coding languages handle binary? 32-bit numbers have to become 31-bit number with an extra bit to hold an “is nullity” value or all numbers double bit for the extra “is nullity” bit value?

And even if we did this conversion (which would be a zillion times grander than any Y2K conversions), how would code handle the value of nullity?

Let’s say I have coed that calculates a landing angle for an aeroplane. Where N = 0.

X = 8 / N
LandingAngle = 6 * X

So X = nullity. Woo hoo! That line of code didn’t crash or return an error. Woo hoo!

But what the hell would be six times nullity? Would it be nullity as well? Or a value of nullity six? What would LandingAngle be?

Do you want to be on a plane flying on auto pilot landing at a nullity six degree angle? And after you crash and die, the programmers tell you everything worked perfectly because no errors were thrown?

You can do the exact same thing with an “On Error Resume Next” statement. It’s easy to resolve 1200-year-old problems when you just ignore them.

WHAT A DANGEROUS DORK, suggesting programmers should ignore division-by-zero errors by assigning a meaningless “nullity” value—especially for critical programs like auto-pilots and pace-makers. :mad:
 

Back
Top Bottom