389 posts
Posted 30 May 2015 - 03:05 PM
I'm not sure if this belongs here, just found it slightly interesting. If someone would care to explain why 100 / 0 returns INF, same goes for -0 and -INF, rather than returning 0.
1140 posts
Location
Kaunas, Lithuania
Posted 30 May 2015 - 03:20 PM
Because you cannot divide by zero. In other programming languages dividing by zero errors, but Lua decided not to, for some reason. The logic here would probably be: start dividing 100 until it gets to zero. For example: 100 / 2 = 50; 50 / 2 = 25; 25 / 2 = 12.5; and so on… until it gets to zero, which would take infinite times of dividing, thus returning infinite.
Edited on 30 May 2015 - 01:20 PM
389 posts
Posted 30 May 2015 - 03:22 PM
Okay, thanks for clarifiying.
2679 posts
Location
You will never find me, muhahahahahaha
Posted 30 May 2015 - 03:30 PM
It returns INF because an infinite amount of 0s can fit into 100.
1426 posts
Location
Does anyone put something serious here?
Posted 30 May 2015 - 04:16 PM
Javascript also implements 100/0 as Infinity. For those who are interested, this is because the
IEEE floating point spec says that it should, despite it should really be NaN instead.
7083 posts
Location
Tasmania (AU)
Posted 31 May 2015 - 12:11 AM
… despite it should really be NaN instead.
Indeed. "Infinity" is not a number, and therefore has no business been returned by mathematical statements.
But Lua seems to be the "I'll let you get away with whatever" language. It's not built to be "fast to execute", it's built to be "easy to use".