Posted 15 May 2015 - 06:37 PM
So I have noticed not too many people know that you dont need to define arg in a function using … arguments (example below)
However my confusion on this is i have recently noticed you cant use both arg and … in a function or else arg is no longer automatically defined for you. Is there an actual reason for this? Because it seems rather random that suddenly something doesnt happen because i chose to use something else. This code will demonstrate what i mean.
…If someone could explain this quirk to me please :P/>
//#this does work
foo = function(...)
print(arg[1])
end
foo("bar")
However my confusion on this is i have recently noticed you cant use both arg and … in a function or else arg is no longer automatically defined for you. Is there an actual reason for this? Because it seems rather random that suddenly something doesnt happen because i chose to use something else. This code will demonstrate what i mean.
print("test one using only arg")
test = function(...)
print(tostring(arg))
end
test(true)//#table expected and got table. So this is good
print("test two using only ...")
test = function(...)
print(tostring({...}))
end
test(true)//#table expected and got table. Still doing good
print("test three using both arg and ...")
test = function(...)
print(tostring(arg))
print(tostring({...}))
end
test(true)//#two tables expected and we got nil and a table. So now suddenly arg doesnt exist
…If someone could explain this quirk to me please :P/>
Edited on 15 May 2015 - 04:40 PM