13 posts
Posted 11 March 2012 - 06:16 PM
Does modem transmit/receive messages the same way wired connection does (two 7-bit chars in single tick + 2 service bits) or it acts differently?
What i mean: how long does it take to transmit huge messages, whether it's required to wait until transmission ends before starting new one, does zero-byte mean end of message and if it's possible to correctly transmit 8-bit characters (with 128+ codes) without converting them to fit (maybe 2-char) 7-bit representation?
454 posts
Location
London
Posted 11 March 2012 - 06:48 PM
Wireless modems transmit instantly, but are limited to 60?/17(storm) range.
Currently, I think characters with bit values of 128+ are bugged.
161 posts
Posted 11 March 2012 - 11:34 PM
The range limit is 64 normally or 16 in a thunderstorm. The distance is between the modems, not the computers. If you could imagine having CXC, where C is a computer and the X between them contains both modems (not actually possible but if it conceivably could be done), then this would be distance zero (the modems are in the same block); CXXC would then be distance 1, and so on.
Strings containing characters 128 or above fail to transfer properly as Advert mentioned (
http://www.computercraft.info/forums2/index.php?/topic/248-modems-are-not-binary-safe/page__fromsearch__1). However, you can send messages of pretty much arbitrary length instantly; I once tried sending a ten megabyte long message and it worked fine (though you have to be careful when
building such a message, as making it directly by concatenating ten million strings will take too long and will time out your program).
13 posts
Posted 12 March 2012 - 04:03 AM
Thanks. But could someone explain, why modems ALSO need 7-bit characters to transmit/receive correctly? They don't use RP bundles so there is no hard-limit of 16 bits per frame for transfer lines.
161 posts
Posted 12 March 2012 - 04:49 AM
Not having written ComputerCraft, I can only speculate, not answer. I speculate that the data is at some point shoved into a java.lang.String, which stores UTF-16, and that the conversion to get from Lua string (which I believe is implemented as a byte array) to java.lang.String is where the problem happens.