130 posts
Location
Here
Posted 17 June 2015 - 08:15 PM
I'm working on a small system that will require things like a timer and miniature loops to monitor other things that I'd like to run in conjunction with on another simultaneously (well you know what I mean) and am wondering what the general limits are to CC's coroutine in terms of memory usage. Can I run 100 small loops in separate coroutines without it lagging so hard its unusable or is this something that is fairly common?
2679 posts
Location
You will never find me, muhahahahahaha
Posted 17 June 2015 - 08:22 PM
The coroutine will not be running at the same time. So you can run what ever you want and as oong as you don't pass the 10 sec limit there sould be no problemo.
355 posts
Location
Germany
Posted 17 June 2015 - 08:32 PM
I think he knows how they work. And I can tell from personal experience, that you don't want too many coroutines running. Depending on the server you are playing on, you might be able to run about 150-300 computers with 2 routines each by default(because rednet protocol) with acceptible responsiveness to all kinds of input. If you start multiple computers running 100 coroutines each, the stuff will run, but ultimately lower tps and responsiveness to a point where you have no longer fun even right-clicking the computer.
These numbers are estimated though, and might as well have changed in the past versions.
Several benchmarks show, that drawing became more intensive with the window api for example. So it really depends on the load the coroutines are facing.
I think you just gotta try some things out ;)/>
355 posts
Location
Germany
Posted 17 June 2015 - 08:38 PM
Just to clear that up. If routines are doing nothing but yielding, you should be fine^^
The number itself does not affect the performance too much, like when you have a process starting way to many threads, so that thread management draws more resources than the actual threads. That should not be the case, if you stay under 1k of coroutines.
As of the memory stuff, I'm fairly interested in that too. I think once a lua state of a computer allocates too much memory the garbage collector forcefully removes stuff. At least I know of some tests I've done with a loop just writing random numbers to a table. It did not crash
7083 posts
Location
Tasmania (AU)
Posted 18 June 2015 - 03:23 AM
For what it's worth,
my WorldPorter script runs about 230 coroutines at a time while "scanning", and 250 while "building". I haven't noticed any ill-effects on performance, though my CPU fans spin up to high whenever I run it. Presumably a lesser processor (or a MineCraft server with more activity than my single-player worlds) might struggle.
The vast majority of the memory the script allocates is released nearly immediately. During a test in early development, though, I had it storing data for about two million blocks
simultaneously, and that also didn't seem to cause performance any problems.