Updated readme files

This commit is contained in:
Ryan Ward 2018-09-10 22:27:20 -04:00
parent 8db42e19f9
commit b68b43fc26
3 changed files with 154 additions and 38 deletions

File diff suppressed because one or more lines are too long

View File

@ -1,4 +1,4 @@
# multi Version: 12.1.0 Fixing bugs and making the library eaiser to use # multi Version: 12.2.0 Added better priority management, function chaining, and some bug fixes
My multitasking library for lua. It is a pure lua binding, if you ignore the integrations and the love2d compat. If you find any bugs or have any issues, please let me know . **If you don't see a table of contents try using the ReadMe.html file. It is easier to navigate than readme**</br> My multitasking library for lua. It is a pure lua binding, if you ignore the integrations and the love2d compat. If you find any bugs or have any issues, please let me know . **If you don't see a table of contents try using the ReadMe.html file. It is easier to navigate than readme**</br>

View File

@ -12,184 +12,303 @@
<h1 id="changes"><a name="changes" href="#changes"></a>Changes</h1><p class="toc" style="undefined"></p><ul> <h1 id="changes"><a name="changes" href="#changes"></a>Changes</h1><p class="toc" style="undefined"></p><ul>
<li><ul> <li><ul>
<li><span class="title"> <li><span class="title">
<a href="#update:-12.0.0-big-update-(lots-of-additions-some-changes)" title="Update: 12.0.0 Big update (Lots of additions some changes)">Update: 12.0.0 Big update (Lots of additions some changes)</a> <a href="#update-12.2.0" title="Update 12.2.0">Update 12.2.0</a>
</span> </span>
<!--span class="number"> <!--span class="number">
0 0
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.11.1" title="Update: 1.11.1">Update: 1.11.1</a> <a href="#update-12.1.0" title="Update 12.1.0">Update 12.1.0</a>
</span> </span>
<!--span class="number"> <!--span class="number">
1 1
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.11.0" title="Update: 1.11.0">Update: 1.11.0</a> <a href="#update:-12.0.0-big-update-(lots-of-additions-some-changes)" title="Update: 12.0.0 Big update (Lots of additions some changes)">Update: 12.0.0 Big update (Lots of additions some changes)</a>
</span> </span>
<!--span class="number"> <!--span class="number">
2 2
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.10.0" title="Update: 1.10.0">Update: 1.10.0</a> <a href="#update:-1.11.1" title="Update: 1.11.1">Update: 1.11.1</a>
</span> </span>
<!--span class="number"> <!--span class="number">
3 3
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.9.2" title="Update: 1.9.2">Update: 1.9.2</a> <a href="#update:-1.11.0" title="Update: 1.11.0">Update: 1.11.0</a>
</span> </span>
<!--span class="number"> <!--span class="number">
4 4
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.9.1" title="Update: 1.9.1">Update: 1.9.1</a> <a href="#update:-1.10.0" title="Update: 1.10.0">Update: 1.10.0</a>
</span> </span>
<!--span class="number"> <!--span class="number">
5 5
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.9.0" title="Update: 1.9.0">Update: 1.9.0</a> <a href="#update:-1.9.2" title="Update: 1.9.2">Update: 1.9.2</a>
</span> </span>
<!--span class="number"> <!--span class="number">
6 6
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.8.7" title="Update: 1.8.7">Update: 1.8.7</a> <a href="#update:-1.9.1" title="Update: 1.9.1">Update: 1.9.1</a>
</span> </span>
<!--span class="number"> <!--span class="number">
7 7
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.8.6" title="Update: 1.8.6">Update: 1.8.6</a> <a href="#update:-1.9.0" title="Update: 1.9.0">Update: 1.9.0</a>
</span> </span>
<!--span class="number"> <!--span class="number">
8 8
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.8.5" title="Update: 1.8.5">Update: 1.8.5</a> <a href="#update:-1.8.7" title="Update: 1.8.7">Update: 1.8.7</a>
</span> </span>
<!--span class="number"> <!--span class="number">
9 9
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.8.4" title="Update: 1.8.4">Update: 1.8.4</a> <a href="#update:-1.8.6" title="Update: 1.8.6">Update: 1.8.6</a>
</span> </span>
<!--span class="number"> <!--span class="number">
10 10
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.8.3" title="Update: 1.8.3">Update: 1.8.3</a> <a href="#update:-1.8.5" title="Update: 1.8.5">Update: 1.8.5</a>
</span> </span>
<!--span class="number"> <!--span class="number">
11 11
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.8.2" title="Update: 1.8.2">Update: 1.8.2</a> <a href="#update:-1.8.4" title="Update: 1.8.4">Update: 1.8.4</a>
</span> </span>
<!--span class="number"> <!--span class="number">
12 12
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.8.1" title="Update: 1.8.1">Update: 1.8.1</a> <a href="#update:-1.8.3" title="Update: 1.8.3">Update: 1.8.3</a>
</span> </span>
<!--span class="number"> <!--span class="number">
13 13
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.7.6" title="Update: 1.7.6">Update: 1.7.6</a> <a href="#update:-1.8.2" title="Update: 1.8.2">Update: 1.8.2</a>
</span> </span>
<!--span class="number"> <!--span class="number">
14 14
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.7.5" title="Update: 1.7.5">Update: 1.7.5</a> <a href="#update:-1.8.1" title="Update: 1.8.1">Update: 1.8.1</a>
</span> </span>
<!--span class="number"> <!--span class="number">
15 15
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.7.4" title="Update: 1.7.4">Update: 1.7.4</a> <a href="#update:-1.7.6" title="Update: 1.7.6">Update: 1.7.6</a>
</span> </span>
<!--span class="number"> <!--span class="number">
16 16
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.7.3" title="Update: 1.7.3">Update: 1.7.3</a> <a href="#update:-1.7.5" title="Update: 1.7.5">Update: 1.7.5</a>
</span> </span>
<!--span class="number"> <!--span class="number">
17 17
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.7.2" title="Update: 1.7.2">Update: 1.7.2</a> <a href="#update:-1.7.4" title="Update: 1.7.4">Update: 1.7.4</a>
</span> </span>
<!--span class="number"> <!--span class="number">
18 18
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.7.1-bug-fixes-only" title="Update: 1.7.1 Bug Fixes Only">Update: 1.7.1 Bug Fixes Only</a> <a href="#update:-1.7.3" title="Update: 1.7.3">Update: 1.7.3</a>
</span> </span>
<!--span class="number"> <!--span class="number">
19 19
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.7.0" title="Update: 1.7.0">Update: 1.7.0</a> <a href="#update:-1.7.2" title="Update: 1.7.2">Update: 1.7.2</a>
</span> </span>
<!--span class="number"> <!--span class="number">
20 20
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.6.0" title="Update: 1.6.0">Update: 1.6.0</a> <a href="#update:-1.7.1-bug-fixes-only" title="Update: 1.7.1 Bug Fixes Only">Update: 1.7.1 Bug Fixes Only</a>
</span> </span>
<!--span class="number"> <!--span class="number">
21 21
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.5.0" title="Update: 1.5.0">Update: 1.5.0</a> <a href="#update:-1.7.0" title="Update: 1.7.0">Update: 1.7.0</a>
</span> </span>
<!--span class="number"> <!--span class="number">
22 22
</span--> </span-->
</li> </li>
<li><span class="title"> <li><span class="title">
<a href="#update:-1.4.1---first-public-release-of-the-library" title="Update: 1.4.1 - First Public release of the library">Update: 1.4.1 - First Public release of the library</a> <a href="#update:-1.6.0" title="Update: 1.6.0">Update: 1.6.0</a>
</span> </span>
<!--span class="number"> <!--span class="number">
23 23
</span--> </span-->
</li> </li>
<li><span class="title">
<a href="#update:-1.5.0" title="Update: 1.5.0">Update: 1.5.0</a>
</span>
<!--span class="number">
24
</span-->
</li>
<li><span class="title">
<a href="#update:-1.4.1---first-public-release-of-the-library" title="Update: 1.4.1 - First Public release of the library">Update: 1.4.1 - First Public release of the library</a>
</span>
<!--span class="number">
25
</span-->
</li>
</ul> </ul>
</li> </li>
</ul> </ul>
<p></p><h2 id="update:-12.0.0-big-update-(lots-of-additions-some-changes)"><a name="update:-12.0.0-big-update-(lots-of-additions-some-changes)" href="#update:-12.0.0-big-update-(lots-of-additions-some-changes)"></a>Update: 12.0.0 Big update (Lots of additions some changes)</h2><p><strong>Note:</strong> <del>After doing some testing, I have noticed that using multi-objects are slightly, quite a bit, faster than using (coroutines)multi:newthread(). Only create a thread if there is no other possibility! System threads are different and will improve performance if you know what you are doing. Using a (coroutine)thread as a loop with a timer is slower than using a TLoop! If you do not need the holding features I strongly recommend that you use the multi-objects. This could be due to the scheduler that I am using, and I am looking into improving the performance of the scheduler for (coroutine)threads. This is still a work in progress so expect things to only get better as time passes!</del> This was the reason threadloop was added. It binds the thread scheduler into the mainloop allowing threads to run much faster than before. Also the use of locals is now possible since I am not dealing with seperate objects. And finally reduced function overhead helps keep the threads running better.</p><h1 id="added:"><a name="added:" href="#added:"></a>Added:</h1><ul> <p></p><h2 id="update-12.2.0"><a name="update-12.2.0" href="#update-12.2.0"></a>Update 12.2.0</h2><p><strong>Added:</strong></p><ul>
<li>multi.nextStep(func)</li><li>Method chaining</li><li>Priority 3 has been added!</li><li>ResetPriority() — This will set a flag for a process to be re evaluated for how much of an impact it is having on the performance of the system.</li><li>setting: auto_priority added! — If only lua os.clock was more fine tuned… milliseconds are not enough for this to work</li><li>setting: auto_lowerbound added! — when using auto_priority this will allow you to set the lowbound for pirority. The defualt is a hyrid value that was calculated to reach the max potential with a delay of .001, but can be changed to whatever. Remember this is set to processes that preform really badly! If lua could handle more detail in regards to os.clock() then i would set the value a bit lower like .0005 or something like that</li><li>setting: auto_stretch added! — This is another way to modify the extent of the lowest setting. This reduces the impact that a low preforming process has! Setting this higher reduces the number of times that a process is called. Only in effect when using auto_priotity</li><li>setting: auto_delay added! — sets the time in seconds that the system will recheck for low performing processes and manage them. Will also upgrade a process if it starts to run better.<pre class="lua hljs"><code class="lua" data-origin="<pre><code class=&quot;lua&quot;>-- All methods that did not return before now return a copy of itself. Thus allowing chaining. Most if not all mutators returned nil, so chaining can now be done. I will eventually write up a full documentation of everything which will show this.
multi = require(&quot;multi&quot;)
multi:newStep(1,100):OnStep(function(self,i)
print(&quot;Index: &quot;..i)
end):OnEnd(function(self)
print(&quot;Step is done!&quot;)
end)
multi:mainloop{
priority = 3
}
</code></pre>"><span class="hljs-comment">-- All methods that did not return before now return a copy of itself. Thus allowing chaining. Most if not all mutators returned nil, so chaining can now be done. I will eventually write up a full documentation of everything which will show this.</span>
multi = <span class="hljs-built_in">require</span>(<span class="hljs-string">"multi"</span>)
multi:newStep(<span class="hljs-number">1</span>,<span class="hljs-number">100</span>):OnStep(<span class="hljs-function"><span class="hljs-keyword">function</span><span class="hljs-params">(self,i)</span></span>
<span class="hljs-built_in">print</span>(<span class="hljs-string">"Index: "</span>..i)
<span class="hljs-keyword">end</span>):OnEnd(<span class="hljs-function"><span class="hljs-keyword">function</span><span class="hljs-params">(self)</span></span>
<span class="hljs-built_in">print</span>(<span class="hljs-string">"Step is done!"</span>)
<span class="hljs-keyword">end</span>)
multi:mainloop{
priority = <span class="hljs-number">3</span>
}
</code></pre>
Priority 3 works a bit differently than the other 2.</li></ul><p>P1 follows a forumla that resembles this: ~n=I*PRank where n is the amount of steps given to an object with PRank and where I is the idle time see chart below. The aim of this priority scheme was to make core objects run fastest while letting idle processes get decent time as well.</p><pre><code data-origin="<pre><code>C: 3322269 ~I*7
H: 2847660 ~I*6
A: 2373050 ~I*5
N: 1898440 ~I*4
B: 1423830 ~I*3
L: 949220 ~I*2
I: 474610 ~I
~n=I*PRank
</code></pre>">C: 3322269 ~I*7
H: 2847660 ~I*6
A: 2373050 ~I*5
N: 1898440 ~I*4
B: 1423830 ~I*3
L: 949220 ~I*2
I: 474610 ~I
~n=I*PRank
</code></pre><p>P2 follows a formula that resembles this: ~n=n*4 where n is the idle time, see chart below. The goal of this one was to make core process higher while keeping idle process low.</p><pre><code data-origin="<pre><code>C: 6700821
H: 1675205
A: 418801
N: 104700
B: 26175
L: 6543
I: 1635
~n=n*4
</code></pre>">C: 6700821
H: 1675205
A: 418801
N: 104700
B: 26175
L: 6543
I: 1635
~n=n*4
</code></pre><p>P3 Ignores using a basic funceion and instead bases its processing time on the amount of cpu time is there. If cpu-time is low and a process is set at a lower priority it will get its time reduced. There is no formula, at idle almost all process work at the same speed!</p><pre><code data-origin="<pre><code>C: 2120906
H: 2120906
A: 2120906
N: 2120906
B: 2120906
L: 2120906
I: 2120506
</code></pre>">C: 2120906
H: 2120906
A: 2120906
N: 2120906
B: 2120906
L: 2120906
I: 2120506
</code></pre><p>Auto Priority works by seeing what should be set high or low. Due to lua not having more persicion than milliseconds, I was unable to have a detailed manager that can set things to high, above normal, normal, ect. This has either high or low. If a process takes longer than .001 millisecond it will be set to low priority. You can change this by using the setting auto<em>lowest = multi.Priority</em>[PLevel] the defualt is low, not idle, since idle tends to get about 1 process each second though you can change it to idle using that setting.</p><p><strong>Improved:</strong></p><ul>
<li>Performance at the base level has been doubled! On my machine benchmark went from ~9mil to ~20 mil steps/s.<br>Note: If you write slow code this librarys improbements wont make much of a difference.</li><li>Loops have been optimised as well! Being the most used objects I felt they needed to be made as fast as possible</li></ul><p>I usually give an example of the changes made, but this time I have an explantion for multi.nextStep(). Its not an entirely new feature since multi:newJob() does something like this, but is completely different. nextStep addes a function that is executed first on the next step. If multiple things are added to next step, then they will be executed in the order that they were added.</p><p>Note:<br>The upper limit of this libraries performance on my machine is ~39mil. This is simply a while loop counting up from 0 and stops after 1 second. The 20mil that I am currently getting is probably as fast as it can get since its half of the max performance possible, and each layer I have noticed that it doubles complexity. Throughout the years with this library I have seen massive improvements in speed. In the beginning we had only ~2000 steps per second. Fast right? then after some tweaks we went to about 300000 steps per second, then 600000. Some more tweaks brought me to ~1mil steps per second, then to ~4 mil then ~9 mil and now finally ~20 mil… the doubling effect that i have now been seeing means that odds are I have reach the limit. I will aim to add more features and optimize individule objects. If its possible to make the library even faster then I will go for it.</p><h2 id="update-12.1.0"><a name="update-12.1.0" href="#update-12.1.0"></a>Update 12.1.0</h2><p>Fixed:</p><ul>
<li>bug causing arguments when spawning a new thread not going through</li></ul><p>Changed:</p><ul>
<li>thread.hold() now returns the arguments that were pass by the event function</li><li>event objexts now contain a copy of what returns were made by the function that called it in a table called returns that exist inside of the object</li></ul><pre class="lua hljs"><code class="lua" data-origin="<pre><code class=&quot;lua&quot;>package.path=&quot;?/init.lua;?.lua;&quot;..package.path
multi = require(&quot;multi&quot;)
local a = 0
multi:newThread(&quot;test&quot;,function()
print(&quot;lets go&quot;)
b,c = thread.hold(function() -- This now returns what was managed here
return b,&quot;We did it!&quot;
end)
print(b,c)
end)
multi:newTLoop(function()
a=a+1
if a == 5 then
b = &quot;Hello&quot;
end
end,1)
multi:mainloop()
</code></pre>"><span class="hljs-built_in">package</span>.path=<span class="hljs-string">"?/init.lua;?.lua;"</span>..<span class="hljs-built_in">package</span>.path
multi = <span class="hljs-built_in">require</span>(<span class="hljs-string">"multi"</span>)
<span class="hljs-keyword">local</span> a = <span class="hljs-number">0</span>
multi:newThread(<span class="hljs-string">"test"</span>,<span class="hljs-function"><span class="hljs-keyword">function</span><span class="hljs-params">()</span></span>
<span class="hljs-built_in">print</span>(<span class="hljs-string">"lets go"</span>)
b,c = thread.hold(<span class="hljs-function"><span class="hljs-keyword">function</span><span class="hljs-params">()</span></span> <span class="hljs-comment">-- This now returns what was managed here</span>
<span class="hljs-keyword">return</span> b,<span class="hljs-string">"We did it!"</span>
<span class="hljs-keyword">end</span>)
<span class="hljs-built_in">print</span>(b,c)
<span class="hljs-keyword">end</span>)
multi:newTLoop(<span class="hljs-function"><span class="hljs-keyword">function</span><span class="hljs-params">()</span></span>
a=a+<span class="hljs-number">1</span>
<span class="hljs-keyword">if</span> a == <span class="hljs-number">5</span> <span class="hljs-keyword">then</span>
b = <span class="hljs-string">"Hello"</span>
<span class="hljs-keyword">end</span>
<span class="hljs-keyword">end</span>,<span class="hljs-number">1</span>)
multi:mainloop()
</code></pre><p><strong>Note:</strong> Only if the first return is non-nil/false will any other returns be passed! So while variable b above is nil the string “We did it!” will not be passed. Also while this seems simple enough to get working, I had to modify a bit on how the scheduler worked to add such a simple feature. Quite a bit is going on behind the scenes which made this a bit tricky to implement, but not hard. Just needed a bit of tinkering. Plus event objects have not been edited since the creation of the EventManager. They have remained mostly the same since 2011</p><h1 id="going-forward:"><a name="going-forward:" href="#going-forward:"></a>Going forward:</h1><p>Contunue to make small changes as I come about them. This change was inspired when working of the net library. I was addind simple binary file support over tcp, and needed to pass the data from the socket when the requested amount has been recieved. While upvalues did work, i felt returning data was cleaner and added this feature.</p><h2 id="update:-12.0.0-big-update-(lots-of-additions-some-changes)"><a name="update:-12.0.0-big-update-(lots-of-additions-some-changes)" href="#update:-12.0.0-big-update-(lots-of-additions-some-changes)"></a>Update: 12.0.0 Big update (Lots of additions some changes)</h2><p><strong>Note:</strong> <del>After doing some testing, I have noticed that using multi-objects are slightly, quite a bit, faster than using (coroutines)multi:newthread(). Only create a thread if there is no other possibility! System threads are different and will improve performance if you know what you are doing. Using a (coroutine)thread as a loop with a<br>is slower than using a TLoop! If you do not need the holding features I strongly recommend that you use the multi-objects. This could be due to the scheduler that I am using, and I am looking into improving the performance of the scheduler for (coroutine)threads. This is still a work in progress so expect things to only get better as time passes!</del> This was the reason threadloop was added. It binds the thread scheduler into the mainloop allowing threads to run much faster than before. Also the use of locals is now possible since I am not dealing with seperate objects. And finally, reduced function overhead help keeps the threads running better.</p><h1 id="added:"><a name="added:" href="#added:"></a>Added:</h1><ul>
<li><code>nGLOBAL = require("multi.integration.networkManager").init()</code></li><li><code>node = multi:newNode(tbl: settings)</code></li><li><code>master = multi:newMaster(tbl: settings)</code></li><li><code>multi:nodeManager(port)</code></li><li><code>thread.isThread()</code> — for coroutine based threads</li><li>New setting to the main loop, stopOnError which defaults to true. This will cause the objects that crash, when under protect, to be destroyed. So the error does not keep happening.</li><li>multi:threadloop(settings) works just like mainloop, but prioritizes (corutine based) threads. Regular multi-objects will still work. This improves the preformance of (coroutine based) threads greatly.</li><li>multi.OnPreLoad — an event that is triggered right before the mainloop starts</li></ul><p>Changed:</p><ul> <li><code>nGLOBAL = require("multi.integration.networkManager").init()</code></li><li><code>node = multi:newNode(tbl: settings)</code></li><li><code>master = multi:newMaster(tbl: settings)</code></li><li><code>multi:nodeManager(port)</code></li><li><code>thread.isThread()</code> — for coroutine based threads</li><li>New setting to the main loop, stopOnError which defaults to true. This will cause the objects that crash, when under protect, to be destroyed. So the error does not keep happening.</li><li>multi:threadloop(settings) works just like mainloop, but prioritizes (corutine based) threads. Regular multi-objects will still work. This improves the preformance of (coroutine based) threads greatly.</li><li>multi.OnPreLoad — an event that is triggered right before the mainloop starts</li></ul><p>Changed:</p><ul>
<li>When a (corutine based)thread errors it does not print anymore! Conect to multi.OnError() to get errors when they happen!</li><li>Connections get yet another update. Connect takes an additional argument now which is the position in the table that the func should be called. Note: Fire calls methods backwards so 1 is the back and the # of connections (the default value) is the beginning of the call table</li><li>The love2d compat layer has now been revamped allowing module creators to connect to events without the user having to add likes of code for those events. Its all done automagically.</li><li>This library is about 8 years old and using 2.0.0 makes it seem young. I changed it to 12.0.0 since it has some huge changes and there were indeed 12 major releases that added some cool things. Going forward Ill use major.minor.bugfix</li></ul><h1 id="node:"><a name="node:" href="#node:"></a>Node:</h1><ul> <li>When a (corutine based)thread errors it does not print anymore! Conect to multi.OnError() to get errors when they happen!</li><li>Connections get yet another update. Connect takes an additional argument now which is the position in the table that the func should be called. Note: Fire calls methods backwards so 1 is the back and the # of connections (the default value) is the beginning of the call table</li><li>The love2d compat layer has now been revamped allowing module creators to connect to events without the user having to add likes of code for those events. Its all done automagically.</li><li>This library is about 8 years old and using 2.0.0 makes it seem young. I changed it to 12.0.0 since it has some huge changes and there were indeed 12 major releases that added some cool things. Going forward Ill use major.minor.bugfix</li><li>multi.OnError() is now required to capture errors that are thrown when in prorected mode.</li></ul><h1 id="node:"><a name="node:" href="#node:"></a>Node:</h1><ul>
<li>node:sendTo(name,data)</li><li>node:pushTo(name,data)</li><li>node:peek()</li><li>node:pop()</li><li>node:getConsole()</li></ul><h1 id="master:"><a name="master:" href="#master:"></a>Master:</h1><ul> <li>node:sendTo(name,data)</li><li>node:pushTo(name,data)</li><li>node:peek()</li><li>node:pop()</li><li>node:getConsole() — has only 1 function print which allows you to print to the master.</li></ul><h1 id="master:"><a name="master:" href="#master:"></a>Master:</h1><ul>
<li>master:doToAll(func)</li><li>master:register(name,node,func)</li><li>master:execute(name,node,…)</li><li>master:newNetworkThread(tname,func,name,…)</li><li>master:getFreeNode()</li><li>master:getRandomNode()</li><li>master:sendTo(name,data)</li><li>master:pushTo(name,data)</li><li>master:peek()</li><li>master:pop()</li><li>master:OnError(nodename, error) — if a node has an error this is triggered.</li></ul><h1 id="bugs"><a name="bugs" href="#bugs"></a>Bugs</h1><ul> <li>master:doToAll(func)</li><li>master:register(name,node,func)</li><li>master:execute(name,node,…)</li><li>master:newNetworkThread(tname,func,name,…)</li><li>master:getFreeNode()</li><li>master:getRandomNode()</li><li>master:sendTo(name,data)</li><li>master:pushTo(name,data)</li><li>master:peek()</li><li>master:pop()</li><li>master:OnError(nodename, error) — if a node has an error this is triggered.</li></ul><h1 id="bugs"><a name="bugs" href="#bugs"></a>Bugs</h1><ul>
<li>Fixed a small typo I made which caused a hard crash when a (coroutine) thread crashes. This only happened if protect was false. Which is now the defualt value for speed reasons.</li></ul><h1 id="going-forward:"><a name="going-forward:" href="#going-forward:"></a>Going forward:</h1><ul> <li>Fixed a small typo I made which caused a hard crash when a (coroutine) thread crashes. This only happened if protect was true.</li></ul><h1 id="going-forward:"><a name="going-forward:" href="#going-forward:"></a>Going forward:</h1><ul>
<li>Improve Performance</li><li>Fix supporting libraries (Bin, and net need tons of work)</li><li>Look for the bugs</li><li>Figure out what I can do to make this library more awesome</li></ul><p><strong>Note On Queues:</strong> When it comes to network queues, they only send 1 way. What I mean by that is that if the master sends a message to a node, its own queue will not get populated at all. The reason for this is because syncing between which popped from what network queue would make things really slow and would not perform well at all. This means you have to code a bit differently. Use: master getFreeNode() to get the name of the node under the least amount of load. Then handle the sending of data to each node that way.</p><p>Now there is a little trick you can do. If you combine both networkmanager and systemthreading manager, then you could have a proxy queue for all system threads that can pull from that “node”. Now data passing within a lan network, (And wan network if using the node manager, though p2p isnt working as i would like and you would need to open ports and make things work. Remember you can define an port for your node so you can port forward that if you want), is fast enough, but the waiting problem is something to consider. Ask yourseld what you are coding and if network paralisim is worth using.</p><p><strong>Note:</strong> These examples assume that you have already connected the nodes to the node manager. Also you do not need to use the node manager, but sometimes broadcast does not work as expected and the master doesnot connect to the nodes. Using the node manager offers nice features like: removing nodes from the master when they have disconnected, and automatically telling the master when nodes have been added. A more complete example showing connections regardless of order will be shown in the example folder check it out. New naming scheme too.</p><p><strong>NodeManager.lua</strong></p><pre class="lua hljs"><code class="lua" data-origin="<pre><code class=&quot;lua&quot;>package.path=&quot;?/init.lua;?.lua;&quot;..package.path <li>I am really excited to finally get this update out there, but left one important thing out. enabling of enviroments for each master connected to a node. This would allow a node to isolate code from multiple masters so they cannot interact with each other. This will come out in version 12.1.0 But might take a while due to the job hunt that I am currently going through.</li><li>Another feature that I am on the fence about is adding channels. They would work like queues, but are named so you can seperate the data from different channels where only one portion of can see certain data. </li><li>I also might add a feature that allows different system threads to consume from a network queue if they are spaned on the same physical machine. This is possible at the moment, just doesnt have a dedicated object for handling this seamlessly. You can do this yourself though.</li><li>Another feature that I am thinking of adding is crosstalk which is a setting that would allow nodes to talk to other nodes. I did not add it in this release since there are some issues that need to be worked out and its very messy atm. however since nodes are named. I may allow by default pushing data to another node, but not have the global table to sync since this is where the issue lies.</li><li>Improve Performance</li><li>Fix supporting libraries (Bin, and net need tons of work)</li><li>Look for the bugs</li><li>Figure out what I can do to make this library more awesome</li></ul><p><strong>Note On Queues:</strong> When it comes to network queues, they only send 1 way. What I mean by that is that if the master sends a message to a node, its own queue will not get populated at all. The reason for this is because syncing between which popped from what network queue would make things really slow and would not perform well at all. This means you have to code a bit differently. Use: master getFreeNode() to get the name of the node under the least amount of load. Then handle the sending of data to each node that way.</p><p>Now there is a little trick you can do. If you combine both networkmanager and systemthreading manager, then you could have a proxy queue for all system threads that can pull from that “node”. Now data passing within a lan network, (And wan network if using the node manager, though p2p isnt working as i would like and you would need to open ports and make things work. Remember you can define an port for your node so you can port forward that if you want), is fast enough, but the waiting problem is something to consider. Ask yourseld what you are coding and if network paralisim is worth using.</p><p><strong>Note:</strong> These examples assume that you have already connected the nodes to the node manager. Also you do not need to use the node manager, but sometimes broadcast does not work as expected and the master doesnot connect to the nodes. Using the node manager offers nice features like: removing nodes from the master when they have disconnected, and automatically telling the master when nodes have been added. A more complete example showing connections regardless of order will be shown in the example folder check it out. New naming scheme too.</p><p><strong>NodeManager.lua</strong></p><pre class="lua hljs"><code class="lua" data-origin="<pre><code class=&quot;lua&quot;>package.path=&quot;?/init.lua;?.lua;&quot;..package.path
multi = require(&quot;multi&quot;) multi = require(&quot;multi&quot;)
local GLOBAL, THREAD = require(&quot;multi.integration.lanesManager&quot;).init() local GLOBAL, THREAD = require(&quot;multi.integration.lanesManager&quot;).init()
nGLOBAL = require(&quot;multi.integration.networkManager&quot;).init() nGLOBAL = require(&quot;multi.integration.networkManager&quot;).init()