EDIT: found the answer - https://www.youtube.com/watch?v=8aGhZQkoFbQ
_
Okay, so I have some C# background and C# "async environment" is a bit of a mixed bag of concurrency and "small parallelism" i.e. when it comes to heavy async environment you can have race conditions, deadlock and you need to protect shared resources.
Now, I'm trying to understand how does JavaScript/ES6 async environment works. Consider following code:
// current context: background "page"
// independent from content page
let busy = false;
// This is an event handler that receives event from content page
// It can happen at any time
runtimeObj.onMessage.addListener((request) =>
{
if(request.action === 'AddEntry')
{
AddEntry(request);
return true;
}
return false;
} );
function AddEntry(data)
{
if (!busy)
group.push({url: data.url, time: Date.now(), session: data.session});
else
setTimeout(AddEntry(data),10000) // simulating Semaphore wait
}
// called from asynchronous function setInterval()
function SendPOST()
{
if (groups.length < 1 || groups === undefined)
return;
busy = true; // JS has no semaphores so I "simulate it"
let del = [];
groups.forEach(item =>
{
if (Date.now() - item.time > 3600000)
{
del.push(item);
let xhr = new XMLHttpRequest();
let data = new FormData();
data.append('action', 'leave');
data.append('sessionID', item.session);
xhr.withCredentials = true;
http.onreadystatechange = function()
{
if(http.readyState == 4 && http.status !== 200) {
console.log(`Unable to part group ${item.url}! Reason: ${http.status}. Leave group manually.`)
}
}
xhr.open('POST', item.url, true);
xhr.send(data);
}
});
del.forEach(item => groups.slice(item,1));
busy = false;
}
setInterval(SendPOST, 60000);
This is not really the best example, since it does not have bunch of async keyword paired with functions but in my understanding both sendPost() and AddEntry() arent really pure sequential operations. Nonetheless, I've been told that in AddEntry(), busy always will be false because:
sendPost is queued to execute in a minute
an event is add to the event loop
the event is processed and AddEntry is called
since busy = false the group is pushed
a minute passes and SendPost is added to the event loop
the event is process and SendPost is called
groups.length === 1 so it continues
busy = true
each group causes a request to get queued
an event comes in
busy = false
the event is processed, AddEntry is called
busy is false, like it will always be
group is pushed
eventually the requests from before are resolved, and t he onreadystatechange callbacks are put on the event loop
eventually each of the callbacks are processed and the logging statements are executed
Is this correct? From what I understand it essentially means there can be no race conditions or deadlock or I ever need to protect shared resource.
If I'm two write a similar code for C# runtime where sendPost() and AddEntry() are asynchronous task methods that can be called in non-blocking way from different event sources there could situation when I access shared resource while the iteration context is temporally suspended in context switch by Thread Sheduler.
quoted textin my post explains.groups.length < 1 || groups === undefinedinto this!groups || groups.length < 1the former might throw an error.