In my threadpool engine the worker threads enters in a wait state
when there is no job in the lock-free queues - for more efficiency -
but if you look at the threadpool engine source code i am using
the following code on the producer side:
So a question follows..
If for example the consumer thread is on
can it forget to process the second item cause the consumer thread
will reset the event but one item will still be on the queue...
the number of items with the following code on the consumer side:
if ThreadPool.Queues[self.threadcount].count <> 0
Amine Moulay Ramdane.
As you will notice my threadpool engine is a simpleand efficient threadpool engine , i have designed it like that to easy the learning step for those who want to learn how to implementa simple and efficient threadpool.
On a multicore system, your goal is to spread the work efficiently among many cores so that it does executes simultaneously. And performance gain should be directly related to how many cores you have. So, a quad core system should be able to get the work done 4 times faster than a single core system. A 16-core platform should be 4-times faster than a quad-core system, and 16-times faster than a single core...
That's where my Threadpool is usefull , it spreads the work efficiently among many cores. Threadpool (and Threadpool with priority) consist of lock-free thread safe/concurrent enabled local FIFO queues of work items, so when you call ThreadPool.execute() , your work item get queued in the local lock-free queues. The worker threads pick them out in a First In First Out order (i.e., FIFO order), and execute them. .
The following have been added to Threadpool:
- Lock-free_mpmc - flqueue that i have modified, enhanced and improved... -
- It uses a lock-free queue for each worker thread and it uses work-stealing - for more efficiency -
- The worker threads enters in a wait state when there is no job in the lock-free queues - for more efficiency -
- You can distribute your jobs to the worker threads and call any method with the threadpool's execute() method.
Work-Stealing scheduling algorithm offer many feature over the ordinary scheduling algorithm:
- Using local queues, this will minimize contention.
- Load Balancing:
- Every thread can steal work from the other threads, so Work-Stealing provides implicitly Load Balancing.
My Threadpool allows load balancing, and also minimize contention.
Amine Moulay Ramdane.