Intel® oneAPI Threading Building Blocks
Ask questions and share information about adding parallelism to your applications when using this threading library.
2464 Discussions

Another tbb::pipeline Problem / Discussion

jaredkeithwhite
New Contributor I
703 Views

Ok, so I have read quite a few of the posts on here about tbb::pipeline, and I think that my application is definitely an candidate for this processing model. Essentially, I receive a very large stream of data from various sensor abilities (7 different types of messages from my sensor, to be exact), and then a variable number of instruments (as few as 5, as many as 500) read that sensor data and produce calculation results. The model flows:

sensors ==> preprocess ==> instrumentset ==> (instruments: A, B, C, D .... etc) ==> clean up buffers / return memory to pool

Each instrument within the instrumentset calculates one or more output results (some keep track of sensor data for the previous 20 minutes, and calculate various averages, scales, etc).

Sensor data arrives in real-time fashion. It's positively inconceivable that no sensor data would be arriving. Therefore, it seems I can avoid some of the issues that others have had regarding pipeline starvation -- where they periodically run out of data to process, and need to stop / start the pipeline. That will never be my problem.

Having said that, I would like for your comments on the following process:

My first tbb::filter is a wrapper around the sensor entrypoint into my application
- tbb::filter(serial_out_of_order) -- open for debate on this.
- there are methods "onmessage(type1), onmesage(type2)", etc, which the sensor thread invokes
- each of these methods are overridden and the messages are pushed onto a queue
- these methods are invoked asynchronously, but only by 1 other thread
- the tbb::filter::operator(void*) method pops items from the queue and returns it


My second tbb::filter is a very lightweight preprocess
- tbb::filter(parallel)
- literally a few instructions. only error checking basically (making sure a few different fields are within a real range)

My third tbb::filter is responsible for receiving the message and distributes it to all of the instruments
- tbb::filter(parallel)
- some instruments take longer than others
- this is the area that I want to parallelize, which can enormously benefit from this model

My fourth tbb::filter is a cleanup operation, fairly lightweight
- returns data to a pool, which the sensor can then use for the next message
- I haven't chosen this memory pool yet, but I might write one myself

I'm sure everyone has a few holes to punch in the above design. Punch away. Also, if you can recommend a better queue structure than concurrent_queue for my first tbb::filter, please do.


Regards,
Jared

0 Kudos
1 Reply
Alexey-Kukanov
Employee
703 Views
The overall design, as described,makes sense to me. A few notes:

- for the input filter, distinction of in-order and out-of-order has only one practical difference, which is whether the order for subsequent ordered filters will be set at input, or later. It looks like you do not use ordered filters at all in the setup, so making input filter unordered seems fine.

- for the queue question: from what you said, seems it will have a single producer (your other thread) and a single virtual consumer (because since the input filter is serial, TBB threads executing it will not contend). Thus you might benefit from using a queue optimized for single producer single consumer model. I believe Dmitry Vyukov referenced his implementation in the forum. Make sure however that any queue implementation you choose does not require single actual consumer (e.g. by depending onthread-local variables).

-I would also experiment with combining the 2nd and the 3rd filters, to see which way performance is better (since you told the second stage is very lightweight, it might not deserve an overhead of another filter). On the second thought, if the 2nd stage does error checking and can cancel the whole pipeline or throw an exception, it might make sense to separate it from the 3rd one.

0 Kudos
Reply