Intel® oneAPI Threading Building Blocks
Ask questions and share information about adding parallelism to your applications when using this threading library.
2479 Discussions

Nondeterministic processing order for function node with queueing policy

Zhan__Larry
Beginner
2,151 Views

When a function node with queueing policy is used, and it receives the input faster than it can handle, the next job it process can be from the input even if the internal buffer isn't empty.

    For example, the code attached process a list of consecutive numbers in a simple tbb graph

    tbb graph: source -> limiter -> func1 -> terminal
                    where terminal is a function_node with queueing policy, and is serial
    result: terminal node processed input out of order when multiple tbb threads are assigned

    Say

    1. terminal node is processing input 1
    2. func1 pushes input 2, now 2 is in terminal node's internal buffer
    3. func1 pushes input 3, and at the same time, terminal node finished processing input 1
    4. the next job the terminal node processes can be either 2 or 3

    Since the function node has queueing policy, it's always gonna be push instead of pull. So when a job is done. How does the function node decide where to get the next job? It's not obvious to me from the source code https://github.com/oneapi-src/oneTBB/blob/2019_U8/include/tbb/internal/_flow_graph_node_impl.h#L250

      Questions:

      • Is this a bug or expected behaviour
      • If it's expected, does it mean that I have to use the sequence node to guarantee the order?

      Sample Code Result

      ➜  bin ✗ ./functionNodeTester
      terminal node actual: 1713, expected:1712
      terminal node actual: 1714, expected:1713
      terminal node actual: 1712, expected:1714

        Sample Code

        #include <iostream>
        
        #include "tbb/flow_graph.h"
        #include "tbb/task_scheduler_init.h"
        
        using namespace tbb::flow;
        
        struct TerminalNode_t {
            continue_msg operator()(int v)
            {
                if (v != counter)
                    std::cout << "terminal node actual: " << v << ", expected:" << counter << std::endl;
                counter++;
                return continue_msg();
            }
        private:
            int counter = 0;
        };
        
        static int const THRESHOLD = 3;
        static int const CYCLES = 10000;
        
        int main()
        {
            int count = 0;
            
            tbb::task_scheduler_init init(3);
        
            graph g;
            source_node<int> input(g,
                [&count](int& output) -> bool {
                    if (count < CYCLES)
                    {
                        output = count;
                        count++;
                        return true;
                    }
                    
                    return false;
                });
            
            limiter_node<int> l( g, THRESHOLD);
            function_node<int,int> func1( g, serial, [](const int& val){ return val; } );
            function_node<int, continue_msg> terminal( g, serial, TerminalNode_t() );
        
        
            make_edge( l, func1 );
            make_edge( func1, terminal );
            make_edge( terminal, l.decrement );
            make_edge( input, l );
        
            g.wait_for_all();
            return 0;
        }

         

        4 Replies
        PatrickBeaulieu
        Beginner
        2,121 Views

        Any feedback, intel?

        Is this a bug to be fixed or are we forced to use sequencer nodes throughout our pipeline if we need to guarantee serial processing order in a part of our pipeline?

        0 Kudos
        Jordan_O
        Beginner
        2,066 Views

        Have you tried setting the source node to inactive and then enabling after connecting the edges?

         

         

         

        0 Kudos
        Mark_L_Intel
        Moderator
        1,907 Views

        I apologize for a long delay. Asked engineering team for help with your question.


        0 Kudos
        lackhole
        Beginner
        975 Views

        This bug still exists. Any updates?

         

        -----

         

        I found this issue: https://github.com/oneapi-src/oneTBB/issues/289. Nevermind.

        0 Kudos
        Reply