Intel® oneAPI Threading Building Blocks
Ask questions and share information about adding parallelism to your applications when using this threading library.

TBB 2.2 - first 'encounter'

memaher
Beginner
258 Views
As of 9am this morning, I began the process of moving from TBB 2.1 to 2.2. Although still in heavy development, our application was up and running with TBB 2.1. For reference, we are programming under 64bit Linux for a dual-Xeon (8 core) machine. The application uses TBB to create two concurrent pipelines. This is achieved by creating two tbb_threads, then starting the pipeline in each.

The first job of the day was to remove calls to task_scheduler_init in the3 active threads (main + 2 pipelines). Nothing more was done to the code, other than comment out these calls, but immediately I am faced with a segmentation fault:

Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x404bb940 (LWP 16665)]
0x00002b1553e75a74 in tbb::task_group_context::init ()
from /opt/intel/tbb/tbb21_017oss/em64t/cc4.1.0_libc2.4_kernel2.6.16.21/lib/libtbb.so.2
(gdb) bt
#0 0x00002b1553e75a74 in tbb::task_group_context::init ()
from /opt/intel/tbb/tbb21_017oss/em64t/cc4.1.0_libc2.4_kernel2.6.16.21/lib/libtbb.so.2
#1 0x00002b1553e709b1 in tbb::pipeline::run () from /opt/intel/tbb/tbb21_017oss/em64t/cc4.1.0_libc2.4_kernel2.6.16.21/lib/libtbb.so.2
#2 0x00000000004241dc in ECMSI::CTIRPipeline::Start (this=0x2b155484ff60) at CTIRPipeline.cpp:69
#3 0x000000000041fde0 in ECMSI::CThreadRun::operator() (this=0x2b1554843e78) at COperationControl.h:69
#4 0x000000000041fea6 in tbb::internal::thread_closure_0<:CTHREADRUN>::start_routine (c=0x2b1554843e78)
at /opt/intel/tbb/tbb21_017oss/include/tbb/tbb_thread.h:76
#5 0x0000003478206367 in start_thread () from /lib64/libpthread.so.0
#6 0x00000034776d30ad in clone () from /lib64/libc.so.6
(gdb)

Easy fix for me is to re-insert these calls (it works with v2.2 with them enabled). Just thought you should know!

Mat.
0 Kudos
2 Replies
pvonkaenel
New Contributor III
258 Views
Quoting - memaher
As of 9am this morning, I began the process of moving from TBB 2.1 to 2.2. Although still in heavy development, our application was up and running with TBB 2.1. For reference, we are programming under 64bit Linux for a dual-Xeon (8 core) machine. The application uses TBB to create two concurrent pipelines. This is achieved by creating two tbb_threads, then starting the pipeline in each.

The first job of the day was to remove calls to task_scheduler_init in the3 active threads (main + 2 pipelines). Nothing more was done to the code, other than comment out these calls, but immediately I am faced with a segmentation fault:

Program received signal SIGSEGV, Segmentation fault.
[Switching to Thread 0x404bb940 (LWP 16665)]
0x00002b1553e75a74 in tbb::task_group_context::init ()
from /opt/intel/tbb/tbb21_017oss/em64t/cc4.1.0_libc2.4_kernel2.6.16.21/lib/libtbb.so.2
(gdb) bt
#0 0x00002b1553e75a74 in tbb::task_group_context::init ()
from /opt/intel/tbb/tbb21_017oss/em64t/cc4.1.0_libc2.4_kernel2.6.16.21/lib/libtbb.so.2
#1 0x00002b1553e709b1 in tbb::pipeline::run () from /opt/intel/tbb/tbb21_017oss/em64t/cc4.1.0_libc2.4_kernel2.6.16.21/lib/libtbb.so.2
#2 0x00000000004241dc in ECMSI::CTIRPipeline::Start (this=0x2b155484ff60) at CTIRPipeline.cpp:69
#3 0x000000000041fde0 in ECMSI::CThreadRun::operator() (this=0x2b1554843e78) at COperationControl.h:69
#4 0x000000000041fea6 in tbb::internal::thread_closure_0<:CTHREADRUN>::start_routine (c=0x2b1554843e78)
at /opt/intel/tbb/tbb21_017oss/include/tbb/tbb_thread.h:76
#5 0x0000003478206367 in start_thread () from /lib64/libpthread.so.0
#6 0x00000034776d30ad in clone () from /lib64/libc.so.6
(gdb)

Easy fix for me is to re-insert these calls (it works with v2.2 with them enabled). Just thought you should know!

Mat.

From your dump it looks like you might still be linking with TBB 2.1. The libtbb.so.2 is listed as being in the tbb21_017oss directory.

I have also started migrating from 2.1 to 2.2 (Win32) and was able to remove with task_scheduler_init without problem. However, I'm not using a pipeline, just parallel_for.

Peter
0 Kudos
memaher
Beginner
258 Views
Doh!

Thanks for that - monday mornings are a lame excuse.

Mat
0 Kudos
Reply