I'm interested in publishing a mixed C++/Python package using TBB on PyPI, the Python Package Index. The way that binary packages are deployed on Python (e.g. on Linux) is by compiling them on a very old version of CentOS dubbed "manylinux", which establishes base versions of standard libraries like 'libc', 'libm', etc.
However, TBB is not part of the "manylinux" core libraries, and so by default any Python project requiring TBB must bundle it. Bundling TBB is easy, no problems there. However, I am concerned what would happen if multiple Python extensions bundle their own TBB libraries with different versions and load them into the same interpreter. That's not only wasteful but also sounds like it would cause the program to segfault shortly thereafter.
Another way to accomplish the same goal would be if there was a "tbb" Python PyPI package that ships TBB so that different Python extensions can compile against a consistent version. And, that actually exists: [tbb]
But looking more closely, I don't understand this package at all. It only covers a subset of platforms (linux, win64, no macos/...), and it doesn't include header files, making it impossible to use to in a package compilation step. There is another package [tbb-devel], which appears deserted/out of sync with [tbb].
For many years, I've been maintaining a CMake overlay over TBB (https://github.com/wjakob/tbb), and I'm tempted to publish the build output of that to PyPI with more complete coverage + headers. But that would create yet another package and likely add to the confusion.
Before this step, I was wondering if anybody knows the role of those existing Python packages? Is anyone looking into maintaining them? How are they used by Intel, and are they even meant to be used by others?