Intel® Integrated Performance Primitives
Deliberate problems developing high-performance vision, signal, security, and storage applications.

High CPU usage when 2 application use IPP

Tarun_D_
Beginner
444 Views

Hi,

I am running 2 applications which use IPP for decoding and image processing. When I run both the applications together the CPU usage of the application which does decoding increases. But when I stop the second application(image processing) then CPU usage of first application decreases.

For example-

 - There are 2 applications A and B.

 - When I run A application only then CPU usage of A is X%.

 - When I start B application also then CPU usage of A increases(X+w)%.

My platform is Linux(Debian).

What can be the reason that if I run B then CPU usage of A increases? 

 

0 Kudos
4 Replies
Chao_Y_Intel
Moderator
444 Views

Hello,

Are application A and B use some large data for the computation ( and the CPU cache size is important for them)?  If, starting anther application may take some cache for other application, and increase its computation time (CPU usage).

Also, are they mutithreaded application, if so, it is better to set correct CPU affinity for them, so sure these application will not switch between the CPU processors.

Thanks,
Chao

0 Kudos
Tarun_D_
Beginner
444 Views

Hello Chao,

Thank you for your response.

We are decoding 15 H264 streams of 1920*1080 resolution in parallel. (Application A). Hence it is correct that it is large size data computation.

Application B takes the decoded YUV420 frame as input from application A(via shared memory) and then does some computation on it.

The application is multithreaded but we have already set the CPU affinity for both the application to different cores.

Is there any way to resolve the issue?

Thanks Again!!

Tarun

0 Kudos
Tarun_D_
Beginner
444 Views

Hello,

I noticed a pattern while investigating this problem. My server has 2 processor each with 8 cores(16 logical cores). Hence the total number of cores in my server are 32. 

When cores allocated to the application A and B belong to different processor then the problem occurs, But when I allocate the cores which belong to same processor to both application then the problem does not occur.

For example - Processor 1 has core number 0 - 15. Processor 2 has core number 16 - 31. 

1. If I allocate 0 - 7 cores(Processor 1) to Application A and 16 - 23 cores(Processor 2) to Application B then CPU usage of A increases on starting application B.

2. If I allocate 0 - 7 cores(Processor 1) to Application A and 8 - 15 cores(Processor 1) to Application B then CPU usage of A does not increase on starting application B.

What can be the reason of this behavior? Is there any way to make it core independent?

Thanks

Tarun

0 Kudos
Igor_A_Intel
Employee
444 Views

Hi Tarun,

why do you consider such behavior as an issue? Do you see the different behavior if both your applications don't use IPP (for example some reference implementation of h264 decoding and some reference implementation of your computations on YUV420 ?

regards, Igor

0 Kudos
Reply