Intel® Integrated Performance Primitives
Community support and discussions relating to developing high-performance vision, signal, security, and storage applications.

Using ippsFIRMR_16sc


I see odd behaviour using ippsFIRMR_16sc. The documentation seems to indicate it expects a complex filter.

      Ipp32fc filter_taps[max_number_taps];

I initialize my filter taps with a windowed sinc function in the real part and zeros in the imaginary. Then I initialize p_spec as shown.

      IPP_status = ippsFIRMRInit_32fc(filter_taps, tap_length, up_factor, 0, down_factor, 0, p_spec);

When I apply the filter in the following way, the result looks like a bandstop filter, which is what I would expect to see if the complex filter created above were used as a real filter with my filter coefficients in the even filter taps and zeros in the odd.

       IPP_status = ippsFIRMR_16sc(buffer, sampled_buffer, sampled_size, p_spec, NULL, NULL, p_buf);

Does ippsFIRMR_16sc expect the filter to be real instead of complex?

0 Kudos
1 Reply

Hi Mark,

ippsFIRMR_16s is for real filter, while ippsFIRMR_16sc is for complex.

Can I presume that you already called ippsFIRMRGetSize() and malloc memory before calling Init and FIRMR? For example:

ppStatus status;
status = ippsFIRMRGetSize(tapsLen, upFactor, downFactor, ipp32f, &specSize, &bufSize );
printf("ippsFIRMRGetSize / status = %s\n", ippGetStatusString(status));
pSpec = (IppsFIRSpec_32f*)ippsMalloc_8u(specSize);
pBuf = ippsMalloc_8u(bufSize);
status = ippsFIRMRInit_32f ( pTaps, tapsLen, upFactor, upPhase, downFactor, downPhase,

And, could you provide your sample code then I can reproduce the issue?

Best Regards,