Excuse me, I am testing the frequency response of the CIC filter IP using Quartus 19.1 Standard Edition and Arria V GX, and I have a question on the clock signal of the CIC filter IP core. I use the filter IP as a decimator to convert a PDM signal to a PCM signal. I mainly refers to this documentation for the IP's connections.
Does the clock signal (clk) determines the input sampling frequency (F_s) on page 4 of this second documentation ? If so, should the clock of the filter and the clk of the input source (av_st_in_data) be the same signal? I first tried this solution, but the CIC filter outputted constant 0. Then I tried to use a higher frequency clock for the filter and a lower frequency clock for the input source. In this setting, the filter's output is roughly correct but the noise is larger than expected and it does not correspond with the filter's frequency response.
Another minor problem is that the output value of the CIC filter is always non-positive when I assign a higher frequency clock to the CIC filter. Is this expected behavior or is it abnormal?
When I use the same clock signal for both the filter and the input source, I get constant warnings in ModelSim:
Warning: NUMERIC_STD."=": metavalue detected, returning FALSE
The signal having problem is "cic_ii_0/core/<protected>/ready_FIFO". The warnings disappear when I assign a higher frequency clock to the filter. This may help with the debugging.
This IP has only one input clock, and this is the clock signal for all the register. The input sampling rate should be depending on the “in_valid” signal as well. Besides, the “in_ready “ signal is the output signal of the IP, a valid input data is required both “in_valid” and “in_ready” signals are asserted.
Please refer to figure 7 (timing diagram) in the user guide below, and see if there is any difference with your waveform.
Thank you very much for your reply! It turns out that my issue is indeed caused by timing and the in_valid signal and I have managed to solve the main problem. Now my CIC filter works under the same clock for the input signal and the noise disappears.
A minor problem is that the filter outputs a constant non-zero value when it is supposed to be zero. In my case, I set the output to 8 bits with Hogenauer Pruning, and the filter outputs signed decimal -48 when it is supposed to output zero. The screenshot of ModelSim's output is attached. In this case the output is supposed to attenuate down to 0. Also, the filter's output decreases when it is supposed to increase, while the shape of the output is correct. Do you know what could cause these problems?
Sorry for the late reply... I found out that I used a 1-bit PDM signal as the CIC filter's input but a signed signal should be at least 2 bits, and this caused the problem. Still thank you for your help! I think it may be better if the platform designer gives warning when the filter's input data width is set to 1 bit.
Thank you for the update and suggestions. If further support is needed in this thread, please post a response within 15 days. After 15 days, this thread will be transitioned to community support. The community users will be able to help you with your follow-up questions. If you have other questions, please open a new forum thread.