FPGA Intellectual Property
PCI Express*, Networking and Connectivity, Memory Interfaces, DSP IP, and Video IP
6524 Discussions

FFT MegaCore Function Output is not as expected

Altera_Forum
Honored Contributor II
2,369 Views

Hi, 

I have generated one FFT MegaCore Function in my FPGA design and simulated it using test bench. 

And the output is compared with the expected values; but they are not same. 

 

the parameters i selected are: 

Device: StratixII 

Transform length: 1024 

Data input precision: 16 

Twiddle precision: 14 

and I/O Data Flow is: streaming 

 

Here I am giving real input in the test bench as integers ranging from 0 to 65536 (16 bits wide in binary) and imaginary input is 0. And the final output is calculated as: 

output = 10*log10(sqrt(out_real^2 + out_imag^2)) 

 

where 'out_real' & 'out_imag' are calculated as: 

out_real = source_real * 2^(-exp) 

out_imag = source_imag * 2^(-exp) 

 

 

Can anybody please help me or let me know if I am wrong in my calculations.
0 Kudos
11 Replies
Altera_Forum
Honored Contributor II
789 Views

One mistake I feel is that, when we convert the integer values above 32768 to binary, they are becoming negative values since they are using signed representation. 

 

Is this the cause of problem? 

If so, then how can I give 16 bit unsigned input to FFT MegaCore?
0 Kudos
Altera_Forum
Honored Contributor II
789 Views

 

--- Quote Start ---  

One mistake I feel is that, when we convert the integer values above 32768 to binary, they are becoming negative values since they are using signed representation. 

 

Is this the cause of problem? 

If so, then how can I give 16 bit unsigned input to FFT MegaCore? 

--- Quote End ---  

 

 

what you interpret as positive above 32767 will be viewed as negative by fft if it is set to 16 bits signed. 

 

For a start use positive only below 32767 and see. then for negative numbers convert using the rule that a negative number is same bits as positive if you add 2^16 to it.
0 Kudos
Altera_Forum
Honored Contributor II
789 Views

Now I am checking with positive numbers only.. 

But still the output is not as expected.. 

 

Can you please verify my calculations are correct or not//
0 Kudos
Altera_Forum
Honored Contributor II
789 Views

I don't know about your fft scaling issue of exp as it is related to your ip specifics. But what do you compare with? and why do you use log. 

if you are comparing with matlab fft then you don't need the log. Just compare fft outputs directly and possibly you may get scaling issues.
0 Kudos
Altera_Forum
Honored Contributor II
789 Views

For example: 

I am getting the following in simulation: 

 

source_real: 2659 

source_imag: 3490 

source_exp: -10 

 

So as per my calculation: 

out_real = source_real * 2^(exp) = 2659 * 2^-10 = 2.596679688 

out_imag = source_imag * 2^(exp) = 3490 * 2^-10 = 3.408203125 

 

So, Output = 10*log10(sqrt(out_real^2 + out_imag^2)) 

= 10* log10 (sqrt(6.742745399 + 11.61584854)) 

= 10* log10 (4.284692981) 

= 6.319197081 

 

But the value I am expecting here is -5.42 

 

Could you please help me//
0 Kudos
Altera_Forum
Honored Contributor II
789 Views

Can you please tell how to calculate output in frequency domain from the 'real' and 'imaginary' parts?

0 Kudos
Altera_Forum
Honored Contributor II
789 Views

I am calculating '10*log10()' to convert the linear value to power in 'db'. 

Anyhow, can you please tell what the scaling factor I have to use? Is scaling factor '-10' as seen in 'source_exp'?
0 Kudos
Altera_Forum
Honored Contributor II
789 Views

what are all your input values i.e. all the 1024 frame

0 Kudos
Altera_Forum
Honored Contributor II
789 Views

The input samples (15 bits wide) in integer are: 

 

21577,18450,14977,11706,9150,7709,7613,8871,11289,14486,17963,21173,23610,24891,24820,23404,20867,17606,14132,10993,8680,7556,7798,9369,12021,15338,18799,21860,24041,25003,24592,22870,20114,16750,13308,10329,8282,7488,8069,9935,12796,16197,19610,22496,24401,25031,24281,22275,19322,15888,12513,9725,7963,7503,8416,10562,13601,17059,20389,23070,24683,24973,23896,21620,18501,15031,11752,9181,7722,7601,8840,11243,14433,17910,21127,23581,24882,24832,23436,20914,17660,14184,11036,8708,7564,7785,9336,11973,15284,18748,21820,24018,24999,24608,22906,20163,16804,13359,10370,8306,7490,8049,9898,12745,16144,19561,22457,24381,25031,24303,22314,19373,15942,12560,9760,7981,7500,8394,10521,13550,17005,20341,23036,24667,24978,23922,21663,18553,15083,11799,9214,7735,7594,8813,11201,14382,17858,21083,23549,24872,24843,23468,20959,17712,14237,11079,8735,7571,7772,9303,11927,15230,18695,21778,23992,24994,24623,22942,20210,16858,13410,10411,8329,7494,8032,9862,12696,16090,19510,22419,24360,25031,24324,22351,19423,15996,12610,9797,7999,7497,8370,10482,13499,16952,20293,23002,24652,24985,23947,21704,18606,15138,11844,9246,7747,7586,8785,11157,14328,17805,21038,23520,24862,24853,23496,21004,17765,14290,11124,8763,7578,7757,9270,11879,15177,18644,21736,23968,24988,24639,22976,20257,16911,13460,10450,8350,7493,8012,9823,12645,16035,19460,22381,24340,25032,24346,22390,19473,16050,12658,9834,8017,7494,8346,10440,13448,16897,20245,22968,24636,24991,23974,21747,18658,15190,11892,9279,7762,7578,8756,11112,14276,17751,20991,23488,24851,24864,23527,21049,17819,14342,11166,8790,7587,7744,9239,11834,15125,18592,21693,23940,24983,24655,23010,20306,16966,13512,10491,8375,7496,7993,9789,12597,15982,19411,22344,24320,25033,24366,22430,19524,16103,12708,9870,8034,7491,8323,10401,13396,16843,20197,22933,24621,24996,23998,21789,18709,15245,11938,9310,7775,7570,8729,11068,14223,17700,20946,23458,24839,24873,23557,21093,17872,14395,11211,8820,7596,7731,9207,11787,15071,18540,21650,23915,24976,24672,23045,20353,17019,13564,10531,8399,7500,7976,9752,12549,15929,19359,22303,24298,25032,24387,22467,19574,16159,12758,9908,8054,7489,8299,10360,13347,16789,20149,22897,24603,24999,24023,21829,18761,15297,11986,9344,7788,7562,8701,11025,14170,17646,20902,23427,24829,24885,23586,21139,17925,14447,11255,8849,7605,7719,9175,11740,15018,18489,21608,23888,24971,24686,23080,20401,17073,13615,10573,8425,7504,7959,9716,12500,15875,19309,22264,24276,25029,24407,22506,19623,16211,12807,9945,8073,7487,8277,10320,13295,16735,20100,22862,24586,25004,24050,21872,18812,15351,12032,9378,7804,7556,8674,10982,14119,17592,20856,23397,24817,24894,23617,21184,17978,14500,11299,8877,7614,7706,9142,11694,14963,18436,21566,23862,24965,24701,23113,20449,17126,13666,10613,8447,7508,7942,9680,12451,15821,19259,22224,24254,25029,24427,22543,19673,16265,12856,9982,8094,7484,8256,10282,13244,16682,20052,22826,24571,25008,24073,21912,18864,15406,12080,9412,7818,7549,8647,10940,14066,17539,20810,23365,24806,24904,23646,21229,18031,14553,11344,8907,7624,7696,9112,11649,14910,18384,21525,23835,24957,24715,23146,20497,17180,13718,10654,8472,7512,7925,9644,12403,15767,19209,22185,24231,25027,24446,22581,19722,16320,12905,10020,8113,7485,8233,10242,13193,16628,20004,22791,24553,25011,24097,21954,18916,15458,12127,9445,7833,7542,8621,10897,14015,17485,20764,23334,24793,24912,23674,21271,18083,14607,11389,8936,7633,7684,9081,11603,14858,18332,21480,23807,24949,24729,23178,20544,17233,13769,10696,8496,7516,7908,9609,12355,15713,19157,22145,24209,25025,24464,22617,19772,16373,12956,10058,8133,7483,8212,10203,13143,16574,19955,22755,24535,25014,24120,21994,18967,15513,12174,9479,7849,7537,8595,10856,13963,17433,20718,23303,24780,24921,23702,21316,18135,14659,11434,8965,7643,7673,9050,11558,14805,18280,21436,23780,24943,24744,23213,20590,17286,13820,10738,8523,7522,7892,9574,12307,15660,19107,22105,24186,25023,24484,22654,19820,16427,13007,10095,8154,7484,8191,10165,13094,16520,19906,22719,24517,25017,24146,22035,19018,15566,12223,9514,7864,7530,8567,10812,13911,17379,20671,23268,24767,24928,23732,21360,18188,14712,11479,8996,7653,7661,9019,11514,14751,18227,21393,23753,24935,24757,23245,20637,17339,13872,10780,8548,7528,7876,9540,12259,15606,19056,22066,24164,25021,24503,22692,19870,16480,13057,10135,8175,7485,8169,10125,13044,16467,19857,22682,24499,25021,24169,22076,19068,15620,12271,9548,7880,7525,8542,10769,13859,17326,20625,23237,24754,24937,23759,21404,18240,14765,11524,9027,7664,7651,8988,11468,14699,18175,21349,23724,24928,24771,23279,20684,17394,13924,10823,8574,7533,7860,9505,12211,15552,19005,22025,24140,25018,24521,22729,19919,16534,13106,10173,8196,7484,8148,10086,12993,16413,19808,22646,24479,25023,24192,22116,19120,15673,12319,9582,7895,7521,8517,10728,13807,17272,20577,23203,24739,24944,23787,21448,18292,14818,11569,9057,7675,7640,8959,11423,14646,18123,21305,23695,24920,24784,23310,20730,17447,13976,10864,8600,7537,7844,9473,12163,15499,18953,21984,24116,25014,24539,22764,19968,16589,13157,10211,8217,7484,8129,10049,12943,16359,19759,22608,24460,25025,24215,22156,19171,15726,12367,9618,7911,7515,8492,10686,13756,17220,20530,23171,24727,24954,23814,21492,18346,14871,11614,9088,7686,7629,8929,11377,14592,18069,21261,23666,24910,24796,23342,20776,17501,14028,10907,8627,7544,7829,9438,12116,15445,18902,21943,24091,25010,24556,22800,20016,16642,13207,10252,8239,7486,8110,10010,12893,16305,19709,22571,24441,25028,24236,22195,19221,15781,12415,9653,7928,7511,8467,10644,13704,17165,20484,23138,24712,24960,23842,21534,18397,14924,11660,9118,7698,7621,8899,11333,14539,18017,21217,23638,24901,24807,23372,20821,17554,14080,10950,8653,7548,7813,9403,12068,15390,18852,21902,24067,25008,24573,22836,20065,16695,13257,10291,8260,7486,8089,9973,12844,16251,19660,22534,24420,25029,24259,22234,19272,15835,12463,9689,7944,7507,8442,10604,13652,17112,20437,23104,24698,24967,23869
0 Kudos
Altera_Forum
Honored Contributor II
789 Views

can you just input all zeros except for first sample as nonzero then compare your output with matlab fft

0 Kudos
Altera_Forum
Honored Contributor II
789 Views

your data vector in above post is not 1024 samples but 1159 samples and it lacks continuity at wrap-up. 

you must enter a known 1024 frame in both your fft and your reference model. 

 

your power equation should not include sqrt unles you use 20*log... 

 

regarding scaling by exp you can find out that from your fft doc or once you get correct signal you can find out the right scaling.
0 Kudos
Reply