- Mark as New
- Bookmark
- Subscribe
- Mute
- Subscribe to RSS Feed
- Permalink
- Report Inappropriate Content
Hi, everyone,
For FFT, is there any formula to compute how many input data needed for one output data coming out with respect to the transform length? I have simulated the situations when transform length are 64 and 1024, I observed that 169 and 2108 clock cycles (input data) are needed for one output data respectively. I don't see any connection between the transform length and data needed for on output so far. Thanks and best regards Sincerely gritLink Copied
0 Replies
Reply
Topic Options
- Subscribe to RSS Feed
- Mark Topic as New
- Mark Topic as Read
- Float this Topic for Current User
- Bookmark
- Subscribe
- Printer Friendly Page