Skip to main content
Have a personal or library account? Click to login
eFFT-C++: An Open-Source Implementation of the Event-Based Fast Fourier Transform Cover

eFFT-C++: An Open-Source Implementation of the Event-Based Fast Fourier Transform

Open Access
|Apr 2026

Figures & Tables

Table 1

Number of non-trivial floating-point operations required by different 1D FFT algorithms to compute an N-length data vector: Radix-2 (Rad2), Radix-4 (Rad4), Rader–Brenner (RB) [2, 3], Split-Radix (SR) [4, 5], and Quick Fourier Transform (QFT) [6, 7].

NRAD2RAD4RBSRQFT
2417616816816877
25496492456
261296118413001160587
27321632362824
2876966880774866643491
29179261797215368
2104097636232410163482418293
Figure 1

Scheme of the system architecture of eFFT-C++. The flow begins with asynchronous events, which are converted into stimulus abstractions and processed by the core eFFT<N> library based on a Radix-2 quadtree structure. The architecture enables the operation on both single stimulus and stimuli batches. Intermediate coefficients are handled through the Eigen3 backend, while the current spectrum is exposed via getFFT(). Validation and benchmarking modules (Google Test and Google Benchmark) operate independently from the core, and the build system orchestrated by CMake provides header-only distribution and Python bindings for seamless integration.

Table 2

Concise public API of eFFT-C++. Types: cfloat = std::complex<float> and cfloatmat = Eigen::Matrix<cfloat, Eigen::Dynamic, Eigen::Dynamic>.

ClassMethodInputOutputSummary
eFFT<N>eFFTBuilds lookup twiddles and allocates quadtree buffers.
eFFT<N>eFFTReleases FFTW plans when enabled (no-op otherwise).
eFFT<N>framesizeunsigned intCompile-time frame size N as a runtime integer.
eFFT<N>initializevoidInitialize internal state from a zero image.
eFFT<N>initializecfloatmat&voidInitialize from an N×N complex image (Eigen matrix).
eFFT<N>updateStimulus&boolApply one stimulus. Returns true if spectrum changed.
eFFT<N>updateStimuli&boolApply a batch of stimuli; prunes redundancies.
eFFT<N>getFFTcfloatmat&Current Fourier spectrum.
eFFT<N>initializeGTcfloatmat&voidPrepare FFTW plan and set input image.
eFFT<N>updateGTStimulus&boolApply one stimulus. Returns true if spectrum changed.
eFFT<N>updateGTStimuli&boolApply a batch of stimuli; prunes redundancies.
eFFT<N>getGTFFTcfloatmatGround-truth FFT with FFTW (if enabled).
eFFT<N>checkdoubleNorm of difference: ∥getFFT() - getGTFFT()∥.
StimulusonStimulus&Set state to true.
StimulusoffStimulus&Set state to false.
StimulussetboolStimulus&Explicitly set state.
StimulustoggleStimulus&Flip state (on/off).
StimulionvoidSet all contained stimuli to true.
StimulioffvoidSet all contained stimuli to false.
StimulisetboolvoidApply same state to all contained stimuli.
StimulitogglevoidFlip state of all contained stimuli.
jors-14-642-g5.png
jors-14-642-g6.png
jors-14-642-g7.png
jors-14-642-g8.png
jors-14-642-g9.png
Figure 2

Event-by-event benchmark (Benchmark 1) results. Google Benchmark time per iteration (wall-clock) versus frame size {16,32,64,128,256}. Each iteration processes Ne = 250 events, including random-event generation, updates, and spectrum retrieval after every event.

Figure 3

Packet-based benchmark (Benchmark 2) results. Google Benchmark time per iteration (wall-clock) versus packet size {1,5,10,50,100}103 for framesize 128 (top) and 256 (bottom). Each iteration integrates a fixed total of Ne=5105 events. Timing includes random-event generation, packet updates, and spectrum retrieval after each packet.

Figure 4

Examples of applications of eFFT. From top to bottom: denoising, pattern analysis, and registration. (1) Denoising: A low-pass filter in the frequency domain is used to suppress high-frequency noise. Artificial noise (5000 random noise events; see top-left) was added to the original events. (2) Pattern analysis: A directional edge filter in the frequency domain is applied to enhance edges within a specific angular range (∼90°). Green lines (middle-right) have been thickened for a better visualization. (3) Registration: Two event slices from different time instants are aligned via phase cross-correlation. The sequence used is urban from the Event Camera Dataset [23].

DOI: https://doi.org/10.5334/jors.642 | Journal eISSN: 2049-9647
Language: English
Submitted on: Nov 12, 2025
Accepted on: Mar 16, 2026
Published on: Apr 16, 2026
Published by: Ubiquity Press
In partnership with: Paradigm Publishing Services
Publication frequency: 1 issue per year

© 2026 Raul Tapia, José Ramiro Martínez-de Dios, Anibal Ollero, published by Ubiquity Press
This work is licensed under the Creative Commons Attribution 4.0 License.