Papers
arxiv:2602.20857

Functional Continuous Decomposition

Published on Feb 24
· Submitted by
Teymur Aghayev
on Feb 26
Authors:

Abstract

Functional Continuous Decomposition enables parametric, continuous optimization of time-series data with guaranteed continuity for capturing local and global patterns, enhancing machine learning model performance through improved feature extraction.

AI-generated summary

The analysis of non-stationary time-series data requires insight into its local and global patterns with physical interpretability. However, traditional smoothing algorithms, such as B-splines, Savitzky-Golay filtering, and Empirical Mode Decomposition (EMD), lack the ability to perform parametric optimization with guaranteed continuity. In this paper, we propose Functional Continuous Decomposition (FCD), a JAX-accelerated framework that performs parametric, continuous optimization on a wide range of mathematical functions. By using Levenberg-Marquardt optimization to achieve up to C^1 continuous fitting, FCD transforms raw time-series data into M modes that capture different temporal patterns from short-term to long-term trends. Applications of FCD include physics, medicine, financial analysis, and machine learning, where it is commonly used for the analysis of signal temporal patterns, optimized parameters, derivatives, and integrals of decomposition. Furthermore, FCD can be applied for physical analysis and feature extraction with an average SRMSE of 0.735 per segment and a speed of 0.47s on full decomposition of 1,000 points. Finally, we demonstrate that a Convolutional Neural Network (CNN) enhanced with FCD features, such as optimized function values, parameters, and derivatives, achieved 16.8% faster convergence and 2.5% higher accuracy over a standard CNN.

Community

Paper author Paper submitter

I am excited to announce my latest research: Functional Continuous Decomposition (FCD) - a JAX-accelerated framework designed for parametric, continuous signal decomposition.

Traditional signal processing algorithms like Empirical Mode Decomposition (EMD) often lack continuity, while smoothing techniques, such as B-splines cannot provide analytical formulation.

FCD addresses limitations of current signal processing algorithms by decomposing a signal on a wide range of mathematical functions with guaranteed continuous modes. Applications of FCD include physics, medicine, financial analysis, and machine learning, where it is commonly used for the analysis of signal temporal patterns, optimized parameters, derivatives, and integrals of decomposition.

Key achievements:

✅ 16.8% faster CNN convergence by using FCD-derived features over a standard CNN.

✅ Guaranteed value and derivative continuity of each mode, and default presets for commonly used functions and initial guesses.

✅ Full JAX implementation for efficient fitting and massive datasets.

Check out the full paper on arXiv:

http://arxiv.org/abs/2602.20857

If you are interested, please leave a comment or feedback

#MachineLearning #SignalProcessing #JAX #AI #Research #Optimization #VGTU #DeepLearning

Sign up or log in to comment

Models citing this paper 0

No model linking this paper

Cite arxiv.org/abs/2602.20857 in a model README.md to link it from this page.

Datasets citing this paper 0

No dataset linking this paper

Cite arxiv.org/abs/2602.20857 in a dataset README.md to link it from this page.

Spaces citing this paper 0

No Space linking this paper

Cite arxiv.org/abs/2602.20857 in a Space README.md to link it from this page.

Collections including this paper 0

No Collection including this paper

Add this paper to a collection to link it from this page.