kraegarthegreat
kraegarthegreat t1_j1cnl5h wrote
Reply to comment by AdFew4357 in [D] Simple Questions Thread by AutoModerator
Kats by meta is a good tool for investigating feature extraction. I haven't done timeseries classification but from my brief work with Kats it seemed promising.
(Look for ideas there, use better tools for implementation)
kraegarthegreat t1_iyorlrr wrote
Reply to comment by SrPinko in [R] Statistical vs Deep Learning forecasting methods by fedegarzar
From my personal experience:
- Univariate with a few timesteps: XGBoost or statistical methods.
- Multivariate with many timesteps: NN-based models.
kraegarthegreat t1_iyor5g6 wrote
Reply to comment by Internal-Diet-514 in [R] Statistical vs Deep Learning forecasting methods by fedegarzar
This is something I have found in my research. I keep seeing people making models with millions of parameters when I am able to achieve 99% of the performance with roughly 1k.
kraegarthegreat t1_iyolbmo wrote
Reply to comment by picardythird in [R] Statistical vs Deep Learning forecasting methods by fedegarzar
This PLAGUES my research.
The amount of detail that most papers provide about their statistical methods used as a baseline is not enough to replicate. "We outperformed ARIMA". Didn't provide values, etc.
kraegarthegreat t1_ivix7ir wrote
Reply to [D] Do you think there is a competitive future for smaller, locally trained/served models? by naequs
This concept is a key part of my research.
I kept seeing these massive models for timeseries that were frankly mediocre at best. Using better preprocessing and smaller models (1k versus 1M parameters) and curated datasets I have met or exceeded similar works.
Smaller, curated models are the future IMO.
kraegarthegreat t1_j316z7p wrote
Reply to comment by enterthesun in Does anyone here use newer or custom frameworks aside from TensorFlow, Keras and PyTorch? by ConsciousInsects
I found PL helped reduce boilerplate code while still giving the niceties of torch versus tf.
The main thing I like is that it abstracts the training loops while still giving you the ability to add custom code to any part of the training loop. This likely sounds weird, but check out their page. 12/10 recommend.