OPEN-SOURCE SCRIPT

Universal Moving Average

124
🙏🏻 UMA (Universal Moving Average) represents the most natural and prolly ‘the’ final general universal entity for calculating rolling typical value for any type of time-series. Simply via different weighting schemes applied together, it encodes:
  • Location of each datapoint in corresponding fields (price, time, volume)
  • Informational relevance of each datapoint via using windowing functions that are fundamental in nature and go beyond DSP inventions & approximations
  • Innovation in state space (in our case = volatility)


The real beauty of this development: being simply a weighting scheme that can be applied to anything: be it weighted median, weighted quantile regression, or weighted KDE, or a simple weighted mean (like in this script). As long as a method accepts weights, you can harness the power of this entity. It means that final algorithmic complexity will match your initial tool.

As a moving ‘average’ it beats ALMA, KAMA, MAMA, VIDYA and all others because it is a simple and general entity, and all it does is encoding ‘all’ available information. I think that post might anger a lot of people, because lotta things will be realized as legacy and many paywalls gonna be ignored, specially for the followers of DSP cult, the ones who yet don’t understand that aggregated tick data is not a signal omg, it’s a completely different type of time series where your methods simply don’t fit even closely. I am also sorry to inform y’all, that spectral analysis is much closer to state-space methods in spirit than to DSP. But in fact DSP is cool and I love it, well for actual signals xD

...

Weights explained & how to use them: as I already said, the whole thing is based on combining different set of weights, and you can turn them on/off in script settings. Btw I've set em up defaults so you can use the thing on price data out of the box right away.

Price, Time, Volume weights: encode location of every datapoint in Price & TIme & Volume field
Howtouse: u have to disable one weight that corresponds to the field you apply UMA to. E.g if you apply UMA to prices, you turn off price weighting And turn on time and volume weighting. Or if you apply UMA to volume delta, you turn off volume weighting And turn on price and time weighting.

Higher prices are more important, this asymmetry is confirmed and even proved by the fact that prices can’t be negative (don’t even mention that incorrect rollover on CL contract in 2k20...).


Signal weights: encode actuality/importance/relevance of datapoints.
Howtouse: in DSP terms, it provides smoothing, but also compensates for the lag it introduces. This smoothness is useful if you use slope reversals for signal generation aka watching peaks and valleys in a moving average shape. It's also better to perturb smoothed outputs with this, this way you inject high freq content back, But in controlled way!

Signal = information.

The fundamental universal entity behind so-called “smoothing” in DSP has nothing to do with signals and goes eons beyond DSP. This is simply about measuring the relevance of data in time.

First, new datapoints need some time to be “embedded” into the timeline, you can think of it as time proof, kinda stuff needs time to be proved, accepted; while earliest datapoints lose relevance in time.

Second, along with the first notion, at the same time there’s the counter notion that simply weights new data more, acting as a counterweight from the down-weighting of the latest datapoints introduced by the first notion.

The first part can be represented as PDF of beta(2, 2) window (a set of weights in our case). It’s actually well known as the Welch window, that lives in between so called statistical and DSP worlds, emerges in multiple contexts. Mainstream DSP users tho mostly don’t use this one, they use primitive legacy windowing function, you can find all kinds on this wiki page.

Now the second part, where DSP adepts usually stop, is to introduce the second compensating windowing function. Instead they try to reduce window size, or introduce other kinds of volatility weights, do some tricks, but it ain’t provides obviously. The natural step here is to simply use the integral of the initial window; if the initial window is beta(2, 2) then what we simply need is CDF of beta(2, 2), in fact the vertically inverted shape of it aka survival function. That’s it bros. Simply as that.

When both of these are applied you have smth magical, your output becomes smooth and yet not lagging. No arbitrary windowing functions, tricks with data modification etc

Why beta(2, 2)? It naturally arises in many contexts, it’s based on one of the most fundamental functions in the universe: x^2. It has finite support. I can talk more bout it on request, but I am absolutely sure this is it.


imagen
^^ impulse response of the resulting weighs together (green) compared with uniform weights aka boxcar (red). Made with this script.


Weighing by state: encodes state-space innovation of each datapoint, basically magnitude of changes, strength of these changes, aka volatility.
Howtouse: this makes your moving average volatility aware in proper math ways. The influence of datapoints will be stronger when changes are stronger. This is weighting by innovations, or weighting by volatility by using squared returns.

Why squared returns? They encode state‑space innovations properly because the innovation of any continuous‑time semimartingale is about its quadratic variation, and quadratic variation is built from squared increments, not absolute increments.

Adaptive length is not the right way to introduce adaptivity by volatility xD. When you weight datapoints by squared returns you’re already dynamically varying ‘effective’ data size, you don’t need anything else.


...

It’s all good, progress happens, that’s how the Universe works, that's how Universal Moving Average works. Time to evolve. I might update other scripts with this complete weighting scheme, either by my own desire or your request.

...

Exención de responsabilidad

La información y las publicaciones no constituyen, ni deben considerarse como asesoramiento o recomendaciones financieras, de inversión, de trading o de otro tipo proporcionadas o respaldadas por TradingView. Más información en Condiciones de uso.